Innovative pixel-inversion calculation for model-based sub-resolution assist features and optical proximity correction

Jue Chin Yu*, Pei-Chen Yu, Hsueh Yung Chao

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations


We propose an inversion calculation method based on a simple "pixel-flipping" approach. The simple method features innovative wavefront-expansion and wavefront-based damping techniques in order to obtain accentuated corrections near the drawn pattern. The method is first employed to be a stand-alone optical proximity correction solution that directly calculates the corrected masks with acceptable contours and image contrast. In addition, a model-based pre-OPC flow, where the initial sizing of drawn patterns and surrounding sub-resolution assist features (SRAF) are simultaneously generated in a single iteration using this inversion calculation is also proposed to minimize technology-transition risks and costs. A mask simplification technique based on the central moments is introduced in order to snap the corrections into 45 degree and axis-aligned line segments. This approach allows achieving optimized corrections while minimizing the impact to the existing and validated correction flow.

Original languageEnglish
Title of host publicationOptical Microlithography XXII
StatePublished - 29 May 2009
EventOptical Microlithography XXII - San Jose, CA, United States
Duration: 24 Feb 200927 Feb 2009

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
ISSN (Print)0277-786X


ConferenceOptical Microlithography XXII
CountryUnited States
CitySan Jose, CA


  • Convergence
  • Inversion calculation
  • Optical proximity correction
  • Sub-resolution assist features

Fingerprint Dive into the research topics of 'Innovative pixel-inversion calculation for model-based sub-resolution assist features and optical proximity correction'. Together they form a unique fingerprint.

Cite this