Effects of self-heating on integrated circuit metallization lifetimes

B. K. Liew*, N. W. Cheung, Chen-Ming Hu

*Corresponding author for this work

Research output: Contribution to journalConference article

16 Scopus citations

Abstract

The authors present simulation results, an experimental technique, and a model for estimating the temperature rise and time-to-failure (TTF) of interconnect. They introduce the concept of derating factor for electromigration TTF due to self-heating. The derating factor is the factor by which the lifetime is reduced by temperature rise in the interconnect. It is shown that in the limit of high frequencies, the temperature rise can be estimated in a straightforward manner using the root-mean-square current density after the thermal resistance of the structure has been determined from DC measurements. The implication of the temperature dependence on Jrms for the usually quoted Jave design rule was examined. It was determined that self-heating is probably not significant for the usual design rule average current density of 1 × 105 A/cm2 for operation at frequencies >10 MHz and duty factors >0.1%. However, if the design rule is increased to 1 × 106 A/cm2, self-heating might become significant.

Original languageEnglish
Pages (from-to)323-326
Number of pages4
JournalTechnical Digest - International Electron Devices Meeting
DOIs
StatePublished - 1 Dec 1989
Event1989 International Electron Devices Meeting - Technical Digest - Washington, DC, USA
Duration: 3 Dec 19896 Dec 1989

Fingerprint Dive into the research topics of 'Effects of self-heating on integrated circuit metallization lifetimes'. Together they form a unique fingerprint.

  • Cite this