The authors present simulation results, an experimental technique, and a model for estimating the temperature rise and time-to-failure (TTF) of interconnect. They introduce the concept of derating factor for electromigration TTF due to self-heating. The derating factor is the factor by which the lifetime is reduced by temperature rise in the interconnect. It is shown that in the limit of high frequencies, the temperature rise can be estimated in a straightforward manner using the root-mean-square current density after the thermal resistance of the structure has been determined from DC measurements. The implication of the temperature dependence on Jrms for the usually quoted Jave design rule was examined. It was determined that self-heating is probably not significant for the usual design rule average current density of 1 × 105 A/cm2 for operation at frequencies >10 MHz and duty factors >0.1%. However, if the design rule is increased to 1 × 106 A/cm2, self-heating might become significant.
|Number of pages||4|
|Journal||Technical Digest - International Electron Devices Meeting|
|State||Published - 1 Dec 1989|
|Event||1989 International Electron Devices Meeting - Technical Digest - Washington, DC, USA|
Duration: 3 Dec 1989 → 6 Dec 1989