DRAM Reliability

Aging Analysis and Reliability Prediction Model

More Info
expand_more

Abstract

An increasing amount of critical applications use DRAM as main memory in its computing systems. It it therefore extremely important that these memories function correctly during their lifetime in order to prevent catastrophic failures. Already during the design phase, the reliability of the circuit needs to be predicted so that a reasonable lifetime expectation can be given. Although the importance of reliability analysis is clear, in literature not much research on DRAM reliability is available to designers. This thesis proposes a two phased
DRAM reliability prediction model that can be used in the circuit design phase. During the first phase, the circuit performance is analyzed for different wear-out mechanisms affecting different subcomponents in the design. In the second phase, the results of the first phase are then used to determine the reliability of the circuit.
In the first phase, the wear-out effects of Bias Temperature Instability (BTI) Hot Carrier Injection (HCI) and radiation trapping as well as transistor mismatch are examined. BTI is modeled using the RD-model, HCI with the lucky electron model, transistor mismatch with Pelgrom’s model. Wear-out caused by radiation
trapping is modeled as Gate Induced Drain Leakage (GIDL), the data for which are derived from retention time degradation measurements of an irradiated commercial DRAM. The circuit performance is analyzed per subcomponent of the DRAM design in a range of different metrics. Furthermore, the aging effects on a
dwonscaled version of the circuit are investigated.
In the second phase, reliability functions are derived from the results from phase one per wear-out mechanism and per subcomponent. These reliability functions are used in an analytical reliability model which yields the overall circuit reliability.
The results from the first phase show degradation of the retention time as well as degradation of sensing delay metrics. Due to the relative low duty factor of memory cells, BTI and HCI have minor impact on the memory cell circuit performance. Radiation however, renders the circuit useless once the Total Ionizing Dose (TID) becomes more than 126 krad. On other subcomponents than the memory cells, BTI and HCI shift the reference voltage which results in an increase of retention time. BTI and HCI stressing of the sense amplifier
also slightly increases retention time but mainly increases sensing delay. For both reference cells and the sense amplifier it holds that higher radiation doses break down the circuit completely. The same effects hold for the downscaled circuit, although the observed effects are more severe than in the unscaled device.
The system reliability prediction in the second phase shows the importance of individual reliability prediction of the subcircuits and wear-out mechanisms. Via the individual analysis, it becomes clear that the system reliability is mostly impacted by the degradation of the sense amplifier delay due to BTI and HCI. Other metric variations, like the increase in retention time caused by the reference cells, have less impact. It was found that the system reliability decreases to 0.84 after 1·108s at a stressing temperature of 300K.