A New Test Paradigm for Semiconductor Memories in the Nano-Era

Master Thesis (2011)
Contributor(s)

S. Hamdioui – Mentor

Copyright
© 2011 Krishnaswami, V.
More Info
expand_more
Publication Year
2011
Copyright
© 2011 Krishnaswami, V.
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Due to rapid and continuous technology scaling, faults in semiconductor memories (and ICs in general) are becoming pervasive and weak of nature; weak faults are faults that pass the test program (because they do not lead to erroneous behavior of the system). Nevertheless, they may cause a system failure during the application. This causes the number of escapes to increase while it becomes increasingly difficult to determine the nature of the failures. Components with weak faults which fail at board and system level are sent to suppliers, only to have them returned back as No Trouble Found (NTF). The conventional memory test approach assumes the presence of a single defect a time causing a strong fault (which leads to an error in the system), and therefore is unable to deal with weak faults. This thesis presents a new memory test approach able to detect weak faults; it is based on assuming the presence of multiple weak faults at a time in a memory system rather a single strong fault at a time. Being able to detect weak faults reduces the number of escapes, hence also the number of NTFs. The experimental analysis done using SPICE simulation for a case of study show e.g., that when assuming two simultaneous weak faults, the missing (defect) coverage can be reduced with up to 10% as compared with the conventional approach.

Files

Thesis.pdf
(pdf | 4.65 Mb)
License info not available