As CMOS technologies scale to deep submicron technology nodes, improvements in transistor density and cost due to scaling inevitably come with degraded analog performance and increased device variability. Variation presents a major challenge to reliably building large arrays of mixed-signal circuits in nanometer scale CMOS processes. To determine the feasibility of creating highly sensitive readout integrated circuits (ROICs) for imaging detector arrays in 28 nm bulk CMOS, this work involves developing a chip to characterize performance variability in areas specifically relevant to image sensors: leakage current and random telegraph signaling (RTS) noise. High leakage current in crucial per-pixel reset switch devices can lead to pixel failure, while low-frequency RTS noise can cause pixels to blink erroneously. In imager arrays with tens of thousands of pixels, devices operating at extreme 3-sigma variation levels may result in dead or defective pixels. To improve the likelihood of measuring this behavior experimentally and obtain statistically meaningful results, the test chip includes nearly 7,000 devices under test (DUTs) for each type of device characterized. NMOS and PMOS devices with multiple threshold voltage flavors and dimensions of interest to digital and analog chip designs are included. The characterization platform consists of 4,608 pairs of test device cells and measurement circuit cells arrayed using a typical ROIC framework. By applying well-tested techniques in ROIC design to the variability measurement space, device measurements can be performed mostly in the digital domain, and a statistically significant subset of RTS noise data can be collected quickly. Because high-performance imagers are often cooled to cryogenic temperatures to reduce noise, the test chip was characterized from room temperature down to 78 K. Initial measurements indicate that while leakage current reduces significantly with temperature, RTS noise is still a concern down to cryogenic temperatures.





Download Full History