Suppressing the standby current in memories is critical in low-power design. By lowering the supply voltage (VDD) to its standby limit, the data retention voltage (DRV), SRAM leakage power can be reduced substantially. The DRV theoretical limit is derived to be 52mV for a 90nm technology at room temperature. The DRV increases with transistor mismatches. Based on sub-threshold circuit analysis, a practical DRV model is developed and verified with measurement data from several test chips in 130nm and 90nm technologies. By reducing the standby VDD of a 32K-bit 130nm industrial IP SRAM module to around 490 mV (390 mV worst-case DRV + 100 mV guard band), an 85% leakage power saving is measured, compared to the typical standby power at 1V. Since the DRV is a strong function of both process and design parameters, the SRAM cell can be optimized to reduce DRV. It is shown that the body bias and device channel length are the most effective knobs in minimizing DRV. This is confirmed with measurement data from a 90nm SRAM test chip. Building on these results, feasibility of a 270mV standby VDD is demonstrated for an optimized 4K-bit SRAM in a 90nm technology, leading to a 97% leakage power reduction. By dynamically configuring the body bias during read and write operations, the active operation noise margins and data access speed are also improved according to simulation results. Correcting the retention errors with error correction code (ECC) provides another opportunity to further reduce the SRAM standby VDD. To establish a power-per-bit metric, the SRAM leakage power is modeled as a function of the ECC parameters, DRV distribution and the standby VDD. This power metric is optimized based on ECC theory to obtain fundamental bounds of the power saving enabled by error-tolerant design. Taking into account the practical design requirements, an error-tolerant SRAM design with a (31, 26, 3) Hamming code is proposed introducing a further power reduction of 33%. Both the circuit optimization and the error-tolerant architecture are implemented in a 90nm 26K-bit ultra-low leakage SRAM chip. Measurement result proves that the memory data can be reliably retained at a 255mV standby VDD, with a 50X leakage power reduction. While the optimization also improves active SRAM operation, the only tradeoff is a 50% larger area caused by the larger channel length and ECC overhead. In summary, this work is the first analytical investigation into the voltage limit of SRAM standby operation. The theoretical and practical DRV models provide insights to the future low-voltage SRAM designs. Besides the analytical study, we also develop two novel design solutions that aggressively reduce SRAM leakage. The error-tolerant SRAM standby scheme is the first time the ECC is used for memory power minimization.




Download Full History