人工神经网络
算法
计算机科学
拓扑(电路)
数学
人工智能
组合数学
作者
Hyundong Jang,Kihoon Nam,Hyeok Yun,Kyeongrae Cho,Seungjoon Eom,Min Sang Park,Rock‐Hyun Baek
标识
DOI:10.1109/ted.2023.3288840
摘要
We used a novel neural network (NN) approach to predict the short-term threshold voltage ( ${V}_{t}{)}$ of 3-D NAND flash memory. Then, we optimized the cell structure parameters to improve retention with minimal ${V}_{t}$ loss. In other words, we proposed a new NN architecture that combines a multilayer perceptron (MLP) and long short-term memory (LSTM) NNs to predict time-variant characteristics and optimize them as time-invariant parameters. We effectively prevented local minimums by modifying the optimization process using gradient descent to find multiple solutions and estimate the optimal point. The NN discovered an optimized structure that reduces ${V}_{t}$ loss by 32.37% compared to the minimum value of random big data. The optimized structure improved retention by having a thicker tunneling oxide (TOX) and longer spacer length than the technology computer-aided design (TCAD) calibration, resulting in a 4.27% improvement in trapped electron variation. Additionally, sensitivity analysis through iterative optimization experiments identified the tunneling/blocking oxide and channel thickness as key parameters influencing ${V}_{t}$ loss. Unlike traditional engineering methods that optimize individual parameters separately, NN optimization considers the correlations among all input parameters simultaneously. Therefore, the novel NN optimization can reduce the need for manual optimization and accelerate the design process by allowing designers to explore a wide range of possibilities quickly. This provides potential advanced optimization strategies that offer new alternatives for improving retention characteristics.
科研通智能强力驱动
Strongly Powered by AbleSci AI