1. Optimization of Self-Heating Driven Leakage Current Properties of Gate-All-Around Field-Effect Transistors Using Neural Network Modeling and Genetic Algorithm
- Author
-
Ilgu Yun and Chuntaek Park
- Subjects
leakage current ,Materials science ,TK7800-8360 ,Computer Networks and Communications ,neural network modeling ,law.invention ,GAAFETs ,law ,Gate oxide ,genetic algorithm ,Electronics ,Electrical and Electronic Engineering ,Scaling ,business.industry ,Transistor ,Time constant ,Semiconductor device ,self-heating effect ,Hardware and Architecture ,Control and Systems Engineering ,thermal time constant ,Signal Processing ,Optoelectronics ,Field-effect transistor ,business ,optimization ,Communication channel - Abstract
As the technology nodes of semiconductor devices have become finer and more complex, progressive scaling down has been implemented to achieve higher densities for electronic devices. Thus, three-dimensional (3D) channel field-effect transistors (FETs), such as fin-shaped FETs (FinFETs) and gate-all-around FETs (GAAFETs), have become popular as they have increased effective surface areas for the channels (Weff), owing to the scaling down strategy. These 3D channel FETs, which have completely covered channel structures with gate oxide and metal, are prone to the self-heating effect (SHE). The SHE is generally known to degrade the on-state drain current, however, when AC pulsed inputs are applied to these devices, the SHE also degrades the off-state leakage current during the off-phase of the pulse. In this study, an optimization methodology to minimize leakage current generation by the SHE is examined.
- Published
- 2021