4 results on '"Jamtsho, Yeshi"'
Search Results
2. EEEA-Net: An Early Exit Evolutionary Neural Architecture Search.
- Author
-
Termritthikun, Chakkrit, Jamtsho, Yeshi, Ieamsaard, Jirarat, Muneesawang, Paisarn, and Lee, Ivan
- Subjects
- *
EVOLUTIONARY algorithms , *GRAPHICS processing units , *CONVOLUTIONAL neural networks , *ALGORITHMS , *IMAGE recognition (Computer vision) , *ERROR rates - Abstract
The goals of this research were to search for Convolutional Neural Network (CNN) architectures, suitable for an on-device processor with limited computing resources, performing at substantially lower Network Architecture Search (NAS) costs. A new algorithm entitled an Early Exit Population Initialisation (EE-PI) for Evolutionary Algorithm (EA) was developed to achieve both goals. The EE-PI reduces the total number of parameters in the search process by filtering the models with fewer parameters than the maximum threshold. It will look for a new model to replace those models with parameters more than the threshold. Thereby, reducing the number of parameters, memory usage for model storage and processing time while maintaining the same performance or accuracy. The search time was reduced to 0.52 GPU day. This is a huge and significant achievement compared to the NAS of 4 GPU days achieved using NSGA-Net, 3,150 GPU days by the AmoebaNet model, and the 2,000 GPU days by the NASNet model. As well, Early Exit Evolutionary Algorithm networks (EEEA-Nets) yield network architectures with minimal error and computational cost suitable for a given dataset as a class of network algorithms. Using EEEA-Net on CIFAR-10, CIFAR-100, and ImageNet datasets, our experiments showed that EEEA-Net achieved the lowest error rate among state-of-the-art NAS models, with 2.46% for CIFAR-10, 15.02% for CIFAR-100, and 23.8% for ImageNet dataset. Further, we implemented this image recognition architecture for other tasks, such as object detection, semantic segmentation, and keypoint detection tasks, and, in our experiments, EEEA-Net-C2 outperformed MobileNet-V3 on all of these various tasks. (The algorithm code is available at https://github.com/chakkritte/EEEA-Net). [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
3. Evolutionary neural architecture search based on efficient CNN models population for image classification
- Author
-
Chakkrit Termritthikun, Yeshi Jamtsho, Paisarn Muneesawang, Jia Zhao, Ivan Lee, Termritthikun, Chakkrit, Jamtsho, Yeshi, Muneesawang, Paisarn, Zhao, Jia, and Lee, Ivan
- Subjects
neural architecture search ,Computer Networks and Communications ,Hardware and Architecture ,Media Technology ,deep learning ,multi-objective evolutionary algorithms ,Software ,image classification - Abstract
Refereed/Peer-reviewed The aim of this work is to search for a Convolutional Neural Network (CNN) architecture that performs optimally across all factors, including accuracy, memory footprint, and computing time, suitable for mobile devices. Although deep learning has evolved for use on devices with minimal resources, its implementation is hampered by that these devices are not designed to tackle complex tasks, such as CNN architectures. To address this limitation, a Network Architecture Search (NAS) strategy is considered, which employs a Multi-Objective Evolutionary Algorithm (MOEA) to create an efficient and robust CNN architecture by focusing on three objectives: fast processing times, reduced storage, and high accuracy. Furthermore, we proposed a new Efficient CNN Population Initialization (ECNN-PI) method that utilizes a combination of random and selected strong models to generate the first-generation population. To validate the proposed method, CNN models are trained using CIFAR-10, CIFAR-100, ImageNet, STL-10, FOOD-101, THFOOD-50, FGVC Aircraft, DTD, and Oxford-IIIT Pets benchmark datasets. The MOEA-Net algorithm outperformed other models on CIFAR-10, whereas MOEANet with the ECNN-PI method outperformed other models on CIFAR-10 and CIFAR-100. Furthermore, both the MOEA-Net algorithm and MOEA-Net with the ECNN-PI method outperformed DARTS, P-DARTS, and Relative-NAS for small-scale multi-class and fine-grained datasets.
- Published
- 2022
4. EEEA-Net: An Early Exit Evolutionary Neural Architecture Search
- Author
-
Ivan Lee, Paisarn Muneesawang, Yeshi Jamtsho, Chakkrit Termritthikun, Jirarat Ieamsaard, Termritthikun, Chakkrit, Jamtsho, Yeshi, Ieamsaard, Jirarat, Muneesawang, Paisarn, and Lee, Ivan
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Computer science ,Process (engineering) ,Computer Vision and Pattern Recognition (cs.CV) ,Population ,Computer Science - Computer Vision and Pattern Recognition ,Evolutionary algorithm ,Word error rate ,Machine learning ,computer.software_genre ,Convolutional neural network ,Machine Learning (cs.LG) ,Artificial Intelligence ,Segmentation ,Neural and Evolutionary Computing (cs.NE) ,Electrical and Electronic Engineering ,education ,education.field_of_study ,Network architecture ,business.industry ,Computer Science - Neural and Evolutionary Computing ,deep learning ,Object detection ,neural architecture search ,Control and Systems Engineering ,Artificial intelligence ,business ,computer ,multi-objective evolutionary algorithms ,image classification - Abstract
The goals of this research were to search for Convolutional Neural Network (CNN) architectures, suitable for an on-device processor with limited computing resources, performing at substantially lower Network Architecture Search (NAS) costs. A new algorithm entitled an Early Exit Population Initialisation (EE-PI) for Evolutionary Algorithm (EA) was developed to achieve both goals. The EE-PI reduces the total number of parameters in the search process by filtering the models with fewer parameters than the maximum threshold. It will look for a new model to replace those models with parameters more than the threshold. Thereby, reducing the number of parameters, memory usage for model storage and processing time while maintaining the same performance or accuracy. The search time was reduced to 0.52 GPU day. This is a huge and significant achievement compared to the NAS of 4 GPU days achieved using NSGA-Net, 3,150 GPU days by the AmoebaNet model, and the 2,000 GPU days by the NASNet model. As well, Early Exit Evolutionary Algorithm networks (EEEA-Nets) yield network architectures with minimal error and computational cost suitable for a given dataset as a class of network algorithms. Using EEEA-Net on CIFAR-10, CIFAR-100, and ImageNet datasets, our experiments showed that EEEA-Net achieved the lowest error rate among state-of-the-art NAS models, with 2.46% for CIFAR-10, 15.02% for CIFAR-100, and 23.8% for ImageNet dataset. Further, we implemented this image recognition architecture for other tasks, such as object detection, semantic segmentation, and keypoint detection tasks, and, in our experiments, EEEA-Net-C2 outperformed MobileNet-V3 on all of these various tasks. (The algorithm code is available at https://github.com/chakkritte/EEEA-Net)., Comment: Published at Engineering Applications of Artificial Intelligence; Code and pretrained models available at https://github.com/chakkritte/EEEA-Net
- Published
- 2021
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.