1. Spatio-temporal deep neural networks for accession classification of Arabidopsis plants using image sequences
- Author
-
Jayant Jagtap and Shrikrishna Kolhar
- Subjects
Ecology ,Artificial neural network ,business.industry ,Computer science ,Applied Mathematics ,Ecological Modeling ,Pattern recognition ,Plant taxonomy ,Convolutional neural network ,Accession ,Computer Science Applications ,Identification (information) ,Computational Theory and Mathematics ,Modeling and Simulation ,Artificial intelligence ,business ,Spatial analysis ,Ecology, Evolution, Behavior and Systematics ,Selection (genetic algorithm) ,Transformer (machine learning model) - Abstract
Recently, image based plant phenotyping is used extensively for plant trait estimation, plant accession classification and selection, and plant stress analysis. Plant accession classification and selection helps in identification of plants tolerant to local climatic conditions. Over the past few years, convolutional neural network (CNN) models were predominantly used for classification of plants and plant diseases from static plant images using spatial features. In this paper, we introduce three different methods namely 3-dimensional (3-D) CNN, CNN with convolutional long short-term memory (ConvLSTM) layers and vision transformer that use temporal information along with spatial information for plant accession classification. We have used publicly available Arabidopsis plant accession classification dataset to test the performance of these methods. Time series color images of four different plant accessions of Arabidopsis are given as input to the neural network model which in turn identifies plant accession. All the three methods outperform the existing methods available in the literature in terms of average accession classification accuracy. Vision transformer achieves highest classification accuracy of 98.59% at the cost of very large number of trainable parameters as compared to the other two methods. On the other hand, CNN-ConvLSTM achieves comparable accuracy of 97.97% with very less trainable parameters as compared to vision transformer. In future, these models can also be used to identify unknown plant accessions and predict plant growth signatures in different climatic conditions.
- Published
- 2021
- Full Text
- View/download PDF