1. Automatic detection and classification of honey bee comb cells using deep learning
- Author
-
Thiago S. Alves, M. Alice Pinto, Cátia J. Neves, Pedro João Rodrigues, Arnaldo Cândido Júnior, David G. Biron, Paulo Ventura, and Pedro Luiz de Paula Filho
- Subjects
Beekeeping ,Source code ,Computer science ,media_common.quotation_subject ,02 engineering and technology ,Horticulture ,Convolutional neural network ,Hough transform ,law.invention ,DeepBee software ,03 medical and health sciences ,law ,020204 information systems ,Machine learning ,0202 electrical engineering, electronic engineering, information engineering ,Segmentation ,030304 developmental biology ,media_common ,2. Zero hunger ,0303 health sciences ,business.industry ,Deep learning ,Forestry ,Pattern recognition ,Honey bee ,Cell classification ,Semantic segmentation ,Brood ,Computer Science Applications ,Artificial intelligence ,business ,Agronomy and Crop Science ,Apis mellifera L - Abstract
In a scenario of worldwide honey bee decline, assessing colony strength is becoming increasingly important for sustainable beekeeping. Temporal counts of number of comb cells with brood and food reserves offers researchers data for multiple applications, such as modelling colony dynamics, and beekeepers information on colony strength, an indicator of colony health and honey yield. Counting cells manually in comb images is labour intensive, tedious, and prone to error. Herein, we developed a free software, named DeepBee©, capable of automatically detecting cells in comb images and classifying their contents into seven classes. By distinguishing cells occupied by eggs, larvae, capped brood, pollen, nectar, honey, and other, DeepBee© allows an unprecedented level of accuracy in cell classification. Using Circle Hough Transform and the semantic segmentation technique, we obtained a cell detection rate of 98.7%, which is 16.2% higher than the best result found in the literature. For classification of comb cells, we trained and evaluated thirteen different convolutional neural network (CNN) architectures, including: DenseNet (121, 169 and 201); InceptionResNetV2; InceptionV3; MobileNet; MobileNetV2; NasNet; NasNetMobile; ResNet50; VGG (16 and 19) and Xception. MobileNet revealed to be the best compromise between training cost, with ~9 s for processing all cells in a comb image, and accuracy, with an F1-Score of 94.3%. We show the technical details to build a complete pipeline for classifying and counting comb cells and we made the CNN models, source code, and datasets publicly available. With this effort, we hope to have expanded the frontier of apicultural precision analysis by providing a tool with high performance and source codes to foster improvement by third parties (https://github.com/AvsThiago/DeepBeesource). This research was developed in the framework of the project “BeeHope - Honeybee conservation centers in Western Europe: an innovative strategy using sustainable beekeeping to reduce honeybee decline”, funded through the 2013-2014 BiodivERsA/FACCE-JPI Joint call for research proposals, with the national funders FCT (Portugal), CNRS (France), and MEC (Spain). info:eu-repo/semantics/publishedVersion
- Published
- 2020
- Full Text
- View/download PDF