5 results on '"Mark Whitty"'
Search Results
2. A computer vision system for early stage grape yield estimation based on shoot detection
- Author
-
Mark Whitty, Scarlett Liu, Julie Tang, Steve Cossell, and Gregory Dunn
- Subjects
Engineering ,Machine vision ,business.industry ,010401 analytical chemistry ,Feature extraction ,Data classification ,Forestry ,Feature selection ,Image processing ,02 engineering and technology ,Horticulture ,01 natural sciences ,0104 chemical sciences ,Computer Science Applications ,Yield (wine) ,0202 electrical engineering, electronic engineering, information engineering ,Unsupervised learning ,020201 artificial intelligence & image processing ,Computer vision ,Artificial intelligence ,Scale (map) ,business ,Agronomy and Crop Science - Abstract
A vision system for automated yield estimation and variation mapping is proposed.The proposed method produces F1-score 0.90 in average over four experimental blocks.The developed shoot detection does not require manual labeling to build a classifier.The developed system only requires low-cost off-the-shelf image collection equipment.The best EL stage for imaging shoots is around EL stage 9 regarding yield estimation. Counting grapevine shoots early in the growing season is critical for adjusting management practices but is challenging to automate due to a range of environmental factors.This paper proposes a completely automatic system for grapevine yield estimation, comprised of robust shoot detection and yield estimation based on shoot counts produced from videos. Experiments were conducted on four vine blocks across two cultivars and trellis systems over two seasons. A novel shoot detection framework is presented, including image processing, feature extraction, unsupervised feature selection and unsupervised learning as a final classification step. Then a procedure for converting shoot counts from videos to yield estimates is introduced.The shoot detection framework accuracy was calculated to be 86.83% with an F1-score of 0.90 across the four experimental blocks. This was shown to be robust in a range of lighting conditions in a commercial vineyard. The absolute predicted yield estimation error of the system when applied to four blocks over two consecutive years ranged from 1.18% to 36.02% when the videos were filmed around E-L stage 9.The developed system has an advantage over traditional PCD mapping techniques in that yield variation maps can be obtained earlier in the season, thereby allowing farmers to adjust their management practices for improved outputs. The unsupervised feature selection algorithm combined with unsupervised learning removed the requirement for any prior training or labeling, greatly enhancing the applicability of the overall framework and allows full automation of shoot mapping on a large scale in vineyards.
- Published
- 2017
- Full Text
- View/download PDF
3. DeepPhenology: Estimation of apple flower phenology distributions based on deep learning
- Author
-
Xu Wang, Mark Whitty, and Julie Tang
- Subjects
0106 biological sciences ,Thinning ,Contextual image classification ,Phenology ,business.industry ,Deep learning ,Forestry ,Pattern recognition ,04 agricultural and veterinary sciences ,Horticulture ,01 natural sciences ,Object detection ,Computer Science Applications ,040103 agronomy & agriculture ,0401 agriculture, forestry, and fisheries ,RGB color model ,Artificial intelligence ,Divergence (statistics) ,business ,Agronomy and Crop Science ,010606 plant biology & botany ,Mathematics ,Block (data storage) - Abstract
Estimation of phenology distribution in horticultural crops is very important as it governs the timing of chemical thinning in order to produce good quality fruit. This paper presents a novel phenology distribution estimation method named DeepPhenology for apple flowers based on CNNs using RGB images, which is able to efficiently map the flower distribution on an image-level, row-level, and block-level. The image classification model VGG-16 was directly trained with relative phenology distributions calculated from manual counts of flowers in the field and acquired imagery. The proposed method removes the need to label images, which overcomes difficulties in distinguishing overlapping flower clusters or identifying hidden flower clusters when using 2D imagery. DeepPhenology was tested on both daytime and night-time images captured using an RGB camera mounted on a ground vehicle in both Gala and Pink Lady varieties in an Australian orchard. An average Kullback-Leibler (KL) divergence value of 0.23 over all validation sets and an average KL value of 0.27 over all test sets was achieved. Further evaluation has been done by comparing the proposed model with YOLOv5 and shown to outperform this state-of-the-art object detection model for this task. By combining relative phenology distributions from a single image to a row-level or block-level distribution, we are able to give farmers a precise and high-level overview of block performance to form the basis for decisions on chemical thinning applications.
- Published
- 2021
- Full Text
- View/download PDF
4. Side-view apple flower mapping using edge-based fully convolutional networks for variable rate chemical thinning
- Author
-
Julie Tang, Mark Whitty, and Xu Wang
- Subjects
0106 biological sciences ,Pixel ,Thinning ,business.industry ,Machine vision ,Computer science ,Deep learning ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Forestry ,Context (language use) ,Pattern recognition ,04 agricultural and veterinary sciences ,Horticulture ,01 natural sciences ,Thresholding ,Computer Science Applications ,040103 agronomy & agriculture ,0401 agriculture, forestry, and fisheries ,Segmentation ,Artificial intelligence ,F1 score ,business ,Agronomy and Crop Science ,010606 plant biology & botany - Abstract
Apple trees commonly require the removal of excessive flowers by thinning to produce high quality fruit. Machine vision has recently been applied to detect the flower density as the first step in this process. Existing work relying on color thresholding is sensitive to imaging conditions and the most recent published work using deep learning in this context has proven to be exceptionally slow to process. This paper presents an apple flower segmentation method on a pixel level based on a Fully Convolutional Network (FCN) together with a process of generating a map that can be used for a variable rate chemical sprayer. Despite the challenging conditions of an uncontrolled environment, our apple flower detector was able to generate a F1 score at pixel-level up to 85.6%, which is a relatively high accuracy in terms of pixel-level segmentation. Our method has been tested on both daytime and night-time datasets, which strongly validates the ability of our apple flower detector to work under different conditions. The resulting detections are georeferenced and merged into a density map in the format necessary for application by a variable rate chemical sprayer. Finally, this flower density mapping system will benefit farmers by visualising the whole crop and extracting useful information to support their decision making for chemical thinning.
- Published
- 2020
- Full Text
- View/download PDF
5. A vision-based robust grape berry counting algorithm for fast calibration-free bunch weight estimation in the field
- Author
-
Xiangdong Zeng, Mark Whitty, and Scarlett Liu
- Subjects
0106 biological sciences ,Vision based ,Forestry ,Image processing ,04 agricultural and veterinary sciences ,Horticulture ,01 natural sciences ,Computer Science Applications ,Bunches ,Weight estimation ,Robustness (computer science) ,040103 agronomy & agriculture ,0401 agriculture, forestry, and fisheries ,Grape berry ,Agronomy and Crop Science ,Algorithm ,Calibration free ,010606 plant biology & botany ,Mathematics - Abstract
Counting the number of berries per bunch is a key component of many yield estimation processes but is exceptionally tedious for farmers to complete. Recent work into image processing in viticulture has produced methods for berry counting, however these require some degree of manual intervention or need calibration to manual counts for different bunch architectures. Therefore, this paper introduces a fast and robust calibration-free algorithm for berry counting for winegrapes to aid yield estimation. The algorithm was tested on 529 images collected in the field at multiple vineyards at different maturity stages and achieved an accuracy of approximately 89% per bunch. As it would mostly likely be used to obtain an average value across a block, the low bias of this method resulted in an average accuracy of 99% and was shown to be robust from pea-sized to harvest and between both red and green bunches. Taking only 0.1 to 1 s per image to process and requiring only a smartphone and small backing board to capture, the algorithm can readily be applied to images which are captured in the field by farmers. This allowed bunch weights to be estimated to within 92% accuracy and assisted larger scale yield estimation processes to achieve accuracies of between 3% and 16%. The robustness of the method lays the foundation for fast fruit-set ratio determination and more detailed bunch architecture studies in vivo on a large scale.
- Published
- 2020
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.