Skip to main content
ARS Home » Northeast Area » Kearneysville, West Virginia » Appalachian Fruit Research Laboratory » Innovative Fruit Production, Improvement, and Protection » Research » Publications at this Location » Publication #337832

Title: Apple flower detection using deep convolutional networks

item DIAS, PHILIPE - Marquette University
item Tabb, Amy
item MEDEIROS, HENRY - Marquette University

Submitted to: Computers in Industry
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 3/15/2018
Publication Date: 3/27/2018
Citation: Dias, P., Tabb, A., Medeiros, H. 2018. Apple flower detection using deep convolutional networks. Computers in Industry. 99:17-28.

Interpretive Summary: Apple trees currently produce many more blossoms than are needed for optimal fruit production and as a result, an estimate of the number of blossoms is required in order to plan for adequate removal (called thinning). Currently, a method for estimating the number of blossoms is manual inspection. This paper presents an automatic method for estimating the number of blossoms in an image using a computer program, based on ideas from convolutional neural networks. The future impact of this work is increased accuracy and speed for bloom estimation in orchard settings.

Technical Abstract: In order to optimize fruit production, a portion of the flowers and fruitlets of apple trees must be removed early in the growing season. The proportion to be removed is determined by the bloom intensity, i.e., the number of flowers present in the orchard. Several automated computer vision systems have been proposed to estimate bloom intensity, but their overall performance is still far from satisfactory even in relatively controlled environments. With the goal of devising a technique for flower identification which is robust to clutter and to changes in illumination, this paper presents a method in which a pre-trained convolutional neural network is fine-tuned to become specially sensitive to flowers. Experimental results on a challenging dataset demonstrate that our method significantly outperforms two approaches that represent the state of the art in flower detection, with recall and precision rates higher than 90%. Moreover, a performance assessment on three additional datasets previously unseen by the network, which consist of different flower species and were acquired under different conditions, reveals that the proposed method highly surpasses baseline approaches in terms of generalization capability.