Vision-based preharvest yield mapping for apple orchards

Pravakar Roy, Abhijeet Kislay, Patrick A. Plonski, James Luby, Volkan Isler

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

We present an end-to-end computer vision system for yield mapping in apple orchards. Our proposed system is platform independent and does not require any specific lighting conditions. Our main technical contributions are (1) a semi-supervised clustering algorithm that utilizes colors to identify apples and (2) an unsupervised clustering method that utilizes spatial properties to estimate fruit counts from apple clusters having arbitrarily complex geometry. Additionally, we utilize camera motion to merge the counts across multiple views. We verified the performance of our algorithms by conducting multiple field trials. Results indicate that the detection method achieves F1-measure.95–.97 for multiple color varieties and lighting conditions. The counting method achieves an accuracy of 89–98%. Additionally, we report merged fruit counts from both sides of the tree rows. Our yield estimation method achieves an overall accuracy of 91.98–94.81% across different datasets.

Original languageEnglish (US)
Article number104897
JournalComputers and Electronics in Agriculture
Volume164
DOIs
StatePublished - Sep 2019

Bibliographical note

Funding Information:
The authors thank Joshua Anderson, Professors Emily Hoover, and Cindy Tong from the Department of Horticultural Science, University of Minnesota, for their expertise and help with the experiments. This work is supported in part by NSF grant # 1317788 , USDA NIFA MIN-98-G02 and the MnDrive initiative.

Keywords

  • Apple counting
  • Apple detection
  • Clustering
  • Machine vision
  • Semi-supervised image segmentation
  • Yield estimation

Fingerprint

Dive into the research topics of 'Vision-based preharvest yield mapping for apple orchards'. Together they form a unique fingerprint.

Cite this