Aerial hyperspectral imagery and deep neural networks for high-throughput yield phenotyping in wheat

Research output: Contribution to journalArticlepeer-review

39 Scopus citations

Abstract

Crop production needs to increase in a sustainable manner to meet the growing global demand for food. To identify crop varieties with high yield potential, plant scientists and breeders evaluate the performance of hundreds of lines in multiple locations over several years. To facilitate the process of selecting advanced varieties, an automated framework was developed in this study. A hyperspectral camera was mounted on an unmanned aerial vehicle to collect aerial imagery with high spatial and spectral resolution in a fast, cost-effective manner. Aerial images were captured in two consecutive growing seasons from three experimental yield fields composed of hundreds experimental wheat lines. The grain of more than thousand wheat plots was harvested by a combine, weighed, and recorded as the ground truth data. To investigate the yield variation at sub-plot scale and leverage the high spatial resolution, plots were divided into sub-plots using image processing techniques integrated by domain knowledge. Subsequent to extracting features from each sub-plot, deep neural networks were trained for yield estimation. The coefficient of determination for predicting the yield was 0.79 and 0.41 with normalized root mean square error of 0.24 and 0.14 g at sub-plot and plot scale, respectively. The results revealed that the proposed framework, as a valuable decision support tool, can facilitate the process of high-throughput yield phenotyping by offering the possibility of remote visual inspection of the plots as well as optimizing plot size to investigate more lines in a dedicated field each year.

Original languageEnglish (US)
Article number105299
JournalComputers and Electronics in Agriculture
Volume172
DOIs
StatePublished - May 2020

Bibliographical note

Funding Information:
The authors would like to gratefully acknowledge the funding from the Minnesota’s Discovery, Research, and InnoVation Economy (MnDRIVE) program through the research area of Robotics, Sensors, and Advanced Manufacturing. We thank Ms. Susan K. Reynolds for her valuable support in managing the fields and collecting the ground truth data, and Mrs. Parisa Kafash for her assistance in preparing the figures. We would also like to acknowledge the graduate student fellowships provided by MnDRIVE Global Food Ventures and the department of Bioproducts and Biosystems Engineering.

Funding Information:
The authors would like to gratefully acknowledge the funding from the Minnesota's Discovery, Research, and InnoVation Economy (MnDRIVE) program through the research area of Robotics, Sensors, and Advanced Manufacturing. We thank Ms. Susan K. Reynolds for her valuable support in managing the fields and collecting the ground truth data, and Mrs. Parisa Kafash for her assistance in preparing the figures. We would also like to acknowledge the graduate student fellowships provided by MnDRIVE Global Food Ventures and the department of Bioproducts and Biosystems Engineering.

Funding Information:
Yang supported this study by the University of Minnesota MNDrive startup fund, administered the project and provided supervision. Anderson provided the field trial and yield data, and provided supervision. Moghimi analyzed the data, decided the methodology and wrote the original draft. All authors contributed to the conceptualization, data curation, investigation, methodology and editing.

Publisher Copyright:
© 2020 Elsevier B.V.

Keywords

  • Deep learning
  • Endmember
  • Hyperspectral imaging
  • Neural network
  • Phenotyping
  • UAV
  • Unmixing
  • Yield

Fingerprint

Dive into the research topics of 'Aerial hyperspectral imagery and deep neural networks for high-throughput yield phenotyping in wheat'. Together they form a unique fingerprint.

Cite this