Automatic Scoring of Rhizoctonia Crown and Root Rot Affected Sugar Beet Fields from Orthorectified UAV Images Using Machine Learning

Rhizoctonia crown and root rot (RCRR), caused by Rhizoctonia solani, can cause severe yield and quality losses in sugar beet. The most common strategy to control the disease is the development of resistant varieties. In the breeding process, field experiments with artificial inoculation are carried...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:Plant disease. - 1997. - 108(2024), 3 vom: 04. März, Seite 711-724
1. Verfasser: Ispizua Yamati, Facundo Ramón (VerfasserIn)
Weitere Verfasser: Günder, Maurice, Barreto, Abel, Bömer, Jonas, Laufer, Daniel, Bauckhage, Christian, Mahlein, Anne-Katrin
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2024
Zugriff auf das übergeordnete Werk:Plant disease
Schlagworte:Journal Article AutoML CNN multiclass classification resistance breeding time series unmanned aerial vehicle Sugars
Beschreibung
Zusammenfassung:Rhizoctonia crown and root rot (RCRR), caused by Rhizoctonia solani, can cause severe yield and quality losses in sugar beet. The most common strategy to control the disease is the development of resistant varieties. In the breeding process, field experiments with artificial inoculation are carried out to evaluate the performance of genotypes and varieties. The phenotyping process in breeding trials requires constant monitoring and scoring by skilled experts. This work is time demanding and shows bias and heterogeneity according to the experience and capacity of each individual person. Optical sensors and artificial intelligence have demonstrated great potential to achieve higher accuracy than human raters and the possibility to standardize phenotyping applications. A workflow combining red-green-blue and multispectral imagery coupled to an unmanned aerial vehicle (UAV), as well as machine learning techniques, was applied to score diseased plants and plots affected by RCRR. Georeferenced annotation of UAV-orthorectified images was carried out. With the annotated images, five convolutional neural networks were trained to score individual plants. The training was carried out with different image analysis strategies and data augmentation. The custom convolutional neural network trained from scratch together with pretrained MobileNet showed the best precision in scoring RCRR (0.73 to 0.85). The average per plot of spectral information was used to score the plots, and the benefit of adding the information obtained from the score of individual plants was compared. For this purpose, machine learning models were trained together with data management strategies, and the best-performing model was chosen. A combined pipeline of random forest and k-nearest neighbors has shown the best weighted precision (0.67). This research provides a reliable workflow for detecting and scoring RCRR based on aerial imagery. RCRR is often distributed heterogeneously in trial plots; therefore, considering the information from individual plants of the plots showed a significant improvement in UAV-based automated monitoring routines
Beschreibung:Date Completed 01.04.2024
Date Revised 01.04.2024
published: Print-Electronic
Citation Status MEDLINE
ISSN:0191-2917
DOI:10.1094/PDIS-04-23-0779-RE