Detection and Classification of <i>Agave angustifolia</i> Haw Using Deep Learning Models

In Oaxaca, Mexico, there are more than 30 species of the <i>Agave</i> genus, and its cultivation is of great economic and social importance. The incidence of pests, diseases, and environmental stress cause significant losses to the crop. The identification of damage through non-invasive...

Full description

Saved in:
Bibliographic Details
Main Authors: Idarh Matadamas, Erik Zamora, Teodulfo Aquino-Bolaños
Format: Article
Language:English
Published: MDPI AG 2024-12-01
Series:Agriculture
Subjects:
Online Access:https://www.mdpi.com/2077-0472/14/12/2199
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In Oaxaca, Mexico, there are more than 30 species of the <i>Agave</i> genus, and its cultivation is of great economic and social importance. The incidence of pests, diseases, and environmental stress cause significant losses to the crop. The identification of damage through non-invasive tools based on visual information is important for reducing economic losses. The objective of this study was to evaluate and compare five deep learning models: YOLO versions 7, 7-tiny, and 8, and two from the Detectron2 library, Faster-RCNN and RetinaNet, for the detection and classification of <i>Agave angustifolia</i> plants in digital images. In the town of Santiago Matatlán, Oaxaca, 333 images were taken in an open-air plantation, and 1317 plants were labeled into five classes: sick, yellow, healthy, small, and spotted. Models were trained with a 70% random partition, validated with 10%, and tested with the remaining 20%. The results obtained from the models indicate that YOLOv7 is the best-performing model, in terms of the test set, with a <i>mAP</i> of 0.616, outperforming YOLOv7-tiny and YOLOv8, both with a <i>mAP</i> of 0.606 on the same set; demonstrating that artificial intelligence for the detection and classification of <i>Agave angustifolia</i> plants under planting conditions is feasible using digital images.
ISSN:2077-0472