A deep learning framework for bone fragment classification in owl pellets using YOLOv12

Abstract Non-invasive monitoring of small mammal populations is critical for both biodiversity conservation and integrated pest management, particularly in agroecosystems. Barn owl (Tyto alba) pellet analysis has long served as a valuable tool for inferring prey abundance, yet conventional bone clas...

Full description

Saved in:
Bibliographic Details
Main Authors: Nik Fadzly, Lay Wai Kean, Siti Nuramaliati Prijono, Rini Rachmatika, Siti Zulaika, Mohd Nasir, Hasber Salim
Format: Article
Language:English
Published: Nature Portfolio 2025-08-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-15906-9
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Non-invasive monitoring of small mammal populations is critical for both biodiversity conservation and integrated pest management, particularly in agroecosystems. Barn owl (Tyto alba) pellet analysis has long served as a valuable tool for inferring prey abundance, yet conventional bone classification is labour-intensive and requires specialized expertise. Here, we introduce a deep learning framework that automates the detection and classification of rodent bone fragments from owl pellets using the YOLOv12 object detection architecture. A dataset comprising 978 annotated images, encompassing skull, femur, mandible, and pubis bones, was used to train and validate the model, achieving high detection performance (precision = 0.90, recall = 0.90, mAP@0.5 = 0.984, F1-score = 0.97). The model demonstrated strong generalization across samples from Malaysia and Indonesia. We further developed a Python-based inference script to estimate rodent abundance using skull and paired bone counts. This AI-assisted workflow reduces human error, increases processing throughput, and enables scalable rodent monitoring. By enhancing ecological inference from pellet studies, our approach supports timely biodiversity assessments and pest surveillance strategies across diverse landscapes.
ISSN:2045-2322