YOLOv9-Based Object Detection Model For Pig Feces On Pig SKIN: Improving Biosecurity In Automated Cleaning Systems
DOI:
https://doi.org/10.52436/1.jutif.2025.6.2.4240Keywords:
automated cleaning, biosecurity, deep learning, object detection, pig feces detection, YOLOv9Abstract
This study developed an object detection model using YOLOv9 to identify pig feces on pig skin, addressing challenges in automating pig cleaning systems and reducing the spread of African Swine Fever (ASF). The aim was to enhance biosecurity measures by minimizing human-pig contact through automation. A specialized dataset comprising 5,404 images was collected from Nyoman Farm in Bali, Indonesia, under various lighting and cleanliness conditions. These images were annotated into two classes, namely 'feces' and 'pig,' following strict criteria to ensure clarity and distinction. YOLOv9 was chosen as it is an advanced update of YOLOv8 with enhanced object detection capabilities. The model was iteratively trained and optimized to achieve the best performance. The results achieved a mAP_0.5 of 70.5%, precision of 70.6%, and recall of 72.1%. However, the model faced challenges in distinguishing pig skin patterns from feces and managing false positives caused by similar-looking objects in the barn environment. Despite these challenges, integrating this model into an automated cleaning system can reduce human-pig contact by up to 76%, which is expected to significantly lower the risk of ASF transmission. This study contributes to automated farming technology, demonstrating how well YOLOv9 can detect complex objects in agricultural settings and providing practical solutions to enhance biosecurity in pig farming while improving productivity.
Downloads
References
I. Sendow, A. Ratnawati, N. Dharmayanti, and M. Saepulloh, “African Swine Fever: Penyakit Emerging yang Mengancam Peternakan Babi di Dunia,” Indones. Bull. Anim. Vet. Sci., vol. 30,
p. 15, May 2020, doi: 10.14334/wartazoa.v30i1.2479.
R. S. Geiger et al., “Garbage in, garbage out? do machine learning application papers in social computing report where human-labeled training data comes from?,” in Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, in FAT* ’20. New York, NY, USA: Association for Computing Machinery, 2020, pp. 325–336. doi: 10.1145/3351095.3372862.
L. V. Alarcón, A. Allepuz, and E. Mateu, “Biosecurity in pig farms: a review,” Porc. Health Manag., vol. 7, no. 1, p. 5, Jan. 2021, doi: 10.1186/s40813-020-00181-z.
S. I. Bonnet et al., “Putative Role of Arthropod Vectors in African Swine Fever Virus Transmission in Relation to Their Bio-Ecological Properties,” Viruses, vol. 12, no. 7, 2020, doi: 10.3390/v12070778.
Jeanet Lali Pora et al., “UPAYA PENCEGAHAN PENYEBARAN AFRICAN SWINE FEVER DI NUSA TENGGARA TIMUR,” J. Media Trop., vol. 1, no. 1, May 2021, doi:
35508/mediatropika.v1i1.3922.
L. Klein, U. Gerdes, S. Blome, A. Campe, and E. grosse Beilage, “Biosecurity measures for the prevention of African swine fever on German pig farms: comparison of farmers’ own appraisals and external veterinary experts’ evaluations,” Porc. Health Manag., vol. 10, no. 1, p. 14, Mar. 2024, doi: 10.1186/s40813-024-00365-x.
A. Fabris and others, “Algorithmic Fairness Datasets: Curation, Selection, and Applications,” 2023.
I. Suta, M. Sudarma, and I. N. S. Kumara, “Segmentasi Tumor Otak Berdasarkan Citra Magnetic Resonance Imaging Dengan Menggunakan Metode U-NET,” Maj. Ilm. Teknol. Elektro, vol. 19, p. 151, Dec. 2020, doi: 10.24843/MITE.2020.v19i02.P05.
P. Jiang, D. Ergu, F. Liu, C. Ying, and B. Ma, “A Review of Yolo Algorithm Developments,”
Procedia Comput. Sci., vol. 199, pp. 1066–1073, Feb. 2022, doi: 10.1016/j.procs.2022.01.135.
Z. Yin et al., “Lightweight Pig Face Feature Learning Evaluation and Application Based on Attention Mechanism and Two-Stage Transfer Learning,” Agriculture, vol. 14, p. 156, Jan. 2024, doi: 10.3390/agriculture14010156.
Y. Amit, P. Felzenszwalb, and R. Girshick, “Object detection,” in Computer Vision: A Reference Guide, Springer, 2021, pp. 875–883.
Q. Lin, G. Ye, J. Wang, and H. Liu, “RoboFlow: a data-centric workflow management system for developing AI-enhanced Robots,” in Conference on Robot Learning, PMLR, 2022, pp. 1789–1794.
S. Holland, A. Hosny, S. Newman, J. Joseph, and K. Chmielinski, “The Dataset Nutrition Label: A Framework To Drive Higher Data Quality Standards.” 2018. [Online]. Available: https://arxiv.org/abs/1805.03677
A. Paullada, I. D. Raji, E. M. Bender, E. Denton, and A. Hanna, “Data and its (dis)contents: A survey of dataset development and use in machine learning research,” Patterns, vol. 2, no. 11, Nov. 2021, doi: 10.1016/j.patter.2021.100336.
A. N. Fadja, S. R. Che, and M. Atemkemg, “Intelligent Vision System with Pruning and Web Interface for Real-Time Defect Detection on African Plum Surfaces,” Information, vol. 15, no. 10, p. 635, 2024.
Z. Zhang, B. Krawczyk, S. García, A. Rosales-Pérez, and F. Herrera, “Empowering one-vs-one decomposition with ensemble learning for multi-class imbalanced data,” Knowl Based Syst, vol. 106, pp. 251–263, Aug. 2016, doi: 10.1016/j.knosys.2016.05.048.
P. Vuttipittayamongkol, E. Elyan, and A. Petrovski, “On the class overlap problem in imbalanced data classification,” Knowl Based Syst, vol. 212, p. 106631, Jan. 2021, doi: 10.1016/j.knosys.2020.106631.
S. Y. Ho, K. Phua, L. Wong, and W. Goh, “Extensions of the External Validation for Checking Learned Model Interpretability and Generalizability,” Patterns, vol. 1, Nov. 2020, doi: 10.1016/j.patter.2020.100129.
F. Cabitza et al., “The importance of being external. methodological insights for the external validation of machine learning models in medicine,” Comput. Methods Programs Biomed., vol. 208, p. 106288, Jul. 2021, doi: 10.1016/j.cmpb.2021.106288.
Y. Yi, H. Lv, T. Luo, J. Yang, L. Liu, and L.-Z. Cui, “Crowdsourcing-based Model Testing in Federated Learning,” 2023 IEEE 22nd Int. Conf. Trust Secur. Priv. Comput. Commun. Trust.,
pp. 207–213, Nov. 2023, doi: 10.1109/TrustCom60117.2023.00048.
C. Y, P. Kiran, and M. B, “The Novel Method for Data Preprocessing CLI,” Adv. Intell. Syst. Technol., Dec. 2022, doi: 10.53759/aist/978-9914-9946-1-2_21.
Z. Song, S. Yang, and R. Zhang, “Does Preprocessing Help Training Over-parameterized Neural Networks?,” in Advances in Neural Information Processing Systems, M. Ranzato, A. Beygelzimer, Y. Dauphin, P. S. Liang, and J. W. Vaughan, Eds., Curran Associates, Inc., 2021,
pp. 22890–22904. [Online]. Available: https://proceedings.neurips.cc/paper_files/paper/2021/file/c164bbc9d6c72a52c599bbb43d8db8e 1-Paper.pdf
M. Kuhn and K. Johnson, “Data Pre-processing,” pp. 27–59, 2013, doi: 10.1007/978-1-4614- 6849-3_3.
J. Zhao, M. Glueck, S. Breslav, F. Chevalier, and A. Khan, “Annotation Graphs: A Graph- Based Visualization for Meta-Analysis of Data Based on User-Authored Annotations,” IEEE Trans. Vis. Comput. Graph., vol. 23, pp. 261–270, 2017, doi: 10.1109/TVCG.2016.2598543.
J. Gujjar and V. Kumar, “Google Colaboratory : Tool for Deep Learning and Machine Learning Applications,” Int. J. Comput. Simul., vol. 6, pp. 23–26, Aug. 2021, doi: 10.17010/IJCS/2021/V6/I3-4/165408.
M. Ucar, M. Nour, H. Sindi, and K. Polat, “The Effect of Training and Testing Process on Machine Learning in Biomedical Datasets,” Math. Probl. Eng., May 2020, doi: 10.1155/2020/2836236.
C. López-Martín, “Machine learning techniques for software testing effort prediction,” Softw. Qual. J., vol. 30, pp. 65–100, Feb. 2021, doi: 10.1007/s11219-020-09545-8.
M. Ocepek, A. Žnidar, M. Lavrič, D. Škorjanc, and I. L. Andersen, “DigiPig: First Developments of an Automated Monitoring System for Body, Head and Tail Detection in Intensive Pig Farming,” Agriculture, vol. 12, no. 1, 2022, doi: 10.3390/agriculture12010002.
J. Kaur and W. Singh, “Tools, techniques, datasets and application areas for object detection in an image: a review,” Multimed. Tools Appl., vol. 81, no. 27, pp. 38297–38351, Nov. 2022, doi: 10.1007/s11042-022-13153-y.
Additional Files
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 I Gede Irvan Pramanta Andika

This work is licensed under a Creative Commons Attribution 4.0 International License.