MACHINE LEARNING ALGORITHM FOR RAPID OBJECT DETECTION BASED ON COLOR FEATURES
DOI:
https://doi.org/10.11113/aej.v14.21633Keywords:
Extreme Machine Learning, Object Detection, Pastoral Landscapes, and Color Features.Abstract
The identification of objects is of utmost importance in a wide range of computer vision applications, such as surveillance systems, autonomous cars, and environmental monitoring. Accurate and efficient object recognition methods are crucial in pastoral environments, characterized by the prominent presence of cattle and other objects, to provide effective analysis and decision-making processes. The present study paper introduces an innovative methodology for efficient identification of objects in pastoral landscapes through the utilization of a Colour Feature Extreme Learning Machine (CF-ELM). The CF-ELM method integrates color characteristics with the ELM algorithm to attain enhanced object detection accuracy while preserving computational economy. The experimental findings provide empirical evidence supporting the efficacy and efficiency of the suggested approach in the detection of items within pastoral landscapes. In addition to the CF-ELM, an algorithm for desktop-based categorization of items within a pastoral environment is provided, with individual speeds ranging from 0.05 s to 0.17 s for a single image, evaluated in each color space. The algorithm is intended for usage in scenarios with challenging and variable terrain, making it appropriate for application in agricultural or pastoral settings.
References
Achour, B., Belkadi, M., Filali, I., Laghrouche, M., & Lahdir, M. 2020. “Image analysis for individual identification and feeding behaviour monitoring of dairy cows based on Convolutional Neural Networks (CNN).” Biosystems Engineering, 198: 31-49. https://doi.org/10.1016/j.biosystemseng.2020.07.019
Xu, B., Wang, W., Falzon, G., Kwan, P., Guo, L., Chen, G., Tait, A., & Schneider, D. 2020. “Automated cattle counting using Mask R-CNN in quadcopter vision system.” Computers and Electronics in Agriculture, 171: 105300. https://doi.org/10.1016/j.compag.2020.105300
Balasubramaniam, A., & Pasricha, S. 2022. “Object Detection in Autonomous Vehicles: Status and Open Challenges”. ArXiv. /abs/2201.07706
Rathore, A., Sharma, A., Shah, S., Sharma, N., Torney, C., & Guttal, V. 2020. “Multi-Object Tracking in Heterogeneous environments (MOTHe) for animal video recordings.” PeerJ, 11.
Er, Meng Joo, Jie Chen, Yani Zhang, and Wenxiao Gao, 2023 "Research Challenges, Recent Advances, and Popular Datasets in Deep Learning-Based Underwater Marine Object Detection: A Review" Sensors 23(4): 1990.
Al-Thelaya, K., Gilal, N. U., Alzubaidi, M., Majeed, F., Agus, M., Schneider, J., & Househ, M. (2022). Applications of discriminative and deep learning feature extraction methods for whole slide image analysis: A survey. Journal of Pathology Informatics, 14: 100335. https://doi.org/10.1016/j.jpi.2023.100335
Kasaei, S. H., Ghorbani, M., & Schilperoort, J. 2020. “Investigating the Importance of Shape Features, Color Constancy, Color Spaces and Similarity Measures in Open-Ended 3D Object Recognition.” ArXiv. /abs/2002.03779
Reppa, I., Williams, K.E., Greville, W.J. et al. 2020. “The relative contribution of shape and colour to object memory.” Memory & Cognition 48: 1504–1521.. https://doi.org/10.3758/s13421-020-01058-w
Jaglarz, A. 2023. “Perception of Color in Architecture and Urban Space.” Buildings. 13(8): 2000https://doi.org/10.3390/buildings13082000
Kasaei, H., Ghorbani, M., Schilperoort, J., & Rest, W.V. 2020. “Investigating the importance of shape features, color constancy, color spaces, and similarity measures in open-ended 3D object recognition.” Intelligent Service Robotics, 14: 329 - 344.
Nigam, N., Singh, D. P., & Choudhary, J. 2023. “A Review of Different Components of the Intelligent Traffic Management System (ITMS).” Symmetry, 15(3): 583.
Goodness, J., Andersson, E., Anderson, P. M., & Elmqvist, T. 2016. “Exploring the links between functional traits and cultural ecosystem services to enhance urban ecosystem management.” Ecological Indicators. 70: 597-605. https://doi.org/10.1016/j.ecolind.2016.02.031
Tausif Diwan, G. Anirudh, and Jitendra V. Tembhurne. 2022. “Object detection using YOLO: challenges, architectural successors, datasets and applications.” Multimedia Tools Appl. 82(6): 9243–9275. https://doi.org/10.1007/s11042-022-13644-y
Edmund J. Sadgrove, Greg Falzon, David Miron, David W. Lamb, 2018 “Real time object detection in agricultural/remote environments using the multiple-expert colour feature extreme learning machine (MEC-ELM),” Computers in Industry, 98: 183-191.
Sabzi, S., Abbaspour-Gilandeh, Y., & García-Mateos, G. 2018. “A fast and accurate expert system for weed identification in potato crops using metaheuristic algorithms.” Computers in Industry. 98: 80-89. https://doi.org/10.1016/j.compind.2018.03.001
Wu, Z., Chen, Y., Zhao, B., Kang, X., & Ding, Y., 2021 “Review of Weed Detection Methods Based on Computer Vision,” Sensors, 21(11): 3647.
J. Xu, H. Zhou and G. -B. Huang, 2012 "Extreme Learning Machine based fast object recognition," 15th International Conference on Information Fusion, Singapore, 2012: 1490-1496.
Aquilani, C., Confessore, A., Bozzi, R., Sirtori, F., & Pugliese, C. 2021. Review: Precision Livestock Farming technologies in pasture-based livestock systems. Animal. 16(1): 100429. https://doi.org/10.1016/j.animal.2021.100429
A. S. Abdullahi Madey, A. Yahyaoui and J. Rasheed, 2021 "Object Detection in Video by Detecting Vehicles Using Machine Learning and Deep Learning Approaches," 2021 International Conference on Forthcoming Networks and Sustainability in AIoT Era (FoNeS-AIoT), Nicosia, Turkey, pp. 62-65.
S. V. Mahadevkar et al., 2022 "A Review on Machine Learning Styles in Computer Vision—Techniques and Future Directions," in IEEE Access, 10: 07293-107329.
A. Guezzaz, Y. Asimi, M. Azrour and A. Asimi, 2021 "Mathematical validation of proposed machine learning classifier for heterogeneous traffic and anomaly detection," in Big Data Mining and Analytics.4(1): 18-24.
W. Diao, X. Sun, X. Zheng, F. Dou, H. Wang and K. Fu, 2016 "Efficient Saliency-Based Object Detection in Remote Sensing Images Using Deep Belief Networks," in IEEE Geoscience and Remote Sensing Letters, 13(2): 137-141.
Y. -j. Liang, X. -p. Cui, X. -h. Xu and F. Jiang, 2020 "A Review on Deep Learning Techniques Applied to Object Detection," 2020 7th International Conference on Information Science and Control Engineering (ICISCE), Changsha, China. 120-124.
A. B. Amjoud and M. Amrouch, 2023 "Object Detection Using Deep Learning, CNNs and Vision Transformers: A Review," in IEEE Access, 11: 35479-35516.
L. Jiao et al., 2022 "New Generation Deep Learning for Video Object Detection: A Survey," in IEEE Transactions on Neural Networks and Learning Systems, 33(8): 3195-3215.
Cao, D., Chen, Z., & Gao, L. 2020. “An improved object detection algorithm based on multi-scaled and deformable convolutional neural networks,” Human-centric Computing and Information Sciences, 10(1): 1-22.
L. Aziz, M. S. B. Haji Salam, U. U. Sheikh and S. Ayub, 2020. "Exploring Deep Learning-Based Architecture, Strategies, Applications and Current Trends in Generic Object Detection: A Comprehensive Review," in IEEE Access, 8: 170461-17049.
W. Wang, Q. Lai, H. Fu, J. Shen, H. Ling and R. Yang, 2022. "Salient Object Detection in the Deep Learning Era: An In-Depth Survey," in IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(6): 3239-3259.
L. Kalake, W. Wan and L. Hou, 2021. "Analysis Based on Recent Deep Learning Approaches Applied in Real-Time Multi-Object Tracking: A Review," in IEEE Access. 9: 32650-32671.
A. Bouguettaya, H. Zarzour, A. Kechida and A. M. Taberkit, 2022. "Vehicle Detection from UAV Imagery with Deep Learning: A Review," in IEEE Transactions on Neural Networks and Learning Systems, 33(11): 6047-6067.
R. Usamentiaga, D. G. Lema, O. D. Pedrayes and D. F. Garcia, 2022. "Automated Surface Defect Detection in Metals: A Comparative Review of Object Detection and Semantic Segmentation Using Deep Learning," in IEEE Transactions on Industry Applications, 58(3): 4203-4213.