Abstract
Deep learning and camera-based monitoring play a pivotal role in effective farm management. However, reliable data availability remains essential for successful deep-learning applications. Cameras are the primary data sources for computer vision deep learning models. For effective farm management, a multi-camera setup is often used. In a multi-camera farm setup, the input dataset for deep learning is prepared by combining the records of the cameras installed on many sides of the farm. However, an irregular frame rate of various cameras in a multi-camera setup can cause issues such as drift. Therefore, the data from different cameras must be in sync before feeding it to a deep learning model. In this work, we present a method for frame rate synchronization that leverages the timestamp information on the video and achieves high accuracy. Our method addresses a critical use case where the frame rate synchronization is performed post-video recording. Its effectiveness is demonstrated in real-world animal behavior detection scenarios, where precise synchronization is vital. Via this work, we contribute to robust deep-learning models for farm management and livestock analysis by addressing frame rate irregularities.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Agarwal, M., Dovdon, E., Barge, L.R., Dajsuren, Y., de Vlieg, J.: A HPC-based data analytics platform architecture for data-driven animal phenotype detection. In: 2023 IEEE 6th International Conference on Cloud Computing and Artificial Intelligence: Technologies and Applications (CloudTech), pp. 1–6. IEEE (2023)
Baek, Y., Lee, B., Han, D., Yun, S., Lee, H.: Character region awareness for text detection. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 9365–9374 (2019)
Breed4Food: Smart turkeys, NWO open technology program (2023). https://www.breed4food.com/affiliate-projects/item/23-smart-turkeys-nwo-open-technology-program. Accessed 28 Sep 2023
Brito, L.F., et al.: Large-scale phenotyping of livestock welfare in commercial production systems: a new frontier in animal breeding. Front. Genet. 11, 793 (2020)
Catarinucci, L., et al.: An animal tracking system for behavior analysis using radio frequency identification. Lab. Anim. 43(9), 321–327 (2014)
Russ, J.C.: The Image Processing Handbook. CRC Press (2006)
Deng, Y., Kanervisto, A., Ling, J., Rush, A.M.: Image-to-markup generation with coarse-to-fine attention. In: International Conference on Machine Learning, pp. 980–989. PMLR (2017)
Elhayek, A., Stoll, C., Kim, K.I., Seidel, H.-P., Theobalt, C.: Feature-based multi-video synchronization with subframe accuracy. In: Pinz, A., Pock, T., Bischof, H., Leberl, F. (eds.) DAGM/OAGM 2012. LNCS, vol. 7476, pp. 266–275. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-32717-9_27
Garcia, R., Aguilar, J., Toro, M., Pinto, A., Rodriguez, P.: A systematic literature review on the use of machine learning in precision livestock farming. Comput. Electron. Agric. 179, 105826 (2020)
Guo, Q., et al.: Enhanced camera-based individual pig detection and tracking for smart pig farms. Comput. Electron. Agric. 211, 108009 (2023)
Hofstra, G., Roelofs, J., Rutter, S.M., van Erp-van der Kooij, E., de Vlieg, J.: Mapping welfare: location determining techniques and their potential for managing cattle welfare a review. Dairy 3(4), 776–788 (2022)
Huang, B.S., Shen, D.F., Lin, G.S., Chai, S.K.D.: Multi-camera video synchronization based on feature point matching and refinement. In: 2019 IEEE/ACIS 18th International Conference on Computer and Information Science (ICIS), pp. 136–139. IEEE (2019)
Karami, E., Prasad, S., Shehata, M.: Image matching using sift, surf, brief and orb: performance comparison for distorted images. arXiv:1710.02726 (2017)
Kawamura, T., Katsuragi, T., Kobayashi, A., Inatomi, M., Oshiro, M., Eguchi, H.: Development of an information research platform for data-driven agriculture. Int. J. Agric. Environ. Inf. Syst. (IJAEIS) 13(1), 1–19 (2022)
Kim, H., Ishikawa, M.: Sub-frame evaluation of frame synchronization for camera network using linearly oscillating light spot. Sensors 21(18), 6148 (2021)
Liu, T., Liu, Y.: Moving camera-based object tracking using adaptive ground plane estimation and constrained multiple kernels. J. Adv. Transp. 2021 (2021)
Mistry, D., Banerjee, A.: Comparison of feature detection and matching approaches: sift and surf. GRD J. Glob. Res. Dev. J. Eng. 2(4), 7–13 (2017)
Nederlandse Organisatie voor Wetenschappelijk Onderzoek (NWO): Animal group sensor - integrating behavioural dynamics and social genetic effects to improve health, welfare and ecological footprint of livestock (IMAGEN) [p18-19] (2023). https://www.nwo.nl/onderzoeksprogrammas/perspectief. Accessed 28 Sept 2023
Oczak, M., Ismayilova, G., Costa, A., Viazzi, S., Sonoda, L.T., Fels, M., Bahr, C., Hartung, J., Guarino, M., Berckmans, D., et al.: Analysis of aggressive behaviours of pigs by automatic video recordings. Comput. Electron. Agric. 99, 209–217 (2013)
Olagoke, A.S., Ibrahim, H., Teoh, S.S.: Literature survey on multi-camera system and its application. IEEE Access 8, 172892–172922 (2020)
Rao, G.S.: View-invariant alignment and matching of video sequences. In: Proceedings Ninth IEEE International Conference on Computer Vision, pp. 939–945. IEEE (2003)
Ratcliff, J.: Timecode A User’s Guide: A User’s Guide. CRC Press, Boca Raton (1999)
Sato, T., Shimada, Y., Taniguchi, Y.: Temporal video alignment based on integrating multiple features by adaptive weighting. In: 2018 International Workshop on Advanced Image Technology (IWAIT), pp. 1–5. IEEE (2018)
Shrestha, P., Barbieri, M., Weda, H., Sekulovski, D.: Synchronization of multiple camera videos using audio-visual features. IEEE Trans. Multimedia 12(1), 79–92 (2009)
Smith, R.: An overview of the tesseract OCR engine. In: Ninth International Conference on Document Analysis and Recognition (ICDAR 2007), vol. 2, pp. 629–633. IEEE (2007)
Wieschollek, P., Freeman, I., Lensch, H.P.: Learning robust video synchronization without annotations. In: 2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA), pp. 92–100. IEEE (2017)
Xiao, S., et al.: Multi-view tracking, re-id, and social network analysis of a flock of visually similar birds in an outdoor aviary. Int. J. Comput. Vision 131(6), 1532–1549 (2023)
Acknowledgements
This work is supported by the Dutch NWO project IMAGEN [P18-19] of the research program Perspectief. Topigs Norsvin, the Netherlands, offered data from the Volmer facility in Germany. The authors would like to thank EngD Software Technology trainees from the Eindhoven University of Technology for assisting in implementing our method as data pipelines on the IMAGEN Data Analytics Platform.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Dovdon, E., Agarwal, M., Dajsuren, Y., de Vlieg, J. (2024). Irregular Frame Rate Synchronization of Multi-camera Videos for Data-Driven Animal Behavior Detection. In: Daimi, K., Al Sadoon, A. (eds) Proceedings of the Second International Conference on Advances in Computing Research (ACR’24). ACR 2024. Lecture Notes in Networks and Systems, vol 956. Springer, Cham. https://doi.org/10.1007/978-3-031-56950-0_9
Download citation
DOI: https://doi.org/10.1007/978-3-031-56950-0_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-56949-4
Online ISBN: 978-3-031-56950-0
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)