Abstract
Multi-camera systems are frequently used in applications such as panorama videos creation, free-viewpoint rendering, and 3D reconstruction. A critical aspect for visual quality in these systems is that the cameras are closely synchronized. In our research, we require high-definition panorama videos generated in real time using several cameras in parallel. This is an essential part of our sports analytics system called Bagadus, which has several synchronization requirements. The system is currently in use for soccer games at the Alfheim stadium for Tromsø IL and at the Ullevaal stadium for the Norwegian national soccer team. Each Bagadus installation is capable of combining the video from five 2 K cameras into a single 50 fps cylindrical panorama video. Due to proper camera synchronization, the produced panoramas exhibit neither ghosting effects nor other visual inconsistencies at the seams. Our panorama videos are designed to support several members of the trainer team at the same time. Using our system, they are able to pan, tilt, and zoom interactively, independently over the entire field, from an overview shot to close-ups of individual players in arbitrary locations. To create such panoramas, each of our cameras covers one part of the field with small overlapping regions, where the individual frames are transformed and stitched together into a single view. We faced two main synchronization challenges in the panorama generation process. First, to stitch frames together without visual artifacts and inconsistencies due to motion, the shutters in the cameras had to be synchronized with sub-millisecond accuracy. Second, to circumvent the need for software readjustment of color and brightness around the seams between cameras, the exposure settings were synchronized. This chapter describes these synchronization mechanisms that were designed, implemented, evaluated, and integrated in the Bagadus system.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
This is slightly slower than the 100-m sprint world record from 2009 by Usain Bolt of 9.58 s [23]. Bolt’s movement between frames would be approximately 42 cm. The speed of the Ronny Heberson’s free kick [24] was clocked at a whooping 221 km/h, which would result in a difference of 2.5 m between frames from different cameras.
- 2.
The Bagadus trigger box design and code are available here: https://bitbucket.org/mpg_code/micro-trigger-box.
References
Halvorsen, P., Sægrov, S., Mortensen, A., Kristensen, D.K., Eichhorn, A., Stenhaug, M., Dahl, S., Stensland, H.K., Gaddam, V.R., Griwodz, C., Johansen, D.: Bagadus: An integrated system for arena sports analytics a soccer case study. In: Proceedings of ACM MMSys, pp. 48–59 (2013)
Sægrov, S., Eichhorn, A., Emerslund, J., Stensland, H.K., Griwodz, C., Johansen, D., Halvorsen, P.: Bagadus: an integrated system for soccer analysis (demo). In: Proceedings of ICDSC, pp. 1–2 (2012)
Stensland, H.K., Gaddam, V.R., Tennøe, M., Helgedagsrud, E., Næss, M., Alstad, H.K., Mortensen, A., Langseth, R., Ljødal, S., Landsverk, Ø., Griwodz, C., Halvorsen, P., Stenhaug, M., Johansen, D.: Bagadus: an integrated real-time system for soccer analytics. ACM TOMCCAP 10(1s) (2014)
Mortensen, A., Gaddam, V.R., Stensland, H.K., Griwodz, C., Johansen, D., Halvorsen, P.: Automatic event extraction and video summaries from soccer games. In: Proceedings of ACM MMSys, pp. 176–179 (2014)
ChyronHego: ZXY Sport Tracking. http://www.zxy.no/
Mills, D., Martin, J., Burbank, J., Kasch, W.: Network time protocol version 4: protocol and algorithms specification. RFC 5905 (Proposed Standard) (2010)
Johansen, D., Stenhaug, M., Hansen, R.B.A., Christensen, A., Høgmo, P.M.: Muithu: smaller footprint, potentially larger imprint. In: Proceedings of IEEE ICDIM, pp. 205–214 (2012)
Basler aca2000-50gc. http://www.baslerweb.com/products/ace.html?model=173
Azure-0814m5m. http://www.azurephotonicsus.com/products/azure-0814M5M.html
Stensland, H.K., Gaddam, V.R., Tennøe, M., Helgedagsrud, E., Næss, M., Alstad, H.K., Griwodz, C., Halvorsen, P., Johansen, D.: Processing panorama video in real-time. Int. J. Semant. Comput. 8(2) (2014)
Tennøe, M., Helgedagsrud, E., Næss, M., Alstad, H.K., Stensland, H.K., Gaddam, V.R., Johansen, D., Griwodz, C., Halvorsen, P.: Efficient implementation and processing of a real-time panorama video pipeline. In: Proceedings of IEEE ISM (2013)
Langseth, R., Gaddam, V.R., Stensland, H.K., Griwodz, C., Halvorsen, P., Johansen, D.: An experimental evaluation of debayering algorithms on gpus for recording panoramic video in real-time. Int. J. Multimedia Data Eng. Manag. 6(6) (2015)
Kellerer, L., Gaddam, V.R., Langseth, R., Stensland, H.K., Griwodz, C., Johansen, D., Halvorsen, P.: Real-time HDR panorama video. In: Proceedings of ACM MM, pp. 1205–1208 (2014)
Stensland, H.K., Wilhelmsen, M.A., Gaddam, V.R., Mortensen, A., Langseth, R., Griwodz, C., Halvorsen, P.: Using a commodity hardware video encoder for interactive applications. Int. J. Multimedia Data Eng. Manag. 6(3), 17–31 (2015)
Wilhelmsen, M.A., Stensland, H.K., Gaddam, V.R., Mortensen, A., Langseth, R., Griwodz, C., Johansen, D., Halvorsen, P.: Using a commodity hardware video encoder for interactive video streaming. In: Proceedings of IEEE ISM (2014)
VideoLAN: x264. http://www.videolan.org/developers/x264.html
Gaddam, V.R., Langseth, R., Ljødal, S., Gurdjos, P., Charvillat, V., Griwodz, C., Halvorsen, P.: Interactive zoom and panning from live panoramic video. In: Proceedings of ACM NOSSDAV, pp. 19–24 (2014)
Gaddam, V.R., Bao Ngo, H., Langseth, R., Griwodz, C., Johansen, D., Halvorsen, P.: Tiling of panorama video for interactive virtual cameras: overheads and potential bandwidth requirement. In: Proceedings of IEEE PV, pp. 204–209 (2015)
Gaddam, V.R., Riegler, M., Eg, R., Griwodz, C., Halvorsen, P.: Tiling in interactive panoramic video: approaches and evaluation. IEEE Trans. Multimedia 18(9), 1819–1831 (2016)
Niamut, O.A., Thomas, E., D’Acunto, L., Concolato, C., Denoual, F., Lim, S.Y.: Mpeg dash srd: Spatial relationship description. In: Proceedings of MMSys (2016)
Sanchez, Y., Skupin, R., Schierl, T.: Compressed domain video processing for tile based panoramic streaming using hevc. In: Proceedings of IEEE ICIP, pp. 2244–2248 (2015)
NTP.org: NTP faq—How does it work? http://www.ntp.org/ntpfaq/NTP-s-algo.htm
Wikipedia: 100 metres. https://en.wikipedia.org/wiki/100_metres
Quora: How fast can a soccer ball be kicked? https://www.quora.com/How-fast-can-a-soccer-ball-be-kicked
Hasler, N., Rosenhahn, B., Thormahlen, T., Wand, M., Gall, J., Seidel, H.P.: Markerless motion capture with unsynchronized moving cameras. In: Proceedings of IEEE CVPR, pp. 224–231 (2009)
Pourcelot, P., Audigié, F., Degueurce, C., Geiger, D., Denoix, J.M.: A method to synchronise cameras using the direct linear transformation technique 33(12), 1751–1754 (2000)
Shrestha, P., Barbieri, M., Weda, H., Sekulovski, D.: Synchronization of multiple camera videos using audio-visual features. IEEE Trans. Multimedia 12(1), 79–92 (2010)
Shrestha, P., Weda, H., Barbieri, M., Sekulovski, D.: Synchronization of multiple video recordings based on still camera flashes. In: Proceedings ACM MM. New York, USA (2006)
Ruttle, J., Manzke, M., Prazak, M., Dahyot, R.: Synchronized real-time multi-sensor motion capture system. In: Proceedings of ACM SIGGRAPH ASIA (2009)
Bradley, D., Atcheson, B., Ihrke, I., Heidrich, W.: Synchronization and rolling shutter compensation for consumer video camera arrays. In: Proceedings of IEEE CVPR, pp. 1–8 (2009)
Duckworth, T., Roberts, D.J.: Camera image synchronisation in multiple camera real-time 3D reconstruction of moving humans. In: Proceedings of DS-RT, pp. 138–144 (2011)
Moore, C., Duckworth, T., Aspin, R., Roberts, D.: Synchronization of images from multiple cameras to reconstruct a moving human. In: Proceedings of IEEE/ACM DR-RT, pp. 53–60 (2010)
Chang, R., Ieng, S., Benosman, R.: Shapes to synchronize camera networks. In: Proceedings of IEEE ICPR, pp. 1–4 (2008)
Sinha, S., Pollefeys, M.: Synchronization and calibration of camera networks from silhouettes. In: Proceedings of ICPR, vol. 1, pp. 116–119 (2004)
Sinha, S.N., Pollefeys, M.: Camera network calibration and synchronization from silhouettes in archived video. Int. J. Comput. Vis. 87(3), 266–283 (2010)
Topçu, O., Ercan, A.Ö., Alatan, A.A.: Recovery of temporal synchronization error through online 3D tracking with two cameras. In: Proceedings of ICDSC, pp. 1–6 (2014)
Haufmann, T.A., Brodtkorb, A.R., Berge, A., Kim, A.: Real-time online camera synchronization for volume carving on GPU. In: Proceedings of AVSS, pp. 288–293 (2013)
Nischt, M., Swaminathan, R.: Self-calibration of asynchronized camera networks. In: Proceedings of ICCV Workshops, pp. 2164–2171 (2009)
Shankar, S., Lasenby, J., Kokaram, A.: Synchronization of user-generated videos through trajectory correspondence and a refinement procedure. In: Proceedings of CVMP, pp. 1–10 (2013)
Shankar, S., Lasenby, J., Kokaram, A.: Warping trajectories for video synchronization. In: Proceedings of ARTEMIS, pp. 41–48 (2013)
Tao, J., Risse, B., Jiang, X., Klette, R.: 3D trajectory estimation of simulated fruit flies. In: Proceedings of IVCNZ (2012)
Velipasalar, S., Wolf, W.H.: Frame-level temporal calibration of video sequences from unsynchronized cameras. Mach. Vis. Appl. 19(5–6), 395–409 (2008)
Whitehead, A., Laganiere, R., Bose, P.: Temporal synchronization of video sequences in theory and in practice. In: Proceedings of IEEE WACV/MOTION, pp. 132–137 (2005)
Kovacs, J.: AN005 Application Note—An Overview of Genlock. http://www.mivs.com/old-site/documents/appnotes/an005.html
Smith, S.L.: Application of high-speed videography in sports analysis. In: Proceedings of SPIE 1757 (1993)
Collins, R.T., Amidi, O., Kanade, T.: An active camera system for acquiring multi-view video. In: Proceedings of ICIP, pp. 520–527 (2002)
Lin, M.Y.: Shutter synchronization circuit for stereoscopic systems (1998). https://www.google.com/patents/US5808588
Gross, M., Lang, S., Strehlke, K., Moere, A.V., Staadt, O., Würmlin, S., Naef, M., Lamboray, E., Spagno, C., Kunz, A., Koller-Meier, E., Svoboda, T., Van Gool, L.: Blue-c: a spatially immersive display and 3D video portal for telepresence. In: Proceedings of ACM SIGGRAPH (2003)
Wilburn, B., Joshi, N., Vaish, V., Levoy, M., Horowitz, M.: High-speed videography using a dense camera array. Proc. IEEE CVPR 2, 294–301 (2004)
Meyer, F., Bahr, A., Lochmatter, T., Borrani, F.: Wireless GPS-based phase-locked synchronization system for outdoor environment. J. Biomech. 45(1), 188–190 (2012)
Litos, G., Zabulis, X., Triantafyllidis, G.: Synchronous image acquisition based on network synchronization. In: Proceedings of CVPR Workshops (3DCINE), pp. 167–167
Sousa, R.M., Wäny, M., Santos, P., Dias, M.: Multi-camera synchronization core implemented on USB3 based FPGA platform. In: Proceedings of SPIE 9403 (2015)
Nguyen, H., Nguyen, D., Wang, Z., Kieu, H., Le, M.: Real-time, high-accuracy 3D imaging and shape measurement. Appl. Opt. 54(1), A9 (2015)
Gaddam, V.R., Griwodz, C., Halvorsen, P.: Automatic exposure for panoramic systems in uncontrolled lighting conditions: a football stadium case study. In: Proceedings of SPIE 9012—The Engineering Reality of Virtual Reality (2014)
Hasler, D., Ssstrunk, S.: Mapping colour in image stitching applications. J. Visual Commun. Im. Represent. 15(1), 65–90 (2004)
Tian, G.Y., Gledhill, D., Taylor, D., Clarke, D.: Colour correction for panoramic imaging. In: Proceedings of IV (2002)
Doutre, C., Nasiopoulos, P.: Fast vignetting correction and color matching for panoramic image stitching. In: Proceedings of ICIP, pp. 709–712 (2009)
Xu, W., Mulligan, J.: Performance evaluation of color correction approaches for automatic multi-view image and video stitching. In: Proceedings of IEEE CVPR, pp. 263–270 (2010)
Xiong, Y., Pulli, K.: Color correction for mobile panorama imaging. In: Proceedings of ICIMCS, pp. 219–226 (2009)
Ibrahim, M., Hafiz, R., Khan, M., Cho, Y., Cha, J.: Automatic reference selection for parametric color correction schemes for panoramic video stitching. Adv. Visual Comput. Lect. Notes Comput. Sci. 7431, 492–501 (2012)
Debevec, P.E., Malik, J.: Recovering high dynamic range radiance maps from photographs. In: Proceedings of SIGGRAPH, pp. 369–378 (1997)
Larson, G.W., Rushmeier, H., Piatko, C.: A visibility matching tone reproduction operator for high dynamic range scenes. IEEE Trans. Visual. Comput. Graph. 3(4), 291–306 (1997)
Reinhard, E., Stark, M., Shirley, P., Ferwerda, J.: Photographic tone reproduction for digital images. ACM Trans. Graph. 21(3), 267–276 (2002)
Gonzalez, R.C., Woods, R.E.: Digital Image Processing, 3rd edn. Prentice-Hall, Inc. (2006)
Langseth, R.: Implementation of a distributed real-time video panorama pipeline for creating high quality virtual views. University of Oslo (2014)
Acknowledgements
This work has been performed in the context of the iAD Centre for Research-based Innovation (project number 174867), and it is also supported in part by the EONS project (project number 231687)—both funded by the Research Council of Norway. Furthermore, there are numerous students and researchers that have worked on Bagadus or post-Bagadus solutions. For the synchronization of cameras, the authors want to acknowledge in alphabetical order: Alexander Eichhorn, Martin Stensgård, and Simen Sægrov.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this chapter
Cite this chapter
Gaddam, V.R. et al. (2018). Camera Synchronization for Panoramic Videos. In: Montagud, M., Cesar, P., Boronat, F., Jansen, J. (eds) MediaSync. Springer, Cham. https://doi.org/10.1007/978-3-319-65840-7_20
Download citation
DOI: https://doi.org/10.1007/978-3-319-65840-7_20
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-65839-1
Online ISBN: 978-3-319-65840-7
eBook Packages: Computer ScienceComputer Science (R0)