[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content

Advertisement

Log in

Automatic resource allocation in a distributed camera network

  • Original Paper
  • Published:
Machine Vision and Applications Aims and scope Submit manuscript

Abstract

In this paper, we present a hierarchical smart resource coordination and reconfiguration framework for distributed systems. We view the coordination problem as one of context aware resource reconfiguration. The fundamental unit in this hierarchy is a Fault Containment Unit (FCU) that provides run-time fault-tolerance by deciding on the best alternative course of action when a failure occurs. FCUs are composed hierarchically and are responsible for dynamically reconfiguring failing FCUs at lower levels. When such a reconfiguration is not possible, FCUs propagate the failure upward for resolution. We evaluate the effectiveness of our framework in a people tracking application using a network of cameras. The task for our multi-camera network is to allocate pairs of cameras that localize a subject optimally given the current run-time context. The system automatically derives policies for switching between camera pairs that enable robust tracking while being attentive to certain performance measures. Our approach is unique in that we model the dynamics in the scene and the camera network configuration steers the policies to provide robust tracking.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bramberger M., Doblander A., Maier A., Rinner B., Schwabach H.: Distributed embedded smart cameras for surveillance applications. Computer 39(2), 68–75 (2006) doi:10.1109/MC.2006.55

    Article  Google Scholar 

  2. Bramberger, M., Rinner, B., Schwabach, H.: A method for dynamic allocation of tasks in clusters of embedded smart cameras. In: Proceedings of the International Conference Systems, Man and Cybernetics, pp. 2595–2600. IEEE Press (2005). doi:10.1109/ICSMC.2005.1571540

  3. Burrell, J., Brooke, T., Beckwith, R.: Vineyard computing: sensor networks in agricultural production. Pervasive Computing, pp. 38–46 (2004)

  4. Chu, M., Reich, J., Zhao, F.: Distributed attention in large scale video sensor networks. In: Intelligent Distributed Surveilliance Systems, IEE, pp. 61–65 (2004)

  5. Fleck, S., Loy, R., Vollrath, C., Walter, F., Strasser, W.: Smartclassysurv: A smart camera network for distributed tracking and activity recognition and its application to assisted living. In: Distributed Smart Cameras, 2007. ICDSC ’07, pp. 211–218 (2007)

  6. Gandhi T., Trivedi M.M.: Person tracking and reidentification: introducing panoramic appearance map (pam) for feature representation. Mach. Vis. Appl. 18(3/4), 207–220 (2007)

    Article  MATH  Google Scholar 

  7. Grocholsky, B., Makarenko, A., Durrant-Whyte, H.F.: Scalable control of decentralised sensor platforms. In: Information Processing in Sensor Networks: 2nd Int Workshop, IPSN03, pp. 96–112 (2003)

  8. Haritaoglu, I., Harwood, D., Davis, L.S.: W4s: Real-time system for detection and tracking people in 2.5d. In: Proceedings of the 5th European Conference on Computer Vision. Freiburg, Germany (1998)

  9. Holness, G., Karuppiah, D.R., Uppala, S., Grupen, R., Ravela, S.C.: A service paradigm for reconfigurable agents. In: Proceedings of the 2nd Workshop on Infrastructure for Agents, MAS, and Scalable MAS (Agents 2001). ACM, Montreal, Canada (2001)

  10. Huber, M., Grupen, R.A.: A hybrid architecture for learning robot control tasks. In: AAAI Spring Symposium Series: Hybrid Systems and AI (1999)

  11. Isler, V., Bajcsy, R.: The sensor selection problem for bounded uncertainty sensing models. In: Proceedings of the 4th International Symposium on Information Processing in Sensor Networks, pp. 151–158 (2005)

  12. Kalbarczyk Z.T., Bagchi S., Whisnant K., Iyer R.K.: Chameleon: a software infrastructure for adaptive fault tolerance. IEEE Trans. Parallel Distrib. Syst. 10(6), 1–20 (1999)

    Article  Google Scholar 

  13. Kokar M.M., Baclawski K., Eracar Y.A.: Control theory based foundations of self controlling software. IEEE Intell. Syst. 14(3), 37–45 (1999)

    Article  Google Scholar 

  14. Kruppa H., Spengler M., Schiele B.: Context-driven model switching for visual tracking. Robot. Autonom. Syst. 41, 101–110 (2002)

    Article  Google Scholar 

  15. Kumar V., Rus D., Singh S.: Robot and sensor networks for first responders. Pervasive Comput. 3(4), 24–34 (2004)

    Article  Google Scholar 

  16. Ladagga R.: Creating robust-software through self-adaptation. IEEE Intell. Syst. 14(3), 26–29 (1999)

    Article  Google Scholar 

  17. Li D., Wong K., Hu Y.H., Sayeed A.: Detection, classification and tracking of targets in distributed sensor networks. IEEE Signal Process. Mag. 19(2), 1–23 (2002)

    Google Scholar 

  18. Mainwaring, A., Polastre, J., Szewczyk, R., Culler, D., Anderson, J.: Wireless sensor networks for habitat monitoring. In: WSNA, pp. 88–97 (2002)

  19. Martinez K., Hart J.K., Ong R.: Environmental sensor networks. Computer 37(8), 50–56 (2004)

    Article  Google Scholar 

  20. Matsuyama, T., et al.: Dynamic memory: Architecture for real time integration of visual perception, camera action, and network communication. In: Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2, pp. 728–737. Hilton Head Island, SC (2000)

  21. Nakazawa, A., Kato, H., Inokuchi, S.: Human tracking using distributed vision systems. In: 14th International Conference on Pattern Recognition, pp. 593–596. Brisbane, Australia (1998)

  22. Ng, K.C., Ishiguro, H., Trivedi, M.M., Sogo, T.: Monitoring dynamically changing environments by ubiquitous vision system. In: IEEE Workshop on Visual Surveillance. Fort Collins, Colorado (1999)

  23. Noury, N., Herve, T., Rialle, V., Virone, G., Mercier, E., Morey, G., Moro, A., Porcheron, T.: Monitoring behaviour in home using a smart fall sensor. In: IEEE-EMBS Special Topic Conference on Microtechnologies in Medicine and Biology, pp. 607–610 (2000)

  24. Parado-Castellote, G., Schneider, S., Hamilton, M.: NDDS: The real-time publish-subscribe network. In: IEEE Real-time Systems Symposium, pp. 222–232. San Francisco, CA (1997)

  25. Pentland A.: Looking at people: sensing for ubiquitous and wearable computing. IEEE Trans. Pattern Anal. Mach. Intell. 22(1), 107–119 (2000)

    Article  Google Scholar 

  26. Schulz, D., Fox, D., Hightower, J.: People tracking with anonymous and id-sensors using rao-blackwellised particle filters. In: Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI) (2003)

  27. Sogo T., Ishiguro H., Trivedi M.M.: N-Ocular stereo for real-time human tracking. In: Benosman, R., Kang, S.B. (eds) Panoramic Vision: Sensors, Theory, and Applications, pp. 359–375. Springer, New York (2001)

    Google Scholar 

  28. Sony ®: Image Sensing Products, Model EVI-D100 (2004). URLhttp://www.sony.net/Products/ISP/products/ptz/EVID100.html

  29. Town, C.: Sensor fusion and environmental modelling for multimodal sentient computing. In: Zhu, Z., Huang, T.S. (eds.) Multimodal Surveillance: Sensors, Algorithms and Systems, pp. 225–257. Artech House Publisher, Boston (2007)

  30. Toyama K., Hager G.: Incremental focus of attention for robust vision-based tracking. Int. J. Comput. Vis. 35(1), 45–63 (1999)

    Article  Google Scholar 

  31. Trivedi M.M., Huang K., Mikic I.: Intelligent environments and active camera networks. IEEE Trans. Syst. Man Cybernet. 2, 804–809 (2000)

    Google Scholar 

  32. Zhu Z., Karuppiah D.R., Riseman E.M., Hanson A.R.: Keeping smart, omnidirectional eyes on you. Robot. Autom. Mag. Special Issue on “Panoramic Robots” 11(4), 69–78 (2004)

    Google Scholar 

  33. Zhu Z., Karuppiah D., Riseman E.M., Hanson A.R.: Dynamic mutual calibration and view planning for cooperative mobile robots with panoramic virtual stereo vision. Comput. Vis. Image Understanding 95(3), 261–286 (2004)

    Article  Google Scholar 

  34. Zhu, Z., Rajasekar, K.D., Riseman, E., Hanson, A.: Panoramic virtual stereo vision of cooperative mobile robots for localizing 3D moving objects. In: Proceedings of IEEE Workshop on Omnidirectional Vision-OMNIVIS’00, pp. 29–36. Hilton Head Island, SC (2000)

  35. Zotkin D.N., Raykar V.C., Duraiswami R., Davis L.S.: Multimodal tracking for smart videoconferencing and video surveillance. In: Zhu, Z., Huang T.S., (eds) Multimodal Surveillance: Sensors, Algorithms and Systems, pp. 141–175. Artech House Publisher, Boston (2007)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhigang Zhu.

Additional information

In fond memory of Edward M. Riseman.

This work is supported by DARPA/ITO Mobile Autonomous Robots S/W (MARS) (Contract Number DOD DABT63-99-1-004) and Software for Distributed Robotics (SDR) (Contract Number DOD DABT63-99-1-0022), by ARO (Grant No. W911NF-05-1-0011) and the NSF CRI Program (Grants No. CNS-0424539 and No. CNS-0551598).

Rights and permissions

Reprints and permissions

About this article

Cite this article

Karuppiah, D.R., Grupen, R.A., Zhu, Z. et al. Automatic resource allocation in a distributed camera network. Machine Vision and Applications 21, 517–528 (2010). https://doi.org/10.1007/s00138-008-0182-7

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00138-008-0182-7

Keywords

Navigation