Abstract
In this paper, we present a hierarchical smart resource coordination and reconfiguration framework for distributed systems. We view the coordination problem as one of context aware resource reconfiguration. The fundamental unit in this hierarchy is a Fault Containment Unit (FCU) that provides run-time fault-tolerance by deciding on the best alternative course of action when a failure occurs. FCUs are composed hierarchically and are responsible for dynamically reconfiguring failing FCUs at lower levels. When such a reconfiguration is not possible, FCUs propagate the failure upward for resolution. We evaluate the effectiveness of our framework in a people tracking application using a network of cameras. The task for our multi-camera network is to allocate pairs of cameras that localize a subject optimally given the current run-time context. The system automatically derives policies for switching between camera pairs that enable robust tracking while being attentive to certain performance measures. Our approach is unique in that we model the dynamics in the scene and the camera network configuration steers the policies to provide robust tracking.
Similar content being viewed by others
References
Bramberger M., Doblander A., Maier A., Rinner B., Schwabach H.: Distributed embedded smart cameras for surveillance applications. Computer 39(2), 68–75 (2006) doi:10.1109/MC.2006.55
Bramberger, M., Rinner, B., Schwabach, H.: A method for dynamic allocation of tasks in clusters of embedded smart cameras. In: Proceedings of the International Conference Systems, Man and Cybernetics, pp. 2595–2600. IEEE Press (2005). doi:10.1109/ICSMC.2005.1571540
Burrell, J., Brooke, T., Beckwith, R.: Vineyard computing: sensor networks in agricultural production. Pervasive Computing, pp. 38–46 (2004)
Chu, M., Reich, J., Zhao, F.: Distributed attention in large scale video sensor networks. In: Intelligent Distributed Surveilliance Systems, IEE, pp. 61–65 (2004)
Fleck, S., Loy, R., Vollrath, C., Walter, F., Strasser, W.: Smartclassysurv: A smart camera network for distributed tracking and activity recognition and its application to assisted living. In: Distributed Smart Cameras, 2007. ICDSC ’07, pp. 211–218 (2007)
Gandhi T., Trivedi M.M.: Person tracking and reidentification: introducing panoramic appearance map (pam) for feature representation. Mach. Vis. Appl. 18(3/4), 207–220 (2007)
Grocholsky, B., Makarenko, A., Durrant-Whyte, H.F.: Scalable control of decentralised sensor platforms. In: Information Processing in Sensor Networks: 2nd Int Workshop, IPSN03, pp. 96–112 (2003)
Haritaoglu, I., Harwood, D., Davis, L.S.: W4s: Real-time system for detection and tracking people in 2.5d. In: Proceedings of the 5th European Conference on Computer Vision. Freiburg, Germany (1998)
Holness, G., Karuppiah, D.R., Uppala, S., Grupen, R., Ravela, S.C.: A service paradigm for reconfigurable agents. In: Proceedings of the 2nd Workshop on Infrastructure for Agents, MAS, and Scalable MAS (Agents 2001). ACM, Montreal, Canada (2001)
Huber, M., Grupen, R.A.: A hybrid architecture for learning robot control tasks. In: AAAI Spring Symposium Series: Hybrid Systems and AI (1999)
Isler, V., Bajcsy, R.: The sensor selection problem for bounded uncertainty sensing models. In: Proceedings of the 4th International Symposium on Information Processing in Sensor Networks, pp. 151–158 (2005)
Kalbarczyk Z.T., Bagchi S., Whisnant K., Iyer R.K.: Chameleon: a software infrastructure for adaptive fault tolerance. IEEE Trans. Parallel Distrib. Syst. 10(6), 1–20 (1999)
Kokar M.M., Baclawski K., Eracar Y.A.: Control theory based foundations of self controlling software. IEEE Intell. Syst. 14(3), 37–45 (1999)
Kruppa H., Spengler M., Schiele B.: Context-driven model switching for visual tracking. Robot. Autonom. Syst. 41, 101–110 (2002)
Kumar V., Rus D., Singh S.: Robot and sensor networks for first responders. Pervasive Comput. 3(4), 24–34 (2004)
Ladagga R.: Creating robust-software through self-adaptation. IEEE Intell. Syst. 14(3), 26–29 (1999)
Li D., Wong K., Hu Y.H., Sayeed A.: Detection, classification and tracking of targets in distributed sensor networks. IEEE Signal Process. Mag. 19(2), 1–23 (2002)
Mainwaring, A., Polastre, J., Szewczyk, R., Culler, D., Anderson, J.: Wireless sensor networks for habitat monitoring. In: WSNA, pp. 88–97 (2002)
Martinez K., Hart J.K., Ong R.: Environmental sensor networks. Computer 37(8), 50–56 (2004)
Matsuyama, T., et al.: Dynamic memory: Architecture for real time integration of visual perception, camera action, and network communication. In: Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2, pp. 728–737. Hilton Head Island, SC (2000)
Nakazawa, A., Kato, H., Inokuchi, S.: Human tracking using distributed vision systems. In: 14th International Conference on Pattern Recognition, pp. 593–596. Brisbane, Australia (1998)
Ng, K.C., Ishiguro, H., Trivedi, M.M., Sogo, T.: Monitoring dynamically changing environments by ubiquitous vision system. In: IEEE Workshop on Visual Surveillance. Fort Collins, Colorado (1999)
Noury, N., Herve, T., Rialle, V., Virone, G., Mercier, E., Morey, G., Moro, A., Porcheron, T.: Monitoring behaviour in home using a smart fall sensor. In: IEEE-EMBS Special Topic Conference on Microtechnologies in Medicine and Biology, pp. 607–610 (2000)
Parado-Castellote, G., Schneider, S., Hamilton, M.: NDDS: The real-time publish-subscribe network. In: IEEE Real-time Systems Symposium, pp. 222–232. San Francisco, CA (1997)
Pentland A.: Looking at people: sensing for ubiquitous and wearable computing. IEEE Trans. Pattern Anal. Mach. Intell. 22(1), 107–119 (2000)
Schulz, D., Fox, D., Hightower, J.: People tracking with anonymous and id-sensors using rao-blackwellised particle filters. In: Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI) (2003)
Sogo T., Ishiguro H., Trivedi M.M.: N-Ocular stereo for real-time human tracking. In: Benosman, R., Kang, S.B. (eds) Panoramic Vision: Sensors, Theory, and Applications, pp. 359–375. Springer, New York (2001)
Sony ®: Image Sensing Products, Model EVI-D100 (2004). URLhttp://www.sony.net/Products/ISP/products/ptz/EVID100.html
Town, C.: Sensor fusion and environmental modelling for multimodal sentient computing. In: Zhu, Z., Huang, T.S. (eds.) Multimodal Surveillance: Sensors, Algorithms and Systems, pp. 225–257. Artech House Publisher, Boston (2007)
Toyama K., Hager G.: Incremental focus of attention for robust vision-based tracking. Int. J. Comput. Vis. 35(1), 45–63 (1999)
Trivedi M.M., Huang K., Mikic I.: Intelligent environments and active camera networks. IEEE Trans. Syst. Man Cybernet. 2, 804–809 (2000)
Zhu Z., Karuppiah D.R., Riseman E.M., Hanson A.R.: Keeping smart, omnidirectional eyes on you. Robot. Autom. Mag. Special Issue on “Panoramic Robots” 11(4), 69–78 (2004)
Zhu Z., Karuppiah D., Riseman E.M., Hanson A.R.: Dynamic mutual calibration and view planning for cooperative mobile robots with panoramic virtual stereo vision. Comput. Vis. Image Understanding 95(3), 261–286 (2004)
Zhu, Z., Rajasekar, K.D., Riseman, E., Hanson, A.: Panoramic virtual stereo vision of cooperative mobile robots for localizing 3D moving objects. In: Proceedings of IEEE Workshop on Omnidirectional Vision-OMNIVIS’00, pp. 29–36. Hilton Head Island, SC (2000)
Zotkin D.N., Raykar V.C., Duraiswami R., Davis L.S.: Multimodal tracking for smart videoconferencing and video surveillance. In: Zhu, Z., Huang T.S., (eds) Multimodal Surveillance: Sensors, Algorithms and Systems, pp. 141–175. Artech House Publisher, Boston (2007)
Author information
Authors and Affiliations
Corresponding author
Additional information
In fond memory of Edward M. Riseman.
This work is supported by DARPA/ITO Mobile Autonomous Robots S/W (MARS) (Contract Number DOD DABT63-99-1-004) and Software for Distributed Robotics (SDR) (Contract Number DOD DABT63-99-1-0022), by ARO (Grant No. W911NF-05-1-0011) and the NSF CRI Program (Grants No. CNS-0424539 and No. CNS-0551598).
Rights and permissions
About this article
Cite this article
Karuppiah, D.R., Grupen, R.A., Zhu, Z. et al. Automatic resource allocation in a distributed camera network. Machine Vision and Applications 21, 517–528 (2010). https://doi.org/10.1007/s00138-008-0182-7
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00138-008-0182-7