[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

A Camera-Based Input Device for Large Interactive Displays

Published: 01 July 2005 Publication History

Abstract

Humanýcomputer interaction using large-format displays is an active area of research that focuses on how humans can better work with computers or other machines. For this to happen, there must be an enabling technology that creates the interface between man and machine. Touch capabilities in a large-format display would be advantageous as a large display area is informationally dense and touch provides a natural, life-size interface to that information. This article describes a new enabling technology in the form of a camera-based man-machine input device that uses smart cameras to analyze a scene directly in front of a large-format computer display. The analysis determines where a user has touched the display, and then treats that information as a mouse click, thereby controlling the computer. Significant technological problems have been overcome to make the system robust enough for commercialization. The article also describes camera-based system architecture and presents some interesting advantages as well as new capabilities. The technology is ideally suited to large-format computer displays, thus creating a natural interface with familiar usage paradigms for humanýcomputer interaction.

References

[1]
A. Wilson, "Smart Cameras Embed Processor Power," Vision Systems Design, 2003, pp. 95-99.
[2]
G. Morrison M. Singh and D. Holmgren, "Machine Vision Passive Touch Technology for Interactive Displays," Proc. Soc. for Information Display 2001 Int'l Symp., 2001.
[3]
I. MacKenzie and C. Ware, "Lag as a Determinant of Human Performance in Interactive Systems," Proc. Conf. Human Factors in Computing Systems, 1993, pp. 448-493.
[4]
T. Starner, et al., "The Perceptive Workbench: Computer-Vision-Based Gesture Tracking, Object Tracking, as 3D Reconstruction for Augmented Desks," Machine Vision and Application, 2003, pp. 59-71.
[5]
G. Morrison and D. Holmgren, "Toward a Touch-Sensitive Display Wall," Proc. Soc. for Information Displays 2003 Int'l Symp., 2003.
[6]
V. Cheng and N. Kehtarnavaz, "A Smart Camera Application: DSP-Based People Detection and Tracking," SPIE J. Electronic Imaging, 2000.
[7]
J. Summet, et al., "Increasing the Usability of Virtual Rear Projection," Int'l Workshop on Projector-Camera Systems, 2003.
[8]
T. Funkhouser and K. Li, "Large Format Displays," IEEE Computer Graphics and Applications, vol. 20, no. 4, 2000, pp. 20-21.
[9]
P. Wellner, "Interacting with Paper on the DigitalDesk," Comm. ACM, no. 7, 1993, pp. 87-96.
[10]
W. Freeman D. Anderson and P. Beardsley, "Computer Vision for Interactive Computer Graphics," IEEE Computer Graphics and Applications, vol. 18, no. 3, 1998, pp. 42-53.
[11]
F. Quek T. Mysliwiec and M. Zhao, "Finger Mouse: A Freehand Pointing Interface," Proc. Int'l Workshop on Automatic Face and Gesture Recognition, 1995.
[12]
J. Crowley J. Bérard and J. Coutaz, "Finger Tracking as an Input Device for Augmented Reality," Proc. Int'l Workshop on Gesture and Face Recognition, 1995.
[13]
C. Jennings, "Robust Finger Tracking with Multiple Cameras," Proc. IEEE Int'l Workshop Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, 1999, pp. 152-160.
[14]
J. Crowley J. Coutaz and F. Bérard, "Things That See," Comm. ACM, 2000, pp. 54-64.
[15]
C. Hardenberg and F. Bérard, "Bare-Hand Human–Computer Interaction," Proc. ACM Workshop on Perceptive User Interfaces, 2001.
[16]
Z. Zhang, et al., "Visual Panel: Virtual Mouse Keyboard and 3D Controller with an Ordinary Piece of Paper," Workshop on Perceptive User Interfaces, ACM Press, Nov. 2001.
[17]
T. Starner, et al., "The Perceptive Workbench: Computer-Vision-Based Gesture Tracking, Object Tracking, as 3D Reconstruction for Augmented Desks," Machine Vision and Application, 2003, pp. 59-71.
[18]
B. Stenger, et al., "Filtering Using a Tree Based Estimator," Proc. 9th IEEE Int'l Conf. Computer Vision, vol. 2, 2003, pp. 1063-1070.
[19]
J. Letessier and F. Bérard, "Visual Tracking of Bare Fingers for Interactive Surfaces," Proc. User Interface Software and Technology (UIST), 2004.
[20]
J. Rekimoto and N. Matsushita, "Perceptual Surfaces: Toward a Human and Object Sensitive Interactive Display," Proc. Workshop on Perceptual User Interfaces, 1997.
[21]
A. Wilson, "TouchLight: An Imaging Touch Screen and Display for Gesture-Based Interaction," Proc. Int'l Conf. Multimodal Interfaces, 2004.

Cited By

View all
  • (2011)Multi-user pointing and gesture interaction for large screen using infrared emitters and accelerometersProceedings of the 14th international conference on Human-computer interaction: interaction techniques and environments - Volume Part II10.5555/2022466.2022489(185-193)Online publication date: 9-Jul-2011
  • (2010)Federate resource management in a Distributed Virtual EnvironmentFuture Generation Computer Systems10.1016/j.future.2009.08.01426:3(308-317)Online publication date: 1-Mar-2010
  • (2008)Blurring the line between real and digitalProceedings of the 2008 workshop on Immersive projection technologies/Emerging display technologiges10.1145/1394669.1394671(1-5)Online publication date: 9-Aug-2008
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image IEEE Computer Graphics and Applications
IEEE Computer Graphics and Applications  Volume 25, Issue 4
July 2005
84 pages

Publisher

IEEE Computer Society Press

Washington, DC, United States

Publication History

Published: 01 July 2005

Author Tags

  1. computer vision
  2. human–computer interaction
  3. human-computer interaction
  4. large-format touch display
  5. machine vision
  6. smart cameras

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 15 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2011)Multi-user pointing and gesture interaction for large screen using infrared emitters and accelerometersProceedings of the 14th international conference on Human-computer interaction: interaction techniques and environments - Volume Part II10.5555/2022466.2022489(185-193)Online publication date: 9-Jul-2011
  • (2010)Federate resource management in a Distributed Virtual EnvironmentFuture Generation Computer Systems10.1016/j.future.2009.08.01426:3(308-317)Online publication date: 1-Mar-2010
  • (2008)Blurring the line between real and digitalProceedings of the 2008 workshop on Immersive projection technologies/Emerging display technologiges10.1145/1394669.1394671(1-5)Online publication date: 9-Aug-2008
  • (2007)A system for hybrid vision- and sound-based interaction with distal and proximal targets on wall-sized, high-resolution tiled displaysProceedings of the 2007 IEEE international conference on Human-computer interaction10.5555/1779576.1779583(59-68)Online publication date: 20-Oct-2007
  • (2007)A System for Hybrid Vision- and Sound-Based Interaction with Distal and Proximal Targets on Wall-Sized, High-Resolution Tiled DisplaysHuman–Computer Interaction10.1007/978-3-540-75773-3_7(59-68)Online publication date: 20-Oct-2007
  • (2006)Human-centered visualization environmentsundefinedOnline publication date: 5-Mar-2006

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media