[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2044354.2044360acmotherconferencesArticle/Chapter ViewAbstractPublication PagesihmConference Proceedingsconference-collections
research-article

Using the user's point of view for interaction on mobile devices

Published: 24 October 2011 Publication History

Abstract

We study interaction modalities for mobile devices (smartphones and tablets) that rely on a camera-based head tracking. This technique defines new possibilities for input and output interaction. For output, by computing the position of the device according to the user's head, it is for example possible to realistically control the viewpoint on a 3D scene (Head-Coupled Perspective, HCP). This technique improves the output interaction bandwidth by enhancing the depth perception and by allowing the visualization of large workspaces (virtual window). For input, head movement can be used as a means of interacting with a mobile device. Moreover such an input modality does not require any additional sensor except the built-in front-facing camera. In this paper, we classify the interaction possibilities offered by head tracking on smartphones and tablets. We then focus on the output interaction by introducing several applications of HCP on both smartphones and tablets and by presenting the results of a qualitative user experiment.

References

[1]
Arthur, K. W., Booth, K. S., Ware, C. 1993. Evaluating 3D Task Performance for Fish Tank Virtual Worlds. ACM Trans. Inf. Syst. 11, 3 (July 1993), 239--265. DOI=http://doi.acm.org/10.1145/159161.155359
[2]
Baudish, P., Rosenholtz, R. 2003. Halo: a technique for visualizing off-screen objects. In Proceedings of the SIGCHI conference on Human factors in computing systems (CHI '03). ACM, New York, NY, USA, 481--488. DOI= http://doi.acm.org/10.1145/642611.642695
[3]
Bérard, F. 1999. The Perceptual Window: Head Motion as a New Input Stream. In Proceedings of INTERACT 1999, IOS Press, 238--244.
[4]
Bernsen, N., 1994. Modality Theory in Support of MultiModal Interface Design. In Proc. of Intelligent MultiMedia Multi-Modal Systems, 37--44.
[5]
Bouchet, J., Nigay, l., Ganille, T. 2004. ICARE Software Components for Rapidly Developing Multimodal Interfaces. In Proceedings of ICMI 2004. ACM, New York, NY, USA, 252--258. DOI=http://doi.acm.org/10.1145/1027933.1027975
[6]
Brewster, S., Lumsden, J., Bell, M., Hall, M., Tasker, S. 2003. Multimodal 'Eyes-Free' Interaction Techniques for Wearable Devices. In Proceedings of CHI 2003. ACM, New York, USA, ACM, 473--480. DOI=http://doi.acm.org/10.1145/642611.642694
[7]
Brooke, J. 1996. SUS: A Quick and Dirty Usability Scale, in P. W. Jordan, B. Thomas, B. A. Weerdmeester & I. L. McClelland (Eds.), Usability Evaluation in Industry. London, UK: Taylor & Francis. 189--194.
[8]
Bulbul, A., Cipiloglu, Z., Capin, T. 2009. A Face Tracking Algorithm for User Interaction in Mobile Devices. In Proceedings of CW 2009. IEEE Computer Society, Washington, DC, USA. 385--390. DOI=http://dx.doi.org/10.1109/CW.2009.9
[9]
Chi, E., Riedl, J. 1998. An Operator Interaction Framework for Visualization Systems. In Proceedings of InfoVis 1998, IEEE Computer Society, 63--70. DOI=http://dx.doi.org/10.1109/INFVIS.1998.729560
[10]
Crossan, A., McGill, M., Brewster, S., Murray-Smith, R. 2009. Head Tilting for Interaction in Mobile Contexts, In Proceedings of MobileHCI '09. ACM, New York, NY, USA. Article 6. DOI=http://doi.acm.org/10.1145/1613858.1613866
[11]
Cruz-Neira, C., Sandin, J. D., DeFanti, T. A. 1993. Surround-Screen Projection Based Virtual Reality: The Design and Implementation of the CAVE. In Proceedings of SIGGRAPH 1993. ACM, New York, NY, USA. 135--142. DOI=http://doi.acm.org/10.1145/166117.166134
[12]
Elmqvist, N., Tudereanu, M. E., Tsigas, P. 2008. Evaluating Motion Constraints for 3D Wayfinding in Immersive and Desktop Virtual Environments. In Proceeding of CHI 2008. ACM, New York, NY, USA. 1769--1778. DOI=http://doi.acm.org/10.1145/1357054.1357330
[13]
Fitzmaurice, G. W. 1993. Situated Information Spaces and Spatially Aware Palmtop Computers. Communications of the ACM. 36, 7 (July 1993). 38--49. DOI=http://doi.acm.org/10.1145/159544.159566
[14]
Hannuksela, J., Sangi, P., Turtinen, M., Heikkilä, J. 2008. Face Tracking for Spatially Aware Mobile User Interfaces. In Procedings of ICISP 2008. Springer-Verlag, Berlin, Heidelberg. 405--412. DOI=http://dx.doi.org/10.1007/978-3-540-69905-7_46
[15]
Hassenzahl, M., Burmester, M., Koller, F. 2003. AttrakDiff: A Questionnaire for Measuring the Perceived Hedonic and Pragmatic Quality. Human & Computer 2003. 187--196.
[16]
"Hidden 3D Image: There it is!", Nintendo DSi Ware, 2010. www.nintendo.co.jp/ds/dsiware/krgj/
[17]
Hirose, M., Ogi, T., Ishiwata, S., Yamada, T. 1999. Development and Evaluation of the Immersive Multiscreen Display CABIN. Systems and Computers in Japan. 30: 13--22. DOI=http://dx.doi.org/10.1002/(SICI)1520--684X(199901)30:1<13::AID-SCJ2>3.0.CO;2--L
[18]
Karlson, A., Bederson, B., Contreras-Vidal, J. 2008. Understanding One Handed Use of Mobile Devices. In Lumsden, J. ed., Handbook of Research on User Interface Design and Evaluation for Mobile Technology. IGI. 86--101.
[19]
Lee, J. C. 2008. Hacking the Nintendo Wii Remote. In IEEE Pervasive Computing 7, 3 (July 2008). 39--45. DOI=http://dx.doi.org/10.1109/MPRV.2008.53
[20]
Morency, L. P., Darrell, T. 2006. Head Gesture Recognition in Intelligent Interfaces: the Role of Context in Improving Recognition. In Proceedings of IUI '06. ACM, New York, NY, USA. 32--38. DOI=http://doi.acm.org/10.1145/1111449.1111464
[21]
Nigay, L., Coutaz, J. 1995. A Generic Platform for Addressing the Multimodal Challenge. In Proceedings of CHI '95. ACM Press/Addison-Wesley Publishing Co., New York, NY, USA. 98--105. DOI=http://dx.doi.org/10.1145/223904.22391798--105.
[22]
Open Computer Vision Library (Official website) http://opencvlibrary.sourceforge.net/
[23]
OpenGL ES (Official website) http://www.khronos.org/opengles/
[24]
Rekimoto, J. 1995. A Vision-Based Head Tracker for Fish Tank Virtual Reality - VR Without Head Gear. In Proceedings of VRAIS '95. IEEE Computer Society, Washington, DC, USA. 94--100. DOI=http://doi.ieeecomputersociety.org/10.1109/VRAIS.1995.512 484
[25]
Rekimoto, J. 1996. Tilting Operations for Small Screen Interfaces. In Procedings of UIST '96. ACM, New York, NY, USA. 167--168. DOI=http://doi.acm.org/10.1145/237091.237115
[26]
Robertson, G. G., Mackinlay, J. D., Card, S. K. 1991. The Perspective Wall: Detail And Context Smoothly Integrated. In Proceedings of CHI '91. ACM, New York, NY, USA. 173--179. DOI=http://doi.acm.org/10.1145/108844.108870
[27]
Ruddarraju, R., Haro, A., Nagel, K., Tran, Q. T., Essa, I. A., Abowd, G., Mynatt, E. D. 2003. Perceptual User Interfaces Using Vision-Based Eye Tracking. In Proceedings of ICMI '03. ACM, New York, NY, USA. 227--233. DOI=http://doi.acm.org/10.1145/958432.958475
[28]
Shneiderman, B. 1996. The Eyes Have It: A Task by Data Type Taxonomy for Information Visualizations. In Proceedings of VL '96. IEEE Computer Society, Washington, DC, USA. 336--343. DOI=http://doi.ieeecomputersociety.org/10.1109/VL.1996.545307
[29]
Silva, M. G., Bowman, D. A. 2009. Body-Based Interaction for Desktop Games. In Proceedings of CHI '09. ACM, New York, NY, USA. 4249--4254. DOI= http://doi.acm.org/10.1145/1520340.1520648
[30]
Sko, T., Gardner, H. J. 2009. Head Tracking in First-Person Games: Interaction Using a Web-Camera. In Procedings of INTERACT '09. Springer-Verlag, Berlin, Heidelberg. 342--355. DOI=http://dx.doi.org/10.1007/978-3-642-03655-2_38
[31]
Stavness, I., Lam, B., Fels, S. 2010. pCubee: a Perspective-Corrected handheld cubic display. In Proceedings of CHI '10. ACM, New York, NY, USA. 1381--1390. DOI= http://doi.acm.org/10.1145/1753326.1753535
[32]
Ware, C., Arthur, K., Booth, K. S. 1993. Fish Tank Virtual Reality. In Proceedings of INTERACT '93 and CHI '93. ACM, New York, USA. 37--42. DOI=http://doi.acm.org/10.1145/169059.169066
[33]
Yee, K. P. 2003. Peephole Displays: Pen Interaction on Spatially Aware Handheld Computers. In Proceedings of CHI '03. ACM, New York, NY, YSA. 1--8. DOI=http://doi.acm.org/10.1145/642611.642613
[34]
Yim, J., Qiu, E., Graham, T. C. N. 2008. Experience in the Design and Development of a Game Based on Head-Tracking Input. In Proceedings of Future Play '08. ACM, New York, NY, USA. 236--239. DOI= http://doi.acm.org/10.1145/1496984.1497033

Cited By

View all
  • (2023)Human-to-Computer Interactivity Features Incorporated Into Behavioral Health mHealth Apps: Systematic SearchJMIR Formative Research10.2196/449267(e44926)Online publication date: 30-Jun-2023
  • (2022)Understanding and Creating Spatial Interactions with Distant Displays Enabled by Unmodified Off-The-Shelf SmartphonesMultimodal Technologies and Interaction10.3390/mti61000946:10(94)Online publication date: 19-Oct-2022
  • (2021)3D Displays: Their Evolution, Inherent Challenges and Future PerspectivesProceedings of the Future Technologies Conference (FTC) 2021, Volume 310.1007/978-3-030-89912-7_31(397-415)Online publication date: 25-Oct-2021
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
IHM '11: Proceedings of the 23rd Conference on l'Interaction Homme-Machine
October 2011
169 pages
ISBN:9781450308229
DOI:10.1145/2044354
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

  • UNSA: University of Nice Sophia Antipolis
  • AFIHM: Ass. Francophone d'Interaction Homme-Machine
  • I3S lab: I3S lab
  • INRIA: Institut Natl de Recherche en Info et en Automatique

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 October 2011

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. 3D interface
  2. head-coupled perspective
  3. interaction modality
  4. mobile device

Qualifiers

  • Research-article

Conference

IHM'11
Sponsor:
  • UNSA
  • AFIHM
  • I3S lab
  • INRIA

Acceptance Rates

Overall Acceptance Rate 103 of 199 submissions, 52%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)12
  • Downloads (Last 6 weeks)2
Reflects downloads up to 26 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Human-to-Computer Interactivity Features Incorporated Into Behavioral Health mHealth Apps: Systematic SearchJMIR Formative Research10.2196/449267(e44926)Online publication date: 30-Jun-2023
  • (2022)Understanding and Creating Spatial Interactions with Distant Displays Enabled by Unmodified Off-The-Shelf SmartphonesMultimodal Technologies and Interaction10.3390/mti61000946:10(94)Online publication date: 19-Oct-2022
  • (2021)3D Displays: Their Evolution, Inherent Challenges and Future PerspectivesProceedings of the Future Technologies Conference (FTC) 2021, Volume 310.1007/978-3-030-89912-7_31(397-415)Online publication date: 25-Oct-2021
  • (2019)Usability Evaluations of Mobile Mental Health Technologies: A Systematic Review Study (Preprint)Journal of Medical Internet Research10.2196/15337Online publication date: 2-Jul-2019
  • (2018)User Behavior and the Importance of Stereo for Depth Perception in Fish Tank Virtual RealityPRESENCE: Virtual and Augmented Reality10.1162/pres_a_0032727:2(206-225)Online publication date: 1-Feb-2018
  • (2018)User-Perspective Rendering for Handheld Applications2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct.2018.00084(270-274)Online publication date: Oct-2018
  • (2017)Constructing a 3D Multiple Mobile Medical Imaging System through Service Science, Management, Engineering and DesignSystems10.3390/systems50100055:1(5)Online publication date: 17-Jan-2017
  • (2017)Tablet fish tank virtual realityProceedings of the 27th International Conference on Artificial Reality and Telexistence and 22nd Eurographics Symposium on Virtual Environments: Posters and Demos10.2312/egve.20171377(27-28)Online publication date: 22-Nov-2017
  • (2016)Automatic display zoom using face size of camera imageProceedings of the 26th International Conference on Artificial Reality and Telexistence and the 21st Eurographics Symposium on Virtual Environments: Posters and Demos10.5555/3059962.3059964(1-2)Online publication date: 7-Oct-2016
  • (2016)A Robust Camera-Based Interface for Mobile EntertainmentSensors10.3390/s1602025416:2(254)Online publication date: 19-Feb-2016
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media