8000 GitHub - sypei/Hand-Interfaces: Hand Interfaces: Using Hands to Imitate Objects in AR/VR for Expressive Interactions (CHI '22 Honorable Mention)
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Hand Interfaces: Using Hands to Imitate Objects in AR/VR for Expressive Interactions (CHI '22 Honorable Mention)

License

Notifications You must be signed in to change notification settings

sypei/Hand-Interfaces

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hand Interfaces: Using Hands to Imitate Objects in AR/VR for Expressive Interactions

Siyou Pei, Alexander Chen, Jaewook Lee, Yang Zhang 🏆 CHI '22 Honorable Mention

A user imitates a joystick with a thumb-up pose and manipulates the joystick by grabbing the thumb with another hand

📘 paper   🎬 preview   🎙️ full presentation   :desktop_computer: lab website

Table of Contents

Motivation

What does Hand Interfaces do?

Quick Start

Full Project Implementation

Help

Acknowledgments

Citation

In the End

Motivation

In the digital reality, there are many objects to retrieve and interact with. This rich set of objects means a lot to the user experience. To play with so many objects, the prevailing method is users hold hand controllers. But holding something all the time can be cumbersome. We want to bring up a new interaction technique that is more readily available.

In the rock-paper-scissors game, people use hands in different shapes to imitate rock, paper and scissors. It is intuitive and self-revealing.

So we wonder, can we generalize this idea to many other objects and interactions in AR and VR? Then, we propose Hand Interfaces!

picture of 28 hand interfaces

What does Hand Interfaces do?

We propose Hand Interfaces - a new free-hand interaction technique that allows users to embody objects through imitating them with hands. In short, hands now BECOME those virtual objects.

Further, Hand Interfaces not only supports object retrieval but also interactive control.

on the left, a hand performs a thumb-up and becomes a joystick, to retrieve an object. On the right, the other hand is manipulating the thumb of the imitating hand, to do interactive control

Quick Start

If you have a Meta Oculus Quest headset, we have a minimal app for you to install easily. You only need the Scissors_Demo.apk in this repo, nothing else. The app contains a pair of scissors. You can perform a peace gesture to retrieve the scissors in the air, and close your index and middle fingers to snip virtual objects in front of you (e.g. a cake, a bread, or even a phone!).

To install it, please

  1. Download the Scissors_Demo.apk in this repo.
  2. Make sure your Quest account has enabled developer mode. Official tutorial
  3. Sideload the apk onto your Quest headset through Oculus Developer Hub, or SideQuest or ADB commands.
  4. Find the apk file in your Oculus library - unknown sources. Have fun!

Full Project Implementation

The 11 objects used in user studies are included in this github project. The two tasks - object retrieval and interactive control - are split into separate scenes in user studies.

Dependencies and Configuration

To build and run the full project, we need to set up our Quest device for Unity development. This is a tutorial I made when I was a TA of a VR course.

Project Structure

handinterfaces/Assets/Scenes/prototypes stores early versions of idea prototypes.

handinterfaces/Assets/Scenes/user study contains what we showed to participants during user studies. The unity scenes starting with "Retr" are for "object retrieval", while those starting with "Inter" refer to "interactive control".

As indicated in the paper, DM, VG, HI are abbreviations for "Drop-down Menu", "VirtualGrasp" and "Hand Interfaces", FG, VM, HI are abbreviations for "Fist Gesture", "Virtual Manipulation" and "Hand Interfaces".

handinterfaces/Assets/Example Applications/ includes three applications mentioned in the paper, they are "education", "entertainment", and "IoT". Demo videos can be found here.

Help

Feel free to create a new issue in the 'Issues' tab!

Acknowledgments

https://github.com/DavidArayan/ezy-slice
https://github.com/dilmerv/VRDraw
https://github.com/pharan/Unity-MeshSaver

In addition, many thanks to Quentin Valembois, Dilmer Valecillos and Antony Vitillo for their contribution to the AR/VR content creation community.

Special thanks to the authors of VirtualGrasp, Yukang Yan, et al. for inspiring this thread of work in designing AR/VR interfaces around hands’ expressivity.

Citation

@inproceedings{pei2022hand,
author = {Pei, Siyou and Chen, Alexander and Lee, Jaewook and Zhang, Yang},
title = {Hand Interfaces: Using Hands to Imitate Objects in AR/VR for Expressive Interactions},
year = {2022},
isbn = {9781450391573},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3491102.3501898},
doi = {10.1145/3491102.3501898},
booktitle = {Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems},
articleno = {429},
numpages = {16},
keywords = {Interaction design, AR/VR, Embodiment, Free-hand interactions, Imitation, On-body interactions},
location = {New Orleans, LA, USA},
series = {CHI '22}
}

In the End

Thank you for reading! Isn't it exciting to create the next-gen user experience for XR :D

You may also find me @SiyouPei

About

Hand Interfaces: Using Hands to Imitate Objects in AR/VR for Expressive Interactions (CHI '22 Honorable Mention)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0