[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US10140507B2 - Apparatus and method for recognizing hand gestures in a virtual reality headset - Google Patents

Apparatus and method for recognizing hand gestures in a virtual reality headset Download PDF

Info

Publication number
US10140507B2
US10140507B2 US14/982,299 US201514982299A US10140507B2 US 10140507 B2 US10140507 B2 US 10140507B2 US 201514982299 A US201514982299 A US 201514982299A US 10140507 B2 US10140507 B2 US 10140507B2
Authority
US
United States
Prior art keywords
user
hand
detected
detected object
vision sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/982,299
Other versions
US20170185830A1 (en
Inventor
Gaurav Srivastava
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US14/982,299 priority Critical patent/US10140507B2/en
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SRIVASTAVA, GAURAV
Priority to KR1020160026357A priority patent/KR102568708B1/en
Priority to EP16181893.5A priority patent/EP3188075B1/en
Priority to CN201610801612.XA priority patent/CN106933343B/en
Publication of US20170185830A1 publication Critical patent/US20170185830A1/en
Application granted granted Critical
Publication of US10140507B2 publication Critical patent/US10140507B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • G06K9/00355
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Definitions

  • the present application relates generally to virtual reality (VR) headsets and, in particular, to a system for correctly identifying the hand gestures of the legitimate user of a VR headset.
  • VR virtual reality
  • VR virtual reality
  • AR augmented reality
  • VR headset projects three-dimensional (3D) images of a virtual world that may appear quite real to the user.
  • One of the key features of a VR headset is the ability to recognize and identify the hand gestures of the user of the VR headset.
  • the front vision sensor on the VR headset detects a hand in the scene for the purpose of identifying user hand gestures, it is difficult to determine whether the hand belong to the legitimate user of the VR headset or to an intruder in the field of vision of the front vision sensor.
  • the default assumption of conventional VR headsets is that a detected hand belongs to the actual user (i.e., the person wearing and operating the VR device). But it is possible that another person (i.e., an “intruder”) may accidentally or intentionally waves his or her hand in front of the VR device.
  • the intruder's detected hand gesture(s) may trigger undesirable effects on the user interface causing an unpleasant experience to the main user.
  • the main user may be editing a document on a virtual reality desktop and the intruder hand gesture may close the document.
  • the main user may be finishing up an online purchase using a VR device when the intruder hand gesture clicks the BACK button.
  • the main user may be watching a movie in the VR device and the intruder hand gesture may click the STOP or CLOSE button on the movie window.
  • the intentional or accidental hand gesture of an intruder may cause undesirable experience for the main or legitimate user.
  • the VR headset comprises: i) a forward-looking vision sensor for detecting objects in the forward field of view of the VR headset; ii) a downward-looking vision sensor for detecting objects in the downward field of view of the VR headset; iii) a controller coupled to the forward-looking vision sensor and the downward-looking vision sensor.
  • the controller is configured to: a) detect a hand in a first image captured by the forward-looking vision sensor; b) detect an arm of the user in a second image captured by the downward-looking vision sensor; and c) determine whether the detected hand in the first image is a hand of the user.
  • the controller determines whether the detected hand in the first image is the hand of the user by comparing a relative position of the detected hand in the first image and a relative position of the detected arm of the user in the second image.
  • the controller determines whether the detected hand in the first image is the hand of the user by comparing a relative movement of the detected hand in the first image and a relative movement of the detected arm of the user in the second image.
  • the controller determines whether the detected hand in the first image is the hand of the user by comparing a relative alignment of the detected hand in the first image and a relative alignment of the detected arm of the user in the second image.
  • FIG. 1A is a perspective view of a virtual reality (VR) headset according to one embodiment of the disclosure.
  • VR virtual reality
  • FIG. 1B is a front view of a virtual reality (VR) headset according to one embodiment of the disclosure.
  • VR virtual reality
  • FIG. 2 illustrates a hand gesture detection operation of a virtual reality (VR) headset according to one embodiment of the disclosure.
  • VR virtual reality
  • FIG. 3 illustrates detected hands in the field of view of the forward-looking vision sensor and detected arms in the field of view of the downward-looking vision sensor of a virtual reality (VR) headset according to one embodiment of the disclosure.
  • VR virtual reality
  • FIG. 4 is a schematic block diagram of a virtual reality (VR) headset according to one embodiment of the disclosure.
  • VR virtual reality
  • FIG. 5 is a flow diagram illustrating the operation of a virtual reality (VR) headset according to one embodiment of the disclosure.
  • VR virtual reality
  • FIGS. 1 through 5 discussed below, and the various embodiments used to describe the principles of the present disclosure in this patent document are by way of illustration only and should not be construed in any way to limit the scope of the disclosure. Those skilled in the art will understand that the principles of the present disclosure may be implemented in any suitably arranged virtual reality headset.
  • VR virtual reality
  • main user or “user” refers to the person actually wearing and operating the virtual reality (VR) head mounted display (HMD) or headset
  • intruder refers to any person other than the user whose hand gestures are intentionally or accidentally triggering undesirable effects on the VR user interface of the HMD/headset.
  • FIG. 1A is a perspective view of virtual reality (VR) headset 100 according to one embodiment of the disclosure.
  • FIG. 1B is a front view of virtual reality (VR) headset 100 according to one embodiment of the disclosure.
  • VR headset 100 comprises chassis (or housing) 105 , forward vision sensor 110 , head strap 120 , and downward vision sensor 130 .
  • Chassis 105 houses the electronics of VR headset 100 .
  • a user places VR headset 100 on his or her head and tightens head strap 120 to hold VR headset 100 in place.
  • Forward vision sensor 110 captures forward field of view (FOV) 150 and displays forward FOV 150 on the internal display of VR headset 100 . The user may then view on the internal display any objects in the forward FOV 150 .
  • FOV forward field of view
  • the forward vision sensor 110 and the internal processor(s) of VR headset 100 detect a hand in forward FOV 150 for the purpose of determining hand gestures, it may be difficult to determine whether the hand belongs to the main user or to an intruder. It is necessary to prevent a hand gesture from an intruder from causing undesirable interference to the user interface.
  • the present disclosure provides a method of distinguishing legitimate user hand gestures from intruder hand gestures by using downward vision sensor 130 , which captures downward field of view (FOV) 160 .
  • Downward vision sensor 130 and the internal processor(s) of VR headset 100 are operable to detect and to identify the arm(s) of the user in downward FOV 160 and then to correlate and/or to associate the user hand movements with the user arm movements.
  • VR headset 100 is capable of determining if a detected hand in the forward FOV 150 belongs to the legitimate user of VR headset 100 or to an intruder. Once this determination is made, the internal processor(s) of VR headset 100 will only process hand gesture commands from the user and will ignore hand gestures from an intruder.
  • FIG. 2 illustrates a hand gesture detection operation of virtual reality (VR) headset 100 according to one embodiment of the disclosure.
  • the user extends her arm and hand forward to interact with object(s) in the virtual world.
  • Forward vision sensor 110 detects user hand 210 in forward FOV 150 and downward vision sensor 130 detects user arm 220 in downward FOV 160 .
  • VR headset 110 determines whether user hand 210 belongs to the user by comparing the alignments and/or positions of user hand 210 and user arm 220 .
  • VR headset 110 may also determine whether user hand 210 belongs to the user by comparing the relative movements of user hand 210 and user arm 220 .
  • the tracked movements may include left-right (lateral) movement of the hands and arms, up-down (vertical) movement of the hands and arms, and/or forward-backward (extension) movements of the hands and arms away from or toward the body of the user.
  • FIG. 3 illustrates detected hands 310 and 320 in forward FOV 150 of forward vision sensor 110 and detected arms 311 and 321 in the downward FOV 160 of downward vision sensor 130 of virtual reality (VR) headset 100 according to one embodiment of the disclosure.
  • the user will only see detected hands 310 and 320 in forward FOV 150 on the internal display of VR headset 100 .
  • Detected arms 311 and 321 are only seen and analyzed by the internal processor(s) of VR headset 100 .
  • FIG. 3 illustrates detected hands 310 and 320 in forward FOV 150 of forward vision sensor 110 and detected arms 311 and 321 in the downward FOV 160 of downward vision sensor 130 of virtual reality (VR) headset 100 according to one embodiment of the disclosure.
  • the user will only see detected hands 310 and 320 in forward FOV 150 on the internal display of VR headset 100 .
  • Detected arms 311 and 321 are only seen and analyzed by the internal processor(s) of VR headset 100 .
  • the lateral movements of detected arms 311 and 321 may be correlated with similar lateral movements of detected hands 310 and 320 , thereby identifying detected hands 310 and 320 as the hands of the user of VR headset 100 and not the hands of an intruder.
  • FIG. 4 is a schematic block diagram of virtual reality (VR) headset 100 according to one embodiment of the disclosure.
  • VR headset 100 comprises forward vision sensor (VS) 110 and downward vision sensor (VS) 130 .
  • VR headset 100 further comprises VR headset controller 410 , memory 420 , VR source video 430 , video processor 440 , display 450 , and speakers 460 .
  • forward VS 110 and downward VS 130 may comprise conventional video cameras (e.g., RGB video cameras).
  • VR headset controller 410 is a microprocessor or microcontroller that controls the overall operation of VR headset 410 by executing an operating system program and one or more application programs stored in memory 420 .
  • Video processor 440 receives source video from VR source video 430 , which video processor 440 then displays on one or more screens of display 450 .
  • VR source video 430 may be an external VR video player coupled wirelessly or by wireline to VR headset 410 .
  • VR source video 430 may be an internal memory (including a part of memory 420 ), in which VR video content is stored.
  • VR headset controller 410 directs the real-world outputs of forward VS 110 and downward VS 130 to video processor 440 so that the user can see the real-world around the user on display 450 , as well as augmented reality (AR) video content.
  • AR augmented reality
  • VR headset controller 410 is configured to direct video processor 440 to detect the hand(s) of the user in forward FOV 150 in the video output of forward VS 110 and to detect the arm(s) of the user in downward FOV 160 in the video output of downward VS 130 .
  • VR headset controller 410 is further configured to direct video processor 440 to correlate and/or to associate the user hand movements with the user arm movements. In this way, video processor 440 is capable of determining if a detected hand in forward FOV 150 belongs to the legitimate user of VR headset 100 or to an intruder.
  • FIG. 5 is a flow diagram illustrating the operation of virtual reality (VR) headset 100 according to one embodiment of the disclosure.
  • the user activates VR headset 100 and places VR headset 100 on his or her head (step 505 ).
  • the user may launch an application that may be controlled by user hand gestures.
  • video processor 440 detects one or more hand(s) in forward FOV 150 (step 510 ).
  • Video processor 440 also detects a portion (e.g., a forearm) of at least one arm of the user in downward FOV 160 (step 515 ).
  • Video processor 440 attempts to determine if a detected hand in forward FOV 150 is the hand of the user or an intruder. Video processor 440 may do this by comparing and analyzing detected objects in forward FOV 150 and downward FOV 160 in order to correlate the alignments and/or movements of a detected hand(s) and a detected forearm(s) (step 520 ). From this comparison, video processor 440 identifies the hand(s) of the legitimate user of VR headset 100 and ignores the detected hand(s) of intruder(s) (step 525 ). Thereafter, video processor 440 and/or VR headset controller 410 process the hand gestures of legitimate user (step 530 ).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A virtual reality (VR) headset configured to be worn by a user. The VR headset comprises: i) a forward-looking vision sensor for detecting objects in the forward field of view of the VR headset; ii) a downward-looking vision sensor for detecting objects in the downward field of view of the VR headset; iii) a controller coupled to the forward-looking vision sensor and the downward-looking vision sensor. The controller is configured to: a) detect a hand in a first image captured by the forward-looking vision sensor; b) detect an arm of the user in a second image captured by the downward-looking vision sensor; and c) determine whether the detected hand in the first image is a hand of the user.

Description

TECHNICAL FIELD
The present application relates generally to virtual reality (VR) headsets and, in particular, to a system for correctly identifying the hand gestures of the legitimate user of a VR headset.
BACKGROUND
Virtual reality (VR) equipment—also called augmented reality (AR) equipment—is becoming increasingly popular, both for entertainment uses, training uses, and commercial uses. A user experiences virtual reality by wearing a VR head-mounted display (HMD) or similar equipment and operating a virtual reality software application that controls the VR equipment. The VR headset projects three-dimensional (3D) images of a virtual world that may appear quite real to the user.
One of the key features of a VR headset is the ability to recognize and identify the hand gestures of the user of the VR headset. However, when the front vision sensor on the VR headset detects a hand in the scene for the purpose of identifying user hand gestures, it is difficult to determine whether the hand belong to the legitimate user of the VR headset or to an intruder in the field of vision of the front vision sensor. The default assumption of conventional VR headsets is that a detected hand belongs to the actual user (i.e., the person wearing and operating the VR device). But it is possible that another person (i.e., an “intruder”) may accidentally or intentionally waves his or her hand in front of the VR device. The intruder's detected hand gesture(s) may trigger undesirable effects on the user interface causing an unpleasant experience to the main user.
For example, the main user may be editing a document on a virtual reality desktop and the intruder hand gesture may close the document. Likewise, the main user may be finishing up an online purchase using a VR device when the intruder hand gesture clicks the BACK button. Or, the main user may be watching a movie in the VR device and the intruder hand gesture may click the STOP or CLOSE button on the movie window. In sum, there are numerous situations where the intentional or accidental hand gesture of an intruder may cause undesirable experience for the main or legitimate user.
Therefore, there is a need in the art for an improved apparatus and method for identifying legitimated hand gesture of the user of a virtual reality device.
SUMMARY
To address the above-discussed deficiencies of the prior art, it is a primary object to provide a virtual reality (VR) headset configured to be worn by a user. In a preferred embodiment of the disclosure, the VR headset comprises: i) a forward-looking vision sensor for detecting objects in the forward field of view of the VR headset; ii) a downward-looking vision sensor for detecting objects in the downward field of view of the VR headset; iii) a controller coupled to the forward-looking vision sensor and the downward-looking vision sensor. The controller is configured to: a) detect a hand in a first image captured by the forward-looking vision sensor; b) detect an arm of the user in a second image captured by the downward-looking vision sensor; and c) determine whether the detected hand in the first image is a hand of the user.
In one embodiment, the controller determines whether the detected hand in the first image is the hand of the user by comparing a relative position of the detected hand in the first image and a relative position of the detected arm of the user in the second image.
In another embodiment, the controller determines whether the detected hand in the first image is the hand of the user by comparing a relative movement of the detected hand in the first image and a relative movement of the detected arm of the user in the second image.
In still another embodiment, the controller determines whether the detected hand in the first image is the hand of the user by comparing a relative alignment of the detected hand in the first image and a relative alignment of the detected arm of the user in the second image.
Before undertaking the DETAILED DESCRIPTION below, it may be advantageous to set forth definitions of certain words and phrases used throughout this patent document: the terms “include” and “comprise,” as well as derivatives thereof, mean inclusion without limitation; the term “or,” is inclusive, meaning and/or; the phrases “associated with” and “associated therewith,” as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, or the like; and the term “controller” means any device, system or part thereof that controls at least one operation, such a device may be implemented in hardware, firmware or software, or some combination of at least two of the same. It should be noted that the functionality associated with any particular controller may be centralized or distributed, whether locally or remotely. Definitions for certain words and phrases are provided throughout this patent document, those of ordinary skill in the art should understand that in many, if not most instances, such definitions apply to prior, as well as future uses of such defined words and phrases.
BRIEF DESCRIPTION OF THE DRAWINGS
For a more complete understanding of the present disclosure and its advantages, reference is now made to the following description taken in conjunction with the accompanying drawings, in which like reference numerals represent like parts:
FIG. 1A is a perspective view of a virtual reality (VR) headset according to one embodiment of the disclosure.
FIG. 1B is a front view of a virtual reality (VR) headset according to one embodiment of the disclosure.
FIG. 2 illustrates a hand gesture detection operation of a virtual reality (VR) headset according to one embodiment of the disclosure.
FIG. 3 illustrates detected hands in the field of view of the forward-looking vision sensor and detected arms in the field of view of the downward-looking vision sensor of a virtual reality (VR) headset according to one embodiment of the disclosure.
FIG. 4 is a schematic block diagram of a virtual reality (VR) headset according to one embodiment of the disclosure.
FIG. 5 is a flow diagram illustrating the operation of a virtual reality (VR) headset according to one embodiment of the disclosure.
DETAILED DESCRIPTION
FIGS. 1 through 5, discussed below, and the various embodiments used to describe the principles of the present disclosure in this patent document are by way of illustration only and should not be construed in any way to limit the scope of the disclosure. Those skilled in the art will understand that the principles of the present disclosure may be implemented in any suitably arranged virtual reality headset.
In the disclosure below, the phrase “virtual reality” will be used generically for both virtual reality and augmented reality in order to simplify the descriptions that follow. Also, the following terms have the following meanings unless otherwise specified: i) “vision sensor” refers to any video camera (e.g., RGB camera), depth sensor, or motion detection circuitry device; ii) “main user” or “user” refers to the person actually wearing and operating the virtual reality (VR) head mounted display (HMD) or headset; and iii) “intruder” refers to any person other than the user whose hand gestures are intentionally or accidentally triggering undesirable effects on the VR user interface of the HMD/headset.
FIG. 1A is a perspective view of virtual reality (VR) headset 100 according to one embodiment of the disclosure. FIG. 1B is a front view of virtual reality (VR) headset 100 according to one embodiment of the disclosure. VR headset 100 comprises chassis (or housing) 105, forward vision sensor 110, head strap 120, and downward vision sensor 130. Chassis 105 houses the electronics of VR headset 100. A user places VR headset 100 on his or her head and tightens head strap 120 to hold VR headset 100 in place. Forward vision sensor 110 captures forward field of view (FOV) 150 and displays forward FOV 150 on the internal display of VR headset 100. The user may then view on the internal display any objects in the forward FOV 150.
When the forward vision sensor 110 and the internal processor(s) of VR headset 100 detect a hand in forward FOV 150 for the purpose of determining hand gestures, it may be difficult to determine whether the hand belongs to the main user or to an intruder. It is necessary to prevent a hand gesture from an intruder from causing undesirable interference to the user interface. The present disclosure provides a method of distinguishing legitimate user hand gestures from intruder hand gestures by using downward vision sensor 130, which captures downward field of view (FOV) 160. Downward vision sensor 130 and the internal processor(s) of VR headset 100 are operable to detect and to identify the arm(s) of the user in downward FOV 160 and then to correlate and/or to associate the user hand movements with the user arm movements. In this way, VR headset 100 is capable of determining if a detected hand in the forward FOV 150 belongs to the legitimate user of VR headset 100 or to an intruder. Once this determination is made, the internal processor(s) of VR headset 100 will only process hand gesture commands from the user and will ignore hand gestures from an intruder.
FIG. 2 illustrates a hand gesture detection operation of virtual reality (VR) headset 100 according to one embodiment of the disclosure. In FIG. 2, the user extends her arm and hand forward to interact with object(s) in the virtual world. Forward vision sensor 110 detects user hand 210 in forward FOV 150 and downward vision sensor 130 detects user arm 220 in downward FOV 160. VR headset 110 then determines whether user hand 210 belongs to the user by comparing the alignments and/or positions of user hand 210 and user arm 220. VR headset 110 may also determine whether user hand 210 belongs to the user by comparing the relative movements of user hand 210 and user arm 220. The tracked movements may include left-right (lateral) movement of the hands and arms, up-down (vertical) movement of the hands and arms, and/or forward-backward (extension) movements of the hands and arms away from or toward the body of the user.
FIG. 3 illustrates detected hands 310 and 320 in forward FOV 150 of forward vision sensor 110 and detected arms 311 and 321 in the downward FOV 160 of downward vision sensor 130 of virtual reality (VR) headset 100 according to one embodiment of the disclosure. Generally, the user will only see detected hands 310 and 320 in forward FOV 150 on the internal display of VR headset 100. Detected arms 311 and 321 are only seen and analyzed by the internal processor(s) of VR headset 100. In FIG. 3, the lateral movements of detected arms 311 and 321 (indicted by left-right arrows) may be correlated with similar lateral movements of detected hands 310 and 320, thereby identifying detected hands 310 and 320 as the hands of the user of VR headset 100 and not the hands of an intruder.
FIG. 4 is a schematic block diagram of virtual reality (VR) headset 100 according to one embodiment of the disclosure. VR headset 100 comprises forward vision sensor (VS) 110 and downward vision sensor (VS) 130. VR headset 100 further comprises VR headset controller 410, memory 420, VR source video 430, video processor 440, display 450, and speakers 460. In an exemplary embodiment, forward VS 110 and downward VS 130 may comprise conventional video cameras (e.g., RGB video cameras).
VR headset controller 410 is a microprocessor or microcontroller that controls the overall operation of VR headset 410 by executing an operating system program and one or more application programs stored in memory 420. Video processor 440 receives source video from VR source video 430, which video processor 440 then displays on one or more screens of display 450. VR source video 430 may be an external VR video player coupled wirelessly or by wireline to VR headset 410. Alternatively, VR source video 430 may be an internal memory (including a part of memory 420), in which VR video content is stored. In camera mode, VR headset controller 410 directs the real-world outputs of forward VS 110 and downward VS 130 to video processor 440 so that the user can see the real-world around the user on display 450, as well as augmented reality (AR) video content.
According to the principles of the disclosure, VR headset controller 410 is configured to direct video processor 440 to detect the hand(s) of the user in forward FOV 150 in the video output of forward VS 110 and to detect the arm(s) of the user in downward FOV 160 in the video output of downward VS 130. VR headset controller 410 is further configured to direct video processor 440 to correlate and/or to associate the user hand movements with the user arm movements. In this way, video processor 440 is capable of determining if a detected hand in forward FOV 150 belongs to the legitimate user of VR headset 100 or to an intruder.
FIG. 5 is a flow diagram illustrating the operation of virtual reality (VR) headset 100 according to one embodiment of the disclosure. Initially, the user activates VR headset 100 and places VR headset 100 on his or her head (step 505). After activation, the user may launch an application that may be controlled by user hand gestures. In response, video processor 440 detects one or more hand(s) in forward FOV 150 (step 510). Video processor 440 also detects a portion (e.g., a forearm) of at least one arm of the user in downward FOV 160 (step 515).
Video processor 440 then attempts to determine if a detected hand in forward FOV 150 is the hand of the user or an intruder. Video processor 440 may do this by comparing and analyzing detected objects in forward FOV 150 and downward FOV 160 in order to correlate the alignments and/or movements of a detected hand(s) and a detected forearm(s) (step 520). From this comparison, video processor 440 identifies the hand(s) of the legitimate user of VR headset 100 and ignores the detected hand(s) of intruder(s) (step 525). Thereafter, video processor 440 and/or VR headset controller 410 process the hand gestures of legitimate user (step 530).
Although the present disclosure has been described with an exemplary embodiment, various changes and modifications may be suggested to one skilled in the art. It is intended that the present disclosure encompass such changes and modifications as fall within the scope of the appended claims.

Claims (19)

What is claimed is:
1. A virtual reality apparatus comprising:
a plurality of vision sensors configured to detect objects in a vicinity of a user of the virtual reality apparatus; and
a processor coupled to the plurality of vision sensors and configured to:
compare a first object detected by a first vision sensor with a second object detected by a second vision sensor,
in response to the comparison, to:
identify the first detected object as a hand of the user;
differentiate the hand of the user from a hand of a person other than the user;
detect an arm of the user;
comparing a relative position of the detected hand and a relative position of the detected arm of the user, and
ignore hand gestures of the person other than the user.
2. The virtual reality apparatus as set forth in claim 1, wherein the processor is configured to detect the first detected object in an image captured by the first vision sensor and to identify the first detected object as a hand.
3. The virtual reality apparatus as set forth in claim 2, wherein the processor is configured to detect the second detected object in an image captured by the second vision sensor and to identify the second detected object as at least a portion of the arm of the user.
4. The virtual reality apparatus as set forth in claim 3, wherein the processor is configured to compare a relative position of the first detected object and a relative position of the second detected object in order to identify the first detected object as the hand of the user.
5. The virtual reality apparatus as set forth in claim 3, wherein the processor is configured to compare a relative movement of the first detected object and a relative movement of the second detected object in order to identify the first detected object as the hand of the user.
6. The virtual reality apparatus as set forth in claim 3, wherein the processor is configured to compare a relative alignment of the first detected object and a relative alignment of the second detected object in order to identify the first detected object as the hand of the user.
7. The virtual reality apparatus as set forth in claim 3, wherein the processor is further configured to compare a third object detected by the first vision sensor with the second object detected by the second vision sensor and, in response to the comparison, to identify the third detected object as the hand of the person other than the user.
8. The virtual reality apparatus as set forth in claim 3, wherein the first and second vision sensors comprise video cameras.
9. A method of operating a virtual reality apparatus comprising:
in a plurality of vision sensors, detecting objects in a vicinity of a user of the virtual reality apparatus;
comparing a first object detected by a first vision sensor with a second object detected by a second vision sensor;
in response to the comparison, identifying the first detected object as a hand of the user and differentiate the hand of the user from a hand of a person other than the user;
detecting an arm of the user;
comparing a relative position of the detected hand and a relative position of the detected arm of the user in; and
ignoring hand gestures of the person other than the user.
10. The method as set forth in claim 9, further comprising:
detecting the first detected object in an image captured by the first vision sensor; and
identifying the first detected object as a hand.
11. The method as set forth in claim 10, further comprising:
detecting the second detected object in an image captured by the second vision sensor; and identifying the second detected object as at least a portion of the arm of the user.
12. The method as set forth in claim 11, wherein identifying the first detected object as a hand of the user comprises:
comparing a relative position of the first detected object and a relative position of the second detected object in order to identify the first detected object as the hand of the user.
13. The method as set forth in claim 11, wherein identifying the first detected object as a hand of the user comprises:
comparing a relative movement of the first detected object and a relative movement of the second detected object in order to identify the first detected object as the hand of the user.
14. The method as set forth in claim 11, wherein identifying the first detected object as a hand of the user comprises:
comparing a relative alignment of the first detected object and a relative alignment of the second detected object in order to identify the first detected object as the hand of the user.
15. The method as set forth in claim 11, further comprising:
comparing a third object detected by the first vision sensor with the second object detected by the second vision sensor; and
in response to the comparison, identifying the third detected object as the hand of the person other than the user.
16. The method as set forth in claim 11, wherein the first and second vision sensors comprise video cameras.
17. A virtual reality (VR) headset configured to be worn by a user, the VR headset comprising:
a forward-looking vision sensor for detecting objects in a forward field of view of the VR headset;
a downward-looking vision sensor for detecting objects in a downward field of view of the VR headset;
a processor coupled to the forward-looking vision sensor and the downward-looking vision sensor and configured to:
detect a hand in a first image captured by the forward-looking vision sensor;
detect an arm of the user in a second image captured by the downward-looking vision sensor;
determine whether the detected hand in the first image is a hand of the user or a hand of a person other than the user;
compare a relative position of the detected hand in the first image and a relative position of the detected arm of the user in the second image; and
ignore hand gestures of the person other than the user.
18. The virtual reality (VR) headset as set forth in claim 17, wherein the processor determines whether the detected hand in the first image is the hand of the user by comparing a relative movement of the detected hand in the first image and a relative movement of the detected arm of the user in the second image.
19. The virtual reality (VR) headset as set forth in claim 17, wherein the processor determines whether the detected hand in the first image is the hand of the user by comparing a relative alignment of the detected hand in the first image and a relative alignment of the detected arm of the user in the second image.
US14/982,299 2015-12-29 2015-12-29 Apparatus and method for recognizing hand gestures in a virtual reality headset Active 2036-04-27 US10140507B2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US14/982,299 US10140507B2 (en) 2015-12-29 2015-12-29 Apparatus and method for recognizing hand gestures in a virtual reality headset
KR1020160026357A KR102568708B1 (en) 2015-12-29 2016-03-04 Apparatus and method for recognizing hand gestures in a virtual reality headset
EP16181893.5A EP3188075B1 (en) 2015-12-29 2016-07-29 Apparatus and method for recognizing hand gestures in a virtual reality headset
CN201610801612.XA CN106933343B (en) 2015-12-29 2016-09-05 Apparatus and method for recognizing gestures in virtual reality headset

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/982,299 US10140507B2 (en) 2015-12-29 2015-12-29 Apparatus and method for recognizing hand gestures in a virtual reality headset

Publications (2)

Publication Number Publication Date
US20170185830A1 US20170185830A1 (en) 2017-06-29
US10140507B2 true US10140507B2 (en) 2018-11-27

Family

ID=59087895

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/982,299 Active 2036-04-27 US10140507B2 (en) 2015-12-29 2015-12-29 Apparatus and method for recognizing hand gestures in a virtual reality headset

Country Status (3)

Country Link
US (1) US10140507B2 (en)
KR (1) KR102568708B1 (en)
CN (1) CN106933343B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220004750A1 (en) * 2018-12-26 2022-01-06 Samsung Electronics Co., Ltd. Method for identifying user's real hand and wearable device therefor

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10650591B1 (en) * 2016-05-24 2020-05-12 Out of Sight Vision Systems LLC Collision avoidance system for head mounted display utilized in room scale virtual reality system
US10981060B1 (en) 2016-05-24 2021-04-20 Out of Sight Vision Systems LLC Collision avoidance system for room scale virtual reality system
US10497179B2 (en) 2018-02-23 2019-12-03 Hong Kong Applied Science and Technology Research Institute Company Limited Apparatus and method for performing real object detection and control using a virtual reality head mounted display system
US20190385372A1 (en) * 2018-06-15 2019-12-19 Microsoft Technology Licensing, Llc Positioning a virtual reality passthrough region at a known distance
US11107293B2 (en) 2019-04-23 2021-08-31 XRSpace CO., LTD. Head mounted display system capable of assigning at least one predetermined interactive characteristic to a virtual object in a virtual environment created according to a real object in a real environment, a related method and a related non-transitory computer readable storage medium
US11176374B2 (en) * 2019-05-01 2021-11-16 Microsoft Technology Licensing, Llc Deriving information from images
US11751800B2 (en) * 2020-10-22 2023-09-12 International Business Machines Corporation Seizure detection using contextual motion
US20220197277A1 (en) * 2020-12-23 2022-06-23 Qatar Foundation For Education, Science And Community Development Telepresence control schemes for hazardous environments
US11681146B2 (en) * 2021-03-18 2023-06-20 Snap Inc. Augmented reality display for macular degeneration

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080013793A1 (en) * 2006-07-13 2008-01-17 Northrop Grumman Corporation Gesture recognition simulation system and method
US20120249416A1 (en) * 2011-03-29 2012-10-04 Giuliano Maciocci Modular mobile connected pico projectors for a local multi-user collaboration
US20140243614A1 (en) * 2013-02-26 2014-08-28 Butterfly Network, Inc. Transmissive imaging and related apparatus and methods
US20150199824A1 (en) 2014-01-10 2015-07-16 Electronics And Telecommunications Research Institute Apparatus and method for detecting multiple arms and hands by using three-dimensional image
US20150241959A1 (en) * 2013-07-12 2015-08-27 Magic Leap, Inc. Method and system for updating a virtual world
US20150312561A1 (en) * 2011-12-06 2015-10-29 Microsoft Technology Licensing, Llc Virtual 3d monitor

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006309562A (en) * 2005-04-28 2006-11-09 Hitachi Omron Terminal Solutions Corp Biological information registering device
US9901828B2 (en) * 2010-03-30 2018-02-27 Sony Interactive Entertainment America Llc Method for an augmented reality character to maintain and exhibit awareness of an observer
JP2012203475A (en) * 2011-03-23 2012-10-22 Toshiba Corp Communication device and control method therefor
US8686943B1 (en) * 2011-05-13 2014-04-01 Imimtek, Inc. Two-dimensional method and system enabling three-dimensional user interaction with a device
CN102799271A (en) * 2012-07-02 2012-11-28 Tcl集团股份有限公司 Method and system for identifying interactive commands based on human hand gestures
CN103077373B (en) * 2012-12-30 2015-08-26 信帧电子技术(北京)有限公司 The method detecting behavior of fighting is pushed and shoved based on upper limbs
US10133342B2 (en) * 2013-02-14 2018-11-20 Qualcomm Incorporated Human-body-gesture-based region and volume selection for HMD
CN104007865B (en) * 2013-02-27 2017-04-19 联想(北京)有限公司 Recognition method and electronic device
CN104424470B (en) * 2013-09-03 2018-04-27 联想(北京)有限公司 A kind of gesture identification method and device
CN104714635B (en) * 2013-12-16 2018-07-06 联想(北京)有限公司 The method and electronic equipment of information processing
US10585486B2 (en) * 2014-01-03 2020-03-10 Harman International Industries, Incorporated Gesture interactive wearable spatial audio system
US10613627B2 (en) * 2014-05-12 2020-04-07 Immersion Corporation Systems and methods for providing haptic feedback for remote interactions
US9690367B2 (en) * 2014-12-10 2017-06-27 Sixense Entertainment, Inc. System and method for assisting a user in locating physical objects while the user is in a virtual reality environment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080013793A1 (en) * 2006-07-13 2008-01-17 Northrop Grumman Corporation Gesture recognition simulation system and method
US20120249416A1 (en) * 2011-03-29 2012-10-04 Giuliano Maciocci Modular mobile connected pico projectors for a local multi-user collaboration
US20120249741A1 (en) * 2011-03-29 2012-10-04 Giuliano Maciocci Anchoring virtual images to real world surfaces in augmented reality systems
US20150312561A1 (en) * 2011-12-06 2015-10-29 Microsoft Technology Licensing, Llc Virtual 3d monitor
US20140243614A1 (en) * 2013-02-26 2014-08-28 Butterfly Network, Inc. Transmissive imaging and related apparatus and methods
US20150241959A1 (en) * 2013-07-12 2015-08-27 Magic Leap, Inc. Method and system for updating a virtual world
US20150199824A1 (en) 2014-01-10 2015-07-16 Electronics And Telecommunications Research Institute Apparatus and method for detecting multiple arms and hands by using three-dimensional image

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Foreign Communication from Related Counterpart Application; European Patent Application No. 16181893.5; Extended European Search Report dated May 8, 2017; 7 pages.
Lau, D.; "Leading Edge Views: 3-D Imaging Advances Capabilities of Machine Vision: Part 1"; retrieved from the Internet: URL: http://www.vision-systems.com/articles/print/volume-17/issue-4/departments/leading-edge-views/3-d-imaging-advances-capabilities-of-machine-vision-part-i.html [retrieved on Feb. 11, 2016]; 7 pages.
Thelen et al.; "Enhancing Large Display interaction with User Tracking Data"; Proceedings of the International Conference on Computer Graphics and Virtual Reality (CGVR); Jan. 1, 2012; 6 pages.

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220004750A1 (en) * 2018-12-26 2022-01-06 Samsung Electronics Co., Ltd. Method for identifying user's real hand and wearable device therefor
US11941906B2 (en) * 2018-12-26 2024-03-26 Samsung Electronics Co., Ltd. Method for identifying user's real hand and wearable device therefor

Also Published As

Publication number Publication date
US20170185830A1 (en) 2017-06-29
CN106933343B (en) 2021-08-31
CN106933343A (en) 2017-07-07
KR102568708B1 (en) 2023-08-21
KR20170078488A (en) 2017-07-07

Similar Documents

Publication Publication Date Title
US10140507B2 (en) Apparatus and method for recognizing hand gestures in a virtual reality headset
EP3188075B1 (en) Apparatus and method for recognizing hand gestures in a virtual reality headset
US10049497B2 (en) Display control device and display control method
US9972136B2 (en) Method, system and device for navigating in a virtual reality environment
US20110298829A1 (en) Selecting View Orientation in Portable Device via Image Analysis
US20110080337A1 (en) Image display device and display control method thereof
US10310583B2 (en) Attention-based rendering and fidelity
US20180133593A1 (en) Algorithm for identifying three-dimensional point-of-gaze
US20160171780A1 (en) Computer device in form of wearable glasses and user interface thereof
CN105847540A (en) Method and mobile phone for controlling picture movement of VR (Virtual Reality) glasses based on eyeball tracking and VR glasses
WO2016008265A1 (en) Method and apparatus for locating position
US20130308835A1 (en) Mobile Communication Device with Image Recognition and Method of Operation Therefor
US20220335734A1 (en) Head-mounted display, display control method, and program
US20240198211A1 (en) Device including plurality of markers
WO2018198499A1 (en) Information processing device, information processing method, and recording medium
WO2018146922A1 (en) Information processing device, information processing method, and program
US11029753B2 (en) Human computer interaction system and human computer interaction method
US20190333468A1 (en) Head mounted display device and visual aiding method
US20200175712A1 (en) Head mounted device for virtual or augmented reality combining reliable gesture recognition with motion tracking algorithm
US20140043443A1 (en) Method and system for displaying content to have a fixed pose
JP2015052895A (en) Information processor and method of processing information
WO2023238678A1 (en) Information processing device, controller display method and computer program
CN112578983B (en) Finger orientation touch detection

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SRIVASTAVA, GAURAV;REEL/FRAME:037374/0715

Effective date: 20151228

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4