CN102253713A - Display system orienting to three-dimensional images - Google Patents
Display system orienting to three-dimensional images Download PDFInfo
- Publication number
- CN102253713A CN102253713A CN201110171017XA CN201110171017A CN102253713A CN 102253713 A CN102253713 A CN 102253713A CN 201110171017X A CN201110171017X A CN 201110171017XA CN 201110171017 A CN201110171017 A CN 201110171017A CN 102253713 A CN102253713 A CN 102253713A
- Authority
- CN
- China
- Prior art keywords
- unit
- user
- dimensional
- depth image
- user interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a display system orienting to three-dimensional images, aiming at solving the technical problems on improving a human-machine interaction effect and a three-dimensional image effect. In a technical scheme adopted in the invention, the display system orienting to the three-dimensional images comprises a three-dimensional image displaying unit, a depth image sensing unit, a depth image processing unit, a limb action recognizing unit, a three-dimensional user interface unit and a three-dimensional position indicator unit. Compared with the prior art, the display system orienting to the three-dimensional images, disclosed by the invention, has the advantages that: the depth image sensing unit and the depth image processing unit are adopt to sense all or partial limb actions of a user under a three-dimensional space, and the three-dimensional user interface is used for interacting with the system so as to further achieve a completely immersed human-machine interaction effect, so that the user experience is greatly improved.
Description
Technical field
The present invention relates to a kind of stereo display field, particularly a kind of towards the 3 D stereoscopic image display system.
Background technology
Televisor is popularized already as a kind of popular leisure household appliances and modal display device and is entered huge numbers of families, has also gone through a plurality of stages such as black-and-white television, color TV, LCD TV, 3 D stereoscopic image TV along with the development televisor of display technique.Particularly, nearly 2 years along with the sale that appears on the market successively of various brands 3D TV, the 3 D stereoscopic image TV has moved towards consumer's life finally from notion, it is slightly variant that the 3 D stereoscopic image TV utilizes people's eyes to observe the angle of object, can produce the principle of stereoscopic vision, the image of right and left eyes being seen by variety of way separates, thereby make the user by experiencing stereoscopic vision by anaglyph spectacles or bore hole, and man-machine interface is the media and the dialog interface of transmission, exchange message between people and the computing machine, is the important component part of computer system.The user can be undertaken alternately by various human-computer interaction devices and mode and computing machine, such as keyboard and mouse pattern, the telepilot pattern of classics, touch control, gesture control, even up-to-date action induction human-computer interaction or the like.
But all at present user interfaces all are the forms of two dimension, and promptly all operations are all towards display device screen.With the mouse is example, mouse moving, click, pulling or the like all operations all can only work to the destination object in the display device plane, for two-dimentional display device, this is natural thing, also can guarantee good man-machine interaction experience, for based on the action induction man-machine interaction, the user is that contactless mode is operated user interface, can be envisioned as aloft and be undertaken alternately by invisible together touch screen of touch-control and system, and user's operational feedback is still on two-dimentional display device screen, user experience in this case will decrease, particularly for the 3-dimensional image stereoscopic display device, traditional two-dimensional user interface can't practical requirement, and how improving this user experience is a problem demanding prompt solution.
Summary of the invention
The purpose of this invention is to provide a kind ofly towards the 3 D stereoscopic image display system, the technical matters that solve is to improve the man-machine interaction effect, improves the effect of 3 D stereoscopic image.
The present invention is by the following technical solutions: a kind of towards the 3 D stereoscopic image display system, describedly comprise with lower member towards the 3 D stereoscopic image display system:
Stereopsis display unit: the 3 D stereoscopic image and the three-dimensional graphical interface of user that are used to receive the transmission of three-dimensional user interface unit;
Depth image sensing unit: be used to obtain and respond to 3 D stereoscopic image display unit place environment, comprise user's depth image information;
Depth image processing unit: be used for the interior scene depth information of visual range that analysis depth image sensing unit obtains, and discern user object and whole body or the part limb motion that obtains by the limb action recognition unit;
Limb action recognition unit: be used for following the tracks of and the identification user action, and import control interface for the three-dimensional user interface unit provides the user;
Three-dimensional user interface unit: be used for sending graphical interfaces to the stereopsis display unit, with the form of 3 D stereoscopic image by display unit, and after receiving the user action of limb action recognition unit, user's action is fed back on the stereopsis display unit;
Three-dimensional position indicator unit: identify the particular location of active user after being used to receive the interior scene depth information of depth image processing unit visual range, and user's control action is made feedback at three-dimensional user interface;
Described depth image sensing unit connects the depth image processing unit, and the depth image processing unit connects limb
The body action recognition unit, described limb action recognition unit connects the three-dimensional user interface unit, and the three-dimensional user interface unit connects stereopsis display unit and three-dimensional position indicator unit respectively.
The present invention compared with prior art, employing is towards the depth image sensing unit and the depth image processing unit of 3 D stereoscopic image display system, user's all or part of limb action under the induction solid space, utilize three-dimensional user interface and system to carry out alternately simultaneously, and this three-dimensional user interface is consistent with the display depth space towards the 3 D stereoscopic image display system, and can feed back by three-dimensional position indicator and user's operation, user's valid function scope also is consistent with display depth space towards the 3 D stereoscopic image display system, and then arrive the man-machine interaction effect of immersing fully, thereby improve user experience greatly.
Description of drawings
Fig. 1 is a structured flowchart of the present invention.
Fig. 2 is a process flow diagram of the present invention.
Embodiment
Below in conjunction with drawings and Examples the present invention is described in further detail.
As shown in Figure 1, of the present inventionly comprise with lower member towards the 3 D stereoscopic image display system:
Stereopsis display unit: the 3 D stereoscopic image and the three-dimensional graphical interface of user that are used to receive the transmission of three-dimensional user interface unit;
Depth image sensing unit: be used to obtain and respond to 3 D stereoscopic image display unit place environment, comprise user's depth image information;
Depth image processing unit: be used for the interior scene depth information of visual range that analysis depth image sensing unit obtains, and discern user object and whole body or the part limb motion that obtains by the limb action recognition unit; Limb action recognition unit: be used for following the tracks of and the identification user action, and import control interface for the three-dimensional user interface unit provides the user;
Three-dimensional user interface unit: be used for sending graphical interfaces to the stereopsis display unit, with the form of 3 D stereoscopic image by display unit, and after receiving the user action of limb action recognition unit, user's action is fed back on the stereopsis display unit;
Three-dimensional position indicator unit: identify the particular location of active user after being used to receive the interior scene depth information of depth image processing unit visual range, and user's control action is made feedback at three-dimensional user interface.
Described depth image sensing unit connects the depth image processing unit, the depth image processing unit connects the limb action recognition unit, described limb action recognition unit connects the three-dimensional user interface unit, and the three-dimensional user interface unit connects stereopsis display unit and three-dimensional position indicator unit respectively.
Three-dimensional user interface of the present invention unit is reference with the Windows operating system of Microsoft, the three-dimensional user interface unit is patterned, can comprise interface elements such as window, icon, file, desktop, all has depth attribute at all interface elements under 3 D stereoscopic image display system environment, be the distance that interface object leaves the physics display screen, can be understood as a three-dimensional desktop in the three dimensions.The user can rotate by gesture motion, this three-dimensional desktop of convergent-divergent, all windows all vertically are suspended in such solid space, and any two depth difference of opening between the window remain on more than the preset threshold scope, the user can grasp by gesture motion, move, pull interested window and with it as focus window etc., the spatial dimension of this solid desktop and user's gesture motion effective range are consistent, for example as when user's one hand extends relative user self the far-end upper right corner of effective range, the three-dimensional position indicator also should move to this solid desktop far-end upper right corner, similarly, when user's one hand contracted relative user self the near-end lower left corner to effective range, the three-dimensional position indicator also should move to this solid desktop near-end lower left corner.
Three-dimensional position indicator unit of the present invention can be the little hand form of a 3 D stereo in the three-dimensional graphical interface of user, and can make corresponding metamorphosis according to the different gesture motion of user, such as user's little hand when doing the propelling movement action is the five fingers expanded configuration, user's little hand when doing drag motions is the seized condition etc. of clenching fist, the user is to show that a pinkie shows device in single-handed exercise, when the user does double-handed exercise again, be shown as two pinkies and show device etc.The physical location of three-dimensional position indicator in three-dimensional graphical interface of user is by user's hand relative displacement decision, and the direction and the speed of the moving direction of indicator and speed and user's hand motion are consistent.
Of the present invention under the effect of depth image sensing unit, this system can obtain user's limb action information, with the gesture motion is example, the gesture motion effective range is meant a specific region between physical display device screen and user's physical location, have only after user's hand enters this zone, system just can discern user's gesture motion, when user's hand is positioned at outside this effective coverage, then user's gesture motion can be ignored by system, the locus scope of user's gesture motion effective range applying three-dimensional user interface is consistent, and the operation of user before 3 D stereo influences display device just can reach the man-machine interaction effect of immersing fully like this.
As shown in Figure 2, of the present invention towards the realization of 3 D stereoscopic image display system employing following steps: one, the stereopsis display unit throws virtual three-dimensional graphical interface of user in entity space; Two, the depth image sensing unit obtains user place environment depth image information after the stereopsis display unit projects virtual three-dimensional graphical interface of user, and depth image information is sent to the depth image processing unit; Three, the depth image processing unit begins to discern user object and follows the tracks of user's hand motion after receiving depth image information, gives the limb action recognition unit with user's hand motion information feedback; Four, the limb action recognition unit judges that the user action gesture whether in the useful effect zone, is then to enter next step operation, otherwise returns previous step; Five, the limb action recognition unit is discerned user's gesture motion and is judged this action command; Six, the order after the limb action recognition unit will be discerned is sent to the three-dimensional user interface unit; Seven, the three-dimensional user interface unit is sent to the three-dimensional position indicator unit with information, upgrades three-dimensional position indicator post and state; Eight, the three-dimensional user interface unit is made respective feedback to the user action control command and is presented on the three-dimensional user interface unit.
Limb action recognition unit of the present invention can be discerned user's bimanualness, and its step is as follows: one, user's one hand enters the gesture motion effective coverage; Two, the three-dimensional position indicator unit upgrades three-dimensional little hand position and form; Three, the user moves in the effective coverage, and does to push when three-dimensional indicator is positioned at certain window ranges and move; Four, the limb action recognition unit identifies effective gesture motion for clicking order, and this window of three-dimensional user interface unit is set to the current focus window, and it is placed the front of current all windows; Five, the user stretches into the useful effect zone with the another hand; Six, the three-dimensional position indicator unit is updated to two three-dimensional little hand form, and position and both hands position are consistent; Seven, user's both hands are done in the current focus window ranges relatively away from action; Eight, the limb action recognition unit identifies this effective gesture motion for amplifying window command, and three-dimensional user interface is amplified to both hands relatively away from the distance proportion unanimity with this window.
The model that stereopsis display unit of the present invention adopts KangJia Group Co., Ltd to produce is the LC42MS96PD dimensional image display; The imageing sensor that the depth image sensing unit adopts PrimeSense company to produce; The model that the depth image processing unit adopts KangJia Group Co., Ltd to produce is the graphics processing unit of KK-Depther; The model that the limb action recognition unit adopts KangJia Group Co., Ltd to produce is the limb action recognizer of KK-Montion; The model that the three-dimensional user interface unit adopts KangJia Group Co., Ltd to produce is the three-dimensional user interface unit of KK-3DUI; The model that the three-dimensional position indicator unit adopts KangJia Group Co., Ltd to produce is the three-dimensional position indicator of KK-3DCur; The action effective coverage is between 0.8 meter to 3.5 meters.
Claims (1)
1. one kind towards the 3 D stereoscopic image display system, it is characterized in that: describedly comprise with lower member towards the 3 D stereoscopic image display system:
Stereopsis display unit: the 3 D stereoscopic image and the three-dimensional graphical interface of user that are used to receive the transmission of three-dimensional user interface unit;
Depth image sensing unit: be used to obtain and respond to 3 D stereoscopic image display unit place environment, comprise user's depth image information;
Depth image processing unit: be used for the interior scene depth information of visual range that analysis depth image sensing unit obtains, and discern user object and whole body or the part limb motion that obtains by the limb action recognition unit;
Limb action recognition unit: be used for following the tracks of and the identification user action, and import control interface for the three-dimensional user interface unit provides the user;
Three-dimensional user interface unit: be used for sending graphical interfaces to the stereopsis display unit, with the form of 3 D stereoscopic image by display unit, and after receiving the user action of limb action recognition unit, user's action is fed back on the stereopsis display unit;
Three-dimensional position indicator unit: identify the particular location of active user after being used to receive the interior scene depth information of depth image processing unit visual range, and user's control action is made feedback at three-dimensional user interface;
Described depth image sensing unit connects the depth image processing unit, the depth image processing unit connects the limb action recognition unit, described limb action recognition unit connects the three-dimensional user interface unit, and the three-dimensional user interface unit connects stereopsis display unit and three-dimensional position indicator unit respectively.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110171017.XA CN102253713B (en) | 2011-06-23 | 2011-06-23 | Towards 3 D stereoscopic image display system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110171017.XA CN102253713B (en) | 2011-06-23 | 2011-06-23 | Towards 3 D stereoscopic image display system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102253713A true CN102253713A (en) | 2011-11-23 |
CN102253713B CN102253713B (en) | 2016-10-12 |
Family
ID=44981016
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201110171017.XA Active CN102253713B (en) | 2011-06-23 | 2011-06-23 | Towards 3 D stereoscopic image display system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102253713B (en) |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102707878A (en) * | 2012-04-06 | 2012-10-03 | 深圳创维数字技术股份有限公司 | User interface operation control method and device |
CN102789312A (en) * | 2011-12-23 | 2012-11-21 | 乾行讯科(北京)科技有限公司 | User interaction system and method |
CN102902355A (en) * | 2012-08-31 | 2013-01-30 | 中国科学院自动化研究所 | Space interaction method of mobile equipment |
CN103067727A (en) * | 2013-01-17 | 2013-04-24 | 乾行讯科(北京)科技有限公司 | Three-dimensional 3D glasses and three-dimensional 3D display system |
CN103530060A (en) * | 2013-10-31 | 2014-01-22 | 京东方科技集团股份有限公司 | Display device and control method thereof and gesture recognition method |
CN103543830A (en) * | 2013-10-28 | 2014-01-29 | 四川大学 | Method for mapping human skeleton points to virtual three-dimensional space points in three-dimensional display |
CN103839040A (en) * | 2012-11-27 | 2014-06-04 | 株式会社理光 | Gesture identification method and device based on depth images |
CN103916660A (en) * | 2013-01-07 | 2014-07-09 | 义明科技股份有限公司 | 3D image sensing device and 3D image sensing method |
WO2014117675A1 (en) * | 2013-01-30 | 2014-08-07 | 联想(北京)有限公司 | Information processing method and electronic device |
CN103995620A (en) * | 2013-12-02 | 2014-08-20 | 深圳市云立方信息科技有限公司 | Air touch system |
CN104536575A (en) * | 2015-01-04 | 2015-04-22 | 苏州易乐展示系统工程有限公司 | Large screen interaction system realization method based on 3D sensing |
TWI494792B (en) * | 2012-09-07 | 2015-08-01 | Pixart Imaging Inc | Gesture recognition system and method |
CN104915979A (en) * | 2014-03-10 | 2015-09-16 | 苏州天魂网络科技有限公司 | System capable of realizing immersive virtual reality across mobile platforms |
CN105389146A (en) * | 2014-09-03 | 2016-03-09 | 三星电子株式会社 | Method for displaying images and electronic device thereof |
CN105511599A (en) * | 2014-09-29 | 2016-04-20 | 联想(北京)有限公司 | Method and device for information processing |
CN106681497A (en) * | 2016-12-07 | 2017-05-17 | 南京仁光电子科技有限公司 | Method and device based on somatosensory control application program |
CN106846564A (en) * | 2016-12-29 | 2017-06-13 | 湖南拓视觉信息技术有限公司 | A kind of intelligent access control system and control method |
CN106984041A (en) * | 2011-02-11 | 2017-07-28 | 漳州市爵晟电子科技有限公司 | A kind of human-computer interaction control system |
CN108388351A (en) * | 2018-04-12 | 2018-08-10 | 深圳市正图科技有限公司 | A kind of mixed reality experiencing system |
CN111045558A (en) * | 2018-10-12 | 2020-04-21 | 上海博泰悦臻电子设备制造有限公司 | Interface control method based on three-dimensional scene, vehicle-mounted equipment and vehicle |
CN111722769A (en) * | 2020-07-16 | 2020-09-29 | 腾讯科技(深圳)有限公司 | Interaction method, interaction device, display equipment and storage medium |
CN112925430A (en) * | 2019-12-05 | 2021-06-08 | 北京芯海视界三维科技有限公司 | Method for realizing suspension touch control, 3D display equipment and 3D terminal |
CN117437701A (en) * | 2022-07-14 | 2024-01-23 | 腾讯科技(深圳)有限公司 | Palm placement area prompting method and device, storage medium and electronic equipment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060031776A1 (en) * | 2004-08-03 | 2006-02-09 | Glein Christopher A | Multi-planar three-dimensional user interface |
CN101952818A (en) * | 2007-09-14 | 2011-01-19 | 智慧投资控股67有限责任公司 | Processing based on the user interactions of attitude |
CN101986255A (en) * | 2010-11-05 | 2011-03-16 | 福州瑞芯微电子有限公司 | Semitransparent gradual three-dimensional user interface |
-
2011
- 2011-06-23 CN CN201110171017.XA patent/CN102253713B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060031776A1 (en) * | 2004-08-03 | 2006-02-09 | Glein Christopher A | Multi-planar three-dimensional user interface |
CN101952818A (en) * | 2007-09-14 | 2011-01-19 | 智慧投资控股67有限责任公司 | Processing based on the user interactions of attitude |
CN101986255A (en) * | 2010-11-05 | 2011-03-16 | 福州瑞芯微电子有限公司 | Semitransparent gradual three-dimensional user interface |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106984041A (en) * | 2011-02-11 | 2017-07-28 | 漳州市爵晟电子科技有限公司 | A kind of human-computer interaction control system |
CN102789312A (en) * | 2011-12-23 | 2012-11-21 | 乾行讯科(北京)科技有限公司 | User interaction system and method |
CN102789312B (en) * | 2011-12-23 | 2016-03-23 | 苏州触达信息技术有限公司 | A kind of user interactive system and method |
CN102707878A (en) * | 2012-04-06 | 2012-10-03 | 深圳创维数字技术股份有限公司 | User interface operation control method and device |
CN102902355A (en) * | 2012-08-31 | 2013-01-30 | 中国科学院自动化研究所 | Space interaction method of mobile equipment |
CN102902355B (en) * | 2012-08-31 | 2015-12-02 | 中国科学院自动化研究所 | The space interaction method of mobile device |
TWI494792B (en) * | 2012-09-07 | 2015-08-01 | Pixart Imaging Inc | Gesture recognition system and method |
US9628698B2 (en) | 2012-09-07 | 2017-04-18 | Pixart Imaging Inc. | Gesture recognition system and gesture recognition method based on sharpness values |
CN103839040B (en) * | 2012-11-27 | 2017-08-25 | 株式会社理光 | Gesture identification method and device based on depth image |
CN103839040A (en) * | 2012-11-27 | 2014-06-04 | 株式会社理光 | Gesture identification method and device based on depth images |
CN103916660A (en) * | 2013-01-07 | 2014-07-09 | 义明科技股份有限公司 | 3D image sensing device and 3D image sensing method |
CN103916660B (en) * | 2013-01-07 | 2016-05-04 | 义明科技股份有限公司 | 3D image sensing device and 3D image sensing method |
CN103067727A (en) * | 2013-01-17 | 2013-04-24 | 乾行讯科(北京)科技有限公司 | Three-dimensional 3D glasses and three-dimensional 3D display system |
WO2014117675A1 (en) * | 2013-01-30 | 2014-08-07 | 联想(北京)有限公司 | Information processing method and electronic device |
CN103543830B (en) * | 2013-10-28 | 2017-02-15 | 四川大学 | Method for mapping human skeleton points to virtual three-dimensional space points in three-dimensional display |
CN103543830A (en) * | 2013-10-28 | 2014-01-29 | 四川大学 | Method for mapping human skeleton points to virtual three-dimensional space points in three-dimensional display |
WO2015062248A1 (en) * | 2013-10-31 | 2015-05-07 | 京东方科技集团股份有限公司 | Display device and control method therefor, and gesture recognition method |
CN103530060A (en) * | 2013-10-31 | 2014-01-22 | 京东方科技集团股份有限公司 | Display device and control method thereof and gesture recognition method |
CN103995620A (en) * | 2013-12-02 | 2014-08-20 | 深圳市云立方信息科技有限公司 | Air touch system |
CN104915979A (en) * | 2014-03-10 | 2015-09-16 | 苏州天魂网络科技有限公司 | System capable of realizing immersive virtual reality across mobile platforms |
CN105389146A (en) * | 2014-09-03 | 2016-03-09 | 三星电子株式会社 | Method for displaying images and electronic device thereof |
CN105511599A (en) * | 2014-09-29 | 2016-04-20 | 联想(北京)有限公司 | Method and device for information processing |
CN105511599B (en) * | 2014-09-29 | 2019-06-25 | 联想(北京)有限公司 | Information processing method and device |
CN104536575A (en) * | 2015-01-04 | 2015-04-22 | 苏州易乐展示系统工程有限公司 | Large screen interaction system realization method based on 3D sensing |
CN106681497A (en) * | 2016-12-07 | 2017-05-17 | 南京仁光电子科技有限公司 | Method and device based on somatosensory control application program |
CN106846564A (en) * | 2016-12-29 | 2017-06-13 | 湖南拓视觉信息技术有限公司 | A kind of intelligent access control system and control method |
CN108388351A (en) * | 2018-04-12 | 2018-08-10 | 深圳市正图科技有限公司 | A kind of mixed reality experiencing system |
CN108388351B (en) * | 2018-04-12 | 2024-03-12 | 深圳市正图科技有限公司 | Mixed reality experience system |
CN111045558A (en) * | 2018-10-12 | 2020-04-21 | 上海博泰悦臻电子设备制造有限公司 | Interface control method based on three-dimensional scene, vehicle-mounted equipment and vehicle |
CN112925430A (en) * | 2019-12-05 | 2021-06-08 | 北京芯海视界三维科技有限公司 | Method for realizing suspension touch control, 3D display equipment and 3D terminal |
CN111722769A (en) * | 2020-07-16 | 2020-09-29 | 腾讯科技(深圳)有限公司 | Interaction method, interaction device, display equipment and storage medium |
CN111722769B (en) * | 2020-07-16 | 2024-03-05 | 腾讯科技(深圳)有限公司 | Interaction method, interaction device, display equipment and storage medium |
CN117437701A (en) * | 2022-07-14 | 2024-01-23 | 腾讯科技(深圳)有限公司 | Palm placement area prompting method and device, storage medium and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN102253713B (en) | 2016-10-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102253713A (en) | Display system orienting to three-dimensional images | |
US12032746B2 (en) | Systems and methods of creating a realistic displacement of a virtual object in virtual reality/augmented reality environments | |
US20220382379A1 (en) | Touch Free User Interface | |
Sharma et al. | Human computer interaction using hand gesture | |
Suarez et al. | Hand gesture recognition with depth images: A review | |
CN103809733B (en) | Man-machine interactive system and method | |
US20160098094A1 (en) | User interface enabled by 3d reversals | |
CN103999018B (en) | The user of response three-dimensional display object selects the method and system of posture | |
Kumar et al. | Hand data glove: A new generation real-time mouse for human-computer interaction | |
WO2016189390A2 (en) | Gesture control system and method for smart home | |
CN103246351A (en) | User interaction system and method | |
CN102934060A (en) | Virtual touch interface | |
CN107357428A (en) | Man-machine interaction method and device based on gesture identification, system | |
WO2008018943A1 (en) | Virtual controller for visual displays | |
CN103064514A (en) | Method for achieving space menu in immersive virtual reality system | |
CN103823548B (en) | Electronic equipment, wearable device, control system and method | |
CN106716331A (en) | Simulating real-time responsiveness for touch displays | |
CN114138106A (en) | Transitioning between states in a mixed virtual reality desktop computing environment | |
Liu | Design of human-computer interaction system based on virtual reality and its application in the dissemination of study lodge culture | |
WO2016102948A1 (en) | Coherent touchless interaction with stereoscopic 3d images | |
Dhamanskar et al. | Human computer interaction using hand gestures and voice | |
CN110766804B (en) | Method for cooperatively grabbing object by human and machine in VR scene | |
Khan et al. | Gesture recognition using Open-CV | |
Gope et al. | Interaction with Large Screen Display using Fingertip & Virtual Touch Screen | |
Zhang et al. | Free-hand gesture control with" touchable" virtual interface for human-3DTV interaction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |