[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2017113674A1 - Procédé et système pour réaliser une commande de détection de mouvement sur la base d'un dispositif intelligent, et dispositif intelligent - Google Patents

Procédé et système pour réaliser une commande de détection de mouvement sur la base d'un dispositif intelligent, et dispositif intelligent Download PDF

Info

Publication number
WO2017113674A1
WO2017113674A1 PCT/CN2016/088314 CN2016088314W WO2017113674A1 WO 2017113674 A1 WO2017113674 A1 WO 2017113674A1 CN 2016088314 W CN2016088314 W CN 2016088314W WO 2017113674 A1 WO2017113674 A1 WO 2017113674A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
motion trajectory
camera
image
contour
Prior art date
Application number
PCT/CN2016/088314
Other languages
English (en)
Chinese (zh)
Inventor
陈建如
Original Assignee
乐视控股(北京)有限公司
乐视移动智能信息技术(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 乐视控股(北京)有限公司, 乐视移动智能信息技术(北京)有限公司 filed Critical 乐视控股(北京)有限公司
Priority to JP2016570245A priority Critical patent/JP2018507448A/ja
Priority to EP16763711.5A priority patent/EP3206188A4/fr
Priority to US15/243,966 priority patent/US20170193668A1/en
Publication of WO2017113674A1 publication Critical patent/WO2017113674A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/013Force feedback applied to a game
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20116Active contour; Active surface; Snakes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Definitions

  • the present invention relates to the field of computer vision technology, and in particular, to a method, system, and smart device for implementing somatosensory control based on an intelligent device.
  • somatosensory games gradually enter people's lives.
  • somatosensory game consoles use the somatosensory camera to sense human body movements to operate games, such as the Xbox360 somatosensory game produced by Microsoft Corporation.
  • the Kinect uses three somatosensory cameras to acquire human body motions and converts them into operation commands to control the game, so that people can get a better operational feeling when playing games, and can make the human body exercise in the exercise state.
  • the technical problem to be solved by the present invention is that the price of the somatosensory camera is expensive and hinders the application of the somatosensory technology in people's lives.
  • an embodiment of the present invention provides a method for implementing a somatosensory control based on a smart device, the smart device having a camera, the method comprising: collecting user image data; Obtaining an image contour of the user according to the image data; acquiring a first motion trajectory of the user on the imaging plane according to the image contour; acquiring the user perpendicular to the camera according to the change of the feature length on the image contour and/or the focal length change of the camera a second motion trajectory in a direction of the imaging plane; generating somatosensory data according to the first motion trajectory and the second motion trajectory.
  • the feature length comprises a hand contour length/width, a leg contour length/width or a head contour length/width.
  • the method further comprises: separating the user image from the foreground and the background.
  • the second motion trajectory of the user in a direction perpendicular to the imaging plane is obtained according to a change in the length of the feature on the image contour and/or a change in the focal length of the camera and according to the first motion trajectory and the second motion
  • the trajectory generates between the somatosensory data, and further includes: correcting the second motion trajectory according to a distance between each part of the user body measured by the ranging module and the camera.
  • the ranging module is an infrared ranging module or a laser ranging module.
  • An embodiment of the present invention further provides a system for implementing a somatosensory control based on a smart device, the smart device having a camera, the system comprising: an acquisition unit for collecting user image data; and an image contour acquisition unit for Obtaining an image contour of the user; the first motion trajectory unit is configured to acquire a first motion trajectory of the user on the imaging plane according to the image contour; and the second motion trajectory unit is configured to change the feature length according to the image contour And/or a focal length change of the camera acquires a second motion trajectory of the user in a direction perpendicular to the imaging plane; a somatosensory data unit configured to generate somatosensory data according to the first motion trajectory and the second motion trajectory .
  • the system further comprises: a separating unit, configured to separate the user image from the foreground and the background between the collecting of the user image data by the collecting unit and the acquiring of the image contour of the user by the image contour acquiring unit according to the user image data .
  • a separating unit configured to separate the user image from the foreground and the background between the collecting of the user image data by the collecting unit and the acquiring of the image contour of the user by the image contour acquiring unit according to the user image data .
  • the system further comprises: a correction unit, configured to acquire, in the second motion trajectory unit, a change in a feature length on the image contour and/or a focus change of the camera to obtain a user perpendicular to the imaging plane a second motion trajectory in the direction of the body and the somatosensory data unit according to The second motion trajectory is corrected between the first motion trajectory and the second motion trajectory to generate the somatosensory data according to the distance between the parts of the user body measured by the ranging module and the camera.
  • a correction unit configured to acquire, in the second motion trajectory unit, a change in a feature length on the image contour and/or a focus change of the camera to obtain a user perpendicular to the imaging plane a second motion trajectory in the direction of the body and the somatosensory data unit according to The second motion trajectory is corrected between the first motion trajectory and the second motion trajectory to generate the somatosensory data according to the distance between the parts of the user body measured by the ranging module and the camera.
  • the embodiment of the present invention further provides a smart device, including: a camera for collecting user image data; a processor for acquiring an image contour of the user according to the user image data, and acquiring the user on the imaging plane according to the image contour a first motion trajectory, acquiring a second motion trajectory of the user in a direction perpendicular to the imaging plane according to a change in a feature length on the image contour and/or a focal length change of the camera, and according to the first motion
  • the trajectory and the second motion trajectory generate somatosensory data.
  • the processor is further configured to receive a distance between parts of the user body measured by an external ranging module to the camera, and correct the second motion trajectory according to the distance.
  • the embodiment of the present invention discloses a system for implementing a somatosensory control based on a smart device, the smart device having a camera, wherein the system comprises:
  • One or more processors are One or more processors;
  • One or more programs the one or more programs being stored in the memory, and when executed by the one or more processors, performing the following operations:
  • the body feeling data is generated according to the first motion trajectory and the second motion trajectory.
  • the system wherein the user image is separated from the foreground and the background between the acquiring user image data and the image contour of the user acquired according to the user image data.
  • a second motion track of a user in a direction perpendicular to the imaging plane is obtained based on a change in feature length on the image contour and/or a change in focal length of the camera
  • a distance between the trace and the somatosensory data unit according to the first motion trajectory and the second motion trajectory according to the first motion trajectory, the distance between each part of the user body measured by the ranging module and the camera, and the second motion trajectory Make corrections.
  • a method, system, and smart device for implementing a somatosensory control based on a smart device using only a camera on a smart device such as a smart phone to acquire user image data, and obtaining a user's image on the imaging plane according to the image data a motion trajectory and a second motion trajectory in a direction perpendicular to the imaging plane, thereby obtaining a motion trajectory of the user in three-dimensional space to generate somatosensory data, allowing the user to experience the somatosensory technology without additional equipment, which is beneficial to the somatosensory technology. Promote the application.
  • FIG. 1 is a schematic diagram of an application scenario for implementing somatosensory control based on a smart device according to an embodiment of the invention
  • FIG. 2 shows a flow chart of a method for implementing somatosensory control based on a smart device according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a system for implementing somatosensory control based on a smart device according to an embodiment of the present invention
  • FIG. 4 is a schematic diagram of a system for implementing somatosensory control based on a smart device with a processor according to an embodiment of the present invention
  • FIG. 5 is a schematic diagram of a system for implementing somatosensory control based on a smart device with two processors according to an embodiment of the present invention.
  • a method for implementing a somatosensory control based on a smart device It requires a smart device with a camera, which can be a smartphone, tablet, laptop, etc.
  • a camera which can be a smartphone, tablet, laptop, etc.
  • the user needs to keep a certain distance from the camera of the smart device, so that the camera can collect image data of the whole body of the user.
  • some somatosensory controls only require hand motion control, in which case only the camera can capture image data from the user's hand.
  • an embodiment of the present invention provides a method for implementing a somatosensory control based on a smart device, where the smart device has a camera, and the method includes the following steps:
  • S1. Collect user image data. As shown in Figure 1, the camera captures image data of the user on the imaging plane, i.e., the x-y plane.
  • This step is an optional step, and any existing image separation method can be used to separate the user image from the foreground and the background, which can reduce the interference of the foreground and background images, and reduce the amount of computation of the processor post-processing.
  • the feature length may be the hand contour length/width, the leg contour length/width, the head contour length/width, etc., for example, when it is detected that the hand contour length becomes longer or the width becomes wider, the hand can be judged to move toward the camera, the hand When the length of the contour is shortened or the width is narrowed, the movement of the hand away from the camera can be judged, so that the change of each torso in the z direction can be judged.
  • the camera constantly changes the focal length when capturing the user image to obtain clear imaging.
  • the focal length change of the image head it can be judged whether the user is moving toward or away from the camera.
  • the movement trajectory of the user in the direction perpendicular to the imaging plane can be judged.
  • comprehensive judgment can be made according to the two to obtain more accurate results. .
  • S6 Generate somatosensory data according to the first motion trajectory and the second motion trajectory.
  • the first motion trajectory on the integrated imaging plane and the second motion trajectory in the direction perpendicular to the imaging plane can obtain the motion trajectory of the user in three-dimensional space, so that the somatosensory data can be obtained, and the somatosensory data is input to the somatosensory function.
  • a method for implementing a somatosensory control based on a smart device using only a camera on a smart device such as a smart phone to acquire user image data, and obtaining a first motion trajectory and a vertical direction of the user on the imaging plane according to the image data.
  • the second motion trajectory in the direction of the imaging plane thereby obtaining the motion trajectory of the user in the three-dimensional space to generate the somatosensory data, and allowing the user to experience the somatosensory technology without additional equipment, which is beneficial to the popularization and application of the somatosensory technology.
  • the ranging module may be an infrared ranging module or a laser ranging module, and the ranging module may be wired or wireless.
  • the method is connected with a smart device such as a smart phone to transmit the measured distance to the smart device, and the smart device acquires the distance between the parts of the user body measured by the ranging module to the camera, and according to the obtained distance
  • the second motion trajectory is corrected, and finally more accurate body sensation data is generated according to the first motion trajectory and the corrected second motion trajectory.
  • the embodiment of the invention further provides a system for implementing somatosensory control based on a smart device, the smart device having a camera, the system comprising:
  • the collecting unit 1 is configured to collect user image data
  • An image contour acquiring unit 3 configured to acquire an image contour of the user according to the user image data
  • a first motion trajectory unit 4 configured to acquire a first user on the imaging plane according to the image contour Motion track
  • the feature length includes a hand contour length /width, leg profile length/width or head profile length/width;
  • the somatosensory data unit 7 is configured to generate somatosensory data according to the first motion trajectory and the second motion trajectory.
  • a system for implementing a somatosensory control based on a smart device uses only a camera on a smart device such as a smart phone to acquire user image data, and obtains a first motion trajectory and a vertical direction of the user on the imaging plane according to the image data.
  • the second motion trajectory in the direction of the imaging plane, thereby obtaining the motion trajectory of the user in the three-dimensional space to generate the somatosensory data, and allowing the user to experience the somatosensory technology without additional equipment, which is beneficial to the popularization and application of the somatosensory technology.
  • the system for implementing the somatosensory control based on the smart device further includes: a separating unit 2, configured to collect the user image data between the collecting unit 1 and the image contour acquiring unit 3 to obtain the image of the user according to the user image data, and the user image Separated from the foreground and background.
  • a separating unit 2 configured to collect the user image data between the collecting unit 1 and the image contour acquiring unit 3 to obtain the image of the user according to the user image data, and the user image Separated from the foreground and background.
  • the system for implementing the somatosensory control based on the smart device further comprises: a correction unit 6 for acquiring, in the second motion trajectory unit 5, the user is perpendicular to the imaging plane according to the change of the feature length on the image contour and/or the focal length change of the camera
  • the second motion trajectory in the direction and the somatosensory data unit 7 generate the somatosensory data according to the first motion trajectory and the second motion trajectory, and the distance between the parts of the user body measured by the ranging module to the camera, and the second The motion track is corrected.
  • the ranging module is an infrared ranging module or a laser ranging module.
  • the embodiment of the present invention further provides a smart device, which may be a smart phone, a tablet computer, a notebook computer, etc., and includes:
  • a processor configured to acquire an image contour of the user according to the user image data, and acquire a first motion trajectory of the user on the imaging plane according to the image contour, according to the feature length of the image contour
  • the change in degree and/or the change in the focal length of the camera acquires a second motion trajectory of the user in a direction perpendicular to the imaging plane, and generates somatosensory data from the first motion trajectory and the second motion trajectory.
  • the smart device of the embodiment of the present invention can obtain the first motion trajectory of the user on the imaging plane and the second motion trajectory in the direction perpendicular to the imaging plane, thereby obtaining the motion trajectory of the user in the three-dimensional space.
  • Generates somatosensory data allowing users to experience somatosensory technology without the need for additional equipment, which is conducive to the promotion and application of somatosensory technology.
  • the processor is further configured to receive a distance between each part of the user body measured by the external ranging module to the camera, and correct the second motion trajectory according to the distance.
  • the processor is further configured to receive a distance between each part of the user body measured by the external ranging module to the camera, and correct the second motion trajectory according to the distance.
  • the embodiment discloses a system for implementing somatosensory control based on a smart device, wherein the smart device has a camera, wherein the system includes: one or more processors 200; a memory 100; one or more programs, the one Or a plurality of programs are stored in the memory 100, and when executed by the one or more processors 200, performing operations of: acquiring user image data; acquiring an image contour of the user according to the user image data; Obtaining a first motion trajectory of the user on the imaging plane; acquiring a second motion trajectory of the user in a direction perpendicular to the imaging plane according to a change in the feature length on the image contour and/or a focal length change of the camera; The first motion trajectory and the second motion trajectory generate somatosensory data.
  • a processor 200 may be included, and as shown in FIG. 5, two processors 200 may be included.
  • the user image is separated from the foreground and the background between the acquiring user image data and the image contour of the user according to the user image data.
  • the system of the present embodiment preferably acquires a second motion trajectory of the user in a direction perpendicular to the imaging plane according to a change in the length of the feature on the image contour and/or a change in the focal length of the camera.
  • the somatosensory data unit generates the second motion locus between the somatosensory data according to the first motion locus and the second motion locus, and the second motion locus is corrected according to the distance between the parts of the user body measured by the ranging module and the camera.
  • embodiments of the invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or a combination of software and hardware. Moreover, the invention can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.
  • computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Human Computer Interaction (AREA)
  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un procédé et un système pour réaliser une commande de détection de mouvement sur la base d'un dispositif intelligent, ainsi qu'un dispositif intelligent. Le procédé consiste : à collecter des données d'image d'utilisateur ; à acquérir un contour d'image d'un utilisateur selon les données d'image d'utilisateur ; à acquérir une première piste de mouvement de l'utilisateur sur un plan d'imagerie selon le contour d'image ; à acquérir une seconde piste de mouvement de l'utilisateur dans une direction perpendiculaire au plan d'imagerie selon un changement de la longueur de caractéristique du contour d'image et/ou un changement de la distance focale d'une caméra ; et à générer des données de détection de mouvement selon la première piste de mouvement et la seconde piste de mouvement.
PCT/CN2016/088314 2015-12-31 2016-07-04 Procédé et système pour réaliser une commande de détection de mouvement sur la base d'un dispositif intelligent, et dispositif intelligent WO2017113674A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2016570245A JP2018507448A (ja) 2015-12-31 2016-07-04 スマートデバイスに基づく体感制御の実現方法、システム及びスマートデバイス
EP16763711.5A EP3206188A4 (fr) 2015-12-31 2016-07-04 Procédé et système pour réaliser une commande de détection de mouvement sur la base d'un dispositif intelligent, et dispositif intelligent
US15/243,966 US20170193668A1 (en) 2015-12-31 2016-08-23 Intelligent Equipment-Based Motion Sensing Control Method, Electronic Device and Intelligent Equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201511034014.6A CN105894533A (zh) 2015-12-31 2015-12-31 基于智能设备实现体感控制的方法、系统以及智能设备
CN201511034014.6 2015-12-31

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/243,966 Continuation US20170193668A1 (en) 2015-12-31 2016-08-23 Intelligent Equipment-Based Motion Sensing Control Method, Electronic Device and Intelligent Equipment

Publications (1)

Publication Number Publication Date
WO2017113674A1 true WO2017113674A1 (fr) 2017-07-06

Family

ID=57002309

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/088314 WO2017113674A1 (fr) 2015-12-31 2016-07-04 Procédé et système pour réaliser une commande de détection de mouvement sur la base d'un dispositif intelligent, et dispositif intelligent

Country Status (5)

Country Link
US (1) US20170193668A1 (fr)
EP (1) EP3206188A4 (fr)
JP (1) JP2018507448A (fr)
CN (1) CN105894533A (fr)
WO (1) WO2017113674A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106547357B (zh) * 2016-11-22 2018-06-29 包磊 体感传感数据的通信处理方法及装置
CN107590823B (zh) * 2017-07-21 2021-02-23 昆山国显光电有限公司 三维形态的捕捉方法和装置
CN109064776A (zh) * 2018-09-26 2018-12-21 广东省交通规划设计研究院股份有限公司 预警方法、系统、计算机设备和存储介质
KR20210099988A (ko) 2020-02-05 2021-08-13 삼성전자주식회사 뉴럴 네트워크의 메타 학습 방법 및 장치와 뉴럴 네트워크의 클래스 벡터 학습 방법 및 장치

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102074018A (zh) * 2010-12-22 2011-05-25 Tcl集团股份有限公司 一种基于深度信息的轮廓跟踪方法
CN102226880A (zh) * 2011-06-03 2011-10-26 北京新岸线网络技术有限公司 一种基于虚拟现实的体感操作方法及系统
CN102350057A (zh) * 2011-10-21 2012-02-15 上海魔迅信息科技有限公司 基于电视机顶盒实现体感游戏操控的系统及方法
WO2012128399A1 (fr) * 2011-03-21 2012-09-27 Lg Electronics Inc. Dispositif d'affichage et procédé de commande associé
CN103345301A (zh) * 2013-06-18 2013-10-09 华为技术有限公司 一种深度信息获取方法和装置
CN103679124A (zh) * 2012-09-17 2014-03-26 原相科技股份有限公司 手势识别系统及方法
CN105138111A (zh) * 2015-07-09 2015-12-09 中山大学 一种基于单摄像头的体感交互方法及系统

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6374225B1 (en) * 1998-10-09 2002-04-16 Enounce, Incorporated Method and apparatus to prepare listener-interest-filtered works
JP2002041038A (ja) * 2000-07-31 2002-02-08 Taito Corp 仮想楽器演奏装置
JP2006107060A (ja) * 2004-10-04 2006-04-20 Sharp Corp 入退室検知装置
US11325029B2 (en) * 2007-09-14 2022-05-10 National Institute Of Advanced Industrial Science And Technology Virtual reality environment generating apparatus and controller apparatus
JP5520463B2 (ja) * 2008-09-04 2014-06-11 株式会社ソニー・コンピュータエンタテインメント 画像処理装置、対象物追跡装置および画像処理方法
US8964298B2 (en) * 2010-02-28 2015-02-24 Microsoft Corporation Video display modification based on sensor input for a see-through near-to-eye display
US8576253B2 (en) * 2010-04-27 2013-11-05 Microsoft Corporation Grasp simulation of a virtual object
JP4650961B2 (ja) * 2010-04-29 2011-03-16 株式会社バンダイナムコゲームス ゲーム装置
JP5438601B2 (ja) * 2010-06-15 2014-03-12 日本放送協会 人物動作判定装置およびそのプログラム
US8475367B1 (en) * 2011-01-09 2013-07-02 Fitbit, Inc. Biometric monitoring device having a body weight sensor, and methods of operating same
US9734304B2 (en) * 2011-12-02 2017-08-15 Lumiradx Uk Ltd Versatile sensors with data fusion functionality
CN103577793B (zh) * 2012-07-27 2017-04-05 中兴通讯股份有限公司 手势识别方法及装置
CA2825635A1 (fr) * 2012-08-28 2014-02-28 Solink Corporation Systeme de verification de transaction
US9489743B2 (en) * 2013-03-13 2016-11-08 Mecommerce, Inc. Determining dimension of target object in an image using reference object
AU2014308590B2 (en) * 2013-08-22 2016-04-28 Bespoke, Inc. Method and system to create custom products
US20150058427A1 (en) * 2013-08-23 2015-02-26 Jean Rene' Grignon Limited Area Temporary Instantaneous Network
KR102233728B1 (ko) * 2013-10-31 2021-03-30 삼성전자주식회사 전자 장치의 제어 방법, 장치 및 컴퓨터 판독 가능한 기록 매체
JP2017505553A (ja) * 2013-11-29 2017-02-16 インテル・コーポレーション 顔検出によるカメラ制御
JP2015158745A (ja) * 2014-02-21 2015-09-03 日本電信電話株式会社 行動識別器生成装置、行動認識装置及びプログラム
US9916010B2 (en) * 2014-05-16 2018-03-13 Visa International Service Association Gesture recognition cloud command platform, system, method, and apparatus
US9922236B2 (en) * 2014-09-17 2018-03-20 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable eyeglasses for providing social and environmental awareness
US10048835B2 (en) * 2014-10-31 2018-08-14 Microsoft Technology Licensing, Llc User interface functionality for facilitating interaction between users and their environments
US10213688B2 (en) * 2015-08-26 2019-02-26 Warner Bros. Entertainment, Inc. Social and procedural effects for computer-generated environments

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102074018A (zh) * 2010-12-22 2011-05-25 Tcl集团股份有限公司 一种基于深度信息的轮廓跟踪方法
WO2012128399A1 (fr) * 2011-03-21 2012-09-27 Lg Electronics Inc. Dispositif d'affichage et procédé de commande associé
CN102226880A (zh) * 2011-06-03 2011-10-26 北京新岸线网络技术有限公司 一种基于虚拟现实的体感操作方法及系统
CN102350057A (zh) * 2011-10-21 2012-02-15 上海魔迅信息科技有限公司 基于电视机顶盒实现体感游戏操控的系统及方法
CN103679124A (zh) * 2012-09-17 2014-03-26 原相科技股份有限公司 手势识别系统及方法
CN103345301A (zh) * 2013-06-18 2013-10-09 华为技术有限公司 一种深度信息获取方法和装置
CN105138111A (zh) * 2015-07-09 2015-12-09 中山大学 一种基于单摄像头的体感交互方法及系统

Also Published As

Publication number Publication date
US20170193668A1 (en) 2017-07-06
EP3206188A1 (fr) 2017-08-16
CN105894533A (zh) 2016-08-24
JP2018507448A (ja) 2018-03-15
EP3206188A4 (fr) 2017-08-16

Similar Documents

Publication Publication Date Title
US10674142B2 (en) Optimized object scanning using sensor fusion
JP7457082B2 (ja) 反応型映像生成方法及び生成プログラム
WO2019120032A1 (fr) Procédé de construction de modèle, procédé de photographie, dispositif, support d'informations et terminal
US10659769B2 (en) Image processing apparatus, image processing method, and storage medium
US10755438B2 (en) Robust head pose estimation with a depth camera
CN103310186B (zh) 校正图像中用户的注视方向的方法和便携式终端
US10469829B2 (en) Information processor and information processing method
US20170345183A1 (en) Robust Head Pose Estimation with a Depth Camera
US20160048964A1 (en) Scene analysis for improved eye tracking
KR101718837B1 (ko) 응용프로그램의 제어방법, 장치 및 전자장비
KR20170031733A (ko) 디스플레이를 위한 캡처된 이미지의 시각을 조정하는 기술들
US20170316582A1 (en) Robust Head Pose Estimation with a Depth Camera
JP2015526927A (ja) カメラ・パラメータのコンテキスト駆動型調整
US20140177926A1 (en) Information notification apparatus that notifies information of motion of a subject
WO2017113674A1 (fr) Procédé et système pour réaliser une commande de détection de mouvement sur la base d'un dispositif intelligent, et dispositif intelligent
US20150109528A1 (en) Apparatus and method for providing motion haptic effect using video analysis
WO2022174594A1 (fr) Procédé et système de suivi et d'affichage de main nue basés sur plusieurs caméras, et appareil
US20100145232A1 (en) Methods and apparatuses for correcting sport postures captured by a digital image processing apparatus
US20150379333A1 (en) Three-Dimensional Motion Analysis System
US20170140215A1 (en) Gesture recognition method and virtual reality display output device
US20210133985A1 (en) Method, system, and computer-accessible recording medium for motion recognition based on an atomic pose
CN108885496B (zh) 信息处理装置、信息处理方法和程序
US10291845B2 (en) Method, apparatus, and computer program product for personalized depth of field omnidirectional video
KR102147930B1 (ko) 포즈 인식 방법 및 장치
KR101414362B1 (ko) 영상인지 기반 공간 베젤 인터페이스 방법 및 장치

Legal Events

Date Code Title Description
REEP Request for entry into the european phase

Ref document number: 2016763711

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2016763711

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2016570245

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE