CN110262660A - Based on the virtual throwing access system of Kinect somatosensory device - Google Patents
Based on the virtual throwing access system of Kinect somatosensory device Download PDFInfo
- Publication number
- CN110262660A CN110262660A CN201910534256.3A CN201910534256A CN110262660A CN 110262660 A CN110262660 A CN 110262660A CN 201910534256 A CN201910534256 A CN 201910534256A CN 110262660 A CN110262660 A CN 110262660A
- Authority
- CN
- China
- Prior art keywords
- bead
- throwing
- drop point
- wooden case
- access
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention discloses one kind to be based on the virtual throwing access system of Kinect somatosensory device, in the scene of Unity3D simulation, dispose a virtual number skeleton, the wooden case 3D model being open with one, after user completes throwing action, a bead is instantiated, and does the movement of parabolic switch triggering, access drop point site is calculated, wooden case is moved to drop point site and carries out access;The triggering of throwing switch relies on multiple trigger bits, triggers throwing function, assigns bead, and instantiates bead initial velocity, direction, first position, the physical engine of Unity3D is transferred to simulate its motion profile;Access drop point site calculates the drop point of fall time and bead by the height of component velocity and throw point position on three directions of bead x, y, z;Wooden case is moved to the drop point site of bead according to the difference of wooden case current location and drop point site by the movement of wooden case, and bead is waited to drop into wooden case;The virtual effect of the present invention realization throwing and access.
Description
Technical field
The present invention relates to a kind of somatosensory devices, more particularly relate to a kind of based on the virtual throwing access of Kinect somatosensory device
System.
Background technique
The computer technology in the present age is that we provide many convenient input modes, including voice input, image/video
Information input, and the body-sensing input on herein is established, they are all referred to as natural user interface.
The working principle of Microsoft's Kinect somatosensory device: on the basis of image recognition, establishing algorithm analysis video information,
And video is turned into image one by one, while utilizing infrared diffusion coding techniques, depth data digging is done to video information
Pick, the motion information of available human body.Kinect can be by the limbs information capture of station user in front of its camera, and incites somebody to action
The limbs information of user is condensed to be the three dimensional space coordinate value of 20 bone nodes, and updates and sit with the movement of human body in real time
Scale value is supplied to the computer for needing to carry out movement processing, response by these data in the form of interface.Although Kinect
Processor with high performance can do the processing of high speed to collected video information, but Kinect is set as an auxiliary
Standby, it can only accomplish to capture human skeleton nodal information, further to these information can not be handled.If will be to one
A movement is identified, is needed the human body information for capturing Kinect to be collected, is carried out at judgement on another computer
Reason.
KinectV1.0 can in the range of its camera can receive, to one or most two users,
Bone key node tracking of information is carried out, 20 bone nodal informations of human body can be at most tracked.Because Kinect is used
" pumped FIR laser " technology, the infrared ray launched using infrared transmitter are spatially encoded, therefore the result measured will not be by
To the influence of ambient light, and infrared ray is black light, will not generate interference to user.Depth data in this way is dug
Pick is achieved that and can analyze out the position of Pixel Information to the three dimensional stress of environmental information, digitlization, i.e. Kinect in image pickup scope
Set distance.
But Kinect, as an ancillary equipment, it can only accomplish to capture human skeleton nodal information, can not be right
These information are further to be handled.If to be identified to a movement, need by Kinect capture human body information into
Row is collected, and carries out judgement processing on computers.In addition, Kinect only provides real-time right hand bone as an ancillary equipment
Bone nodal information, and the drop point site of the triggering of throwing, access calculates and wooden case it is mobile etc. to carry out it is further soft or hard
Part design is completed.
Summary of the invention
1, goal of the invention.
The present invention proposes a kind of based on the virtual throwing of Kinect somatosensory device to solve the problems, such as virtual throwing access
Access system.
2, the technical solution adopted in the present invention.
The invention discloses one kind to be based on the virtual throwing access system of Kinect somatosensory device, in the field of Unity3D simulation
Jing Zhong disposes a virtual number skeleton, the wooden case 3D model being open with one, after user completes throwing action,
A bead is instantiated, and does the movement of parabolic switch triggering, calculates access drop point site, wooden case is moved to drop point site and is contained
It connects;
The triggering of throwing switch relies on multiple trigger bits, i.e. bool value, when whole trigger bits meet condition, i.e., whole
When bool value is true, throwing function is triggered, initial velocity, direction, first position are assigned bead, and instantiate bead, is transferred to
The physical engine of Unity3D simulates its motion profile;
Access drop point site is calculated down by the height of component velocity and throw point position on three directions of bead x, y, z
Fall the drop point of time and bead;
Wooden case is moved to the drop point position of bead according to the difference of wooden case current location and drop point site by the movement of wooden case
It sets, bead is waited to drop into wooden case;
After user's throwing complete object, switch is closed, i.e., its value becomes false, and user is needed to click space
Switch is opened and throwSwitch value is set to true again, just can be carried out throwing next time by key.
Further, the triggering of throwing switch specifically:
The 3 d space coordinate value of step 1, right hand bone node, and the coordinate value has the variation of N frame each second, so that it may
To calculate the component velocity on real-time three directions of right-hand minutia and component of acceleration size;
If step 2, with 1/N seconds for the unit time, successively addition of variable x/N, be often added to N just reset and calculate a speed with
Acceleration is achieved that the effect per second for calculating n times, stores velocity amplitude and acceleration value with float;
Step 3 calculates x, and the component velocity in tri- directions y, z and the formula for closing speed are as follows:
Vx=dx/t;Vy=dy/t;Vz=dz/t;V=(Vx^2+Vy^2+Vz^2) ^0.5
Wherein dx, dy, dz are the position coordinates difference on three directions in the unit time;
Speed is vector, obtains the direction in space according to the component velocity on three directions;
Step 4, according to component velocity, obtain the acceleration and resultant acceleration on three directions of the right hand, acceleration calculation is public
Formula is as follows:
Ax=dVx/t;Ay=dVy/t;Az=dVz/t;A=(Ax^2+Ay^2+Az^2) ^0.5;
Unit time t equally takes 1/N seconds, and dVx, dVy, dVz are the difference on three directions in the speed unit time.
Further, in the triggering of throwing, the speed of the right hand is more than 7m/s, and acceleration is more than 70m/s as triggering throwing
The switch of function.
Further, access drop point site specifically: at the time of bead is spilled over, we record bead in x, y, z
Component velocity on three directions, and the height at moment of dishing out is recorded, the reason of bead can be calculated using physical equation
By drop point, i.e. the bead distance mobile in x-axis and z-axis, formula is as follows: Vx, Vy, Vz are that bead is dished out on three directions of moment
Component velocity, h is that bead is dished out the height at moment, and g is acceleration of gravity, and in the environment of Unity3D, y-axis is vertical direction
Axis;
Sx=(Vy/g+ ((Vy^2+2*g*h)/g^2) ^0.5) * Vx, the moving distance in x-axis direction;
Sz=(Vy/g+ ((Vy^2+2*g*h)/g^2) ^0.5) * Vz, the moving distance on z-axis direction;
Total time of the bead from dishing out whereabouts is T:Vy/g+ ((Vy^2+2*g*h)/g^2) ^0.5;
Bead is acquired from the moment of dishing out, to after the total time of landing, so that it may which the drop point site for calculating bead is fallen
Point coordinate.
Further, the movement of wooden case:
Sx=Vx*T;Sz=Vz*T
The moving distance of bead x, z in the plane, do not consider air drag, so bead is dished out on the direction moment x, z
Initial velocity be it is constant, Vx be the direction bead x on initial velocity, Vz be the direction bead z on initial velocity, T be fall required for
Total time;
The drop point site (x ', y ', z ') of bead is that the position of dishing out (x, y, z) of bead adds offset, calculation formula
Are as follows:
X '=x+Sx;Z '=z+Sz;Y '=0;
Finally we calculate wooden barrel changing coordinates and to the differences between bead landing point coordinates, and wooden barrel is mobile by function
To small placement, wait and fall on drop point site under bead, wooden case translation access can be realized i.e.: (x1, y1, z1) is that wooden case is in situ
The three-dimensional coordinate set, (x ', y ', z ') are the three-dimensional coordinate of drop point site;
X1=x ';Z1=z ';Y1=y '=0;
Since wooden case moves in xoz plane, so y-axis coordinate is constant.
Further, a panel component, the video that display Kinect camera is captured are added in Unity3D
Image information;A C# script is bound on panel component, and the control statement for calling Kinect device is write in script, it will
The video color image real time print that Kinect is got is in panel;The three-dimensional of human skeleton node is got in Kinect
After coordinate information, this rainbow skeleton in Unity3D is controlled, the consistent limbs of the video image information made and captured are dynamic
Make.
3, technical effect caused by the present invention.
(1) the shortcomings that present invention breaches traditional Kinect somatosensory device can be mentioned using Kinect somatosensory device
The real-time right hand bone nodal information supplied defines throwing triggering, calculates the small placement that the right hand is dished out, and mobile wooden case access is small
Ball;
(2) present invention can be used for allowing the movement waved that can play many application fields such as ppt.
(3) present invention uses Kinect somatosensory devices, have combined Unity3D and have been developed;It is collected and is used by Kinect
Family is dished out the limbs information of bead, and the access of throwing drop point, wooden case is calculated;And Unity3D is passed information to, pass through script control
It increases income used in invention processed skeleton pattern, realizes the virtual effect of throwing and access.
Detailed description of the invention
Fig. 1 is the speed schematic diagram under peacetime state.
Fig. 2 is the speed schematic diagram under throwing state.
Fig. 3 is the acceleration schematic diagram under peacetime state.
Fig. 4 is the acceleration schematic diagram under throwing state.
Fig. 5 is the velocity wave form figure under peacetime state.
Fig. 6 is the velocity wave form figure under throwing state.
Fig. 7 is that no user uses schematic diagram.
Fig. 8 is that user is used and moves right hand schematic diagram.
Fig. 9 is that wooden case is mobile by original (x1, y1, z1) Dao (x2, y2, z2) schematic diagram.
Figure 10 has been turned on no user for program and uses schematic diagram.
Figure 11 is that user's throwing action is completed and bead is caught schematic diagram.
Specific embodiment
Embodiment
Design of the invention is to realize throwing, access.In the scene of Unity3D simulation, a virtual use is disposed
Amount character skeleton, the wooden case 3D model being open with one instantiate a bead, and throw after user completes throwing action
Object movement, calculates drop point, allows model wooden case to be moved to drop point site and carries out access.Entirely invention core is that, throwing triggering,
Access drop point site calculates and the movement of wooden case.The triggering of throwing relies on multiple trigger bits, i.e. bool value, when whole trigger bits
Meet condition, i.e., when whole bool values is true, triggers throwing function, the value of initial velocity, direction of dishing out, position of dishing out is assigned
Give bead.By the height of component velocity and throw point position on three directions of bead x, y, z, fall time and bead are calculated
Wooden case is moved to the drop point site of bead according to the difference of wooden case current location and drop point site by drop point, and bead is waited to drop into
Wooden case.
The concept of gesture is an extensive concept in Kinect invention, that is, is not understood as the movement of limbs.The right hand is made
For the triggering object of throwing in system, motion itself is variable motion.Motion state when user's right hand throwing, and usually just
State when the common right hand is different.When using the right hand usually, moment is used when the velocity and acceleration of the right hand is compared to throwing
The velocity and acceleration of power has biggish difference.As shown in Figure 1, Figure 2, Figure 3, Figure 4, by functional image, we can be more
Intuitively find out difference.
If we calculate the speed of the usually right hand with the speed of right hand when throwing, picture is compared, with list
The position time is interval, and using its size as line segment length, setting is drawn, and the middle position of line segment is gone here and there in straight line
On, then the head end of a upper speed line segment and the tail end of next speed line segment are connected with straight line, then we can be obtained by
The right hand is usually similar to audiograph with the speed " waveform diagram " under throwing state under motion state, shown in following Fig. 5, Fig. 6, erects
What is marked by line is the size of speed in the unit time.Sound can be analyzed by waveform, same method, can be incited somebody to action
The foundation that the velocity wave form figure of movement is acted as analysis, by the waveform diagram of speed, we more intuitive can see, right
Velocity variations situation of the hand under peacetime state and under throwing state.
From figure, we can significantly find out, the velocity wave form variation that velocity wave form when throwing will obviously than usual
It is acutely very much.And in throwing algorithm, also exactly using characteristic --- the peak value of speed and the change when throwing of this throwing state
Rate, higher than the peak value and change rate of usually speed.In this, as the trigger source of throwing.
When having obtained the 3 d space coordinate value of right hand bone node, and the coordinate value has the variation of 30 frames each second,
The component velocity and component of acceleration size on real-time three directions of right-hand minutia can be calculated.If with 1/30 second for the unit time,
It so just needs each second to calculate 30 times.Such operation frequency is too high, and the value calculated can be too small, exceeds float
The receptible range of type institute.Although such calculating is the finest, very big operand can be brought to computer.For
Calculating in one second 30 times are changed to calculate 10 times for one second by raising operational efficiency, by setting integer variable, allow its from
Add, be often added to 3 and just reset and calculate a velocity and acceleration, is achieved that the effect per second for calculating 10 times.And in program
Numerical value is stored without using double class types, stores velocity amplitude and acceleration value with float.
X is calculated, the component velocity in tri- directions y, z and the formula for closing speed are as follows: (unit time t makes in practical invention
With 1/10 second, through practical test, it will appear the phenomenon that float variable can not calculate numerical value lower than 1/10 second.It 1/10 second can be with
Facilitate and understand formula in development process, formula is facilitated to calculate.Wherein dx, dy, dz are the position on three directions in the unit time
Coordinate difference.)
Vx=dx/t;Vy=dy/t;Vz=dz/t;V=(Vx^2+Vy^2+Vz^2) ^0.5.
Speed is vector, has the component velocity on three directions to can be obtained by the direction in space.There is component velocity, into
One step, the acceleration and resultant acceleration, acceleration formula on three directions of the available right hand are as follows: (unit time t
It equally takes 1/10 second, dVx, dVy, dVz are the difference on three directions in the speed unit time.)
Ax=dVx/t;Ay=dVy/t;Az=dVz/t;A=(Ax^2+Ay^2+Az^2) ^0.5.
In program operation, we facilitate us to observe debugging on these data-printings to screen, such as Fig. 7, Fig. 8 institute
Show.Fig. 7 is the case where program just brings into operation, uses without user, and Fig. 8 is to have user that the program and the mobile right hand is used
The case where.By the data on screen, it is observed that the speed of user's right hand and acceleration information change constantly
, when user's right hand stops moving, and keeps relative position constant, on the left of the decimal point of data and two, decimal point right side is all
It is zero.Since identification is shaken, velocity and acceleration data will not be whole zero.
Each movement of people's limbs, is essentially all variable motion, when user is when the mobile right hand, the speed of the right hand
Degree and acceleration are all quickly changing.But movement of the user to the right hand in daily life, the right hand will not be made to generate excessive
Acceleration, and in the case where throwing object, the right hand can generate biggish acceleration.Pass through experiment, it has been found that ordinary circumstance
Under, speed of people when using the right hand does not exceed 2m/s, and when throwing object, the speed of the right hand can be more than 7m/s.I
With the speed of the right hand be more than 7m/s, acceleration is more than switch of the 70m/s as triggering throwing function.Throwing is set in program
It switchs (throwSwitch), this switch is a bool value, and when its value is true, user can just carry out throwing, work as user
After the complete object of throwing, switch is closed, i.e., its value becomes false, and user is needed to click space bar, and switch is opened,
And throwSwitch value is set to true again, just can be carried out throwing next time.When user is when carrying out throwing, the right hand is fast
Degree triggers throwing function more than 7m/s and when acceleration is more than 70m/s, assigns initial velocity, direction, first position to bead, and
Bead is instantiated, the physical engine of Unity3D is transferred to simulate its motion profile.
At the time of bead is spilled over, we record component velocity of the bead on x, tri- directions y, z, and record
It dishes out the height at moment, can calculate the theoretical drop point of bead using physical equation, is i.e. bead is mobile in x-axis and z-axis
Distance, formula are as follows: (Vx, Vy, Vz are the component velocity dished out on three directions of moment of bead, and h is that bead is dished out the height at moment
Degree, g is acceleration of gravity.It should be noted that y-axis is vertical direction axis in the environment of Unity3D.)
Sx=(Vy/g+ ((Vy^2+2*g*h)/g^2) ^0.5) * Vx;Moving distance in x-axis direction
Sz=(Vy/g+ ((Vy^2+2*g*h)/g^2) ^0.5) * Vz;Moving distance on z-axis direction
The derivation process of entire formula is as follows:
In the physical engine of Unity3D, y-axis indicates vertical direction.Bead dish out the moment y-axis direction speed be Vy,
Bead needs first to do upward retarded motion in y-direction, and speed when peaking is zero, is spilled over most from bead
Time required for high point is Vy/g, and the distance of movement is Vy^2/ (2*g).After bead peaks, in vertical direction
Just start to do the movement of falling object, height of drop is bead y-coordinate value when dishing out and do at a distance from the fortune function that slows down it upwards
With its calculation formula is: Vy^2/2*g+h (h are the bead y-axis coordinate at moment of dishing out, and is known quantity), do freely falling body fortune
Dynamic fall time are as follows: ((Vy^2+2*g*h)/g^2) ^0.5.Therefore, total time of the bead from dishing out whereabouts is T:Vy/g+
((Vy^2+2*g*h)/g^2)^0.5。
Bead is acquired from the moment of dishing out, to after the total time of landing, so that it may which the drop point site for calculating bead is fallen
Point coordinate.The moving distance of bead x, z in the plane, formula are as follows: (small because not considering air drag in invention
The initial velocity that ball is dished out on the direction moment x, z is constant.Vx is the initial velocity on the direction bead x, and Vz is on the direction bead z
Initial velocity, T are total time required for falling.)
Sx=Vx*T;Sz=Vz*T.
The drop point site (x ', y ', z ') of bead is that the position of dishing out (x, y, z) of bead adds offset, calculation formula
Are as follows:
X '=x+Sx;Z '=z+Sz;Y '=0.
Finally we calculate wooden barrel changing coordinates and to the differences between bead landing point coordinates, and wooden barrel is mobile by function
To small placement, waits and fall on drop point site under bead, access can be realized.Its calculation formula is as follows: ((x1, y1, z1) is wood
The three-dimensional coordinate of case original position, (x ', y ', z ') are the three-dimensional coordinate of drop point site.)
X1=x ';Z1=z ';Y1=y '=0.
Since wooden case moves in xoz plane, so y-axis coordinate is constant.
Its schematic diagram is as shown in Figure 9.
The final realization effect of the present invention is realized by the simulation of 3D scene.Provide UI's in Unity3D
Interface adds a panel component, the video image captured for showing Kinect camera in Unity3D
Information.A C# script is bound on panel component, and the control statement for calling Kinect device is write in script, it will
The video color image real time print that Kinect is got is in panel.
Rainbow skeleton is an open source model of Kinect combined U nity3D exploitation, wherein containing 20 bones of human body
The colored spherical model of bone node, and using colored column, the spherical model of bone node is connected, is in Unity
Reveal the skeleton pattern of a human body.After routine is write and finished, machine correctly installs Kinect driver, and connects
Kinect device can follow the movement of player to make corresponding dynamic when subscriber station is before the camera of Kinect device
Make.This is actually digitized human skeleton i.e. rainbow skeleton pattern.
KinectModelControllerV2 is an open source C# shell script of Kinect combined U nity3D exploitation, is used
It can be with after the three-dimensional coordinate information that Kinect gets human skeleton node in the interface for calling Kinect driving to provide
This rainbow skeleton in Unity3D is controlled, is made and the consistent limb action of user.As it can be seen that being carried out in Unity3D and Kinect
When joint development, it can be very good the spatiality for embodying Kinect capture, processing video image information, embody bone very well
The movement of node in three-dimensional space.In entire design process, KinectModelControllerV2.cs script is provided
The bone nodal information of Kinect transmission is received, and controls the digital skeleton pattern in scene and makes unanimously dynamic work with user
Can, meanwhile, throwing, access algorithmic code also write on inside this script.By this script, throwing gesture may be implemented
Identification, the calculating of small placement and the movement of wooden case.
When the program is run, such full frame interface will be obtained.Grid texture is set by ground, the purpose is to convenient
Observe the drop point site of bead.Subscriber station more quickly completes throwing action before Kinect device, using right hand forward upward,
Can dish out a bead, and wooden case model can be moved to the drop point site of bead, be connect according to the drop point data of bead
Firmly.It is as shown in Figure 10, Figure 11 the end interface of virtual throwing access system.
The above embodiment is a preferred embodiment of the present invention, but embodiments of the present invention are not by above-described embodiment
Limitation, other any changes, modifications, substitutions, combinations, simplifications made without departing from the spirit and principles of the present invention,
It should be equivalent substitute mode, be included within the scope of the present invention.
Claims (6)
1. one kind is based on the virtual throwing access system of Kinect somatosensory device, it is characterised in that:
In the scene of Unity3D simulation, a virtual number skeleton is disposed, the wooden case 3D model being open with one,
After user completes throwing action, a bead is instantiated, and does the movement of parabolic switch triggering, calculates access drop point site, wood
Case is moved to drop point site and carries out access;
The triggering of throwing switch relies on multiple trigger bits, i.e. bool value, when whole trigger bits meet condition, i.e., whole bool values
When for true, throwing function is triggered, initial velocity, direction, first position are assigned bead, and instantiate bead, transfers to Unity3D's
Physical engine simulates its motion profile;
Access drop point site by component velocity and throw point position on three directions of bead x, y, z height, when calculating whereabouts
Between and bead drop point;
Wooden case is moved to the drop point site of bead according to the difference of wooden case current location and drop point site by the movement of wooden case, etc.
Wooden case is dropped into bead;
After user's throwing complete object, switch is closed, i.e., its value becomes false, and user is needed to click space bar, will
Switch is opened and throwSwitch value is set to true again, just can be carried out throwing next time.
2. according to claim 1 be based on the virtual throwing access system of Kinect somatosensory device, it is characterised in that throwing is opened
The triggering of pass specifically:
The 3 d space coordinate value of step 1, right hand bone node, and the coordinate value has the variation of N frame each second, so that it may it calculates
Component velocity on real-time three directions of right-hand minutia and component of acceleration size out;
If step 2, with 1/N seconds for the unit time, successively addition of variable x/N is often added to N and just resets and calculate a speed and accelerate
Degree is achieved that the effect per second for calculating n times, stores velocity amplitude and acceleration value with float;
Step 3 calculates x, and the component velocity in tri- directions y, z and the formula for closing speed are as follows:
Vx=dx/t;Vy=dy/t;Vz=dz/t;V=(Vx^2+Vy^2+Vz^2) ^0.5
Wherein dx, dy, dz are the position coordinates difference on three directions in the unit time;
Speed is vector, obtains the direction in space according to the component velocity on three directions;
Step 4, according to component velocity, obtain the acceleration and resultant acceleration on three directions of the right hand, acceleration formula is such as
Under:
Ax=dVx/t;Ay=dVy/t;Az=dVz/t;A=(Ax^2+Ay^2+Az^2) ^0.5;
Unit time t equally takes 1/N seconds, and dVx, dVy, dVz are the difference on three directions in the speed unit time.
3. according to claim 1 be based on the virtual throwing access system of Kinect somatosensory device, it is characterised in that: throwing
In triggering, the speed of the right hand is more than 7m/s, and acceleration is more than switch of the 70m/s as triggering throwing function.
4. according to claim 1 be based on the virtual throwing access system of Kinect somatosensory device, it is characterised in that access is fallen
Point position specifically: at the time of bead is spilled over, we record component velocity of the bead on x, tri- directions y, z, and
The height at moment of dishing out is recorded, the theoretical drop point of bead can be calculated using physical equation, i.e., bead is in x-axis and z-axis
Mobile distance, formula are as follows: Vx, Vy, Vz are the component velocity that bead is dished out on three directions of moment, when h dishes out for bead
The height at quarter, g are acceleration of gravity, and in the environment of Unity3D, y-axis is vertical direction axis;
Sx=(Vy/g+ ((Vy^2+2*g*h)/g^2) ^0.5) * Vx, the moving distance in x-axis direction;
Sz=(Vy/g+ ((Vy^2+2*g*h)/g^2) ^0.5) * Vz, the moving distance on z-axis direction;
Total time of the bead from dishing out whereabouts is T:Vy/g+ ((Vy^2+2*g*h)/g^2) ^0.5;
Bead is acquired from the moment of dishing out, to after the total time of landing, so that it may calculate the drop point site of bead, i.e. drop point is sat
Mark.
5. according to claim 1 be based on the virtual throwing access system of Kinect somatosensory device, it is characterised in that wooden case
It is mobile:
Sx=Vx*T;Sz=Vz*T
The moving distance of bead x, z in the plane, do not consider air drag, the initial velocity so bead is dished out on the direction moment x, z
Degree be it is constant, Vx is the initial velocity on the direction bead x, and Vz is the initial velocity on the direction bead z, and T is total required for falling
Time;
The drop point site (x ', y ', z ') of bead is that the position of dishing out (x, y, z) of bead adds offset, its calculation formula is:
X '=x+Sx;Z '=z+Sz;Y '=0;
Finally we calculate wooden barrel changing coordinates and to the differences between bead landing point coordinates, wooden barrel are moved to by function small
Placement waits and falls on drop point site under bead, and wooden case translation access can be realized i.e.: (x1, y1, z1) is wooden case original position
Three-dimensional coordinate, (x ', y ', z ') are the three-dimensional coordinate of drop point site;
X1=x ';Z1=z ';Y1=y '=0;
Since wooden case moves in xoz plane, so y-axis coordinate is constant.
6. according to claim 1 be based on the virtual throwing access system of Kinect somatosensory device, it is characterised in that:
A panel component, the video image information that display Kinect camera is captured are added in Unity3D;In panel component
One C# script of upper binding writes the control statement for calling Kinect device in script, and the video that Kinect is got is color
Chromatic graph is as real time print is in panel;After the three-dimensional coordinate information that Kinect gets human skeleton node, control
This rainbow skeleton in Unity3D, the consistent limb action of the video image information made and captured.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910534256.3A CN110262660B (en) | 2019-06-20 | 2019-06-20 | Virtual throwing and containing system based on Kinect somatosensory equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910534256.3A CN110262660B (en) | 2019-06-20 | 2019-06-20 | Virtual throwing and containing system based on Kinect somatosensory equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110262660A true CN110262660A (en) | 2019-09-20 |
CN110262660B CN110262660B (en) | 2023-05-23 |
Family
ID=67919538
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910534256.3A Active CN110262660B (en) | 2019-06-20 | 2019-06-20 | Virtual throwing and containing system based on Kinect somatosensory equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110262660B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113624996A (en) * | 2021-09-07 | 2021-11-09 | 珠海格力电器股份有限公司 | Cargo throwing state identification method and system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105373128A (en) * | 2015-06-17 | 2016-03-02 | 电子科技大学 | Badminton robot and whole-field positioning method thereof |
US20180104573A1 (en) * | 2016-10-17 | 2018-04-19 | Aquimo, Llc | Method and system for using sensors of a control device for control of a game |
CN107943292A (en) * | 2017-11-24 | 2018-04-20 | 无锡南理工新能源电动车科技发展有限公司 | One kind is based on Kinect somatosensory device throwing access system |
CN108037827A (en) * | 2017-12-08 | 2018-05-15 | 北京凌宇智控科技有限公司 | The virtual objects throwing emulation mode and its system of Virtual actual environment |
-
2019
- 2019-06-20 CN CN201910534256.3A patent/CN110262660B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105373128A (en) * | 2015-06-17 | 2016-03-02 | 电子科技大学 | Badminton robot and whole-field positioning method thereof |
US20180104573A1 (en) * | 2016-10-17 | 2018-04-19 | Aquimo, Llc | Method and system for using sensors of a control device for control of a game |
CN107943292A (en) * | 2017-11-24 | 2018-04-20 | 无锡南理工新能源电动车科技发展有限公司 | One kind is based on Kinect somatosensory device throwing access system |
CN108037827A (en) * | 2017-12-08 | 2018-05-15 | 北京凌宇智控科技有限公司 | The virtual objects throwing emulation mode and its system of Virtual actual environment |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113624996A (en) * | 2021-09-07 | 2021-11-09 | 珠海格力电器股份有限公司 | Cargo throwing state identification method and system |
CN113624996B (en) * | 2021-09-07 | 2022-07-12 | 珠海格力电器股份有限公司 | Cargo throwing state identification method and system |
Also Published As
Publication number | Publication date |
---|---|
CN110262660B (en) | 2023-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110147231B (en) | Combined special effect generation method and device and storage medium | |
KR101751078B1 (en) | Systems and methods for applying animations or motions to a character | |
CN111556278B (en) | Video processing method, video display device and storage medium | |
KR101700468B1 (en) | Bringing a visual representation to life via learned input from the user | |
CN109145788B (en) | Video-based attitude data capturing method and system | |
US8866898B2 (en) | Living room movie creation | |
CN102947777B (en) | Usertracking feeds back | |
US20130278501A1 (en) | Systems and methods of identifying a gesture using gesture data compressed by principal joint variable analysis | |
CN106383587A (en) | Augmented reality scene generation method, device and equipment | |
CN106201173B (en) | A kind of interaction control method and system of user's interactive icons based on projection | |
CN112198959A (en) | Virtual reality interaction method, device and system | |
CN106462725A (en) | Systems and methods of monitoring activities at a gaming venue | |
US10885691B1 (en) | Multiple character motion capture | |
CN102915112A (en) | System and method for close-range movement tracking | |
KR20220093342A (en) | Method, device and related products for implementing split mirror effect | |
CN102413885A (en) | Systems and methods for applying model tracking to motion capture | |
CN108416832A (en) | Display methods, device and the storage medium of media information | |
JP2017534135A (en) | Method for simulating and controlling a virtual ball on a mobile device | |
CN113129411A (en) | Bionic animation generation method and electronic equipment | |
CN112308977A (en) | Video processing method, video processing apparatus, and storage medium | |
CN110262660A (en) | Based on the virtual throwing access system of Kinect somatosensory device | |
CN111773669B (en) | Method and device for generating virtual object in virtual environment | |
CN109407826A (en) | Ball game analogy method, device, storage medium and electronic equipment | |
Velho et al. | Expanded Virtual Puppeteering. | |
Bernardes Jr et al. | Design and implementation of a flexible hand gesture command interface for games based on computer vision |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |