[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN107847187A - Apparatus and method for carrying out motion tracking at least part of limbs - Google Patents

Apparatus and method for carrying out motion tracking at least part of limbs Download PDF

Info

Publication number
CN107847187A
CN107847187A CN201680039901.9A CN201680039901A CN107847187A CN 107847187 A CN107847187 A CN 107847187A CN 201680039901 A CN201680039901 A CN 201680039901A CN 107847187 A CN107847187 A CN 107847187A
Authority
CN
China
Prior art keywords
limbs
space
orientation
movement
mark
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201680039901.9A
Other languages
Chinese (zh)
Other versions
CN107847187B (en
Inventor
陈城
王进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics NV filed Critical Koninklijke Philips Electronics NV
Publication of CN107847187A publication Critical patent/CN107847187A/en
Application granted granted Critical
Publication of CN107847187B publication Critical patent/CN107847187B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Dentistry (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The present invention proposes a kind of device for being used to carry out at least part of the limbs including two straight parts with joint connection motion tracking.Described device includes:Motion sensor (110c), it is attached to the part of the limbs to measure the orientation of the part of the limbs;Camera (120), the part of its limbs being used in capture movement is together with the image of the mark at predefined position (110a, the 110b, 110d) place for the part for being attached to the limbs, and position of the mark in each image captured by the camera is as reference position;Processor, it is used for based on reference position, it is corresponding with each reference position be orientated, the movement of the part of the limbs is estimated in the predetermined length of the part of the limbs and the predefined position.The position for directly tracking joint in the picture is substituted, based on can observe one position marked and position is exported according to orientation information.By this way, solve occlusion issue, and the movement of limbs can be estimated based on the degree of accuracy to interested position and transient measurement.

Description

Apparatus and method for carrying out motion tracking at least part of limbs
Technical field
Motion tracking is carried out this patent disclosure relates generally at least part to limbs, more particularly to the limbs in rehabilitation extremely Small part carries out motion tracking.
Background technology
Palsy is one of worldwide major causes of death, and most of palsy survivor is suffering from many functions Obstacle.A kind of most common dysfunction is motility dysfunction.Need long-term rehabilitation and move energy to help patient to improve it Power.Generally, patient's hospitalization about one month, then leave hospital back due to the limited quantity for the treatment of expert and recovery centre Family.Unsupervised rehabilitation system is therefore, it has been developed to, patient reaches the rehabilitation outcome of its maximum possible to help to be in.
WO2014108824A1 discloses a kind of motion tracking system of the view-based access control model for rehabilitation.The system has The camera of the image of limbs in capture movement, wherein, by mark be attached on 1 points of upper limbs, so as to based on The range of movement of the upper limbs is assessed in the mark position in the picture.Will according to the projected length based on limbs on image The position of at least two marks captured due to the fact the change of the orientation of the limbs correspondingly changes in the picture becomes Change, the orientation information for mobile estimation will be exported for moving projection.However, due to the 2D dimensional attributes of image, thus It will occur to block mark when a mark is covered by the part of another mark or limbs during the motion of limbs, This will cause the undetectability of the position of mark overlapping in the picture.Thus, interrupt the continuity of motion tracking.Although Interpolation method, which can aid in, overcomes the problem, but it provides the moment joint position estimation of low accuracy.
The content of the invention
The system for wishing acquisition more robust is promoted at least part of continuous motion tracking to limbs, and provided The motion tracking of the more preferable degree of accuracy, assessed for further.
A kind of device for being used to carry out at least part of limbs motion tracking, the limbs are connected including two with joint Straight part, described device includes:
Motion sensor 110c, it is attached to the part of the limbs for measuring the portion of the limbs The orientation divided, the orientation is derived according to the output of the motion sensor 110c;
Camera 120, the part of its limbs being used in capture movement and is attached to predefined position The image of 110a, 110b, 110d mark, position of the mark in each image captured by camera 120 is as reference Position;
Processor, it is used for based on reference position, it is corresponding with each reference position be orientated, the portion of the limbs The predetermined length divided and the predefined position, the space bit at the both ends of the part by determining the limbs Put, to estimate the movement of the part of the limbs.
Propose in the present invention and be introduced into motion sensor to provide at least part of the limbs in motion (or whole limb Body) orientation information, and it is then infeasible to derive described information via the image that is captured by camera when blocking generation.Profit With the orientation information provided, the mark adhered to even with only one, the estimation to the locus in joint is also promoted.Institute It is that basis is predefined to the motion tracking of specific reconditioning to state the position that mark is attached to, to ensure that the mark is being forged It is to detect during refining.Substitute the tracking of position directly to joint in the picture, locus is based on can observe One mark position and corresponding orientation information and it is derived.By this way, solve occlusion issue, and being capable of base Estimate the movement of the part of limbs (or whole limbs) in measurement to the accurate of locus interested and instantaneously.When When limbs are limbs interested, the limbs keep stretching without bending.The straight part of limbs can be upper arm, forearm, small Leg or thigh.
In one embodiment, the mark is each predefined position for the part for being attached to the limbs One of 110a, 110b, 110d multiple marks that can be distinguished, to ensure each image captured during movement by camera 120 In it is at least one mark can observe.
Due to the perhaps limitation of the situation of patient in reconditioning, it may happen that in the absence of such position so that attachment To a mark thereon in the position that the whole flow process of exercise can detect all the time.So, it will use and each other can in the picture Distinguish multiple marks come ensure for a certain moment for it is at least one mark can observe.Each mark is attached to Each predefined position on the part (or whole limbs) of limbs.Then, can be based on the mark identified in the picture Position, the mark corresponding predefined position and the orientation information that is measured by motion sensor estimate joint interested Locus.
In one embodiment, described device also includes alignment unit, and the alignment unit is orientated space by determining Angular difference between coordinate system and the coordinate system of image space and be directed at orientation space and image space, wherein, the orientation is empty Between correspond to the derived space being orientated of institute.
Derived according to the output of motion sensor to describe the orientation information in orientation space is by may be with retouching The coordinate system of drawing image space different coordinate system is expressed.Described image spatial references be used to describe mark by camera The coordinate system of position in each image of capture.Alignment unit is additionally provided with by determining the angular difference between two coordinate systems And be directed at orientation space and image space, be thereby preparing for the orientation measured by motion sensor, so as in image space by For determining the locus in joint interested.
In one embodiment, the alignment unit includes the compass for being attached to camera.
Derived from the orientation informations of limbs be present in world's fixed coordinate system, the axle of the coordinate system, which has, points to ground Manage the direction of the arctic, gravity direction and perpendicular to the direction of above-mentioned two axle.Be attached to camera compass help calibrate will The camera of a certain position is fixed to, so that in two edges (limiting shutter by its identified surface) of the camera Any edge with point to north geographic pole direction be aligned.Then, other edge for example can be by naturally drooping and gravity Direction is easily aligned.Then, it is directed at orientation space and image space, so that the portion of the limbs by motion sensor generation The orientation information for dividing (either whole limbs) is the angle of orientation of the part (or whole limbs) of limbs in image space.
In one embodiment, the alignment unit includes the motion sensor for being attached to the camera.
The alignment unit may be implemented as the motion sensor of camera to be attached to.For example, the motion-sensing Device is placed along the gravipause of camera.The output indication image space of the motion sensor of camera is attached to taking To the misorientation between space.The camera can be further adjusted, until the weight component of the output is equal to 0 and another Untill outer two components are equal to 90 degree, this means that gravity axis is aligned.It is then possible to remaining axle is directed in an identical manner. Then, two spaces are aligned.Alternatively, regulation camera is substituted, the motion along each edge attachment of camera can be recorded Each output of angular difference between the instruction two spaces of sensor, for further compensation.
In one embodiment, the processor is also compensated to described by the angular difference measured by motion sensor 110c The estimation of the movement of the part of limbs.
Based on the orientation information generated, compensated information is collected to relax moving between orientation space and image space Difference in terms of the phase of dynamic measurement result.
In one embodiment, the locus is the position in image space, and the movement is in the figure Estimate in image space.
The movement in image space is measured, it can be sufficiently used for recovering the purpose assessed.
In one embodiment, the processor also by based on predetermined mapping ruler by the space of image space Position maps to real space to estimate the movement being orientated in space.
Movement in alternate image space, the movement in real space interested is further exported by mapping.It is described to reflect It is predetermined according to the original setting of the relative space relation (for example, distance) between camera and object to penetrate rule.
In one embodiment, the movement be it is following in one:The anglec of rotation, angular speed, angular acceleration, linear velocity and Movement locus.
In one embodiment, the motion sensor include it is following in two or its combination:Accelerometer, magnetic force Meter and gyroscope.
Motion sensor comprises at least accelerometer and magnetometer.Using accelerometer, it can determine that 2D is tilted.Then, It is combined with magnetometer, the rotation around vertical axis is further determined that, the complete 3D being derived there in world's fixed coordinate system takes To.The shortcomings that being rebuild using the orientation of accelerometer and magnetometer is:It is directed to as caused by movement acceleration only with gravity Acceleration compared to working in the case of small well.In order to improve the degree of accuracy, gyroscope can be covered in orientation reconstruction, institute It is that gyro data is integrated to state orientation reconstruction.In another embodiment, the group of accelerometer and gyroscope can be implemented Close, be orientated with measuring complete 3D.Can in previous document, such as by Victor Van Acht, Edwin Bongers, Miniature Wirelesss of Niek Lambert and the Rene Verberne on EMBC2007, Lyon, France Found in the texts of Inertial Sensor for Measuring Human Motions mono- to using accelerometer, magnetometer and Gyroscope be orientated the detailed description of reconstruction.
The present invention includes a kind of method for following steps:
Make orientation space and figure by determining the angular difference between the coordinate system in orientation space and the coordinate system of image space Image space is directed at S100, wherein, the orientation space corresponds to the derived space being orientated of institute;
The orientation of the part of S101 limbs is measured, the orientation is the fortune according to the part for being attached to the limbs Dynamic sensor 110c output and it is derived;
Capture S102 motion in the limbs the part together be attached to predefined position 110a, 110b, The image of 110d mark, position of the mark in each image captured by camera 120 is as reference position;
The predetermined length of the part of orientation, limbs based on reference position, corresponding with each reference position Degree and the predefined position, the locus at the both ends of the part by determining the limbs, to estimate S103 institutes State the movement of the part of limbs.
Various aspects of the disclosure and feature is described in more detail.The description understood according to the accompanying drawings, Other objects of the present invention and advantage will become apparent.
Brief description of the drawings
More detailed description and explanation are made to the present invention hereinafter in connection with embodiment and refer to the attached drawing, in accompanying drawing In:
Fig. 1 shows the schematic diagram according to an embodiment of the invention for being used to carry out limbs in the device of motion tracking;
Fig. 2 (a)-Fig. 2 (d) illustrates the posture example designed for upper limb healing recovery from illness.
Fig. 3 shows what the image for being exported and being captured based on motion sensor in embodiments of the invention moved to limbs The schematic diagram of measurement.
Fig. 4 (a) shows the schematic diagram of state of the upper limbs of patient of the explanation with palsy when shoulder flexing starts.
Fig. 4 (b) shows the schematic diagram of state of the upper limbs for showing the patient with palsy among shoulder flexing.
Fig. 5 shows a kind of method that limbs are carried out with motion tracking.
It is designated in the figures with the same reference numeral similar or corresponding feature and/or function.
Embodiment
The present invention will be described relative to specific embodiment and with reference to certain figures, but the invention is not restricted to this, this hair It is bright to be only defined by the claims.Described accompanying drawing is only schematical, and nonrestrictive.In the accompanying drawings, for illustration The purpose of explanation, the size of some elements may be exaggerated, and be not necessarily drawn to scale.
Fig. 1 is the schematic diagram according to an embodiment of the invention for being used to carry out upper limbs in the device of motion tracking, and it includes The mark that is attached on the hand 110b of object, the motion sensor 110c being attached on upper limbs, the phase in front of object The processor (not shown) of the movement of machine 120 and estimation upper limbs.Upper limbs includes two straight parts:Upper arm and forearm.Below It will discuss on the embodiment for the movement for only estimating upper arm or forearm.Although camera is illustrated as being positioned in object in Fig. 1 Front, but camera can also be positioned in the side of object, should not be assumed that limitation in this respect be present.
Motion sensor 110c is placed along the limbs, limb in being moved with measurement, for example doing reconditioning The orientation of body, and camera 120 is configured as image of the limbs in simultaneously capture movement together with accompanying mark.The mark Note is attached to 110b in one's hands, and this is predefined before exercise.
For reconditioning, the posture of mobile forearm or upper arm is devised, it is extensive after palsy to assess upper limbs Multiple situation.Each posture can emphasize the rotation of particular joint in different directions.Fig. 2 (a)-Fig. 2 (d) gives designed Some examples of posture, wherein, shoulder or ancon serve as the joint that receive service, and the remains stationary in the posture. Correspondingly, the free degree and scope of the movement of upper arm or forearm are measured during exercise.
Fig. 2 (a) illustrates the exercise of shoulder flexing.
Original position:Parallel to trunk midaxillary line;
Shift position:Parallel to the longitudinal axis for the humerus for pointing to lateral epicondyle;
Motion:Shoulder flexing;
Scope:30 °~180 °.
Fig. 2 (b) illustrates shoulder abduction exercise.
Original position:Parallel to body midaxillary line;
Shift position:The longitudinal axis of the front side of upper arm parallel to humerus;
It is mobile:Shoulder lifting in omoplate plane or volume plane;
Scope:30 °~180 °.
Fig. 2 (c) illustrates shoulder inside/outside turning exercise.
Original position:Parallel to support surface or perpendicular to floor;
Shift position:Parallel to the longitudinal axis for the ulna for pointing to belemnoid;
It is mobile:Internal rotation and outer rotation;
Scope:Interior 0 °~80 °, outer 0 °~60 °.
Fig. 2 (d) illustrates elbow in the wrong and taken exercise.
Original position:Parallel to the longitudinal axis for the humerus for pointing to acromion top;
Shift position:Parallel to the longitudinal axis for the radius for pointing to processus styloideus radii;
It is mobile:Bend elbow;
Scope:30 °~150 °.
Fig. 3 is to show that output based on motion sensor 110c in embodiments of the invention and captured images are bent to shoulder The schematic diagram of the measurement of limbs movement in song.Assuming that image space and orientation space are thereby perfectly aligned, and fortune will be used Dynamic sensor 110c world's fixed coordinate system carrys out depiction picture, wherein, x instructions indicate gravity towards the direction of north geographic pole, y Direction, and z instruction perpendicular to the direction in both of the aforesaid direction.Image 210 is particular moment during shoulder flexing by phase Machine 120 is directed to limbs A0B0Captured.Such as in picture 210, the reference position B ' of hand0(x′b0,y′b0, 0) indicate in image The coordinate of middle the observed mark for being attached to 110b in one's hands, it is position B0(xb0,yb0,zb0) to the projection on x/y plane. Using as the instruction limbs A measured by motion sensor0B0The orientation information θ of orientation in image space0x0y0z0) (it is not shown in figure, wherein, θx0Indicate limbs A0B0Angle between x-axis, θy0Indicate limbs A0B0Angle between y-axis, θz0Refer to Show limbs A0B0Angle between z-axis) and limbs A0B0Predefined length L, can determine the coordinate B of hand0(xb0,yb0,zb0) And the coordinate A of shoulder0(xb0,yb0, 0), wherein, the displacement of shoulder in a z-direction assumes it is insignificant.Thus, even if when described Limbs perpendicular to camera shutter, so as to cause using shoulder in the case of conventional method and hand in the image captured it is overlapping When, it can also estimate the position A of shoulder0.Export based on corresponding shoulder and the hand of each image with being captured by camera 120 is sat Mark, can continuously and immediately estimate the movement of limbs.The predefined position being attached to is marked to may be implemented as limbs On any position, a quarter point of the other end (shoulder 110a) of such as limbs, the midpoint of limbs or limbs, its foundation Standard be that the mark in the image captured by camera 120 is to observe in whole motion process.
However, for example because the restriction of object condition, a mark may be not enough to detect all the time in reconditioning Arrive.Fig. 4 (a)-Fig. 4 (b) illustrates a kind of situation that may occur.For the patient with palsy, upper limbs may not Enough outwards to stretch, its forearm and upper arm keep certain angle between each other.During movement, such as during shoulder flexing, upper arm Adhere to motion sensor 110c by the target being counted as in motion to be evaluated (parts of limbs), and along upper arm. But both shoulder 110a and elbow 110d are not that can be observed in whole exercise routine.At the beginning of the exercise, such as Fig. 4 (a) diagrams, elbow 110d is blocked due to hand 110b and forearm.In the centre of the exercise, such as Fig. 4 (b) figures Show, shoulder 110a is blocked due to hand 110b and forearm.Therefore, two marks that can be distinguished are attached to shoulder Both 110a and elbow 110d, to ensure that at least one mark can observe in the image captured.Thus, based on every width One in image identification mark (be that being attached on elbow 110d, otherwise be attached on shoulder 110a That) and according to motion sensor 110c output, remain able to determine shoulder 110a and axle 110d coordinate.If two marks Note is insufficient to, then can correspondingly apply more multiple labeling.Distinguishing characteristics between multiple marks can be such as color and reflection The visual attributes of intensity and such as geometric attribute of shape and pattern.
Realized by alignment unit by determining to be orientated the angular difference between the coordinate system in space and the coordinate system of image space The alignment being orientated between space and image space.The orientation space is world's fixed coordinate system, according to motion sensor 110c Output and the derived oriented coordinate system.Described image space is related to the positioning to camera, wherein, x-axis, y-axis are parallel In the edge of camera.If image space and orientation space have been aligned, then alignment unit is optional.
One embodiment of the alignment unit is attached to the compass of camera.It is attached to the one of the upper surface of camera One edge on the surface that the compass at individual edge helps to make to include shutter is aligned with the direction towards north geographic pole.Including shutter The other edge on surface can be aligned by naturally drooping with the direction of gravity.In order to realize the calibration of precision, it can use and incline Tiltedly meter, so that the other edge is aligned with the direction of gravity.
One embodiment of the alignment unit is passed along the attached motion in an edge on the surface including shutter Sensor.In calibration, adjust the position of camera, until output one-component be equal to 0, and two other component be equal to 90 degree Untill, this means that axle is aligned.It is directed in the same way the other edge on the surface.When described two axles are right On time, image space and orientation space are aligned.
For another embodiment, record along the defeated of the attached motion sensor of two axles on the surface including shutter Go out, be further compensate for for the estimation of the movement to limbs, wherein, between the output indication image space and orientation space Angular difference.
The motion sensor can also be placed along the edge vertical with the surface including shutter, so as to with institute above The same way referred to is calibrated to two spaces.
The length of the part of the limbs is derived based on statistics ratio according to limbs length.On for example, The length of arm is the statistics ratio between the length of length and upper arm based on whole upper limbs and calculated.With to image In the determination of the length of upper limbs that naturally droops, predetermined limbs length L is correspondingly determined based on the statistics ratio.
It can complete in image space first to mobile estimation, it is further converted to exhausted in real space To movement.Mapping ruler for such conversion is the length being directed to based on the distance between camera and object in real space With the ratio of the length in image space and it is predetermined.
The movement be it is following in one:The anglec of rotation, angular speed, angular acceleration, linear velocity and movement locus.
Also the movement that lower limb are measured by device proposed by the invention can be applied.Can with it is mentioned above similar Mode promote the estimation of the movement to thigh, shank or whole piece leg, wherein, thigh is the corresponding part of upper arm, before shank is The corresponding part of arm, hip are the corresponding parts of shoulder, and knee is the corresponding part of elbow, are the corresponding part of hand enough.
Fig. 5 shows a kind of method that limbs are carried out with continuous motion tracking, and method 100 comprises the following steps:
Step S100, make orientation by determining to be orientated the angular difference between the coordinate system in space and the coordinate system of image space Space and image space alignment, wherein, the orientation space corresponds to the derived space being orientated of institute.
Step S101, the orientation of the part of the limbs, the orientation are according to the institute for being attached to the limbs State the motion sensor 110c of part output and derived;
Step S102, the part of the limbs in capture movement is together with the part for being attached to the limbs Predefined position 110a, 110b, 110d mark image, it is described mark in each image captured by camera 120 Position is as reference position;
Step S103, based on reference position, it is corresponding with each reference position be orientated, the part of the limbs it is advance The length of determination and the predefined position, the locus at the both ends of the part by determining the limbs, to estimate Count the movement of the part of the limbs.
If image space has been aligned with orientation space, then step 100 is then optional step.
Further relate to promote the other step of the compensation to the angular difference between image space and orientation space, the step For:
Angular difference between measurement orientation space and image space;
Compensated with the estimation of the movement of the angular difference that measures to the part of the limbs.
By studying accompanying drawing, specification and appended claims, those skilled in the art can be to claimed Practice of the invention in understand and implement other modifications for the disclosed embodiments.In the claims, " comprising " one Word is not excluded for other elements or step, and singular article "a" or "an" is not excluded for plural number.Single processor or other units can To complete the function for some projects enumerated in claim.Some measures are stated in mutually different dependent claims not The combination of certain measures cannot be used to advantage in expression.Computer program can be stored/distributed in appropriate medium, example Such as, the medium can be that optical storage media either provides together with other hardware or part as other hardware consolidates Body medium, however, it is also possible to make computer program be distributed by other forms, for example, by internet or other it is wired or Radio telecommunications system.Reference in claim should not be inferred as to the effect of restricted scope.

Claims (15)

1. a kind of device for being used to carry out at least part of limbs motion tracking, the limbs include two with joint connection Straight part, described device include:
Motion sensor (110c), it is attached to the part of the limbs to measure taking for the part of the limbs To the orientation is derived according to the output of the motion sensor (110c);
Camera (120), the part of its limbs being used in capture movement and is attached to predefined position The image of the mark of (110a, 110b, 110d), position of the mark in each image captured by the camera (120) As reference position;
Processor, it is used for based on reference position, it is corresponding with each reference position be orientated, the part of the limbs Predetermined length and the predefined position, by determine the limbs the part both ends in image space Locus, to estimate the movement of the part of the limbs;
Wherein, the limbs are arm or leg.
2. device according to claim 1, wherein, the mark be attached to the limbs the part it is respective A mark in multiple marks that can be distinguished of predefined position (110a, 110b, 110d), to ensure that at least one mark exists It can be observed in each image captured by the camera (120) during the motion.
3. the device according to any one of claim 1 or claim 2, in addition to alignment unit, the calibration is single Member by determine be orientated space coordinate system and described image space coordinate system between angular difference come make it is described orientation space with Described image spacial alignment, wherein, the orientation space corresponds to the derived space being orientated of institute.
4. device according to claim 3, wherein, the alignment unit includes the compass that be attached to the camera.
5. device according to claim 3, wherein, the motion that the alignment unit includes being attached to the camera passes Sensor.
6. device according to claim 5, wherein, the processor by the motion sensor (110c) also to be measured The angular difference compensates the estimation of the movement to the part of the limbs.
7. the device according to any one of claim 1 to 6, wherein, the length of the part of the limbs It is derived based on statistics ratio according to limbs length.
8. the device according to any one of claim 1 to 7, wherein, the locus is in described image space Position.
9. device according to claim 8, wherein, the processor also by based on predetermined mapping ruler by institute State the locus in image space and map to real space to estimate the movement in the real space.
10. the device according to any one of claim 1 to 9, wherein, the movement be it is following in one:Rotation Angle, angular speed, angular acceleration, linear velocity and movement locus.
11. the device according to any one of claim 1 to 9, wherein, the motion sensor include it is following in two Individual or more:Accelerometer, magnetometer and gyroscope.
12. a kind of method that at least part to limbs carries out motion tracking, the limbs include two with the straight of joint connection Part, the described method comprises the following steps:
The orientation of the part of (S101) described limbs is measured, the orientation is according to the portion for being attached to the limbs Point the output of motion sensor (110c) and it is derived;
Capture (S102) motion in the limbs the part and be attached to predefined position (110a, 110b, The image of mark 110d), position of the mark in each image captured by the camera (120) is as reference bit Put;
The predetermined length of the part of orientation, limbs based on reference position, corresponding with each reference position with And the predefined position, the both ends locus in image space of the part by determining the limbs, to estimate Count the movement of the part of (S103) described limbs;
Wherein, the limbs are arm or leg.
It is 13. according to the method for claim 12, further comprising the steps of:
Make the orientation space by determining the angular difference between the coordinate system in orientation space and the coordinate system in described image space With described image spacial alignment (S100), wherein, the orientation space corresponds to the derived space being orientated of institute.
It is 14. according to the method for claim 12, further comprising the steps of:
Measure the angular difference between the orientation space and described image space;
The estimation of the movement to the part of the limbs is compensated with the angular difference measured.
A kind of 15. computer program product including computer program code modules, when the computer program code modules exist When being run on computer, the computer program code modules are used to make the computer execution according to claim 12 The step of method.
CN201680039901.9A 2015-07-07 2016-06-30 Apparatus and method for motion tracking of at least part of a limb Expired - Fee Related CN107847187B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CN2015083491 2015-07-07
CNPCT/CN2015/083491 2015-07-07
EP15189716.2 2015-10-14
EP15189716 2015-10-14
PCT/EP2016/065266 WO2017005591A1 (en) 2015-07-07 2016-06-30 Apparatus and method for motion tracking of at least a portion of a limb

Publications (2)

Publication Number Publication Date
CN107847187A true CN107847187A (en) 2018-03-27
CN107847187B CN107847187B (en) 2021-08-17

Family

ID=56321940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680039901.9A Expired - Fee Related CN107847187B (en) 2015-07-07 2016-06-30 Apparatus and method for motion tracking of at least part of a limb

Country Status (2)

Country Link
CN (1) CN107847187B (en)
WO (1) WO2017005591A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021129487A1 (en) * 2019-12-25 2021-07-01 华为技术有限公司 Method and apparatus for determining position of limb node of user, medium and system

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3621083A1 (en) * 2018-09-10 2020-03-11 Koninklijke Philips N.V. Rehabilitation device and method
GB201813125D0 (en) 2018-08-10 2018-09-26 Univ Oxford Innovation Ltd Apparatus and method for determining an indication of blood flow
CN109394232B (en) * 2018-12-11 2023-06-23 上海金矢机器人科技有限公司 Exercise capacity monitoring system and method based on wolf scale

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6377401B1 (en) * 1999-07-28 2002-04-23 Bae Systems Electronics Limited Head tracker system
CN1877340A (en) * 2005-06-09 2006-12-13 索尼株式会社 Activity recognition apparatus, method and program
KR20070120443A (en) * 2006-06-19 2007-12-24 소니 가부시끼 가이샤 Motion capture apparatus and method, and motion capture program
US20100144414A1 (en) * 2008-12-04 2010-06-10 Home Box Office, Inc. System and method for gathering and analyzing objective motion data
CN102176888A (en) * 2008-08-25 2011-09-07 苏黎世大学数学和自然科学部 Adjustable virtual reality system
CN102781320A (en) * 2009-10-30 2012-11-14 医学运动有限公司 Systems and methods for comprehensive human movement analysis
CN103099623A (en) * 2013-01-25 2013-05-15 中国科学院自动化研究所 Extraction method of kinesiology parameters
CN104106262A (en) * 2012-02-08 2014-10-15 微软公司 Head pose tracking using a depth camera

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2605016A1 (en) * 2005-04-18 2006-10-26 Bioness Development, Llc System and related method for determining a measurement between locations on a body
US8531396B2 (en) * 2006-02-08 2013-09-10 Oblong Industries, Inc. Control system for navigating a principal dimension of a data space
JP2009543649A (en) * 2006-07-19 2009-12-10 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Health management device
DE102007033486B4 (en) * 2007-07-18 2010-06-17 Metaio Gmbh Method and system for mixing a virtual data model with an image generated by a camera or a presentation device
US8696458B2 (en) * 2008-02-15 2014-04-15 Thales Visionix, Inc. Motion tracking system and method using camera and non-camera sensors
US9119569B2 (en) * 2011-11-08 2015-09-01 Nanyang Technological University Method and apparatus for calibrating a motion tracking system
RU2015133516A (en) 2013-01-11 2017-02-17 Конинклейке Филипс Н.В. SYSTEM AND METHOD FOR ASSESSING THE VOLUME OF MOVEMENTS OF A SUBJECT

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6377401B1 (en) * 1999-07-28 2002-04-23 Bae Systems Electronics Limited Head tracker system
CN1877340A (en) * 2005-06-09 2006-12-13 索尼株式会社 Activity recognition apparatus, method and program
KR20070120443A (en) * 2006-06-19 2007-12-24 소니 가부시끼 가이샤 Motion capture apparatus and method, and motion capture program
CN102176888A (en) * 2008-08-25 2011-09-07 苏黎世大学数学和自然科学部 Adjustable virtual reality system
US20100144414A1 (en) * 2008-12-04 2010-06-10 Home Box Office, Inc. System and method for gathering and analyzing objective motion data
CN102781320A (en) * 2009-10-30 2012-11-14 医学运动有限公司 Systems and methods for comprehensive human movement analysis
CN104106262A (en) * 2012-02-08 2014-10-15 微软公司 Head pose tracking using a depth camera
CN103099623A (en) * 2013-01-25 2013-05-15 中国科学院自动化研究所 Extraction method of kinesiology parameters

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021129487A1 (en) * 2019-12-25 2021-07-01 华为技术有限公司 Method and apparatus for determining position of limb node of user, medium and system
CN113111678A (en) * 2019-12-25 2021-07-13 华为技术有限公司 Method, device, medium and system for determining position of limb node of user
CN113111678B (en) * 2019-12-25 2024-05-24 华为技术有限公司 Method, device, medium and system for determining position of limb node of user

Also Published As

Publication number Publication date
WO2017005591A1 (en) 2017-01-12
CN107847187B (en) 2021-08-17

Similar Documents

Publication Publication Date Title
JP7342864B2 (en) Positioning program, positioning method, and positioning device
Roetenberg Inertial and magnetic sensing of human motion
Roetenberg et al. Xsens MVN: Full 6DOF human motion tracking using miniature inertial sensors
US9401025B2 (en) Visual and physical motion sensing for three-dimensional motion capture
WO2016183812A1 (en) Mixed motion capturing system and method
US20110218458A1 (en) Mems-based method and system for tracking a femoral frame of reference
Jakob et al. Estimation of the knee flexion-extension angle during dynamic sport motions using body-worn inertial sensors
CN104834917A (en) Mixed motion capturing system and mixed motion capturing method
JP6145072B2 (en) Sensor module position acquisition method and apparatus, and motion measurement method and apparatus
JP6573156B2 (en) Data analysis apparatus, data analysis method, and data analysis program
CN107847187A (en) Apparatus and method for carrying out motion tracking at least part of limbs
Meng et al. Biomechanical model-based displacement estimation in micro-sensor motion capture
CN110609621B (en) Gesture calibration method and human motion capture system based on microsensor
CN101755191A (en) Determine the device and method in the path of object moving in two dimensions
WO2017004403A1 (en) Biomechanical information determination
Madrigal et al. 3D motion tracking of the shoulder joint with respect to the thorax using MARG sensors and data fusion algorithm
Abbate et al. Development of a MEMS based wearable motion capture system
Ambrósio et al. Spatial reconstruction of human motion by means of a single camera and a biomechanical model
Qiu et al. Heterogeneous data fusion for three-dimensional gait analysis using wearable MARG sensors
Lin et al. Development of an ultra-miniaturized inertial measurement unit WB-3 for human body motion tracking
Yuan et al. Method to calibrate the skeleton model using orientation sensors
JP2021157289A (en) Animation generation device, animation generation method and program
JP6205387B2 (en) Method and apparatus for acquiring position information of virtual marker, and operation measurement method
KR20120028416A (en) Integration motion sensing device
JP5424224B2 (en) Relative angle estimation system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210817