CN111002292B - Robot arm humanoid motion teaching method based on similarity measurement - Google Patents
Robot arm humanoid motion teaching method based on similarity measurement Download PDFInfo
- Publication number
- CN111002292B CN111002292B CN201911263108.9A CN201911263108A CN111002292B CN 111002292 B CN111002292 B CN 111002292B CN 201911263108 A CN201911263108 A CN 201911263108A CN 111002292 B CN111002292 B CN 111002292B
- Authority
- CN
- China
- Prior art keywords
- joint
- mechanical arm
- arm
- human
- elbow
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0081—Programme-controlled manipulators with master teach-in means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1612—Programme controls characterised by the hand, wrist, grip control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1633—Programme controls characterised by the control loop compliant, force, torque control, e.g. combined with position control
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Physics & Mathematics (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Manipulator (AREA)
Abstract
The invention discloses a method for teaching humanoid motion of a mechanical arm based on similarity measurement, which comprises the steps of obtaining the elbow joint angle of a bent human arm, defining the elbow joint of a multi-degree-of-freedom mechanical arm through the similarity measurement of the human arm and the mechanical arm, calculating the rest joint angles of the multi-degree-of-freedom mechanical arm, and realizing humanoid motion. The method can simplify the operation, realize the humanoid motion of the multi-degree-of-freedom mechanical arm with undetermined joint number and joint length, package a calling program into a functional module, is suitable for most robot platforms, and improves the universality of the robot.
Description
Technical Field
The invention belongs to the technical field of human-computer teaching, and particularly relates to a similarity measurement-based human-simulated motion teaching method for a multi-degree-of-freedom mechanical arm.
Background
At present, the man-machine teaching mode is mainly to directly record human body actions. When motion capture alone is of interest, a variety of existing motion tracking systems based on visual, exoskeleton or other wearable motion sensors can be used. These external devices that track the motion of the body return accurate measurements of angular displacement of the joints for various tasks of whole body motion. The advantage of these methods is that they allow free movement of the person, but require a good solution to the problem of joint matching. Usually, this is achieved by explicit mapping between the human and the robot joints, but if the robot structure differs significantly from the human, then explicit mapping can be quite difficult. The main reason is that the elbow joint of the mechanical arm with multiple degrees of freedom cannot be accurately defined, so that the man-machine joint mapping is difficult.
With the development of robotics, more and more industries put higher demands on humanoid robotics. Especially, for improving the universality of the robot and simplifying the calculation, the robot which is not a humanoid mechanical arm is used for simulating the action of a human body, so that the robot has high value.
Disclosure of Invention
The invention provides a robot arm humanoid motion teaching method based on similarity measurement, aiming at the technical problem that the universality of a robot needs to be improved in the current man-machine teaching mode.
In order to achieve the technical purpose, the invention adopts the following technical scheme:
the invention provides a robot arm humanoid motion teaching method based on similarity measurement, which comprises the following steps:
acquiring space coordinates of three points of a shoulder, an elbow and a wrist of a human arm by adopting a motion capture system, and determining the angle of each joint of the human arm after bending based on the acquired space coordinates; and calculating the rest joint angles of the mechanical arm based on the elbow joint of the mechanical arm determined by the similarity between the mechanical arm and the human arm in advance, so as to realize human-simulated motion.
Further, the method of predetermining the elbow joint of the arm is as follows:
taking a mechanical arm base as a starting point, sequentially changing joint angles from a first joint to a last joint to enable the joint angle of the mechanical arm to be consistent with a preset elbow joint angle after the human arm is bent, and determining the similarity between the mechanical arm and the human arm based on the joint changed by the mechanical arm each time;
and setting the changed mechanical arm joint corresponding to the highest similarity as the mechanical arm elbow joint by using a comparison algorithm of motion quantification.
Furthermore, a motion capture system is adopted to obtain the space coordinates of three points of the shoulder, the elbow and the wrist of the person, the elbow joint angle of the person after the arm is bent is determined based on the obtained space coordinates, and the calculation formula is as follows:
d(ja,jb)=||j(xa,ya,za)-j(xb,yb,zb)||
d(ja,jb) Is a joint jaAnd joint jbEuclidean distance between; thetaElbowThe elbow joint angle after the arm is bent; j (x)a,ya,za) Is a joint jaSpatial coordinates of j (x)b,yb,zb) Is a joint jbThe spatial coordinates of (a).
Further, the specific method for predetermining the elbow joint of the mechanical arm comprises the following steps:
the acquisition mechanical arm changes the key frame image of each joint and simultaneously acquires the elbow bending angle thetaElbowMotion key frame images of time.
Respectively carrying out quantization processing on the collected key frame images of the motion of the human arm and the multi-degree-of-freedom mechanical arm:
obtaining a global motion transformation matrix of the human arm and the multi-degree-of-freedom mechanical arm;
obtaining a local motion transformation matrix of the human arm and the multi-degree-of-freedom mechanical arm;
calculating the angles of the bending joints of the limbs and the multi-degree-of-freedom mechanical arm, and determining the direction of the joints by checking the sizes of the angles of the bending joints of the limbs and the multi-degree-of-freedom mechanical arm in each frame;
the difference between the global motion transformation matrix, the local motion transformation matrix and the angle of the curved joint in the human arm and the robot arm is calculated using a comparison algorithm, and then their similarity is evaluated based on given thresholds of the global motion transformation matrix, the local motion transformation matrix and the angle of the curved joint and the robot arm elbow joint position is determined based on the similarity.
Still further, the method for acquiring the key frame image of each joint changed by the mechanical arm comprises the following steps:
extracting key frames by adopting self-adaptive sampling, and solving the number of N frame key frames of each group of actions by using a K-means clustering algorithm; selecting an angle 0 as an initial frame and an elbow joint angle thetaElbowTo end the frame.
Further, the calculation method of the global transformation matrix is the same as that of the local transformation matrix, only the selected origin is different, and the calculation formula is as follows:
Setαis the starting point of the three-dimensional nodes of the shoulder, the elbow and the wrist, wherein the 1 st node is the shoulder node, the 2 nd node is the elbow node and the 3 rd node is the wrist node; setβIs the end point of the three-dimensional nodes of the shoulder, elbow and wrist;is the three-dimensional node coordinates of the ith node in the start frame,is the three-dimensional node coordinate of the ith node in the end frame; centαIs SetαThe center of mass of; centβIs SetβThe center of mass of;
[M,N,O]=SVD(H)
R=OMt
T=Centβ-R*Centα
wherein H is a covariance matrix; SVD is a singular value decomposition function that decomposes matrix H into three matrices (M, N and O); the rotation matrix R is calculated from O and M, wherein T represents a translation matrix of point motion along the x, y, z axes;
TF is SetAAnd SetBJoint movementA dynamic transformation matrix;
θx,θY,θZ=Euler(R)
wherein Euler is a function for calculating the rotation angle theta of the rotation matrix R around X, Y, Z, respectivelyX,θY,θZObtaining a rotation matrix R, where θXRepresenting the rotation angle, theta, of the rotation matrix R about the X-axisyRepresenting the rotation angle, theta, of the rotation matrix R about the Y-axisZRepresenting the rotation angle of the rotation matrix R about the Z-axis.
Further, the global motion of the arm of the human hand is changed, the similarity is judged by using the right arm, and the neck is used as the origin; the global motion transformation of the multi-degree-of-freedom mechanical arm takes the base coordinate as an original point.
And further, the local motion transformation matrixes of the human arm and the multi-degree-of-freedom mechanical arm respectively use the father joint points of the joints of the human arm and the mechanical arm as original points.
Still further, the method of evaluating the similarity of the global motion transformation matrix, the local motion transformation matrix and the angle of the bending joint according to their given threshold values is as follows:
(1) for the moving images of N frames, calculating the global joint translation G between the j joint after the ith frame quantization of the human arm and the multi-degree-of-freedom mechanical armdtAnd local joint translation LdθAnd global rotation angle GdθAnd local rotation angle Ldθ;
Gdt=||TG arm-TG mechanical arm||
Ldt=||TL arm-TL mechanical arm||
Gdθ=||RG arm-RG mechanical arm||
Ldθ=||RL arm-RL mechanical arm||
Wherein, TG armRepresenting a global translation matrix of a j joint of the human arm after the ith frame is quantized; t isG mechanical armRepresenting a global translation matrix of a j joint of the mechanical arm after the ith frame is quantized; t isL armA local translation matrix representing a j joint of the human arm after the ith frame is quantized; t isL mechanical armA local translation matrix representing a j joint of the mechanical arm after the ith frame is quantized; rG armRepresenting a global rotation matrix of a j joint of the human arm after the ith frame is quantized; rG mechanical armRepresenting a global rotation matrix of a j joint of the mechanical arm after the ith frame is quantized; rL armA local rotation matrix representing a j joint of the human arm after the ith frame is quantized; rL mechanical armA local rotation matrix representing a j joint of the mechanical arm after the ith frame is quantized;
(2) setting global rotation angle threshold G of j-th joint according to severity of similarityθLocal rotation angle threshold value LθGlobal joint translation threshold GtLocal joint translation threshold Lt。
When G isdθ≤GθTime of flight
When G isdt≤GtTime of flight
When L isdθ≤LθTime of flight
When L isdt≤GtTime of flight
Wherein GFrame% is the initial global percentage, preferably set to 0; LFrame% is the initial local percentage, preferably set to 0; GFrame1j%、GFrame2j% is Global percentage, LFrame1j%、LFrame2j% is the percentage of the local area,as a percentage of the global rotation angle,is the percentage of the local rotation angle,as a percentage of the global translation,is a local translation percentage and satisfies the following formula:
calculating the ith frame a after quantization of the human arm and the multi-degree-of-freedom mechanical armHuman shoulderAnd bMechanical arm shoulderDifference d of1(ii) a Calculating aHuman elbowAnd bMechanical arm elbowDifference d between2Wherein a isHuman shoulderIs the included angle of the shoulders of the left hand; bMechanical arm shoulderThe included angle of the shoulder joint of the mechanical arm; a isHuman elbowIs the angle of the elbow of the human left arm; bMechanical arm elbowThe included angle of the elbow joint of the mechanical arm:
d1=||ahuman shoulder-bMechanical arm shoulder||
d2=||aHuman elbow-bMechanical arm elbow||
When d is1≤θ1When the temperature of the water is higher than the set temperature,
AngFrame_1%=AngFrame%+a1%
when d is2≤θ2When the temperature of the water is higher than the set temperature,
AngFrame_2%=AngFrame_1%+a2%
θ1is the shoulder joint angle threshold, θ2Is the elbow joint angle threshold, a 1%, a 2% are angle percentages, and
a 1% + a 2% + 100%, the percentage is set by the weight. The AngFrame% is the initial angle percentage, and the AngFrame _ 1% and AngFrame _ 2% are the angle percentages.
When the angular percentage AngFrame _ 2% > 50%, the percentage of similarity between two frames Framei% is:
Framei%=((GFrame_2%+LFrame_2%)+AngFrame_2%)/2
if the angular percentage AngFrame _ 2% is less than or equal to 50%, then the FrameiPercent 0 is the percent total of similarity of the two movementssimComprises the following steps:
the beneficial technical effects are as follows:
the invention determines the elbow joint of the multi-degree-of-freedom mechanical arm by a similarity measurement method in image processing, realizes the humanoid motion of the multi-degree-of-freedom mechanical arm with undetermined joint number and joint length, encapsulates a calling program into a functional module, is suitable for most robot platforms, and improves the universality of the robot. The accuracy of judging the similarity of two groups of similar actions is improved through an improved motion quantization comparison algorithm. The similarity percentage of the human arm and the mechanical arm is calculated in a combined mode through three modes of a global transformation matrix, a local transformation matrix and a bent joint angle, so that the calculation precision can be obviously improved, and errors can be reduced. The elbow joint of the multi-degree-of-freedom mechanical arm is defined by the method, and the human-computer teaching technology is combined, so that the simulation of the mechanical arm with any degree of freedom on the actions of a human body can be realized, and the technical requirements of the human-computer teaching technology on the mechanical arm are greatly reduced.
Drawings
FIG. 1 is a block diagram of a design flow of a specific embodiment of the present invention;
FIG. 2 is a comparison graph of elbow joint similarity for an embodiment of the present invention;
FIG. 3 illustrates global and local motion coordinates of a human arm and robotic arm in an embodiment of the present invention;
fig. 4 is a framework of a motion quantization and comparison method in an embodiment of the invention.
Detailed Description
The present invention will be further described with reference to the accompanying drawings, and the following examples are only for clearly illustrating the technical solutions of the present invention, and are not intended to limit the protection and the application scope of the present invention.
As shown in FIG. 1, the present invention uses Kinect (in other embodiments, other motion capture systems may be used) to obtain the spatial coordinates of three points on the arm of the human shoulder, elbow and wrist, and the elbow joint angle is obtained by the cosine law. The method comprises the steps of taking a multi-degree-of-freedom mechanical arm base as a starting point, changing a first joint angle to enable the joint angle of a mechanical arm to be consistent with the elbow joint angle of a human hand, then selecting the joints of the mechanical arm one by one according to a joint sequence, sequentially using an improved motion quantization comparison algorithm to calculate the similarity degree of the multi-degree-of-freedom mechanical arm simulating the motion of the human arm, and defining the selected joint of the mechanical arm as the elbow joint of the mechanical arm under the condition of the highest similarity degree. And identifying the motion state of the human arm by using the Kinect, and calculating the rest joint angles of the multi-degree-of-freedom mechanical arm to realize human-simulated motion. The method can realize the elbow joint definition and the teaching process of the multi-degree-of-freedom mechanical arm based on similarity judgment, and comprises the following steps:
and S1, acquiring the elbow joint angle of the bent human arm.
And S1-1, identifying the space coordinates of three points of the shoulder, elbow and wrist on the arm of the human hand by using Kinect.
S1-2, obtaining the elbow joint angle theta of the bent arm of the human hand through the cosine lawElbow。
d(ja,jb)=||j(xa,ya,za)-j(xb,yb,zb)||
d(ja,jb) Is a joint jaAnd joint jbEuclidean distance between; thetaElbowThe elbow joint angle after the arm is bent;j(xa,ya,za) Is a joint jaSpatial coordinates of j (x)b,yb,zb) Is a joint jbThe spatial coordinates of (a).
And S2, determining the position of the elbow joint of the mechanical arm through the similarity as shown in figure 2.
S2-1, carrying out quantization processing on the motion of the human arm and the multi-degree-of-freedom mechanical arm, selecting mechanical arm joints one by one according to a joint sequence by taking the multi-degree-of-freedom mechanical arm base as a starting point, and changing the joints into the angle theta of the elbow joint of the obtained humanElbowAs shown in fig. 2. The skeletal data is preprocessed by dimension reduction, the video sequence is decomposed into a plurality of image frames and then a selection key image is extracted. And extracting the key frames by adopting self-adaptive sampling, and solving the number of the N frames of the key frames of each group of actions by using a K-means clustering algorithm. Selecting an angle 0 as an initial frame and an angle thetaElbowIn order to finish the frame, for the frames from i to i +1 (i is more than 1 and less than or equal to n frames), the motion of the human arm and the multi-degree-of-freedom mechanical arm is quantized:
A=R*B+T
where R is the rotation matrix, T is the translation vector, A is the start of the motion, and B is the end of the motion.
S2-1-1, finding a transformation matrix of the global motion of the human arm and the multi-degree-of-freedom mechanical arm, and for the global transformation, judging the similarity of the human arm by using the right arm, using the neck as an original point, and using the base coordinate as the original point for the mechanical arm. The similarity is judged by the right arm of the human, the neck is taken as the original point, the mechanical arm takes the base coordinate as the original point, as shown in fig. 3:
global motion transformation matrix of human arm:
Setαis the starting point of the three-dimensional nodes of the shoulder, the elbow and the wrist of the human arm; setβIs the end point of the three-dimensional nodes of the shoulder, the elbow and the wrist of the human arm;is the three-dimensional node coordinates of the ith node in the start frame,is the three-dimensional node coordinates of the ith node in the end frame. CentαIs SetαThe center of mass of; centβIs SetβThe center of mass of the lens.
[M,N,O]=SVD(H)
R=OMt
T=Centβ-R*Centα
Wherein H is a covariance matrix; SVD singular value decomposition is a singular value decomposition function that decomposes matrix H into three matrices (M, N, and O). The rotation matrix R is calculated from O and M, where T ═ Tx,tY,tZ]tWhich represents the movement of a point along the x, y, z axes, where txRepresenting the movement of a point of the translation matrix T along the x-axis, TyRepresenting the motion of points of the translation matrix T along the y-axis; t is tZRepresenting the motion of a point of the translation matrix T along the z-axis;
TF is SetAAnd SetBA transformation matrix of the joint motion.
θx,θY,θZ=Euler(R)
Wherein Euler is a function for calculating the rotation angle theta of the rotation matrix R around X, Y, Z, respectivelyX,θY,θZ。
The calculation method of the global motion transformation matrix and the local motion transformation matrix of the multi-degree-of-freedom mechanical arm is the same as the calculation method of the global motion transformation matrix and the local motion transformation matrix of the human arm, and the three-dimensional node coordinate of the ith node in the human arm image key frame is converted into the three-dimensional node coordinate of the ith node in the multi-degree-of-freedom mechanical arm image key frame.
S2-1-2, solving a transformation matrix of local motion, wherein the local transformation matrix takes a father joint as an origin, as shown in figure 3, j1, j2, j3 and j4 respectively correspond to the neck, shoulder, elbow and wrist of a human body and a base of the multi-degree-of-freedom mechanical arm, 1 to 3 joints, 4 joints and 5 to 7 joints, and j3 is a father joint point of j 4.
And S2-1-3, calculating the angles of the bending joints of the limbs and the multi-degree-of-freedom mechanical arm, and determining the direction of the joints by checking the sizes of the angles of the bending joints of the limbs and the multi-degree-of-freedom mechanical arm in each frame. Similar to S1-2, the cosine law is used to calculate the angle of the triangle formed by the arm or multi-degree-of-freedom mechanical arm based on its vertex coordinates. The angle between the limb and the arm's bending joint is calculated in each frame.
d(ja,jb)=||j(xa,ya,za)-j(xb,yb,zb)||
In the same way, a can be calculatedHuman shoulder,bMechanical arm shoulder,bMechanical arm elbowWherein, aHuman shoulderIs the included angle of the shoulders of the left hand; bMechanical arm shoulderIs the angle of the shoulder joint of the mechanical arm;aHuman elbowIs the angle of the elbow of the human left arm; bMechanical arm elbowThe included angle of the elbow joint of the mechanical arm.
And S2-2, determining the elbow joint position of the multi-degree-of-freedom mechanical arm through similarity measurement, and calculating the similarity degree of the motion of the arm after each change of the joint angle of the multi-degree-of-freedom mechanical arm and the motion of the arm by using an improved motion quantification comparison algorithm as shown in figure 4. The distance between two motion metrics is calculated using a comparison algorithm and their similarity is then evaluated according to a given threshold for each metric. And selecting a state with the highest similarity degree, and taking the joint which changes at the moment as an elbow joint of the multi-degree-of-freedom mechanical arm simulating the arm motion.
S2-2-1, calculating global joint translation G between j joints of the human arm and the multi-degree-of-freedom mechanical arm after the ith frame quantization for the N frames of moving imagesdtAnd local joint translation LdθAnd global rotation angle GdθAnd local rotation angle Ldθ。
Gdt=||TG arm-TG mechanical arm||
Ldt=||TL arm-TL mechanical arm||
Gdθ=||RG arm-RG mechanical arm||
Ldθ=||RL arm-RL mechanical arm||
Wherein, TG armRepresenting a global translation matrix of a j joint of the human arm after the ith frame is quantized; t isG mechanical armRepresenting a global translation matrix of a j joint of the mechanical arm after the ith frame is quantized; t isL armA local translation matrix representing a j joint of the human arm after the ith frame is quantized; t isL mechanical armA local translation matrix representing a j joint of the mechanical arm after the ith frame is quantized; rG armRepresenting a global rotation matrix of a j joint of the human arm after the ith frame is quantized; rG mechanical armRepresenting a global rotation matrix of a j joint of the mechanical arm after the ith frame is quantized; rL armA local rotation matrix representing a j joint of the human arm after the ith frame is quantized; rL mechanical armIndicating the ith frame of the mechanical arm after quantizationLocal rotation matrix for j-joint.
S2-2-2, setting the global rotation angle threshold G of the j joint according to the strictness degree of the similarityθLocal rotation angle threshold value LθGlobal joint translation threshold GtLocal joint translation threshold Lt。
When G isdθ≤GθTime of flight
When G isdt≤GtTime of flight
When L isdθ≤LθTime of flight
When L isdt≤GtTime of flight
Wherein GFrame ═ 0 is the initial global percentage, LFrame ═ 0 is the initial local percentage, GFrame1j%、GFrame2j% is Global percentage, LFrame1j%、LFrame2j% is the percentage of the local area,as a percentage of the global rotation angle,is the percentage of the local rotation angle,as a percentage of the global translation,is a local translation percentage, andand the percentage is set according to the weight.
Calculating the ith frame a after quantization of the human arm and the multi-degree-of-freedom mechanical armHuman shoulderAnd bMechanical arm shoulderDifference d of1(ii) a Calculating aHuman elbowAnd bMechanical arm elbowDifference d between2Wherein a isHuman shoulderIs the included angle of the shoulders of the left hand; bMechanical arm shoulderThe included angle of the shoulder joint of the mechanical arm; a isHuman elbowIs the angle of the elbow of the human left arm; bMechanical arm elbowThe included angle of the elbow joint of the mechanical arm:
d1=||ahuman shoulder-bMechanical arm shoulder||
d2=||aHuman elbow-bMechanical arm elbow||
When d is1≤θ1When the temperature of the water is higher than the set temperature,
AngFrame_1%=AngFrame%+a1%
when d is2≤θ2When the temperature of the water is higher than the set temperature,
AngFrame_2%=AngFrame_1%+a2%
θ1,θ2a 1%, a 2% are angle percentages for each joint angle threshold, and
a 1% + a 2% + 100%, the percentage is set by the weight. The AngFrame% is the initial angle percentage, and the AngFrame _ 1% and AngFrame _ 2% are the angle percentages.
When the angular percentage AngFrame _ 2% > 50%, the percentage of similarity between two frames Framei% is:
Framei%=((GFrame_2%+LFrame_2%)+AngFrame_2%)/2
if the angular percentage AngFrame _ 2% is less than or equal to 50%, then the FrameiPercent 0 is the percent total of similarity of the two movementssimComprises the following steps:
the method and the device jointly calculate the similarity percentage of the human arm and the mechanical arm through three calculation modes of the global transformation matrix, the local transformation matrix and the bent joint angle, can obviously improve the calculation precision and reduce errors. The elbow joint of the multi-degree-of-freedom mechanical arm is defined by the method, and the human-computer teaching technology is combined, so that the simulation of the mechanical arm with any degree of freedom on the actions of a human body can be realized, and the technical requirements of the human-computer teaching technology on the mechanical arm are greatly reduced.
And S3, identifying the motion state of the human arm by using the Kinect, and calculating the rest joint angles of the multi-freedom-degree mechanical arm to realize human-simulated motion.
And S3-1, performing man-machine joint mapping, and performing one-to-one mapping on the human arm and the shoulder, elbow and wrist joint of the multi-degree-of-freedom mechanical arm.
And S3-2, performing secondary description of the arm movement, and describing the change of coordinates in the arm movement in an arm base coordinate system through Kinect recognition through translation and rotation transformation.
And S3-3, performing joint prediction according to a plurality of frames of joint data captured in the early stage as training samples to repair the joint point loss caused by the reasons of limb occlusion and the like. The skeletal joint prediction formula is derived using the backward difference of the nth derivative of taylor's formula (without the remainder) as shown in the following equation:
in the formula, xn+1|nPredicting the n +1 frame bone data by using the n frame bone data and the previous frames of data; n is the number of predicted samples; f. of(i)(n) is the i-th derivative of the Taylor equation.
The first, second and third derivatives of the taylor equation can be approximated by the following equations, respectively:
in the formula, xnThe input nth frame of skeleton data.
The skeletal data of the x coordinate of the n +1 frame can be predicted, and the other two coordinates can be obtained in the same way. While using improved lost joint repair algorithms to repair a missing continuous plurality of joints or end-of-limb joints.
The invention determines the elbow joint of the multi-degree-of-freedom mechanical arm by a similarity measurement method in image processing, realizes the humanoid motion of the multi-degree-of-freedom mechanical arm with undetermined joint number and joint length, encapsulates a calling program into a functional module, is suitable for most robot platforms, and improves the universality of the robot. The accuracy of judging the similarity of two groups of similar actions is improved through an improved motion quantization comparison algorithm.
And S3-4, combining a kinematics homogeneous transformation method and a space vector method, and solving the angle of each joint angle of the multi-degree-of-freedom mechanical arm by using a kinematics inverse solution method. Then, the phenomenon that the hand gesture cannot be recognized due to kinect is eliminated by using a shake elimination filtering method, so that abnormal information is given, and the shake of the mechanical arm is reduced. Finally, the multi-degree-of-freedom mechanical arm can realize human-simulated motion.
And S4, evaluating the humanoid action of the multi-degree-of-freedom mechanical arm and observing whether the humanoid action meets the requirements.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (8)
1. The mechanical arm humanoid motion teaching method based on similarity measurement is characterized by comprising the following steps:
acquiring space coordinates of three points of a shoulder, an elbow and a wrist of a human arm by adopting a motion capture system, and determining the angle of each joint of the human arm after bending based on the acquired space coordinates; based on a mechanical arm elbow joint determined by the similarity between the mechanical arm and the human arm in advance, calculating other joint angles of the mechanical arm to realize human-simulated motion, wherein the specific method for determining the mechanical arm elbow joint in advance comprises the following steps: taking a mechanical arm base as a starting point, sequentially changing joint angles from a first joint to a last joint to enable the joint angle of the mechanical arm to be consistent with a preset elbow joint angle after the human arm is bent, and determining the similarity between the mechanical arm and the human arm based on the joint changed by the mechanical arm each time; and calculating the difference value between the global motion transformation matrix, the local motion transformation matrix and the angle of the bending joint in the human arm and the mechanical arm by using a comparison algorithm, evaluating the similarity of the global motion transformation matrix, the local motion transformation matrix and the angle of the bending joint according to given threshold values of the global motion transformation matrix, the local motion transformation matrix and the angle of the bending joint, determining the position of the elbow joint of the mechanical arm according to the similarity, and setting the changed mechanical arm joint corresponding to the highest similarity as the elbow joint of the mechanical arm.
2. The method for teaching human-simulated motion of the mechanical arm based on the similarity measurement as claimed in claim 1, wherein a motion capture system is adopted to obtain the space coordinates of three points of the shoulder, the elbow and the wrist of the human, and the elbow joint angle of the human arm after bending is determined based on the obtained space coordinates, and the calculation formula is as follows:
d(ja,jb)=||j(xa,ya,za)-j(xb,yb,zb)||
d(ja,jb) Is a joint jaAnd joint jbEuclidean distance between; thetaElbowThe elbow joint angle after the arm is bent; j (x)a,ya,za) Is a joint jaSpatial coordinates of j (x)b,yb,zb) Is a joint jbThe spatial coordinates of (a).
3. The method for teaching human-simulated motion of mechanical arms based on similarity measurement as claimed in claim 1, wherein the specific method for predetermining the elbow joints of the mechanical arms comprises the following steps:
the acquisition mechanical arm changes the key frame image of each joint and simultaneously acquires the elbow bending angle thetaElbowTemporal motion key frame images;
respectively carrying out quantization processing on the collected key frame images of the motion of the human arm and the multi-degree-of-freedom mechanical arm:
obtaining a global motion transformation matrix of the human arm and the multi-degree-of-freedom mechanical arm;
obtaining a local motion transformation matrix of the human arm and the multi-degree-of-freedom mechanical arm;
and calculating the angles of the bending joints of the limbs and the multi-degree-of-freedom mechanical arm, and determining the direction of the joints by checking the sizes of the angles of the bending joints of the limbs and the multi-degree-of-freedom mechanical arm in each frame.
4. The method for teaching human-simulated motion of mechanical arm based on similarity measurement as claimed in claim 3, wherein the method for collecting the key frame image of mechanical arm to change each joint comprises the following steps:
extracting key frames by adopting self-adaptive sampling, and solving the number of N frame key frames of each group of actions by using a K-means clustering algorithm; selecting an angle 0 as an initial frame and an elbow joint angle thetaElbowTo end the frame.
5. The method for teaching human-simulated motion of the mechanical arm based on the similarity measurement as claimed in claim 3, wherein the global transformation matrix is the same as the local transformation matrix in calculation method, only the selected origin is different, and the calculation formula is as follows:
Setαis the starting point of the three-dimensional nodes of the shoulder, the elbow and the wrist, wherein the 1 st node is the shoulder node, the 2 nd node is the elbow node and the 3 rd node is the wrist node; setβIs the end point of the three-dimensional nodes of the shoulder, elbow and wrist;is the three-dimensional node coordinates of the ith node in the start frame,is the three-dimensional node coordinate of the ith node in the end frame; centαIs SetαThe center of mass of; centβIs SetβThe center of mass of;
[M,N,O]=SVD(H)
R=OMt
T=Centβ-R*Centα
wherein H is a covariance matrix; SVD singular value decomposition is a singular value decomposition function that decomposes matrix H into three matrices M, N and O; the rotation matrix R is calculated from O and M, where T ═ TX,tY,tZ]tAnd T represents a translation matrix of point motion along the x, y, z axes, where TXRepresenting the movement of a point of the translation matrix T along the x-axis, TyRepresenting the motion of points of the translation matrix T along the y-axis; t is tZRepresenting the motion of a point of the translation matrix T along the z-axis;TF is SetAAnd SetBA transformation matrix of joint motion;
θX,θY,θZ=Euler(R)
wherein Euler is a function for calculating the rotation angle theta of the rotation matrix R around X, Y, Z, respectivelyX,θY,θZWherein theta isXRepresenting the rotation angle, theta, of the rotation matrix R about the X-axisyRepresenting the rotation angle, theta, of the rotation matrix R about the Y-axisZRepresenting the rotation angle of the rotation matrix R about the Z-axis.
6. The method for teaching human-simulated motion of mechanical arms based on similarity measurement as claimed in claim 4, wherein the global motion of the mechanical arms is changed, the similarity is judged by using the right arm, and the neck is used as the origin; the global motion transformation of the multi-degree-of-freedom mechanical arm takes the base coordinate as an original point.
7. The method for teaching human-simulated motion of a mechanical arm based on similarity measurement as claimed in claim 6, wherein the transformation matrix for local motion of the human arm and the multi-degree-of-freedom mechanical arm respectively takes the father joint points of the joints of the human arm and the mechanical arm as the origin.
8. The human-simulated-motion-teaching method of mechanical arms based on similarity measurement as claimed in claim 7, wherein the method for evaluating the similarity of the global motion transformation matrix, the local motion transformation matrix and the angle of the bending joint according to the given threshold values is as follows:
(1) for the moving images of N frames, calculating the global joint translation G between the j joint after the ith frame quantization of the human arm and the multi-degree-of-freedom mechanical armdtAnd local joint translation LdθAnd global rotation angle GdθAnd local rotation angle Ldθ;
Gdt=||TG arm-TG mechanical arm||
Ldt=||TL arm-TL mechanical arm||
Gdθ=||RG arm-RG mechanical arm||
Ldθ=||RL arm-RL mechanical arm||
Wherein, TG armRepresenting a global translation matrix of a j joint of the human arm after the ith frame is quantized; t isG mechanical armRepresenting a global translation matrix of a j joint of the mechanical arm after the ith frame is quantized; t isL armA local translation matrix representing a j joint of the human arm after the ith frame is quantized; t isL mechanical armA local translation matrix representing a j joint of the mechanical arm after the ith frame is quantized; rG armRepresenting a global rotation matrix of a j joint of the human arm after the ith frame is quantized; rG mechanical armRepresenting a global rotation matrix of a j joint of the mechanical arm after the ith frame is quantized; rL armA local rotation matrix representing a j joint of the human arm after the ith frame is quantized; rL mechanical armA local rotation matrix representing a j joint of the mechanical arm after the ith frame is quantized;
(2) setting global rotation angle threshold G of j-th joint according to severity of similarityθLocal rotation angle threshold value LθGlobal joint translation threshold GtLocal joint translation threshold Lt;
When G isdθ≤GθTime of flight
When G isdt≤GtTime of flight
When L isdθ≤LθTime of flight
When L isdt≤GtTime of flight
Wherein GFrame% is the initial global percentage, LFrame% is the initial local percentage, GFrame1j%、GFrame2j% is Global percentage, LFrame1j%、LFrame2j% is the percentage of the local area,as a percentage of the global rotation angle,is the percentage of the local rotation angle,as a percentage of the global translation,is a local translation percentage and satisfies the following formula:
calculating the included angle a of the ith frame and the left shoulder of the person after the quantization of the human arm and the multi-degree-of-freedom mechanical armHuman shoulderAngle b between the shoulder joint of the mechanical armMechanical arm shoulderDifference d of1(ii) a Calculating the included angle a of the elbow of the left arm of the human bodyHuman elbowAngle b with elbow joint of mechanical armMechanical arm elbowDifference d between2:
d1=||aHuman shoulder-bMechanical arm shoulder||
d2=||aHuman elbow-bMechanical arm elbow||
When d is1≤θ1When the temperature of the water is higher than the set temperature,
AngFrame_1%=AngFrame%+a1%
when d is2≤θ2When the temperature of the water is higher than the set temperature,
AngFrame_2%=AngFrame_1%+a2%
θ1is the shoulder joint angle threshold, θ2For the elbow joint angle threshold, a 1%, a 2% are angle percentages, and a 1% + a 2% is 100%, the percentages are set by weight; the AngFrame% is the initial angle percentage, and the AngFrame _ 1% and AngFrame _ 2% are the angle percentages;
when the angular percentage AngFrame _ 2% > 50%, the percentage of similarity between two frames Framei% is:
Framei%=((GFrame_2%+LFrame_2%)+AngFrame_2%)/2
if the angular percentage AngFrame _ 2% is less than or equal to 50%, then the FrameiPercent 0 is the percent total of similarity of the two movementssimComprises the following steps:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911263108.9A CN111002292B (en) | 2019-12-11 | 2019-12-11 | Robot arm humanoid motion teaching method based on similarity measurement |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911263108.9A CN111002292B (en) | 2019-12-11 | 2019-12-11 | Robot arm humanoid motion teaching method based on similarity measurement |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111002292A CN111002292A (en) | 2020-04-14 |
CN111002292B true CN111002292B (en) | 2021-04-16 |
Family
ID=70115036
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911263108.9A Active CN111002292B (en) | 2019-12-11 | 2019-12-11 | Robot arm humanoid motion teaching method based on similarity measurement |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111002292B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111515959B (en) * | 2020-05-19 | 2021-11-23 | 厦门大学 | Programmable puppet performance robot control method and system and robot |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090271038A1 (en) * | 2008-04-25 | 2009-10-29 | Samsung Electronics Co., Ltd. | System and method for motion control of humanoid robot |
CN102509025A (en) * | 2011-11-25 | 2012-06-20 | 苏州大学 | Method for quick solution of six-degree-of-freedom humanoid dexterous arm inverse kinematics |
CN104635762A (en) * | 2015-01-13 | 2015-05-20 | 北京航空航天大学 | Self-motion angle calculating method facing SRS anthropomorphic arm |
CN106737671A (en) * | 2016-12-21 | 2017-05-31 | 西安科技大学 | The bilayer personification motion planning method of seven degrees of freedom copy man mechanical arm |
CN108241339A (en) * | 2017-12-27 | 2018-07-03 | 北京航空航天大学 | The movement solution of apery mechanical arm and configuration control method |
CN108638069A (en) * | 2018-05-18 | 2018-10-12 | 南昌大学 | A kind of mechanical arm tail end precise motion control method |
CN208163215U (en) * | 2018-03-24 | 2018-11-30 | 南通洪源地质工程材料有限公司 | Automation crumb loading system based on drilling pipe punching lathe |
CN110480634A (en) * | 2019-08-08 | 2019-11-22 | 北京科技大学 | A kind of arm guided-moving control method for manipulator motion control |
-
2019
- 2019-12-11 CN CN201911263108.9A patent/CN111002292B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090271038A1 (en) * | 2008-04-25 | 2009-10-29 | Samsung Electronics Co., Ltd. | System and method for motion control of humanoid robot |
CN102509025A (en) * | 2011-11-25 | 2012-06-20 | 苏州大学 | Method for quick solution of six-degree-of-freedom humanoid dexterous arm inverse kinematics |
CN104635762A (en) * | 2015-01-13 | 2015-05-20 | 北京航空航天大学 | Self-motion angle calculating method facing SRS anthropomorphic arm |
CN106737671A (en) * | 2016-12-21 | 2017-05-31 | 西安科技大学 | The bilayer personification motion planning method of seven degrees of freedom copy man mechanical arm |
CN108241339A (en) * | 2017-12-27 | 2018-07-03 | 北京航空航天大学 | The movement solution of apery mechanical arm and configuration control method |
CN208163215U (en) * | 2018-03-24 | 2018-11-30 | 南通洪源地质工程材料有限公司 | Automation crumb loading system based on drilling pipe punching lathe |
CN108638069A (en) * | 2018-05-18 | 2018-10-12 | 南昌大学 | A kind of mechanical arm tail end precise motion control method |
CN110480634A (en) * | 2019-08-08 | 2019-11-22 | 北京科技大学 | A kind of arm guided-moving control method for manipulator motion control |
Also Published As
Publication number | Publication date |
---|---|
CN111002292A (en) | 2020-04-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Li | Human–robot interaction based on gesture and movement recognition | |
CN109101966B (en) | Workpiece recognition positioning and attitude estimation system and method based on deep learning | |
CN105096341B (en) | Mobile robot position and orientation estimation method based on trifocal tensor and key frame strategy | |
CN114454174B (en) | Mechanical arm motion capturing method, medium, electronic device and system | |
CN115847422A (en) | Gesture recognition method, device and system for teleoperation | |
Lin et al. | Ball trajectory tracking and prediction for a ping-pong robot | |
CN111002292B (en) | Robot arm humanoid motion teaching method based on similarity measurement | |
Lee et al. | Toward vision-based high sampling interaction force estimation with master position and orientation for teleoperation | |
Yang et al. | Skeleton-based hand gesture recognition for assembly line operation | |
CN112894794B (en) | Human body arm action simulation method and device, terminal equipment and storage medium | |
JP7061272B2 (en) | Motion analysis device, motion analysis method, motion analysis program and motion analysis system | |
Zhang et al. | Big-Net: Deep learning for grasping with a bio-inspired soft gripper | |
CN115205750B (en) | Motion real-time counting method and system based on deep learning model | |
CN116423520A (en) | Mechanical arm track planning method based on vision and dynamic motion primitives | |
CN116079727A (en) | Humanoid robot motion simulation method and device based on 3D human body posture estimation | |
Khalil et al. | Visual monitoring of surface deformations on objects manipulated with a robotic hand | |
CN115311353A (en) | Multi-sensor multi-handle controller graph optimization tight coupling tracking method and system | |
Zhou et al. | Analysing the effects of pooling combinations on invariance to position and deformation in convolutional neural networks | |
Wanyan et al. | Scene Prediction and Manipulator Grasp Pose Estimation Based on YOLO-GraspNet | |
CN113608622A (en) | Human body posture real-time prediction method, system, medium and equipment | |
Heickal et al. | Real-time 3D full body motion gesture recognition | |
Liu et al. | Real time pose estimation based on extended Kalman filter for binocular camera | |
CN114083545B (en) | Moving object robot grabbing method and device based on visual perception | |
Heickal et al. | Computer vision-based real-time 3D gesture recognition using depth image | |
Deng et al. | High-precision control of robotic arms based on active visual under unstructured scenes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |