[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2018000854A1 - Human upper limb motion intention recognition and assistance method and device - Google Patents

Human upper limb motion intention recognition and assistance method and device Download PDF

Info

Publication number
WO2018000854A1
WO2018000854A1 PCT/CN2017/076273 CN2017076273W WO2018000854A1 WO 2018000854 A1 WO2018000854 A1 WO 2018000854A1 CN 2017076273 W CN2017076273 W CN 2017076273W WO 2018000854 A1 WO2018000854 A1 WO 2018000854A1
Authority
WO
WIPO (PCT)
Prior art keywords
exoskeleton
data
time point
joint
dimensional force
Prior art date
Application number
PCT/CN2017/076273
Other languages
French (fr)
Chinese (zh)
Inventor
刘若鹏
舒良轩
王宇驰
Original Assignee
深圳光启合众科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳光启合众科技有限公司 filed Critical 深圳光启合众科技有限公司
Publication of WO2018000854A1 publication Critical patent/WO2018000854A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators

Definitions

  • the invention relates to the field of limb movement intent recognition and assistance.
  • a one-dimensional voltage time series signal can be obtained by attaching the myoelectric sensor to the surface of the limb and guiding and amplifying the surface electrode.
  • the system using the myoelectric sensor acquires the motion intention of the limb by pattern recognition of the acquired signal.
  • a method for performing motion intent recognition and assisting on a human upper limb comprising the steps of: acquiring the exoskeleton robot arm in real time And three-dimensional force data generated between the upper limbs of the human body; acquiring position data of current joint points of the joints of the exoskeleton mechanical arm; calculating the three-dimensional force data and the position data of the current time point to obtain the exoskeleton mechanical arms The desired position data of the joint at the next time point; generating a control signal for driving the exoskeleton robot arm according to the position data of the current time point of each joint of the exoskeleton manipulator and the position data desired at the next time point; And controlling the exoskeleton arm according to the control signal to perform corresponding actions to assist the upper limb of the human body.
  • the calculating includes: performing coordinate transformation on the three-dimensional force data; substituting the coordinate-transformed three-dimensional force data into a force impedance control model to obtain a next time point of each joint of the exoskeleton mechanical arm The desired acceleration, velocity, and position vector; and converting the resulting desired acceleration, velocity, and position vector into desired position data, wherein the control signal includes the obtained desired position data.
  • performing coordinate transformation on the three-dimensional force data includes transforming the three-dimensional force data from a sensor coordinate system to a coordinate system fixed with respect to a back of a human body by a homogeneous transformation matrix, and obtaining the desired Converting the acceleration, velocity, and position vectors into desired position data includes substituting the resulting desired position vector into the corner of the joint of the exoskeleton manipulator and the length of each link, the origin of the sensor coordinate system The position of the position in the coordinate system fixed with respect to the back of the human body corresponds to the formula to obtain the desired position data.
  • the coordinate transformation is among them Is the representation of the three-dimensional force data in the sensor coordinate system O s , Is a representation of the three-dimensional force data in a coordinate system O 0 fixed relative to the back of the human body, and T 0s is a homogeneous transformation matrix of the O s coordinate system to the O 0 coordinate system and
  • l 1 , l 2 , and l 3 are the first link above the shoulder joint of the exoskeleton manipulator, the second link between the shoulder joint and the elbow joint, and the length of the third link below the elbow joint, respectively , ⁇ 1 , ⁇ 2 , ⁇ 3 are the internal and external rotation angles of the shoulder, the shoulder flexion angle, and the elbow flexion angle, respectively, and the obtained desired position vector is expressed as
  • the position correspondence formula is
  • the time interval between the current time point and the next time point depends on the time required for the calculation, the three-dimensional force data, and the transmission time of the position data, wherein the time interval is greater than or equal to the calculation The sum of the required time and the transmission time.
  • the force impedance control model is
  • the parameters of the force impedance control model can be dynamically adjusted via feedback.
  • an apparatus for performing motion intent recognition and assistance on a human upper limb comprising: an exoskeleton robot arm mounted on an upper limb of a human body, the exoskeleton The mechanical arm can sense position data of a current time point of each joint of the exoskeleton mechanical arm; the three-dimensional force sensor collects three-dimensional force data generated between the exoskeleton mechanical arm and the upper limb of the human body in real time; the receiving unit The receiving unit is configured to receive data sensed by the three-dimensional force sensor and the robotic arm; and a processing unit configured to: position the three-dimensional force data and a current time point Calculating the data to obtain the desired position data of the joint at the next time point of each joint of the exoskeleton manipulator; generating the position according to the position data of the current time point of each joint of the exoskeleton manipulator and the desired position data of the next time point a control signal for driving the exoskeleton robot arm; and controlling the exoskeleton arm to perform phase according
  • the calculating includes: performing coordinate transformation on the three-dimensional force data; substituting the coordinate-transformed three-dimensional force data into a force impedance control model to obtain a next time point of each joint of the exoskeleton mechanical arm The desired acceleration, velocity, and position vector; and converting the resulting desired acceleration, velocity, and position vector into desired position data, wherein the control signal includes The desired location data obtained.
  • performing coordinate transformation on the three-dimensional force data includes transforming the three-dimensional force data from a sensor coordinate system to a coordinate system fixed with respect to a back of a human body by a homogeneous transformation matrix, and obtaining the desired Converting the acceleration, velocity, and position vectors into desired position data includes substituting the resulting desired position vector into the corner of the joint of the exoskeleton manipulator and the length of each link, the origin of the sensor coordinate system The position of the position in the coordinate system fixed with respect to the back of the human body corresponds to the formula to obtain the desired position data.
  • the coordinate transformation is among them Is the representation of the three-dimensional force data in the sensor coordinate system O s , Is a representation of the three-dimensional force data in a coordinate system O 0 fixed relative to the back of the human body, and T 0s is a homogeneous transformation matrix of the O s coordinate system to the O 0 coordinate system and
  • l 1 , l 2 , and l 3 are the first link above the shoulder joint of the exoskeleton manipulator, the second link between the shoulder joint and the elbow joint, and the length of the third link below the elbow joint, respectively , ⁇ 1 , ⁇ 2 , ⁇ 3 are the inner (outer) rotation angle of the shoulder, the shoulder flexion angle, and the elbow flexion angle, respectively, and Representing the obtained desired position vector, the position corresponding formula is
  • the time interval between the current time point and the next time point depends on the time required for the calculation, the three-dimensional force data, and the transmission time of the position data, wherein the time interval is greater than or equal to the calculation The sum of the required time and the transmission time.
  • the exoskeleton robot arm includes an angle sensor, wherein position data of a current time point of each joint of the exoskeleton robot arm is sensed by the angle sensor.
  • the mechanical arm includes two left and right mechanical arms, and wherein the two mechanical arms are coupled together by a back bracket.
  • the force impedance control model is among them Is the representation of the three-dimensional force data in a coordinate system O 0 fixed relative to the back of the human body, a desired acceleration, velocity, and position vector representing the origin of the coordinate system O 0 , An acceleration, velocity, and position vector representing the current time point of the origin of the coordinate system O 0 measured by the exoskeleton manipulator, and wherein the parameter M represents the ideal inertia of the exoskeleton manipulator and the parameter C represents the ideal damping of the exoskeleton manipulator, The parameter K represents the ideal stiffness of the exoskeleton manipulator, and wherein the parameters are dynamically determined using an adaptive control algorithm.
  • the parameters of the force impedance control model can be dynamically adjusted via feedback.
  • FIG. 1 is a block diagram of an apparatus for performing motion intent recognition and assistance on a human upper limb based on a three-dimensional force sensor in accordance with the present disclosure.
  • FIG. 2 is a schematic skeletal model diagram of an apparatus for performing motion intent recognition and assistance on a human upper limb based on a three-dimensional force sensor in accordance with the present disclosure.
  • FIG. 3 is a flow chart of a method for motion intent recognition and assistance of a human upper limb based on a three-dimensional force sensor in accordance with the present disclosure.
  • FIG. 1 shows a block diagram of an apparatus 100 for motion intent recognition and assistance of a human upper limb based on a three-dimensional force sensor in accordance with the present disclosure.
  • the device 100 includes a three-dimensional force sensor 102, an exoskeleton robot arm 103, a receiving unit 104, and a processing unit 101.
  • the three-dimensional force sensor 102 can collect the three-dimensional force data generated between the exoskeleton robot 103 and the upper limb of the human body in real time. And represent it as which is It is represented in the sensor coordinate system of the three-dimensional force sensor 102.
  • the data sensed by the three-dimensional force sensor 102 in real time is transmitted to the receiving unit 104.
  • the three-dimensional force sensor 102 can communicate with the receiving unit 104 in any suitable manner to communicate data representative of the sensed force.
  • the three-dimensional force sensor 102 can communicate with the receiving unit 104 via various wired or wireless communication technologies.
  • the exoskeleton robot 103 also senses the position of the exoskeleton arm by an angle sensor (not shown) on the exoskeleton arm.
  • the exoskeleton robot arm 103 is mounted on the upper limb of the human body, and can sense position data of the current time point of each joint of the exoskeleton robot arm 103.
  • the exoskeleton robot arm 103 includes three arm joints, namely, a robot arm joint 1, a robot arm joint 2, and a robot arm joint 3.
  • the exoskeleton robot arm 103 includes an angle sensor, wherein the position data of the current time point of each joint of the exoskeleton robot arm 103 is sensed by the angle sensor.
  • an angle sensor can be placed at the joints of the robot arms to obtain the position of the joints of the various arm arms.
  • the position measured according to the angle sensor may only transmit position data, and the corresponding speed and acceleration data may be calculated by the processing unit 101 based on the received position data.
  • the exoskeleton robotic arm 103 can also include a speed sensor and/or an acceleration sensor such that corresponding speed and/or acceleration data can be directly measured and transmitted.
  • the exoskeleton robotic arm 103 can communicate with the receiving unit 104 to communicate the sensed data in any suitable manner.
  • the exoskeleton robotic arm 103 can communicate with the receiving unit 104 via various wired or wireless communication technologies.
  • the three-dimensional force sensor 102 and the exoskeleton robot 103 are shown separately in FIG. 1, those skilled in the art will appreciate that this is for illustrative purposes only.
  • the three-dimensional force sensor 102 and the exoskeleton robot arm 103 can be integrated together or separated from one another without departing from the scope of the present disclosure.
  • the three-dimensional force sensor 102 can be integrated onto a handle that is coupled to the exoskeleton robotic arm 103, or the three-dimensional force sensor 102 can be separated from the exoskeleton robotic arm 103.
  • the receiving unit 104 forwards the data sensed by the three-dimensional force sensor 102 and the exoskeleton robot 103 to the processing unit 101.
  • the processing unit 101 calculates a control signal for the exoskeleton robot 103 based on the data.
  • the processing unit 101 calculates the received three-dimensional force data and the position data of the current time point to obtain the desired position data of the next time point of each joint of the exoskeleton robot 103, and according to the exoskeleton robot 103
  • the position data of the current time point of each joint and the position data desired at the next time point generate a control signal for driving the exoskeleton robot 103.
  • the control signal includes the desired joint position data obtained.
  • the control signal is the desired position, velocity, and acceleration vector.
  • control signal includes three components that are respectively transmitted to the joints of the three robot arms, each component being a combination of position, velocity, and acceleration vectors. Therefore, the processing unit 101 can control the exoskeleton robot arm 103 to perform corresponding actions on the human body upper limb according to the control signal.
  • the time interval between the current time point and the next time point depends on the time required for the calculation, the three-dimensional force data, and the transmission time of the joint position data.
  • the time interval is at least equal to L+M seconds for the processing unit 101 to perform the processing efficiently.
  • the processing unit 101 may be any device having processing capabilities, such as a stand-alone general purpose personal computer, a mobile phone, a tablet computer, a dedicated processing device, a DSP, an ASIC, or integrated with an exoskeleton robot 103. Any processing device that is together.
  • control signal is calculated by the processing unit 101 through a force impedance control model.
  • the force impedance control model and the associated calculation of the control signals will be described in more detail below in connection with FIG. 2.
  • the drive unit (not shown in FIG. 1) of the exoskeleton robot 103 controls the action of the exoskeleton robot 103 to enable the exoskeleton arm 103 to follow the movement of the limb or Provides assistance to the limbs.
  • the device 100 includes two exoskeleton robot arms, and wherein the two exoskeleton robot arms are coupled together by a back bracket.
  • the skeleton model diagram of this embodiment will be described in more detail below in connection with FIG.
  • FIG. 2 is a schematic skeletal model diagram of an apparatus 200 for performing motion intent recognition and assistance on a human upper limb based on a three-dimensional force sensor, in accordance with an embodiment of the present disclosure.
  • device 200 includes two exoskeleton robotic arms 201, 202, and wherein the two exoskeleton robotic arms are coupled together by a back bracket 203.
  • the three-dimensional force sensor based limb intent recognition and assisting device of the present disclosure may include only one exoskeleton robotic arm.
  • the operation of device 200 will be described below in connection with exoskeleton robotic arm 201, but those skilled in the art will appreciate that these operations are equally applicable to exoskeleton robotic arm 202.
  • the exoskeleton robot arm 201 includes a first link l 1 above the shoulder joint, a second link l 2 between the shoulder joint and the elbow joint, and a third link l 3 below the elbow joint, and Wherein the first link l 1 and the second link l 2 are connected by a joint 205 (shoulder joint), the second link l 2 and the third link l 3 are connected by a joint 204 (elbow joint), and the first link The l 1 and the back support 203 are connected by a joint 206.
  • the joints 204, 205, 206 may be the robot arm joints 1-3 shown in FIG.
  • FIGS. 1-2 An example of use of an embodiment of the present disclosure is described below in conjunction with FIGS. 1-2.
  • the exoskeleton robotic arm 201 is worn on the user's upper limb (eg, the left or right arm).
  • the device 200 senses the motion intent of the wearer's upper limb by a three-dimensional force sensor at the end of the exoskeleton robotic arm 201 (eg, the three-dimensional force sensor 102 in FIG. 1).
  • the three-dimensional force sensor may be located at one end of the third link 13 opposite the joint 204, or it may be separated from the third link 13 .
  • a three-dimensional force sensor is placed on the handle attached to the exoskeleton robotic arms 201, 202.
  • the user's hand grasps the handle with the three-dimensional force sensor, and when the user's upper limb moves, the force is generated by the handle (such as the force shown in FIG. 1). ), the force can be obtained by three-dimensional force sensor measurement.
  • the measured three-dimensional force is represented in the sensor coordinate system O s , ie As can be appreciated, O s is the follower coordinate system.
  • the calculations performed by the processing unit 101 of FIG. 1 include coordinate transformation of the three-dimensional force data.
  • performing coordinate transformation on the three-dimensional force data includes transforming the three-dimensional force data from the sensor coordinate system to a coordinate system fixed relative to the back of the human body by the homogeneous transformation matrix.
  • the coordinate transformation is
  • T 0s is a homogeneous transformation matrix of the O s coordinate system to the O 0 coordinate system
  • l 1 , l 2 , and l 3 are the first link above the shoulder joint of the exoskeleton manipulator, the second link between the shoulder joint and the elbow joint, and the length of the third link below the elbow joint, respectively , ⁇ 1 , ⁇ 2 , and ⁇ 3 are the inner (outer) rotation angle of the shoulder, the shoulder flexion angle, and the elbow flexion and extension angle, respectively.
  • processing unit 101 substitutes the coordinate transformed force data into the force impedance control model to obtain the desired acceleration, velocity, and position vector, and then converts the resulting desired acceleration, velocity, and position vector into the desired joint position data.
  • the processing unit 101 will transform the resulting Substituting the force impedance control model (for example, according to N. Hogan's force impedance control model) in the following formula:
  • a desired acceleration, velocity, and position vector representing the origin of the coordinate system O 0 The acceleration, velocity, and position vector representing the current time point of the origin of the coordinate system O 0 measured by the exoskeleton robot 201; and wherein the parameter M represents the ideal inertia of the robot, the parameter C represents the ideal damping of the robot, and the parameter K represents the robot's Ideal stiffness, these parameters are determined by the model of the device.
  • solving the formula (1) can obtain the desired acceleration, velocity and position vector.
  • the processing unit 101 substitutes the obtained desired position vector into the position of the joint of the joint of the exoskeleton robot 103 and the length of each link, the origin of the sensor coordinate system in a coordinate system fixed with respect to the back of the human body.
  • the origin of the joints 204, 205, 206 of the exoskeleton robotic arm 201 and the length of each link of the exoskeleton robot 201 may represent the origin of the O s coordinate system in the O 0 coordinate system.
  • l 1 , l 2 , and l 3 are the lengths of the first link, the second link, and the third link, respectively; ⁇ 1 , ⁇ 2 , and ⁇ 3 are the inner (outer) rotation angle and the shoulder flexion angle of the shoulder, respectively. And the elbow flexion angle.
  • the desired position vector It can be substituted into the above formula (2), and the kinematic inverse solution can be used to obtain the desired positions ⁇ 1 , ⁇ 2 , and ⁇ 3 of the three joints.
  • the desired position of the joint can be used as an input to control the movement of the drive unit (not shown in FIG. 2) of the exoskeleton robot 201, thereby enabling the exoskeleton arm to follow the intended movement of the human body or to assist the arm.
  • the parameters M, K, and C of the force impedance control model in equation (1) are not fixed, but are dynamically determined using an adaptive control algorithm.
  • the values of the parameters M, K, C can be first calculated by the adaptive control algorithm based on the received force and the position vector, and then the calculated position values are used to calculate the desired position vector.
  • a lookup table of parameters M, K, C and corresponding forces and position vectors can be maintained; upon receiving the force and position vector, device 200 can look up the table to obtain the corresponding M, K, C values. And use these parameter values to calculate the desired position vector.
  • the parameters M, K, C of the force impedance control model in equation (1) can be adjusted via feedback.
  • a user wearing the exoskeleton robotic arm 201 can indicate that the calculated desired position vector always lags behind the user's intent. Based on this feedback, device 200 can recalculate and adjust the values of parameters M, K, C.
  • the exoskeleton robotic arm 201 can also include a pressure sensor for providing pressure between the arm and the exoskeleton robotic arm.
  • the device 200 can determine whether there is a match between the exoskeleton robot arm 201 and the user's intention by the pressure change sensed by the pressure sensor, and adjust the parameters M, K of the force impedance control model depending on the determined result. , C.
  • FIG. 3 is a flow diagram of a method 300 of performing motion intent recognition and assistance on a human upper limb based on a three-dimensional force sensor in accordance with the present disclosure.
  • method 300 includes acquiring, in 310, three-dimensional force data generated between the exoskeleton robot and the upper limb of the human body in real time.
  • the three-dimensional force sensor 102 senses the force generated by the limb in real time.
  • method 300 includes obtaining positional data for a current time point of each joint of the exoskeleton robotic arm.
  • the exoskeleton robotic arm 103 is mounted on the upper extremity of the human body and senses its various joints (mechanical arm joints 1-3 of Figure 1, joints 204, 205, 206 of Figure 2, The location data of, etc., and transmits this data to the receiving unit 104.
  • the data sensed by the exoskeleton arm is sensed by an angle sensor on the exoskeleton arm.
  • an angle sensor is mounted on the joints 204, 205, 206 for sensing angular position data of the joint.
  • the method 300 further includes calculating the positional data of the three-dimensional force data and the current time point to obtain the desired position data for the next time point of each joint of the exoskeleton manipulator.
  • processing unit 101 processes the received data to obtain an exoskeleton robotic arm The desired position data for each joint at the next time point.
  • the calculating comprises: performing coordinate transformation on the three-dimensional force data; substituting the coordinate-transformed force data into a force impedance control model to obtain a desired acceleration, velocity, and position vector; and obtaining the desired The acceleration, velocity and position vectors are converted to the desired joint position data.
  • performing coordinate transformation of the three-dimensional force data as described above in connection with FIGS.
  • 1-2 includes transforming the three-dimensional force data from the sensor coordinate system to a coordinate system fixed relative to the back of the human body by a homogeneous transformation matrix; Converting the resulting desired acceleration, velocity, and position vector into desired joint position data includes substituting the resulting desired position vector into the corner of the joint of the exoskeleton manipulator and the length of each link, the sensor coordinate system origin In the formula of the position in the coordinate system fixed with respect to the back of the human body, the desired joint position data is obtained. For example, equation (2) gives the corresponding formula.
  • method 300 includes generating a control signal that drives the exoskeleton arm based on position data at a current time point of each joint of the exoskeleton manipulator and position data desired at a next point in time.
  • the control signal is calculated by the force impedance control model, as shown in the above formula (1).
  • the parameters of the force impedance control model are dynamically determined using an adaptive control algorithm.
  • the parameters of the force impedance control model can be adjusted via feedback. For example, as described above in connection with FIG. 2, various parameters of the force impedance control model can be adjusted by lookup tables, dynamic calculations, user feedback, and the like.
  • method 300 further includes controlling the exoskeleton robotic arm to perform a corresponding action to assist the upper limb of the human body based on the control signal.
  • processing unit 101 transmits a control signal to exoskeleton robot arm 103 to control the motion of the exoskeleton robotic arm.
  • the control of the exoskeleton manipulator is achieved via a drive unit of the exoskeleton manipulator.
  • the time interval between the current time point and the next time point depends on the time required for the calculation performed by the processing unit 101, the three-dimensional force data, and the transmission time of the joint position data. As described above in connection with Figures 1-2, the time interval is greater than or equal to the sum of the time required for the calculation and the transmission time.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Rehabilitation Tools (AREA)
  • Manipulator (AREA)

Abstract

A method and device for motion intention recognition and assistance of a human upper limb based on a three-dimensional force sensor. The upper limbs of the human body are equipped with an exoskeleton manipulator. The method comprises the following steps: collecting three-dimensional force data between the exoskeleton manipulator and the upper limb of the human body in real time; acquiring position data of each exoskeleton manipulator joint at the current time point; obtaining desired position data at the next time point of each exoskeleton manipulator joint by calculating the three-dimensional force data and the position data at the current time point; generating a control signal for driving the exoskeleton manipulator according to the position data at the current time point and the desired position data at the next time point of the exoskeleton manipulator; and controlling the exoskeleton manipulator to perform the corresponding motion according to the control signal to assist the upper limb of the human body.

Description

对人体上肢进行运动意图识别和助力的方法及装置Method and device for performing motion intent recognition and assistance on human upper limbs 技术领域Technical field
本发明涉及肢体运动意图识别及助力领域。The invention relates to the field of limb movement intent recognition and assistance.
背景技术Background technique
目前,肢体的运动识别主要通过肌电传感器进行。由于神经肌肉系统在进行随意性和非随意性活动时会产生生物电变化,所以通过将肌电传感器贴在肢体的表面,经过表面电极的引导、放大,能够获得一维电压时间序列信号。采用肌电传感器的系统通过对采集到的信号进行模式识别来获取肢体的运动意图。Currently, motion recognition of limbs is mainly performed by myoelectric sensors. Since the neuromuscular system produces bioelectrical changes during random and involuntary activities, a one-dimensional voltage time series signal can be obtained by attaching the myoelectric sensor to the surface of the limb and guiding and amplifying the surface electrode. The system using the myoelectric sensor acquires the motion intention of the limb by pattern recognition of the acquired signal.
但是,使用肌电传感器的采集过程很不方便,采集到的信号难以识别,并且建模复杂。However, the acquisition process using the myoelectric sensor is inconvenient, the acquired signal is difficult to recognize, and the modeling is complicated.
本公开针对但不限于上述诸多因素进行了改进。The present disclosure has been made with respect to, but not limited to, the many factors described above.
发明内容Summary of the invention
以下给出一个或多个方面的简要概述以提供对这些方面的基本理解。此概述不是所有构想到的方面的详尽综览,并且既非旨在指认出所有方面的关键性或决定性要素亦非试图界定任何或所有方面的范围。其唯一的目的是要以简化形式给出一个或多个方面的一些概念以为稍后给出的更加详细的描述之序。A brief overview of one or more aspects is provided below to provide a basic understanding of these aspects. This summary is not an extensive overview of all aspects that are conceived, and is not intended to identify key or critical elements in all aspects. Its sole purpose is to present some concepts of one or more aspects
根据本公开的一个方面,提供了一种对人体上肢进行运动意图识别和助力的方法,所述人体上肢上安装有外骨骼机械臂,所述方法包括以下步骤:实时采集所述外骨骼机械臂与人体上肢之间产生的三维力数据;获取所述外骨骼机械臂各个关节当前时间点的位置数据;对所述三维力数据和当前时间点的位置数据进行计算得到所述外骨骼机械臂各个关节下一时间点所期望的位置数据;根据所述外骨骼机械臂各个关节当前时间点的位置数据和下一时间点所期望的位置数据生成对所述外骨骼机械臂进行驱动的控制信号;以及根据控制信号控制所述外骨骼机械臂进行相应动作对人体上肢进行助力。 According to an aspect of the present disclosure, a method for performing motion intent recognition and assisting on a human upper limb is provided, the exoskeleton robot arm being mounted on the upper limb of the human body, the method comprising the steps of: acquiring the exoskeleton robot arm in real time And three-dimensional force data generated between the upper limbs of the human body; acquiring position data of current joint points of the joints of the exoskeleton mechanical arm; calculating the three-dimensional force data and the position data of the current time point to obtain the exoskeleton mechanical arms The desired position data of the joint at the next time point; generating a control signal for driving the exoskeleton robot arm according to the position data of the current time point of each joint of the exoskeleton manipulator and the position data desired at the next time point; And controlling the exoskeleton arm according to the control signal to perform corresponding actions to assist the upper limb of the human body.
根据本公开的一实施例,所述计算包括:将所述三维力数据进行坐标变换;将经坐标变换的三维力数据代入力阻抗控制模型以得到所述外骨骼机械臂各个关节下一时间点所期望的加速度、速度和位置向量;以及将所得到的期望的加速度、速度和位置向量转换成期望的位置数据,其中所述控制信号包括所获得的期望的位置数据。According to an embodiment of the present disclosure, the calculating includes: performing coordinate transformation on the three-dimensional force data; substituting the coordinate-transformed three-dimensional force data into a force impedance control model to obtain a next time point of each joint of the exoskeleton mechanical arm The desired acceleration, velocity, and position vector; and converting the resulting desired acceleration, velocity, and position vector into desired position data, wherein the control signal includes the obtained desired position data.
根据本公开的一实施例,将所述三维力数据进行坐标变换包括通过齐次变换矩阵将所述三维力数据从传感器坐标系变换到相对于人体背部固定的坐标系,并且将所得到的期望的加速度、速度和位置向量转换成期望的位置数据包括将所得到的期望的位置向量代入由所述外骨骼机械臂的关节的转角和各连杆的长度表示的、所述传感器坐标系原点在相对于人体背部固定的坐标系中的位置的位置对应公式中,以获得期望的位置数据。According to an embodiment of the present disclosure, performing coordinate transformation on the three-dimensional force data includes transforming the three-dimensional force data from a sensor coordinate system to a coordinate system fixed with respect to a back of a human body by a homogeneous transformation matrix, and obtaining the desired Converting the acceleration, velocity, and position vectors into desired position data includes substituting the resulting desired position vector into the corner of the joint of the exoskeleton manipulator and the length of each link, the origin of the sensor coordinate system The position of the position in the coordinate system fixed with respect to the back of the human body corresponds to the formula to obtain the desired position data.
根据本公开的一实施例,所述坐标变换是
Figure PCTCN2017076273-appb-000001
其中
Figure PCTCN2017076273-appb-000002
是所述三维力数据在传感器坐标系Os中的表示,
Figure PCTCN2017076273-appb-000003
是所述三维力数据在相对于人体背部固定的坐标系O0中的表示,T0s是Os坐标系到O0坐标系的齐次变换矩阵且
According to an embodiment of the present disclosure, the coordinate transformation is
Figure PCTCN2017076273-appb-000001
among them
Figure PCTCN2017076273-appb-000002
Is the representation of the three-dimensional force data in the sensor coordinate system O s ,
Figure PCTCN2017076273-appb-000003
Is a representation of the three-dimensional force data in a coordinate system O 0 fixed relative to the back of the human body, and T 0s is a homogeneous transformation matrix of the O s coordinate system to the O 0 coordinate system and
Figure PCTCN2017076273-appb-000004
Figure PCTCN2017076273-appb-000004
其中l1、l2、l3分别是所述外骨骼机械臂的肩关节以上的第一连杆、肩关节与肘关节之间的第二连杆以及肘关节以下的第三连杆的长度,θ1、θ2、θ3分别是肩部内外旋角、肩部屈伸角、以及肘部屈伸角,并且所得到的期望的位置向量表示为
Figure PCTCN2017076273-appb-000005
所述位置对应公式是
Wherein l 1 , l 2 , and l 3 are the first link above the shoulder joint of the exoskeleton manipulator, the second link between the shoulder joint and the elbow joint, and the length of the third link below the elbow joint, respectively , θ 1 , θ 2 , θ 3 are the internal and external rotation angles of the shoulder, the shoulder flexion angle, and the elbow flexion angle, respectively, and the obtained desired position vector is expressed as
Figure PCTCN2017076273-appb-000005
The position correspondence formula is
Figure PCTCN2017076273-appb-000006
Figure PCTCN2017076273-appb-000006
根据本公开的一实施例,当前时间点与下一时间点的时间间隔取决于所述计算所需的时间、三维力数据和位置数据的传输时间,其中所述时间间隔大于或等于所述计算所需的时间与所述传输时间之和。According to an embodiment of the present disclosure, the time interval between the current time point and the next time point depends on the time required for the calculation, the three-dimensional force data, and the transmission time of the position data, wherein the time interval is greater than or equal to the calculation The sum of the required time and the transmission time.
根据本公开的一实施例,所述力阻抗控制模型是According to an embodiment of the present disclosure, the force impedance control model is
Figure PCTCN2017076273-appb-000007
其中
Figure PCTCN2017076273-appb-000008
是所述三维力数据在相对于人体背部固定的坐标系O0中的表示,
Figure PCTCN2017076273-appb-000009
表示坐标系O0的原点的期望的加速度、速度和位置向量,
Figure PCTCN2017076273-appb-000010
表示由外骨骼机械臂测量到的坐标系O0的原点当前时间点的加速度、速度和位置向量,并且其中参数M表示外骨骼机械臂的理想惯性,参数C表示外骨骼机械臂的理想阻尼,参数K表示外骨骼机械臂的理想刚度,并且其中所述参数是使用自适应控制算法来动态地确定的。
Figure PCTCN2017076273-appb-000007
among them
Figure PCTCN2017076273-appb-000008
Is the representation of the three-dimensional force data in a coordinate system O 0 fixed relative to the back of the human body,
Figure PCTCN2017076273-appb-000009
a desired acceleration, velocity, and position vector representing the origin of the coordinate system O 0 ,
Figure PCTCN2017076273-appb-000010
An acceleration, velocity, and position vector representing the current time point of the origin of the coordinate system O 0 measured by the exoskeleton manipulator, and wherein the parameter M represents the ideal inertia of the exoskeleton manipulator and the parameter C represents the ideal damping of the exoskeleton manipulator, The parameter K represents the ideal stiffness of the exoskeleton manipulator, and wherein the parameters are dynamically determined using an adaptive control algorithm.
根据本公开的一实施例,所述力阻抗控制模型的参数能经由反馈来动态调整。According to an embodiment of the present disclosure, the parameters of the force impedance control model can be dynamically adjusted via feedback.
根据本公开的另一个方面,提供了一种对人体上肢进行运动意图识别和助力的装置,所述装置包括:外骨骼机械臂,所述外骨骼机械臂安装在人体上肢上,所述外骨骼机械臂能感测所述外骨骼机械臂各个关节当前时间点的位置数据;三维力传感器,所述三维力传感器实时采集所述外骨骼机械臂与人体上肢之间产生的三维力数据;接收单元,所述接收单元被配置成接收由所述三维力传感器和所述机械臂感测到的数据;以及处理单元,所述处理单元被配置成:对所述三维力数据和当前时间点的位置数据进行计算得到所述外骨骼机械臂各个关节下一时间点所期望的位置数据;根据所述外骨骼机械臂各个关节当前时间点的位置数据和下一时间点所期望的位置数据生成对所述外骨骼机械臂进行驱动的控制信号;以及根据控制信号控制所述外骨骼机械臂进行相应动作对人体上肢进行助力。According to another aspect of the present disclosure, there is provided an apparatus for performing motion intent recognition and assistance on a human upper limb, the apparatus comprising: an exoskeleton robot arm mounted on an upper limb of a human body, the exoskeleton The mechanical arm can sense position data of a current time point of each joint of the exoskeleton mechanical arm; the three-dimensional force sensor collects three-dimensional force data generated between the exoskeleton mechanical arm and the upper limb of the human body in real time; the receiving unit The receiving unit is configured to receive data sensed by the three-dimensional force sensor and the robotic arm; and a processing unit configured to: position the three-dimensional force data and a current time point Calculating the data to obtain the desired position data of the joint at the next time point of each joint of the exoskeleton manipulator; generating the position according to the position data of the current time point of each joint of the exoskeleton manipulator and the desired position data of the next time point a control signal for driving the exoskeleton robot arm; and controlling the exoskeleton arm to perform phase according to the control signal Action to help the human upper limb.
根据本公开的一实施例,所述计算包括:将所述三维力数据进行坐标变换;将经坐标变换的三维力数据代入力阻抗控制模型以得到所述外骨骼机械臂各个关节下一时间点所期望的加速度、速度和位置向量;以及将所得到的期望的加速度、速度和位置向量转换成期望的位置数据,其中所述控制信号包括所 获得的期望的位置数据。According to an embodiment of the present disclosure, the calculating includes: performing coordinate transformation on the three-dimensional force data; substituting the coordinate-transformed three-dimensional force data into a force impedance control model to obtain a next time point of each joint of the exoskeleton mechanical arm The desired acceleration, velocity, and position vector; and converting the resulting desired acceleration, velocity, and position vector into desired position data, wherein the control signal includes The desired location data obtained.
根据本公开的一实施例,将所述三维力数据进行坐标变换包括通过齐次变换矩阵将所述三维力数据从传感器坐标系变换到相对于人体背部固定的坐标系,并且将所得到的期望的加速度、速度和位置向量转换成期望的位置数据包括将所得到的期望的位置向量代入由所述外骨骼机械臂的关节的转角和各连杆的长度表示的、所述传感器坐标系原点在相对于人体背部固定的坐标系中的位置的位置对应公式中,以获得期望的位置数据。According to an embodiment of the present disclosure, performing coordinate transformation on the three-dimensional force data includes transforming the three-dimensional force data from a sensor coordinate system to a coordinate system fixed with respect to a back of a human body by a homogeneous transformation matrix, and obtaining the desired Converting the acceleration, velocity, and position vectors into desired position data includes substituting the resulting desired position vector into the corner of the joint of the exoskeleton manipulator and the length of each link, the origin of the sensor coordinate system The position of the position in the coordinate system fixed with respect to the back of the human body corresponds to the formula to obtain the desired position data.
根据本公开的一实施例,所述坐标变换是
Figure PCTCN2017076273-appb-000011
其中
Figure PCTCN2017076273-appb-000012
是所述三维力数据在传感器坐标系Os中的表示,
Figure PCTCN2017076273-appb-000013
是所述三维力数据在相对于人体背部固定的坐标系O0中的表示,T0s是Os坐标系到O0坐标系的齐次变换矩阵且
According to an embodiment of the present disclosure, the coordinate transformation is
Figure PCTCN2017076273-appb-000011
among them
Figure PCTCN2017076273-appb-000012
Is the representation of the three-dimensional force data in the sensor coordinate system O s ,
Figure PCTCN2017076273-appb-000013
Is a representation of the three-dimensional force data in a coordinate system O 0 fixed relative to the back of the human body, and T 0s is a homogeneous transformation matrix of the O s coordinate system to the O 0 coordinate system and
Figure PCTCN2017076273-appb-000014
Figure PCTCN2017076273-appb-000014
其中l1、l2、l3分别是所述外骨骼机械臂的肩关节以上的第一连杆、肩关节与肘关节之间的第二连杆以及肘关节以下的第三连杆的长度,θ1、θ2、θ3分别是肩部内(外)旋角、肩部屈伸角、以及肘部屈伸角,并且
Figure PCTCN2017076273-appb-000015
表示所得到的期望的位置向量,所述位置对应公式是
Wherein l 1 , l 2 , and l 3 are the first link above the shoulder joint of the exoskeleton manipulator, the second link between the shoulder joint and the elbow joint, and the length of the third link below the elbow joint, respectively , θ 1 , θ 2 , θ 3 are the inner (outer) rotation angle of the shoulder, the shoulder flexion angle, and the elbow flexion angle, respectively, and
Figure PCTCN2017076273-appb-000015
Representing the obtained desired position vector, the position corresponding formula is
Figure PCTCN2017076273-appb-000016
Figure PCTCN2017076273-appb-000016
根据本公开的一实施例,当前时间点与下一时间点的时间间隔取决于所述计算所需的时间、三维力数据和位置数据的传输时间,其中所述时间间隔大于或等于所述计算所需的时间与所述传输时间之和。 According to an embodiment of the present disclosure, the time interval between the current time point and the next time point depends on the time required for the calculation, the three-dimensional force data, and the transmission time of the position data, wherein the time interval is greater than or equal to the calculation The sum of the required time and the transmission time.
根据本公开的一实施例,所述外骨骼机械臂上包括角度传感器,其中所述外骨骼机械臂各个关节当前时间点的位置数据是由所述角度传感器来感测到的。According to an embodiment of the present disclosure, the exoskeleton robot arm includes an angle sensor, wherein position data of a current time point of each joint of the exoskeleton robot arm is sensed by the angle sensor.
根据本公开的一实施例,所述机械臂包括左、右两个机械臂,并且其中这两个机械臂通过背部支架连接在一起。According to an embodiment of the present disclosure, the mechanical arm includes two left and right mechanical arms, and wherein the two mechanical arms are coupled together by a back bracket.
根据本公开的一实施例,所述力阻抗控制模型是
Figure PCTCN2017076273-appb-000017
其中
Figure PCTCN2017076273-appb-000018
是所述三维力数据在相对于人体背部固定的坐标系O0中的表示,
Figure PCTCN2017076273-appb-000019
表示坐标系O0的原点的期望的加速度、速度和位置向量,
Figure PCTCN2017076273-appb-000020
表示由外骨骼机械臂测量到的坐标系O0的原点当前时间点的加速度、速度和位置向量,并且其中参数M表示外骨骼机械臂的理想惯性,参数C表示外骨骼机械臂的理想阻尼,参数K表示外骨骼机械臂的理想刚度,并且其中所述参数是使用自适应控制算法来动态地确定的。
According to an embodiment of the present disclosure, the force impedance control model is
Figure PCTCN2017076273-appb-000017
among them
Figure PCTCN2017076273-appb-000018
Is the representation of the three-dimensional force data in a coordinate system O 0 fixed relative to the back of the human body,
Figure PCTCN2017076273-appb-000019
a desired acceleration, velocity, and position vector representing the origin of the coordinate system O 0 ,
Figure PCTCN2017076273-appb-000020
An acceleration, velocity, and position vector representing the current time point of the origin of the coordinate system O 0 measured by the exoskeleton manipulator, and wherein the parameter M represents the ideal inertia of the exoskeleton manipulator and the parameter C represents the ideal damping of the exoskeleton manipulator, The parameter K represents the ideal stiffness of the exoskeleton manipulator, and wherein the parameters are dynamically determined using an adaptive control algorithm.
根据本公开的一实施例,所述力阻抗控制模型的参数能经由反馈来动态调整。According to an embodiment of the present disclosure, the parameters of the force impedance control model can be dynamically adjusted via feedback.
如上所述,通过采用三维力传感器而非肌电传感器来对人体上肢进行运动意图识别,克服了现有技术中信号采集困难、信号难以被识别等缺陷,并且所采集的信号的建模相对简单,使得能够更加方便且快速地进行人体上肢运动意图识别,从而提高了识别的准确度和效率。As described above, by using a three-dimensional force sensor instead of an electromyography sensor to perform motion intent recognition on the upper limb of the human body, the disadvantages of signal acquisition in the prior art, difficulty in recognition of signals, and the like are overcome, and the modeling of the acquired signal is relatively simple. This makes it easier and faster to perform human body upper limb motion intent recognition, thereby improving the accuracy and efficiency of recognition.
附图说明DRAWINGS
在结合以下附图阅读本公开的实施例的详细描述之后,能够更好地理解本发明的上述特征和优点。在附图中,各组件不一定是按比例绘制,并且具有类似的相关特性或特征的组件可能具有相同或相近的附图标记。The above features and advantages of the present invention will be better understood from the following description of the appended claims. In the figures, components are not necessarily drawn to scale, and components having similar related features or features may have the same or similar reference numerals.
图1是根据本公开的基于三维力传感器对人体上肢进行运动意图识别和助力的装置的框图。1 is a block diagram of an apparatus for performing motion intent recognition and assistance on a human upper limb based on a three-dimensional force sensor in accordance with the present disclosure.
图2是根据本公开的基于三维力传感器对人体上肢进行运动意图识别和助力的装置的示意骨架模型图。 2 is a schematic skeletal model diagram of an apparatus for performing motion intent recognition and assistance on a human upper limb based on a three-dimensional force sensor in accordance with the present disclosure.
图3是根据本公开的基于三维力传感器对人体上肢进行运动意图识别和助力的方法的流程图。3 is a flow chart of a method for motion intent recognition and assistance of a human upper limb based on a three-dimensional force sensor in accordance with the present disclosure.
具体实施方式detailed description
以下结合附图和具体实施例对本发明作详细描述。注意,以下结合附图和具体实施例描述的诸方面仅是示例性的,而不应被理解为对本发明的保护范围进行任何限制。The invention is described in detail below with reference to the drawings and specific embodiments. It is to be noted that the aspects described below in conjunction with the drawings and the specific embodiments are merely exemplary and are not to be construed as limiting the scope of the invention.
可以理解,尽管本文中以人体上肢为例解说了各实施例,但本公开的实施例可以适用于其他动物或机器人的上肢、下肢等。此外,还可以理解,本公开的实施例可以适用于全身外骨骼,而非仅仅是肢体。但为简明起见,下文仅针对人体上肢手臂来描述本公开的各实施例。It will be understood that although various embodiments are illustrated herein with human upper limbs as an example, embodiments of the present disclosure may be applied to upper limbs, lower limbs, and the like of other animals or robots. Moreover, it will also be appreciated that embodiments of the present disclosure may be applied to a whole body exoskeleton, rather than just a limb. However, for the sake of brevity, the various embodiments of the present disclosure are described below only for human upper limbs.
图1示出了根据本公开的基于三维力传感器对人体上肢进行运动意图识别和助力的装置100的框图。1 shows a block diagram of an apparatus 100 for motion intent recognition and assistance of a human upper limb based on a three-dimensional force sensor in accordance with the present disclosure.
如图1所示,装置100包括三维力传感器102、外骨骼机械臂103、接收单元104以及处理单元101。在图1所示的示例中,
Figure PCTCN2017076273-appb-000021
是外骨骼机械臂103与人的接触力;三维力传感器102能实时采集外骨骼机械臂103与人体上肢之间产生的三维力数据
Figure PCTCN2017076273-appb-000022
并将其表示为
Figure PCTCN2017076273-appb-000023
Figure PCTCN2017076273-appb-000024
是在三维力传感器102的传感器坐标系中表示的。随后,如图1所示,三维力传感器102实时感测到的数据被传送给接收单元104。在一示例中,三维力传感器102可通过任何合适的方式与接收单元104通信来传递表示感测到的力的数据。例如,三维力传感器102可通过各种有线或无线通信技术来与接收单元104通信。
As shown in FIG. 1, the device 100 includes a three-dimensional force sensor 102, an exoskeleton robot arm 103, a receiving unit 104, and a processing unit 101. In the example shown in Figure 1,
Figure PCTCN2017076273-appb-000021
Is the contact force between the exoskeleton robot 103 and the human; the three-dimensional force sensor 102 can collect the three-dimensional force data generated between the exoskeleton robot 103 and the upper limb of the human body in real time.
Figure PCTCN2017076273-appb-000022
And represent it as
Figure PCTCN2017076273-appb-000023
which is
Figure PCTCN2017076273-appb-000024
It is represented in the sensor coordinate system of the three-dimensional force sensor 102. Subsequently, as shown in FIG. 1, the data sensed by the three-dimensional force sensor 102 in real time is transmitted to the receiving unit 104. In an example, the three-dimensional force sensor 102 can communicate with the receiving unit 104 in any suitable manner to communicate data representative of the sensed force. For example, the three-dimensional force sensor 102 can communicate with the receiving unit 104 via various wired or wireless communication technologies.
在图1所示的示例中,外骨骼机械臂103也通过外骨骼机械臂上的角度传感器(未示出)来感测外骨骼机械臂的位置。例如,外骨骼机械臂103安装在人体上肢上,并且能感测外骨骼机械臂103的各个关节当前时间点的位置数据。例如,如图1所示,外骨骼机械臂103包括三个机械臂关节,即机械臂关节1、机械臂关节2、以及机械臂关节3。根据本公开的一实施例,外骨骼机械臂103上包括角度传感器,其中外骨骼机械臂103各个关节当前时间点的位置数据是 由所述角度传感器来感测到的。在一示例中,角度传感器可被置于这些机械臂关节处,从而获得各机械臂关节的位置。此外,根据角度传感器测量到的位置
Figure PCTCN2017076273-appb-000025
然而,在一示例中,外骨骼机械臂103可以只传送位置数据,而相应的速度以及加速度数据可由处理单元101来根据接收到的位置数据计算得到。在本公开的又一实施例中,外骨骼机械臂103还可包括速度传感器和/或加速度传感器,从而可直接测量并传送相应的速度和/或加速度数据。在一示例中,外骨骼机械臂103可通过任何合适的方式与接收单元104通信来传递感测到的数据。例如,外骨骼机械臂103可通过各种有线或无线通信技术来与接收单元104通信。
In the example shown in FIG. 1, the exoskeleton robot 103 also senses the position of the exoskeleton arm by an angle sensor (not shown) on the exoskeleton arm. For example, the exoskeleton robot arm 103 is mounted on the upper limb of the human body, and can sense position data of the current time point of each joint of the exoskeleton robot arm 103. For example, as shown in FIG. 1, the exoskeleton robot arm 103 includes three arm joints, namely, a robot arm joint 1, a robot arm joint 2, and a robot arm joint 3. According to an embodiment of the present disclosure, the exoskeleton robot arm 103 includes an angle sensor, wherein the position data of the current time point of each joint of the exoskeleton robot arm 103 is sensed by the angle sensor. In an example, an angle sensor can be placed at the joints of the robot arms to obtain the position of the joints of the various arm arms. In addition, the position measured according to the angle sensor
Figure PCTCN2017076273-appb-000025
However, in an example, the exoskeleton robot 103 may only transmit position data, and the corresponding speed and acceleration data may be calculated by the processing unit 101 based on the received position data. In yet another embodiment of the present disclosure, the exoskeleton robotic arm 103 can also include a speed sensor and/or an acceleration sensor such that corresponding speed and/or acceleration data can be directly measured and transmitted. In an example, the exoskeleton robotic arm 103 can communicate with the receiving unit 104 to communicate the sensed data in any suitable manner. For example, the exoskeleton robotic arm 103 can communicate with the receiving unit 104 via various wired or wireless communication technologies.
根据本公开的又一实施例,尽管图1中将三维力传感器102和外骨骼机械臂103分开地示出,本领域技术人员将理解,这只是用于说明的目的。三维力传感器102和外骨骼机械臂103可被集成在一起或彼此分开,而不背离本公开的范围。例如,三维力传感器102可被集成到与外骨骼机械臂103连接在一起的手柄上,或者三维力传感器102可与外骨骼机械臂103彼此分开。In accordance with yet another embodiment of the present disclosure, although the three-dimensional force sensor 102 and the exoskeleton robot 103 are shown separately in FIG. 1, those skilled in the art will appreciate that this is for illustrative purposes only. The three-dimensional force sensor 102 and the exoskeleton robot arm 103 can be integrated together or separated from one another without departing from the scope of the present disclosure. For example, the three-dimensional force sensor 102 can be integrated onto a handle that is coupled to the exoskeleton robotic arm 103, or the three-dimensional force sensor 102 can be separated from the exoskeleton robotic arm 103.
在图1所示的示例中,接收单元104在接收到由三维力传感器102和外骨骼机械臂103感测到的数据之后,将其转发给处理单元101。In the example shown in FIG. 1, the receiving unit 104 forwards the data sensed by the three-dimensional force sensor 102 and the exoskeleton robot 103 to the processing unit 101.
处理单元101在接收到来自接收单元104的数据之后,根据这些数据计算针对外骨骼机械臂103的控制信号。在一实施例中,处理单元101对接收到的三维力数据和当前时间点的位置数据进行计算得到外骨骼机械臂103各个关节下一时间点所期望的位置数据,并根据外骨骼机械臂103各个关节当前时间点的位置数据和下一时间点所期望的位置数据生成对外骨骼机械臂103进行驱动的控制信号。在一示例中,控制信号包括所获得的期望的关节位置数据。例如,如图1所示,控制信号是期望的位置、速度、以及加速度向量
Figure PCTCN2017076273-appb-000026
在一示例中,控制信号包括分别传送给三个机械臂关节的三个分量,每一分量是位置、速度、以及加速度向量的组合。从而,处理单元101可根据控制信号控制外骨骼机械臂103进行相应动作对人体上肢进行助力。
After receiving the data from the receiving unit 104, the processing unit 101 calculates a control signal for the exoskeleton robot 103 based on the data. In an embodiment, the processing unit 101 calculates the received three-dimensional force data and the position data of the current time point to obtain the desired position data of the next time point of each joint of the exoskeleton robot 103, and according to the exoskeleton robot 103 The position data of the current time point of each joint and the position data desired at the next time point generate a control signal for driving the exoskeleton robot 103. In an example, the control signal includes the desired joint position data obtained. For example, as shown in Figure 1, the control signal is the desired position, velocity, and acceleration vector.
Figure PCTCN2017076273-appb-000026
In an example, the control signal includes three components that are respectively transmitted to the joints of the three robot arms, each component being a combination of position, velocity, and acceleration vectors. Therefore, the processing unit 101 can control the exoskeleton robot arm 103 to perform corresponding actions on the human body upper limb according to the control signal.
根据本公开的一实施例,当前时间点与下一时间点的时间间隔取决于所述计算所需的时间、三维力数据和关节位置数据的传输时间。在一示例中,例如, 假定处理单元进行计算所需的时间是L秒,且三维力数据和关节位置数据的传输时间是M秒,则为使处理单元101能有效地进行处理,所述时间间隔最少等于L+M秒。According to an embodiment of the present disclosure, the time interval between the current time point and the next time point depends on the time required for the calculation, the three-dimensional force data, and the transmission time of the joint position data. In an example, for example, Assuming that the time required for the processing unit to perform the calculation is L seconds, and the transmission time of the three-dimensional force data and the joint position data is M seconds, the time interval is at least equal to L+M seconds for the processing unit 101 to perform the processing efficiently. .
在本公开的一实施例中,处理单元101可以是具有处理能力的任何装置,如独立的通用个人计算机、移动电话、平板计算机、专用处理装置、DSP、ASIC、或与外骨骼机械臂103集成在一起的任何处理装置。In an embodiment of the present disclosure, the processing unit 101 may be any device having processing capabilities, such as a stand-alone general purpose personal computer, a mobile phone, a tablet computer, a dedicated processing device, a DSP, an ASIC, or integrated with an exoskeleton robot 103. Any processing device that is together.
根据本公开的另一实施例,控制信号是由处理单元101通过力阻抗控制模型来计算得到的。力阻抗控制模型以及控制信号的相关计算将在下文结合图2更详细地描述。According to another embodiment of the present disclosure, the control signal is calculated by the processing unit 101 through a force impedance control model. The force impedance control model and the associated calculation of the control signals will be described in more detail below in connection with FIG. 2.
在一实施例中,基于接收到的控制信号,外骨骼机械臂103的驱动单元(图1中未示出)控制外骨骼机械臂103的动作,以实现外骨骼机械臂103跟随肢体的运动或为肢体提供助力。In an embodiment, based on the received control signal, the drive unit (not shown in FIG. 1) of the exoskeleton robot 103 controls the action of the exoskeleton robot 103 to enable the exoskeleton arm 103 to follow the movement of the limb or Provides assistance to the limbs.
根据本公开的一实施例,装置100包括两个外骨骼机械臂,并且其中这两个外骨骼机械臂通过背部支架连接在一起。这一实施例的骨架模型图将在下文结合图2来更详细描述。According to an embodiment of the present disclosure, the device 100 includes two exoskeleton robot arms, and wherein the two exoskeleton robot arms are coupled together by a back bracket. The skeleton model diagram of this embodiment will be described in more detail below in connection with FIG.
图2是根据本公开的一实施例的基于三维力传感器对人体上肢进行运动意图识别和助力的装置200的示意骨架模型图。2 is a schematic skeletal model diagram of an apparatus 200 for performing motion intent recognition and assistance on a human upper limb based on a three-dimensional force sensor, in accordance with an embodiment of the present disclosure.
如图2所示,装置200包括两个外骨骼机械臂201、202,并且其中这两个外骨骼机械臂通过背部支架203连接在一起。然而,本领域技术人员可以理解,本公开的基于三维力传感器的肢体意图识别和助力装置可以只包括一个外骨骼机械臂。下面将结合外骨骼机械臂201来描述装置200的操作,但本领域技术人员将明白,这些操作同样适用于外骨骼机械臂202。As shown in FIG. 2, device 200 includes two exoskeleton robotic arms 201, 202, and wherein the two exoskeleton robotic arms are coupled together by a back bracket 203. However, those skilled in the art will appreciate that the three-dimensional force sensor based limb intent recognition and assisting device of the present disclosure may include only one exoskeleton robotic arm. The operation of device 200 will be described below in connection with exoskeleton robotic arm 201, but those skilled in the art will appreciate that these operations are equally applicable to exoskeleton robotic arm 202.
如图2所示,外骨骼机械臂201包括肩关节以上的第一连杆l1、肩关节与肘关节之间的第二连杆l2以及肘关节以下的第三连杆l3,并且其中第一连杆l1和第二连杆l2由关节205(肩关节)连接,第二连杆l2和第三连杆l3由关节204(肘关节)连接,且第一连杆l1与背部支架203由关节206连接。在本公开的一实施例中,关节204、205、206可以是图1中所示的机械臂关节1-3。 As shown in FIG. 2, the exoskeleton robot arm 201 includes a first link l 1 above the shoulder joint, a second link l 2 between the shoulder joint and the elbow joint, and a third link l 3 below the elbow joint, and Wherein the first link l 1 and the second link l 2 are connected by a joint 205 (shoulder joint), the second link l 2 and the third link l 3 are connected by a joint 204 (elbow joint), and the first link The l 1 and the back support 203 are connected by a joint 206. In an embodiment of the present disclosure, the joints 204, 205, 206 may be the robot arm joints 1-3 shown in FIG.
下面结合图1-2来描述本公开的一实施例的使用示例。An example of use of an embodiment of the present disclosure is described below in conjunction with FIGS. 1-2.
在使用中,外骨骼机械臂201被穿戴在用户的上肢手臂(例如,左臂或右臂)上。装置200通过外骨骼机械臂201末端处的三维力传感器(例如,图1中的三维力传感器102)来感测穿戴者的上肢手臂的运动意图。例如,三维力传感器可以位于第三连杆l3的与关节204相对的一端处,或者它可以与第三连杆l3分开。在本公开的一实施例中,三维力传感器被置于附连到外骨骼机械臂201、202的手柄上。在该实施例中,用户手掌握住置有三维力传感器的手柄,当用户上肢运动时,会通过手柄产生力(如图1中所示的力
Figure PCTCN2017076273-appb-000027
),该力可以通过三维力传感器测量获得。如以上在图1中所示,测得的三维力是在传感器坐标系Os中表示的,即
Figure PCTCN2017076273-appb-000028
如可理解的,Os是随动坐标系。
In use, the exoskeleton robotic arm 201 is worn on the user's upper limb (eg, the left or right arm). The device 200 senses the motion intent of the wearer's upper limb by a three-dimensional force sensor at the end of the exoskeleton robotic arm 201 (eg, the three-dimensional force sensor 102 in FIG. 1). For example, the three-dimensional force sensor may be located at one end of the third link 13 opposite the joint 204, or it may be separated from the third link 13 . In an embodiment of the present disclosure, a three-dimensional force sensor is placed on the handle attached to the exoskeleton robotic arms 201, 202. In this embodiment, the user's hand grasps the handle with the three-dimensional force sensor, and when the user's upper limb moves, the force is generated by the handle (such as the force shown in FIG. 1).
Figure PCTCN2017076273-appb-000027
), the force can be obtained by three-dimensional force sensor measurement. As shown above in Figure 1, the measured three-dimensional force is represented in the sensor coordinate system O s , ie
Figure PCTCN2017076273-appb-000028
As can be appreciated, O s is the follower coordinate system.
根据本公开的一实施例,
Figure PCTCN2017076273-appb-000029
需要被转化到在关节206处建立相对于背部固定的坐标系O0中。如此,图1中的处理单元101所执行的计算包括将三维力数据进行坐标变换。根据本公开的一实施例,将三维力数据进行坐标变换包括通过齐次变换矩阵将三维力数据从传感器坐标系变换到相对于人体背部固定的坐标系。例如,坐标变换是
According to an embodiment of the present disclosure,
Figure PCTCN2017076273-appb-000029
Need to be established with respect to the transformed coordinate system fixed to the back at the joints O 0 206. As such, the calculations performed by the processing unit 101 of FIG. 1 include coordinate transformation of the three-dimensional force data. According to an embodiment of the present disclosure, performing coordinate transformation on the three-dimensional force data includes transforming the three-dimensional force data from the sensor coordinate system to a coordinate system fixed relative to the back of the human body by the homogeneous transformation matrix. For example, the coordinate transformation is
Figure PCTCN2017076273-appb-000030
Figure PCTCN2017076273-appb-000030
其中T0s是Os坐标系到O0坐标系的齐次变换矩阵且Where T 0s is a homogeneous transformation matrix of the O s coordinate system to the O 0 coordinate system and
Figure PCTCN2017076273-appb-000031
Figure PCTCN2017076273-appb-000031
其中l1、l2、l3分别是所述外骨骼机械臂的肩关节以上的第一连杆、肩关节与肘关节之间的第二连杆以及肘关节以下的第三连杆的长度,θ1、θ2、θ3分别是肩部内(外)旋角、肩部屈伸角、以及肘部屈伸角。Wherein l 1 , l 2 , and l 3 are the first link above the shoulder joint of the exoskeleton manipulator, the second link between the shoulder joint and the elbow joint, and the length of the third link below the elbow joint, respectively , θ 1 , θ 2 , and θ 3 are the inner (outer) rotation angle of the shoulder, the shoulder flexion angle, and the elbow flexion and extension angle, respectively.
随后,处理单元101将经坐标变换的力数据代入力阻抗控制模型以得到期望的加速度、速度和位置向量,并随后将所得到的期望的加速度、速度和位置向量转换成期望的关节位置数据。在一实施例中,处理单元101将变换得到的
Figure PCTCN2017076273-appb-000032
代入力阻抗控制模型(例如,根据N.Hogan的力阻抗控制模型)的 以下公式中:
Subsequently, processing unit 101 substitutes the coordinate transformed force data into the force impedance control model to obtain the desired acceleration, velocity, and position vector, and then converts the resulting desired acceleration, velocity, and position vector into the desired joint position data. In an embodiment, the processing unit 101 will transform the resulting
Figure PCTCN2017076273-appb-000032
Substituting the force impedance control model (for example, according to N. Hogan's force impedance control model) in the following formula:
Figure PCTCN2017076273-appb-000033
Figure PCTCN2017076273-appb-000033
其中
Figure PCTCN2017076273-appb-000034
表示坐标系O0的原点的期望的加速度、速度和位置向量,
Figure PCTCN2017076273-appb-000035
表示由外骨骼机械臂201测量到的坐标系O0的原点当前时间点的加速度、速度和位置向量;并且其中参数M表示机器人的理想惯性,参数C表示机器人的理想阻尼,参数K表示机器人的理想刚度,这些参数通过装置的模型来确定。如此,对公式(1)求解可以得到期望的加速度、速度和位置向量
Figure PCTCN2017076273-appb-000036
among them
Figure PCTCN2017076273-appb-000034
a desired acceleration, velocity, and position vector representing the origin of the coordinate system O 0 ,
Figure PCTCN2017076273-appb-000035
The acceleration, velocity, and position vector representing the current time point of the origin of the coordinate system O 0 measured by the exoskeleton robot 201; and wherein the parameter M represents the ideal inertia of the robot, the parameter C represents the ideal damping of the robot, and the parameter K represents the robot's Ideal stiffness, these parameters are determined by the model of the device. Thus, solving the formula (1) can obtain the desired acceleration, velocity and position vector.
Figure PCTCN2017076273-appb-000036
随后,处理单元101将所得到的期望的位置向量代入由外骨骼机械臂103的关节的转角和各连杆的长度表示的、传感器坐标系原点在相对于人体背部固定的坐标系中的位置的公式中,以获得期望的关节位置数据。例如,根据本公开的一实施例,可通过外骨骼机械臂201的关节204、205、206的转角和外骨骼机械臂201各连杆的长度表示Os坐标系原点在O0坐标系中的位置(x40,y40,z40):Subsequently, the processing unit 101 substitutes the obtained desired position vector into the position of the joint of the joint of the exoskeleton robot 103 and the length of each link, the origin of the sensor coordinate system in a coordinate system fixed with respect to the back of the human body. In the formula, to obtain the desired joint position data. For example, according to an embodiment of the present disclosure, the origin of the joints 204, 205, 206 of the exoskeleton robotic arm 201 and the length of each link of the exoskeleton robot 201 may represent the origin of the O s coordinate system in the O 0 coordinate system. Position (x 40 , y 40 , z 40 ):
Figure PCTCN2017076273-appb-000037
Figure PCTCN2017076273-appb-000037
其中l1、l2、l3分别是第一连杆、第二连杆以及第三连杆的长度;θ1、θ2、θ3分别是肩部内(外)旋角、肩部屈伸角、以及肘部屈伸角。该示例中,期望的位置向量
Figure PCTCN2017076273-appb-000038
可被代入上式(2)中,利用运动学反解,可以获得三个关节期望的位置θ1、θ2、θ3。该关节期望位置可作为输入来控制外骨骼机械臂201的驱动单元(图2中未示出)的运动,从而实现外骨骼机械臂跟随人体意图的运动或为手臂提供助力。
Where l 1 , l 2 , and l 3 are the lengths of the first link, the second link, and the third link, respectively; θ 1 , θ 2 , and θ 3 are the inner (outer) rotation angle and the shoulder flexion angle of the shoulder, respectively. And the elbow flexion angle. In this example, the desired position vector
Figure PCTCN2017076273-appb-000038
It can be substituted into the above formula (2), and the kinematic inverse solution can be used to obtain the desired positions θ 1 , θ 2 , and θ 3 of the three joints. The desired position of the joint can be used as an input to control the movement of the drive unit (not shown in FIG. 2) of the exoskeleton robot 201, thereby enabling the exoskeleton arm to follow the intended movement of the human body or to assist the arm.
根据本公开的一实施例,式(1)中的力阻抗控制模型的参数M、K、C并非是固定不变的,而是使用自适应控制算法来动态地确定的。例如,在一示例 中,可以基于在接收到的力以及位置向量来首先通过自适应控制算法计算参数M、K、C的值,并然后使用计算得到的参数值来计算期望的位置向量。在另一示例中,可以维护参数M、K、C与相应的力以及位置向量的查找表;在接收到力以及位置向量时,装置200可以查找该表来获得相应的M、K、C值,并使用这些参数值来计算期望的位置向量。According to an embodiment of the present disclosure, the parameters M, K, and C of the force impedance control model in equation (1) are not fixed, but are dynamically determined using an adaptive control algorithm. For example, in an example The values of the parameters M, K, C can be first calculated by the adaptive control algorithm based on the received force and the position vector, and then the calculated position values are used to calculate the desired position vector. In another example, a lookup table of parameters M, K, C and corresponding forces and position vectors can be maintained; upon receiving the force and position vector, device 200 can look up the table to obtain the corresponding M, K, C values. And use these parameter values to calculate the desired position vector.
根据本公开的又一实施例,式(1)中的力阻抗控制模型的参数M、K、C能经由反馈来调整。例如,穿戴外骨骼机械臂201的用户可以指示计算得到的期望位置向量总是滞后于用户的意图。根据这一反馈,装置200可以重新计算并调整参数M、K、C的值。或者,在又一示例中,外骨骼机械臂201上还可包括压力传感器,用于提供手臂与外骨骼机械臂之间的压力。在该示例中,装置200可通过压力传感器感测到的压力变化来确定外骨骼机械臂201与用户的意图之间是否匹配,并取决于确定的结果来调整力阻抗控制模型的参数M、K、C。According to still another embodiment of the present disclosure, the parameters M, K, C of the force impedance control model in equation (1) can be adjusted via feedback. For example, a user wearing the exoskeleton robotic arm 201 can indicate that the calculated desired position vector always lags behind the user's intent. Based on this feedback, device 200 can recalculate and adjust the values of parameters M, K, C. Alternatively, in yet another example, the exoskeleton robotic arm 201 can also include a pressure sensor for providing pressure between the arm and the exoskeleton robotic arm. In this example, the device 200 can determine whether there is a match between the exoskeleton robot arm 201 and the user's intention by the pressure change sensed by the pressure sensor, and adjust the parameters M, K of the force impedance control model depending on the determined result. , C.
图3是根据本公开的基于三维力传感器对人体上肢进行运动意图识别和助力的方法300的流程图。3 is a flow diagram of a method 300 of performing motion intent recognition and assistance on a human upper limb based on a three-dimensional force sensor in accordance with the present disclosure.
如图3所示,方法300包括在310实时采集外骨骼机械臂与人体上肢之间产生的三维力数据。例如,如以上结合图1描述的,三维力传感器102实时感测肢体所产生的力。As shown in FIG. 3, method 300 includes acquiring, in 310, three-dimensional force data generated between the exoskeleton robot and the upper limb of the human body in real time. For example, as described above in connection with FIG. 1, the three-dimensional force sensor 102 senses the force generated by the limb in real time.
在320,方法300包括获取外骨骼机械臂各个关节当前时间点的位置数据。例如,如以上结合图1-2描述的,外骨骼机械臂103安装在人体上肢上,并且感测其各个关节(图1的机械臂关节1-3,图2的关节204、205、206,等等)的位置数据,并将这一数据传送给接收单元104。根据本公开的一实施例,外骨骼机械臂感测到的数据是由外骨骼机械臂上的角度传感器来感测的。例如,如以上结合图2所述,关节204、205、206上安装有角度传感器,用于感测该关节的角度位置数据。At 320, method 300 includes obtaining positional data for a current time point of each joint of the exoskeleton robotic arm. For example, as described above in connection with Figures 1-2, the exoskeleton robotic arm 103 is mounted on the upper extremity of the human body and senses its various joints (mechanical arm joints 1-3 of Figure 1, joints 204, 205, 206 of Figure 2, The location data of, etc., and transmits this data to the receiving unit 104. According to an embodiment of the present disclosure, the data sensed by the exoskeleton arm is sensed by an angle sensor on the exoskeleton arm. For example, as described above in connection with FIG. 2, an angle sensor is mounted on the joints 204, 205, 206 for sensing angular position data of the joint.
在330,方法300还包括对三维力数据和当前时间点的位置数据进行计算得到外骨骼机械臂各个关节下一时间点所期望的位置数据。例如,如以上结合图1或2描述的,处理单元101对接收到的数据进行处理来获得外骨骼机械臂 各个关节下一时间点所期望的位置数据。在一实施例中,所述计算包括:将所述三维力数据进行坐标变换;将经坐标变换的力数据代入力阻抗控制模型以得到期望的加速度、速度和位置向量;以及将所得到的期望的加速度、速度和位置向量转换成期望的关节位置数据。在一实施例中,如以上结合图1-2描述的,将三维力数据进行坐标变换包括通过齐次变换矩阵将三维力数据从传感器坐标系变换到相对于人体背部固定的坐标系;并且将所得到的期望的加速度、速度和位置向量转换成期望的关节位置数据包括将所得到的期望的位置向量代入由外骨骼机械臂的关节的转角和各连杆的长度表示的、传感器坐标系原点在相对于人体背部固定的坐标系中的位置的公式中,以获得期望的关节位置数据。例如,式(2)给出了相应的公式。At 330, the method 300 further includes calculating the positional data of the three-dimensional force data and the current time point to obtain the desired position data for the next time point of each joint of the exoskeleton manipulator. For example, as described above in connection with FIG. 1 or 2, processing unit 101 processes the received data to obtain an exoskeleton robotic arm The desired position data for each joint at the next time point. In an embodiment, the calculating comprises: performing coordinate transformation on the three-dimensional force data; substituting the coordinate-transformed force data into a force impedance control model to obtain a desired acceleration, velocity, and position vector; and obtaining the desired The acceleration, velocity and position vectors are converted to the desired joint position data. In an embodiment, performing coordinate transformation of the three-dimensional force data as described above in connection with FIGS. 1-2 includes transforming the three-dimensional force data from the sensor coordinate system to a coordinate system fixed relative to the back of the human body by a homogeneous transformation matrix; Converting the resulting desired acceleration, velocity, and position vector into desired joint position data includes substituting the resulting desired position vector into the corner of the joint of the exoskeleton manipulator and the length of each link, the sensor coordinate system origin In the formula of the position in the coordinate system fixed with respect to the back of the human body, the desired joint position data is obtained. For example, equation (2) gives the corresponding formula.
在340,方法300包括根据外骨骼机械臂各个关节当前时间点的位置数据和下一时间点所期望的位置数据生成对外骨骼机械臂进行驱动的控制信号。根据本公开的一实施例,控制信号是通过力阻抗控制模型来计算得到的,如上式(1)所示。根据本公开的又一实施例,力阻抗控制模型的参数是使用自适应控制算法来动态地确定的。根据本公开的又一实施例,力阻抗控制模型的参数能经由反馈来调整。例如,如以上结合图2描述的,力阻抗控制模型的各参数能够通过查找表、动态计算、用户反馈等等来进行调整。At 340, method 300 includes generating a control signal that drives the exoskeleton arm based on position data at a current time point of each joint of the exoskeleton manipulator and position data desired at a next point in time. According to an embodiment of the present disclosure, the control signal is calculated by the force impedance control model, as shown in the above formula (1). According to yet another embodiment of the present disclosure, the parameters of the force impedance control model are dynamically determined using an adaptive control algorithm. According to yet another embodiment of the present disclosure, the parameters of the force impedance control model can be adjusted via feedback. For example, as described above in connection with FIG. 2, various parameters of the force impedance control model can be adjusted by lookup tables, dynamic calculations, user feedback, and the like.
在350,方法300进一步包括根据控制信号控制外骨骼机械臂进行相应动作对人体上肢进行助力。例如,如以上结合图1描述的,处理单元101将控制信号传送给外骨骼机械臂103以控制外骨骼机械臂的动作。根据本公开的一实施例,对外骨骼机械臂的控制是经由外骨骼机械臂的驱动单元来实现的。At 350, method 300 further includes controlling the exoskeleton robotic arm to perform a corresponding action to assist the upper limb of the human body based on the control signal. For example, as described above in connection with FIG. 1, processing unit 101 transmits a control signal to exoskeleton robot arm 103 to control the motion of the exoskeleton robotic arm. According to an embodiment of the present disclosure, the control of the exoskeleton manipulator is achieved via a drive unit of the exoskeleton manipulator.
在一实施例中,当前时间点与下一时间点的时间间隔取决于处理单元101所执行的计算所需的时间、三维力数据和关节位置数据的传输时间。如以上结合图1-2所述,该时间间隔大于或等于计算所需的时间与传输时间之和。In an embodiment, the time interval between the current time point and the next time point depends on the time required for the calculation performed by the processing unit 101, the three-dimensional force data, and the transmission time of the joint position data. As described above in connection with Figures 1-2, the time interval is greater than or equal to the sum of the time required for the calculation and the transmission time.
本领域技术人员可以理解,上述各步骤的操作顺序只是出于示例的目的给出的,各步骤可以按各种合适的次序来执行。Those skilled in the art will appreciate that the order of operations of the various steps described above is presented for purposes of example only, and the various steps may be performed in various suitable sequences.
提供对本公开的先前描述是为使得本领域任何技术人员皆能够制作或使用本公开。对本公开的各种修改对本领域技术人员来说都将是显而易见的,且本文中所定义的普适原理可被应用到其他变体而不会脱离本公开的精神或范围。由此,本公 开并非旨在被限定于本文中所描述的示例和设计,而是应被授予与本文中所公开的原理和新颖性特征相一致的最广范围。 The previous description of the disclosure is provided to enable any person skilled in the art to make or use the disclosure. Various modifications to the present disclosure will be obvious to those skilled in the art, and the general principles defined herein may be applied to other variations without departing from the spirit or scope of the disclosure. Thus, the public The present invention is not intended to be limited to the examples and designs described herein, but should be accorded the broadest scope of the principles and novel features disclosed herein.

Claims (10)

  1. 一种对人体上肢进行运动意图识别和助力的方法,所述人体上肢上安装有外骨骼机械臂,其特征在于,所述方法包括以下步骤:A method for recognizing and assisting movement of an upper limb of a human body, wherein an exoskeleton robot arm is mounted on the upper limb of the human body, characterized in that the method comprises the following steps:
    实时采集所述外骨骼机械臂与人体上肢之间产生的三维力数据;Collecting three-dimensional force data generated between the exoskeleton mechanical arm and the upper limb of the human body in real time;
    获取所述外骨骼机械臂各个关节当前时间点的位置数据;Obtaining position data of a current time point of each joint of the exoskeleton robot arm;
    对所述三维力数据和当前时间点的位置数据进行计算得到所述外骨骼机械臂各个关节下一时间点所期望的位置数据;Calculating the position data of the three-dimensional force data and the current time point position data to obtain the desired position data of each joint of the exoskeleton mechanical arm at the next time point;
    根据所述外骨骼机械臂各个关节当前时间点的位置数据和下一时间点所期望的位置数据生成对所述外骨骼机械臂进行驱动的控制信号;以及Generating a control signal for driving the exoskeleton manipulator according to position data of a current time point of each joint of the exoskeleton manipulator and position data desired at a next time point;
    根据控制信号控制所述外骨骼机械臂进行相应动作对人体上肢进行助力。The exoskeleton manipulator is controlled according to a control signal to perform a corresponding action to assist the upper limb of the human body.
  2. 如权利要求1所述的方法,其特征在于,所述计算包括:The method of claim 1 wherein said calculating comprises:
    将所述三维力数据进行坐标变换;Performing coordinate transformation on the three-dimensional force data;
    将经坐标变换的三维力数据代入力阻抗控制模型以得到所述外骨骼机械臂各个关节下一时间点所期望的加速度、速度和位置向量;以及Substituting the coordinate transformed three-dimensional force data into a force impedance control model to obtain a desired acceleration, velocity, and position vector for the next time point of each joint of the exoskeleton manipulator;
    将所得到的期望的加速度、速度和位置向量转换成期望的位置数据,其中所述控制信号包括所获得的期望的位置数据。The resulting desired acceleration, velocity, and position vector are converted to desired position data, wherein the control signal includes the obtained desired position data.
  3. 如权利要求2所述的方法,其特征在于,将所述三维力数据进行坐标变换包括通过齐次变换矩阵将所述三维力数据从传感器坐标系变换到相对于人体背部固定的坐标系,并且将所得到的期望的加速度、速度和位置向量转换成期望的位置数据包括将所得到的期望的位置向量代入由所述外骨骼机械臂的关节的转角和各连杆的长度表示的、所述传感器坐标系原点在相对于人体背部固定的坐标系中的位置的位置对应公式中,以获得期望的位置数据。The method of claim 2, wherein performing coordinate transformation on the three-dimensional force data comprises transforming the three-dimensional force data from a sensor coordinate system to a coordinate system fixed relative to a back of the human body by a homogeneous transformation matrix, and Converting the resulting desired acceleration, velocity, and position vector to desired position data includes substituting the resulting desired position vector into a corner of the joint of the exoskeleton arm and a length of each link The position of the sensor coordinate system origin in the position relative to the position in the coordinate system fixed to the back of the human body corresponds to the formula to obtain the desired position data.
  4. 如权利要求3所述的方法,其特征在于,所述坐标变换是
    Figure PCTCN2017076273-appb-100001
    其中
    Figure PCTCN2017076273-appb-100002
    是所述三维力数据在传感器坐标系Os中的表示,
    Figure PCTCN2017076273-appb-100003
    是所述三维力数据在相对于人体背部固定的坐标系O0中的表示,T0s是Os坐标系到O0坐标系的齐次变换矩阵且
    The method of claim 3 wherein said coordinate transformation is
    Figure PCTCN2017076273-appb-100001
    among them
    Figure PCTCN2017076273-appb-100002
    Is the representation of the three-dimensional force data in the sensor coordinate system O s ,
    Figure PCTCN2017076273-appb-100003
    Is a representation of the three-dimensional force data in a coordinate system O 0 fixed relative to the back of the human body, and T 0s is a homogeneous transformation matrix of the O s coordinate system to the O 0 coordinate system and
    Figure PCTCN2017076273-appb-100004
    Figure PCTCN2017076273-appb-100004
    其中l1、l2、l3分别是所述外骨骼机械臂的肩关节以上的第一连杆、肩关节与肘关节之间的第二连杆以及肘关节以下的第三连杆的长度,θ1、θ2、θ3分别是肩部内外旋角、肩部屈伸角、以及肘部屈伸角,并且所得到的期望的位置向量表示为
    Figure PCTCN2017076273-appb-100005
    所述位置对应公式是
    Wherein l 1 , l 2 , and l 3 are the first link above the shoulder joint of the exoskeleton manipulator, the second link between the shoulder joint and the elbow joint, and the length of the third link below the elbow joint, respectively , θ 1 , θ 2 , θ 3 are the internal and external rotation angles of the shoulder, the shoulder flexion angle, and the elbow flexion angle, respectively, and the obtained desired position vector is expressed as
    Figure PCTCN2017076273-appb-100005
    The position correspondence formula is
    Figure PCTCN2017076273-appb-100006
    Figure PCTCN2017076273-appb-100006
  5. 如权利要求1所述的方法,其特征在于,当前时间点与下一时间点的时间间隔取决于所述计算所需的时间、三维力数据和位置数据的传输时间,其中所述时间间隔大于或等于所述计算所需的时间与所述传输时间之和。The method of claim 1 wherein the time interval between the current time point and the next time point is dependent on the time required for the calculation, the three-dimensional force data, and the transmission time of the location data, wherein the time interval is greater than Or equal to the sum of the time required for the calculation and the transmission time.
  6. 如权利要求2所述的方法,其特征在于,所述力阻抗控制模型是
    Figure PCTCN2017076273-appb-100007
    其中
    Figure PCTCN2017076273-appb-100008
    是所述三维力数据在相对于人体背部固定的坐标系O0中的表示,
    Figure PCTCN2017076273-appb-100009
    表示坐标系O0的原点的期望的加速度、速度和位置向量,
    Figure PCTCN2017076273-appb-100010
    表示由外骨骼机械臂测量到的坐标系O0的原点当前时间点的加速度、速度和位置向量,并且其中参数M表示外骨骼机械臂的理想惯性,参数C表示外骨骼机械臂的理想阻尼,参数K表示外骨骼机械臂的理想刚度,并且其中所述参数是使用自适应控制算法来动态地确定的。
    The method of claim 2 wherein said force impedance control model is
    Figure PCTCN2017076273-appb-100007
    among them
    Figure PCTCN2017076273-appb-100008
    Is the representation of the three-dimensional force data in a coordinate system O 0 fixed relative to the back of the human body,
    Figure PCTCN2017076273-appb-100009
    a desired acceleration, velocity, and position vector representing the origin of the coordinate system O 0 ,
    Figure PCTCN2017076273-appb-100010
    An acceleration, velocity, and position vector representing the current time point of the origin of the coordinate system O 0 measured by the exoskeleton manipulator, and wherein the parameter M represents the ideal inertia of the exoskeleton manipulator and the parameter C represents the ideal damping of the exoskeleton manipulator, The parameter K represents the ideal stiffness of the exoskeleton manipulator, and wherein the parameters are dynamically determined using an adaptive control algorithm.
  7. 如权利要求6所述的方法,其特征在于,所述力阻抗控制模型的参数能经由反馈来动态调整。The method of claim 6 wherein the parameters of the force impedance control model are dynamically adjustable via feedback.
  8. 一种对人体上肢进行运动意图识别和助力的装置,其特征在于,所述装 置包括:A device for recognizing and assisting the movement of an upper limb of a human body, characterized in that Set includes:
    外骨骼机械臂,所述外骨骼机械臂安装在人体上肢上,所述外骨骼机械臂能感测所述外骨骼机械臂各个关节当前时间点的位置数据;An exoskeleton robot arm mounted on an upper limb of the human body, the exoskeleton robot arm capable of sensing position data of a current time point of each joint of the exoskeleton robot arm;
    三维力传感器,所述三维力传感器实时采集所述外骨骼机械臂与人体上肢之间产生的三维力数据;a three-dimensional force sensor that collects three-dimensional force data generated between the exoskeleton mechanical arm and the upper limb of the human body in real time;
    接收单元,所述接收单元被配置成接收由所述三维力传感器和所述机械臂感测到的数据;以及a receiving unit configured to receive data sensed by the three-dimensional force sensor and the robotic arm;
    处理单元,所述处理单元被配置成:a processing unit configured to:
    对所述三维力数据和当前时间点的位置数据进行计算得到所述外骨骼机械臂各个关节下一时间点所期望的位置数据;Calculating the position data of the three-dimensional force data and the current time point position data to obtain the desired position data of each joint of the exoskeleton mechanical arm at the next time point;
    根据所述外骨骼机械臂各个关节当前时间点的位置数据和下一时间点所期望的位置数据生成对所述外骨骼机械臂进行驱动的控制信号;以及Generating a control signal for driving the exoskeleton manipulator according to position data of a current time point of each joint of the exoskeleton manipulator and position data desired at a next time point;
    根据控制信号控制所述外骨骼机械臂进行相应动作对人体上肢进行助力。The exoskeleton manipulator is controlled according to a control signal to perform a corresponding action to assist the upper limb of the human body.
  9. 如权利要求8所述的装置,其特征在于,所述外骨骼机械臂上包括角度传感器,其中所述外骨骼机械臂各个关节当前时间点的位置数据是由所述角度传感器来感测到的。The device according to claim 8, wherein said exoskeleton robot arm includes an angle sensor, wherein position data of a current time point of each joint of said exoskeleton robot arm is sensed by said angle sensor .
  10. 如权利要求8所述的装置,其特征在于,所述机械臂包括左、右两个机械臂,并且其中这两个机械臂通过背部支架连接在一起。 The device of claim 8 wherein said robot arm comprises two left and right robot arms, and wherein the two robot arms are coupled together by a back bracket.
PCT/CN2017/076273 2016-06-29 2017-03-10 Human upper limb motion intention recognition and assistance method and device WO2018000854A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610494243.4A CN107538469A (en) 2016-06-29 2016-06-29 The method and device of motion intention identification and power-assisted is carried out to human upper limb
CN201610494243.4 2016-06-29

Publications (1)

Publication Number Publication Date
WO2018000854A1 true WO2018000854A1 (en) 2018-01-04

Family

ID=60785282

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/076273 WO2018000854A1 (en) 2016-06-29 2017-03-10 Human upper limb motion intention recognition and assistance method and device

Country Status (2)

Country Link
CN (1) CN107538469A (en)
WO (1) WO2018000854A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110327048A (en) * 2019-03-11 2019-10-15 浙江工业大学 A kind of human upper limb posture reconstruction system based on wearable inertial sensor
CN111062247A (en) * 2019-11-07 2020-04-24 郑州大学 Human body movement intention prediction method oriented to exoskeleton control
CN111452024A (en) * 2019-05-14 2020-07-28 成都智慧果易科技有限公司 Be applied to upper limbs bearing helping hand ectoskeleton equipment
CN111515934A (en) * 2020-05-09 2020-08-11 中国人民解放军32286部队50分队 Wearable individual equipment maintenance exoskeleton system and control method thereof
CN113043248A (en) * 2021-03-16 2021-06-29 东北大学 Transportation and assembly whole-body exoskeleton system based on multi-source sensor and control method
CN113478462A (en) * 2021-07-08 2021-10-08 中国科学技术大学 Method and system for controlling intention assimilation of upper limb exoskeleton robot based on surface electromyogram signal
CN113946132A (en) * 2021-10-18 2022-01-18 湖南大学 Multi-dimensional force sensor-based multifunctional integrated adjusting device, adjusting method and readable storage medium
CN114734426A (en) * 2022-03-11 2022-07-12 中国科学院自动化研究所 Hand exoskeleton structure control method and device, electronic equipment and storage medium
CN114750137A (en) * 2022-05-19 2022-07-15 合肥工业大学 RBF network-based upper limb exoskeleton robot motion control method
CN114952804A (en) * 2022-07-13 2022-08-30 山东中科先进技术有限公司 Exoskeleton integrated power assisting device and method
CN117182930A (en) * 2023-11-07 2023-12-08 山东捷瑞数字科技股份有限公司 Four-axis mechanical arm binding method, system, equipment and medium based on digital twin

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109498375B (en) * 2018-11-23 2020-12-25 电子科技大学 Human motion intention recognition control device and control method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103203748A (en) * 2013-04-08 2013-07-17 布法罗机器人科技(苏州)有限公司 Exoskeleton robot controlling system and method
CN104797385A (en) * 2012-12-19 2015-07-22 英特尔公司 Adaptive exoskeleton, devices and methods for controlling the same
US20150217444A1 (en) * 2014-01-13 2015-08-06 Massachusetts Institute Of Technology Wearable Robot Assisting Manual Tasks
CN105108760A (en) * 2015-08-14 2015-12-02 上海申磬产业有限公司 Control method of wearable type power-assisted exoskeleton upper limb mechanism

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104797385A (en) * 2012-12-19 2015-07-22 英特尔公司 Adaptive exoskeleton, devices and methods for controlling the same
CN103203748A (en) * 2013-04-08 2013-07-17 布法罗机器人科技(苏州)有限公司 Exoskeleton robot controlling system and method
US20150217444A1 (en) * 2014-01-13 2015-08-06 Massachusetts Institute Of Technology Wearable Robot Assisting Manual Tasks
CN105108760A (en) * 2015-08-14 2015-12-02 上海申磬产业有限公司 Control method of wearable type power-assisted exoskeleton upper limb mechanism

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110327048B (en) * 2019-03-11 2022-07-15 浙江工业大学 Human upper limb posture reconstruction system based on wearable inertial sensor
CN110327048A (en) * 2019-03-11 2019-10-15 浙江工业大学 A kind of human upper limb posture reconstruction system based on wearable inertial sensor
CN111452024A (en) * 2019-05-14 2020-07-28 成都智慧果易科技有限公司 Be applied to upper limbs bearing helping hand ectoskeleton equipment
CN111062247A (en) * 2019-11-07 2020-04-24 郑州大学 Human body movement intention prediction method oriented to exoskeleton control
CN111062247B (en) * 2019-11-07 2023-05-26 郑州大学 Human motion intention prediction method for exoskeleton control
CN111515934A (en) * 2020-05-09 2020-08-11 中国人民解放军32286部队50分队 Wearable individual equipment maintenance exoskeleton system and control method thereof
CN113043248A (en) * 2021-03-16 2021-06-29 东北大学 Transportation and assembly whole-body exoskeleton system based on multi-source sensor and control method
CN113043248B (en) * 2021-03-16 2022-03-11 东北大学 Transportation and assembly whole-body exoskeleton system based on multi-source sensor and control method
CN113478462B (en) * 2021-07-08 2022-12-30 中国科学技术大学 Method and system for controlling intention assimilation of upper limb exoskeleton robot based on surface electromyogram signal
CN113478462A (en) * 2021-07-08 2021-10-08 中国科学技术大学 Method and system for controlling intention assimilation of upper limb exoskeleton robot based on surface electromyogram signal
CN113946132A (en) * 2021-10-18 2022-01-18 湖南大学 Multi-dimensional force sensor-based multifunctional integrated adjusting device, adjusting method and readable storage medium
CN113946132B (en) * 2021-10-18 2024-03-12 湖南大学 Multi-functional integrated adjusting device based on multi-dimensional force sensor, adjusting method and readable storage medium
CN114734426A (en) * 2022-03-11 2022-07-12 中国科学院自动化研究所 Hand exoskeleton structure control method and device, electronic equipment and storage medium
CN114734426B (en) * 2022-03-11 2024-05-24 中国科学院自动化研究所 Hand exoskeleton structure control method and device, electronic equipment and storage medium
CN114750137A (en) * 2022-05-19 2022-07-15 合肥工业大学 RBF network-based upper limb exoskeleton robot motion control method
CN114952804A (en) * 2022-07-13 2022-08-30 山东中科先进技术有限公司 Exoskeleton integrated power assisting device and method
CN117182930A (en) * 2023-11-07 2023-12-08 山东捷瑞数字科技股份有限公司 Four-axis mechanical arm binding method, system, equipment and medium based on digital twin
CN117182930B (en) * 2023-11-07 2024-02-13 山东捷瑞数字科技股份有限公司 Four-axis mechanical arm binding method, system, equipment and medium based on digital twin

Also Published As

Publication number Publication date
CN107538469A (en) 2018-01-05

Similar Documents

Publication Publication Date Title
WO2018000854A1 (en) Human upper limb motion intention recognition and assistance method and device
CN108187310B (en) Feel that the limb motion of information and posture information is intended to understand and upper-limbs rehabilitation training robot and its control method based on power
JP5761832B2 (en) Operation assistance device and tuning control method for operation assistance device
He et al. Development of a novel autonomous lower extremity exoskeleton robot for walking assistance
WO2018196227A1 (en) Evaluation method, device, and system for human motor capacity
US20070150105A1 (en) Legged mobile robot controller, legged mobile robot and legged mobile robot control method
CN111773027B (en) Flexibly-driven hand function rehabilitation robot control system and control method
TWI549655B (en) Joint range of motion measuring apparatus and measuring method thereof
JP5974668B2 (en) Manipulation system
KR102193771B1 (en) Wearable robot and method for controlling the same
KR101761314B1 (en) Vital signal measurement and control method thereof
CN106003053A (en) Teleoperation passive robot control system and control method thereof
CN106074073B (en) A kind of control system and rehabilitation training strategy of lower limb rehabilitation robot
WO2017197886A1 (en) Method and device for solving problem of strange configuration of exoskeleton robot shoulder joint
US20210200311A1 (en) Proxy controller suit with optional dual range kinematics
JP2019209392A (en) Haptic visualization device, robot and haptic visualization program
KR20130113063A (en) Human power amplification robot estimating user's intension by detecting joint torque and control method thereof
JP6781453B2 (en) Standing motion support method by tuning control using robotic wear, computer program for standing motion support, and robotic wear
CN109453505B (en) Multi-joint tracking method based on wearable device
JP2014004655A (en) Manipulation system
JP2018121697A (en) Adaptation information generation device and adaptation information generation method for movement assisting device
JP2015221073A (en) Rehabilitation apparatus, control method and control program
CN212421309U (en) Remote control device of foot type robot
KR100895692B1 (en) Wearable robot arm and position moving method of wearable robot arm
Kiguchi et al. A study of a 4DOF upper-limb power-assist intelligent exoskeleton with visual information for perception-assist

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17818853

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17818853

Country of ref document: EP

Kind code of ref document: A1