CN114227689B - Robot motion control system and motion control method thereof - Google Patents
Robot motion control system and motion control method thereof Download PDFInfo
- Publication number
- CN114227689B CN114227689B CN202111654304.6A CN202111654304A CN114227689B CN 114227689 B CN114227689 B CN 114227689B CN 202111654304 A CN202111654304 A CN 202111654304A CN 114227689 B CN114227689 B CN 114227689B
- Authority
- CN
- China
- Prior art keywords
- motion
- robot
- data
- user
- action
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 183
- 238000000034 method Methods 0.000 title claims abstract description 34
- 238000004891 communication Methods 0.000 claims abstract description 63
- 230000008859 change Effects 0.000 claims abstract description 13
- 230000009471 action Effects 0.000 claims description 134
- 230000000875 corresponding effect Effects 0.000 claims description 45
- 238000004590 computer program Methods 0.000 claims description 22
- 239000011521 glass Substances 0.000 claims description 22
- 238000005259 measurement Methods 0.000 claims description 11
- 230000006870 function Effects 0.000 claims description 9
- 230000009191 jumping Effects 0.000 claims description 6
- 230000001276 controlling effect Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000001429 stepping effect Effects 0.000 description 3
- 241000282414 Homo sapiens Species 0.000 description 2
- 238000005452 bending Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/006—Controls for manipulators by means of a wireless system for controlling one or several manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- Manipulator (AREA)
Abstract
The application provides a robot motion control system and a motion control method thereof, wherein the robot motion control system comprises a robot and a motion control device, a communication connection channel is established between a first communication module arranged in the robot and a second communication module arranged in the motion control device, the robot receives user motion data measured by the motion control device through the communication connection channel and controls each motion component in the robot to execute corresponding motion according to the user motion data, wherein the user motion data is data generated by the motion change of limbs of a user; the motion control device collects user motion data through the configured motion data measuring instrument and transmits the user motion data to the robot through the communication connection channel so as to control the motion of the robot. Based on the robot motion control system, the robot can be remotely controlled in a first person-to-person manner.
Description
Technical Field
The present application relates to the field of robots, and more particularly, to a robot motion control system and a motion control method thereof, and an apparatus and a storage medium for executing the motion control method.
Background
With the rapid development of artificial intelligence technology, robots are widely applied to various industries, and are intelligent machines capable of semi-autonomous or fully autonomous operation, and perform corresponding actions by receiving control and command of human beings. At present, the control and command of human beings on the robot are basically realized in a programming mode, and the robot runs a preset programming program to enable the robot to execute corresponding actions. However, the robot is controlled in a programming mode, once programming and compiling are completed, the action of the robot is fixed, and if the action of the robot needs to be changed, the programming needs to be modified, so that the flexibility is poor and the modification difficulty is high. Moreover, remote control of the robot in a first person-to-person manner cannot be achieved through programming, and interactivity between the robot and the person is difficult to embody.
Disclosure of Invention
In view of this, the embodiment of the application provides a robot motion control system and a motion control method thereof, which can realize remote control of a robot in a first person-to-person manner, and enhance flexibility of motion control of the robot.
A first aspect of an embodiment of the present application provides a robot motion control system, including a robot and a motion control device, where a communication connection channel is established between a first communication module set in the robot and a second communication module set in the motion control device, where:
the robot receives user action data measured by the action control device through the communication connection channel and controls all action components in the robot to execute corresponding actions according to the user action data, wherein the user action data is action data generated by user limb action change;
the motion control device collects user motion data through a configured motion data measuring instrument, and transmits the user motion data to the robot through the communication connection channel so as to perform motion control on the robot.
With reference to the first aspect, in a first possible implementation manner of the first aspect, the motion data measurement apparatus configured in the motion control device includes a head VR glasses, and an inertial measurement sensor is installed in the head VR glasses, and is configured to obtain first rotation angle data generated by a rotation motion of a head of a user.
With reference to the first aspect, in a second possible implementation manner of the first aspect, the motion data measurement apparatus configured in the motion control device further includes an operation seat having a pivot function, and an angle sensor is installed on a pivot chassis of the operation seat, for acquiring second rotation angle data generated by a body rotation motion of a user.
With reference to the first aspect, in a third possible implementation manner of the first aspect, the motion data measurement apparatus configured in the motion control device further includes a foot pedal provided with a control block, where the control block is respectively provided with an inclination angle measurement instrument, and is configured to obtain inclination angle depth data generated by a stepping motion of a foot of a user.
With reference to the third possible implementation manner of the first aspect, in a fourth possible implementation manner of the first aspect, three control blocks are disposed in the foot pedal, and the three control blocks are respectively used for controlling a speed of forward motion, a speed of backward motion and making a jumping motion of the robot.
With reference to the first aspect, in a fifth possible implementation manner of the first aspect, the motion data measurement apparatus configured in the motion control device further includes a glove part, where a stretch sensor is installed on the glove part, and the stretch sensor is configured to obtain stretch data generated by a hand of a user based on gesture motion.
A second aspect of an embodiment of the present application provides an action control method, including:
receiving user action data measured by the action control device based on the communication connection channel;
according to the user action data, accessing a receiving interface in the robot, and determining a target action part in the robot correspondingly controlled by the user action data;
and configuring the operation parameters of the control equipment corresponding to the target action component according to the user action data so as to control the target action component to execute corresponding actions based on the operation parameters of the control equipment.
With reference to the second aspect, in a first possible implementation manner of the second aspect, the action control method further includes:
inquiring a parameter configuration comparison table preset in the robot according to the user action data, and acquiring the operation parameters of the control equipment corresponding to the target action component from the parameter configuration comparison table based on the corresponding quantitative relation between the user action data and the operation parameters of the control equipment in the parameter configuration comparison table.
A third aspect of an embodiment of the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the action control method according to any one of the second aspects when the computer program is executed.
A fourth aspect of an embodiment of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the action control method according to any one of the second aspects.
Compared with the prior art, the embodiment of the application has the beneficial effects that:
the robot action control system comprises a robot and an action control device, wherein a communication connection channel is established between a first communication module arranged in the robot and a second communication module arranged in the action control device, the robot receives user action data measured by the action control device through the communication connection channel and controls all action components in the robot to execute corresponding actions according to the user action data, and the user action data are data generated by user limb action change; the motion control device collects user motion data through the configured motion data measuring instrument and transmits the user motion data to the robot through the communication connection channel so as to control the motion of the robot. Based on the robot motion control system, the robot can be remotely controlled in a first person-to-person manner.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of a basic structure of a robot motion control system according to an embodiment of the present application;
fig. 2 is a schematic diagram of a detailed structure of a robot motion control system according to an embodiment of the present application;
fig. 3 is a schematic diagram of a robot motion control system according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a basic method of an action control method applied to the robot action control system according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an action control device according to an embodiment of the present application;
fig. 6 is a schematic diagram of an electronic device for implementing an action control method according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to illustrate the technical scheme of the application, the following description is made by specific examples.
In some embodiments of the present application, referring to fig. 1, fig. 1 is a schematic basic structure diagram of a robot motion control system according to an embodiment of the present application. As shown in fig. 1, the robot motion control system specifically includes a robot 10 and a motion control device 20, wherein a first communication module 101 is provided in the robot, and a second communication module is provided in the motion control device 20. A communication connection channel is established between the first communication module 101 provided in the robot and the second communication module 201 provided in the motion control device. In the present embodiment, the robot 10 may receive the user motion data measured by the motion control device 20 through the communication connection channel, and control each motion component in the robot 10 to perform a corresponding motion according to the user motion data. The user action data is action data generated by the action change of the limbs of the user. By way of example, the first communication module 101 and the second communication module 201 may employ a high-speed communication module such as 5G. The motion control device 20 collects user motion data through the configured motion data measuring instrument, and transmits the user motion data to the robot 10 through the communication connection channel, thereby performing motion control on the robot 10. The motion data measuring device provided in the motion control device 20 is in contact with each motion part of the user's body, such as the head, body, and limbs of the user's body, to monitor the change in the motion of the limb, and to collect the motion data of the user.
In some embodiments of the present application, please refer to fig. 2 and fig. 3 together, fig. 2 is a schematic diagram of a refinement structure of a robot motion control system according to an embodiment of the present application; fig. 3 is a schematic diagram of a robot motion control system according to an embodiment of the present application.
As shown in fig. 2 and 3, the motion data measuring instrument configured in the motion control device 20 may specifically include a pair of Virtual Reality (VR) glasses 202. In this embodiment, the inertial measurement sensor 2021 may be installed in the head VR glasses 202, and the inertial measurement sensor 2021 may be used to obtain first rotation angle data generated by the user's head rotation. In the present embodiment, the second communication module 201 provided in the motion control device 20 is integrally provided in the head VR glasses 202. The inertial measurement sensor 2021 and the second communication module 201 are specifically connected to the head VR glasses 202 through a bus interface such as SPI (Serial Peripheral Interface ), USB (Universal Serial Bus, universal serial bus), I2C (Inter IntegratedCircuit, integrated circuit bus), or SDIO (Secure Digital Input and Output, secure digital input/output). Based on the head VR glasses 202, live video data collected by the robot 10 may be received through the second communication module 201, and then the live video data is transmitted into the head VR glasses 202 by the head VR glasses 202, so that eyes of a user can see the live video data collected by the robot 10 in real time. The user wears the head VR glasses 202 on his/her head to rotate his/her head to the left or right, rotation data of the user's head to rotate to the left or right is measured by the inertial measurement sensor 2021 in the head VR glasses 202, and the measured rotation data is fed back to the robot 10 through the second communication module 201 of the head VR glasses 202, and the head of the robot 10 is rotated by the controller of the robot 10 according to the rotation data, thereby changing the orientation of the head of the robot 10.
As shown in fig. 2 and 3, the motion data measuring instrument provided in the motion control device 20 further includes an operation seat 203 having a pivot function. Specifically, the operating seat 203 performs a pivot function by a pivot plate. In this embodiment, the second rotation angle data generated by the user's body rotation action can be acquired by installing an angle sensor 2031 on the rotating chassis of the operation seat 203, and using the angle sensor 2031. In this embodiment, the operation seat 203 performs data transmission with the head VR glasses 202 through a short-range wireless communication technology, such as bluetooth, wifi, and the like. Based on the operation seat 203, the user sits on the operation seat 203 to rotate the operation seat 203 to the left or right, rotation data of the left or right rotation of the operation seat 203 is measured by an angle sensor 2031 on the operation seat 203, the rotation data is transmitted to the head VR glasses 202 by a short-distance wireless communication technology, and then is fed back to the robot 10 through the second communication module 201 of the head VR glasses 202, and the body of the robot 10 is rotated by the controller of the robot 10 according to the rotation data, thereby changing the orientation faced by the body of the robot 10. By way of example, the angle sensor 2031 may be a sliding resistance sensor or a magnetically encoded sensor.
As shown in fig. 2 and 3, the motion data measuring apparatus provided in the motion control device 20 further includes a foot pedal 204, and specifically, a control block 2042 is provided in the foot pedal 204. By mounting the inclinometer 2041 on the control block 2042, inclination depth data generated by the user's foot stepping action is acquired by using the inclinometer 2041. For example, three control blocks 2042 having a certain inclination angle may be provided in the foot pedal 204, wherein one control block 2042 is used to control the speed of the forward motion of both feet of the robot 10; a control block 2042 is provided for controlling the speed of the reverse motion of the feet of the robot 10; a control block 2042 is provided for controlling the jumping motions of the feet of the robot 10. In this embodiment, the foot pedal 204 performs data transmission with the head VR glasses 202 through a short-range wireless communication technology, such as bluetooth, wifi, etc. Based on the foot pedal 204, the user steps on the control block with force to change the inclination angle of the control block, the inclination angle depth data value of the inclination angle changed by the control block when the user steps on the control block with force is measured by the inclination angle measuring instrument 2041 on the control block 2042, the inclination angle depth data value is transmitted to the head VR glasses 202 through a short-distance wireless communication technology, and then is fed back to the robot 10 through the second communication module 201 of the head VR glasses 202, and the controller of the robot 10 controls the feet of the robot 10 according to the inclination angle depth data value, so that the advancing speed and the retreating speed of the feet of the robot 10 are changed or jumping actions are made.
Specifically, when the control block 2042 on which the feet of the user are stepped is a control block for controlling the forward movement speed of both feet of the robot 10, the inclination depth data value measured by the inclination angle gauge 2041 on the control block 2042 is proportional to the forward movement speed of both feet of the robot 10, and the deeper the depth, the faster the forward movement speed of both feet of the robot 10. When the control block 2042 stepped on by the feet of the user is a control block for controlling the speed of the backward movement of the feet of the robot 10, the inclination depth data value measured by the inclination angle measuring instrument 2041 on the control block 2042 is proportional to the speed of the backward movement of the feet of the robot 10, and the deeper the depth, the faster the backward movement of the feet of the robot 10. When the control block 2042 stepped on by the feet of the user is a control block for controlling the two feet of the robot 10 to perform the jumping motion, the jumping motion is performed by the two feet of the robot 10 when the inclination depth data value measured by the inclination measuring instrument 2041 on the control block 2042 reaches a certain preset threshold value.
As shown in fig. 2 and 3, the motion data measuring apparatus configured in the motion control device 20 further includes a glove member 205, specifically, a stretch sensor 2051 is mounted on the glove member 205, and the stretch sensor 2051 is used to obtain stretch data generated by a user's hand based on gesture motion. In this embodiment, the glove component 205 performs data transmission with the head VR glasses 202 through a short-range wireless communication technology, such as bluetooth, wifi, and the like. Based on the glove member 205, the user wears the glove member 205 to perform gesture actions such as bending or stretching, the stretching data of each finger of the glove member 205 is measured by the stretching sensor 2051 on the glove member 205, the stretching data is transmitted to the head VR glasses 202 by the short-distance wireless communication technology, and then is fed back to the robot 10 by the second communication module 201 of the head VR glasses 202, and the controller of the robot 10 controls the gesture actions such as bending or stretching of each finger of the hand of the robot 10 according to the stretching data, so that the gesture actions of the hand of the robot 10 are changed.
In some embodiments of the present application, referring to fig. 4, fig. 4 is a schematic flow chart of a basic method of an action control method applied to the robot action control system according to an embodiment of the present application. The details are as follows:
step S41: and receiving user action data measured by the action control device based on the communication connection channel.
In this embodiment, based on a communication connection channel established between a first communication module provided in the robot and a second communication module provided in the motion control device, a user performs a related motion by wearing or using each motion data measuring instrument configured by the motion control device, and when the user performs the related motion, each motion data measuring instrument monitors a change in a user's limb motion, measures motion data generated when the user performs the related motion, and transmits the measured motion data to the second communication module, and further transmits the measured motion data to the first communication module based on the communication connection channel established between the first communication module and the second communication module, so that the robot receives the user motion data measured by the motion control device.
Step S42: and accessing the receiving interface in the robot according to the user action data, and determining a target action part in the robot correspondingly controlled by the user action data.
In this embodiment, the motion control device sets a data header representing a data source on the basis of user motion data sent to the first communication module by a communication connection channel established between the first communication module and the second communication module, and records address information of a motion data measuring instrument of the data source in the data header. When the robot receives user action data, the address information of the action data measuring instrument recorded in the data head is obtained by analyzing the data head, a receiving interface for accessing the user action data is selected according to the address information of the action data measuring instrument, and then the selected receiving interface is adopted to access the user action data into the robot. In the robot, a plurality of receiving interfaces for accessing data are provided, and the application range of each receiving interface is preset by adopting address information of the motion data measuring instrument as authority. Therefore, when the robot receives the user action data, the address information of the action data measuring instrument recorded in the data head is obtained, and the address information of the action data measuring instrument recorded in the data head is compared with the address information of the action data measuring instrument which is set corresponding to the receiving interface and represents the applicable range, so that the receiving interface for accessing the user action data can be selected. In this embodiment, a corresponding matching relationship is established between one receiving interface and one action component in the robot, and specifically, each receiving interface is correspondingly configured with a code identifier of the action component that is correspondingly matched. After the user action data is accessed to the robot through the selected receiving interface, traversing the coding identification of each action component of the robot according to the coding identification of the selected receiving interface, and determining the target action component in the robot correspondingly controlled by the user action data.
Step S43: and configuring the operation parameters of the control equipment corresponding to the target action component according to the user action data so as to control the target action component to execute corresponding actions based on the operation parameters of the control equipment.
In this embodiment, in the robot, each of the motion parts has a corresponding control device, and by adjusting the operation parameters of the control device, the posture or the change condition of the motion part, such as the change of the orientation of the motion part, the change of the motion amplitude of the motion part, and the like, can be changed. Because the user motion data is motion data generated by the motion change of the limbs of the user, in this embodiment, when the robot motion control system is used to perform motion control on the robot in a first person-to-person manner, the operation parameters of the control device corresponding to the target motion component in the robot can be configured according to the received user motion data, so that the target motion component in the robot is controlled to perform corresponding motion based on the operation parameters of the control device, thereby changing the posture or the change condition of the target motion component in the robot. For example, assuming that the user motion data is first rotation angle data generated by the rotation motion of the head of the user, the azimuth angle parameter of the control device in the robot for controlling the rotation of the head part can be configured based on the first rotation angle data, and then the control device operates according to the configured azimuth angle parameter, so that the head part in the robot can be rotated to the corresponding azimuth.
In some embodiments of the present application, a corresponding quantitative relationship between user action data and operation parameters of control devices is preset for control devices in robots that control each action component to execute a related action, so as to form a parameter configuration comparison table. In this embodiment, after obtaining the user data, when the operation parameters of the control device corresponding to the target action component are configured according to the user action data, the reference table may be configured by querying the parameter configuration reference table, so that the operation parameters of the control device corresponding to the target action component are obtained from the parameter configuration reference table based on the corresponding quantitative relationship between the user action data and the operation parameters of the control device in the parameter configuration reference table. For example, assume that the user action parameter is inclination angle depth data generated by the user performing foot stepping action on a control block for controlling the forward action speed of the two feet of the robot, the control device corresponding to the two feet in the robot is a motor, and the rotation speed parameter of the motor device is used for controlling the forward action speed of the two feet. The parameter configuration comparison table is based on the corresponding quantitative relation between the dip angle depth and the motor rotating speed, for example, a graph or a numerical value corresponding table representing the corresponding relation between the dip angle depth and the motor rotating speed is provided in the motor equipment, at this time, the motor rotating speed parameter corresponding to the dip angle depth data value specifically recorded in the user action data can be read from the graph or the numerical value corresponding table by inquiring the graph or the numerical value corresponding table representing the corresponding relation between the dip angle depth and the motor rotating speed, and then the obtained motor rotating speed parameter is configured as the operation parameter of the motor equipment for controlling the forward movement of the two feet of the robot, and the robot operates the motor equipment based on the motor rotating speed parameter, so that the two feet of the robot perform the forward movement according to the forward movement speed corresponding to the motor rotating speed parameter.
The motion control method provided by the embodiment receives the user motion data measured by the motion control device based on the communication connection channel; according to the user action data, accessing a receiving interface in the robot, and determining a target action part in the robot correspondingly controlled by the user action data; and configuring the operation parameters of the control equipment corresponding to the target action component according to the user action data so as to control the target action component to execute corresponding actions based on the operation parameters of the control equipment. Based on the method, the robot is remotely controlled in a first person-to-person manner, and the action control of each action part of the robot is realized by controlling each part of the body of the user, so that the interactivity between the robot and the person is enhanced.
It should be understood that, the sequence number of each step in the foregoing embodiment does not mean the execution sequence, and the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
In some embodiments of the present application, please refer to fig. 5, fig. 5 is a schematic diagram of an operation control device according to an embodiment of the present application. As shown in fig. 5, the motion control device includes: a data receiving module 51, an action component determining module 52, and an action control module 53. The data receiving module 51 is configured to receive user action data measured by the action control device based on the communication connection channel. The action component determining module 52 is configured to determine a target action component in the robot controlled by the user action data according to the user action data accessing a receiving interface in the robot. The action control module 53 is configured to configure an operation parameter of a control device corresponding to the target action component according to the user action data, so as to control the target action component to execute a corresponding action based on the operation parameter of the control device.
In some embodiments of the present application, the motion control apparatus further includes a parameter configuration module, where the parameter configuration module is configured to query a parameter configuration comparison table preset in the robot according to the user motion data, and obtain, from the parameter configuration comparison table, an operation parameter of the control device corresponding to the target motion component based on a corresponding quantitative relationship between the user motion data and the operation parameter of the control device in the parameter configuration comparison table.
In some embodiments of the present application, please refer to fig. 6, fig. 6 is a schematic diagram of an electronic device for implementing an action control method according to an embodiment of the present application. As shown in fig. 6, the electronic device 6 of this embodiment includes: a processor 61, a memory 62 and a computer program 63 stored in the memory 62 and executable on the processor 61, for example based on an intelligent inspection program of an inspection robot. The steps of the various action control method embodiments described above are implemented when the processor 61 executes the computer program 62. Alternatively, the processor 61, when executing the computer program 63, performs the functions of the modules/units in the above-described device embodiments.
Illustratively, the computer program 63 may be partitioned into one or more modules/units that are stored in the memory 62 and executed by the processor 61 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing the specified functions for describing the execution of the computer program 63 in the electronic device 6. For example, the computer program 63 may be split into:
the data receiving module is used for receiving the user action data measured by the action control device based on the communication connection channel;
the action part determining module is used for determining a target action part in the robot correspondingly controlled by the user action data according to the receiving interface of the robot accessed by the user action data;
and the action control module is used for configuring the operation parameters of the control equipment corresponding to the target action component according to the user action data so as to control the target action component to execute the corresponding action based on the operation parameters of the control equipment.
The electronic device may include, but is not limited to, a processor 61, a memory 62. It will be appreciated by those skilled in the art that fig. 6 is merely an example of the electronic device 6 and is not meant to be limiting as the electronic device 6 may include more or fewer components than shown, or may combine certain components, or different components, e.g., the electronic device may further include an input-output device, a network access device, a bus, etc.
The processor 61 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 62 may be an internal storage unit of the electronic device 6, such as a hard disk or a memory of the electronic device 6. The memory 62 may also be an external storage device of the electronic device 6, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the electronic device 6. Further, the memory 62 may also include both an internal storage unit and an external storage device of the electronic device 6. The memory 62 is used to store the computer program as well as other programs and data required by the electronic device. The memory 62 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium may include content that is subject to appropriate increases and decreases as required by jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is not included as electrical carrier signals and telecommunication signals.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.
Claims (8)
1. The robot motion control system is characterized by comprising a robot and a motion control device, wherein a communication connection channel is established between a first communication module arranged in the robot and a second communication module arranged in the motion control device, and the robot motion control system comprises:
the robot receives user action data measured by the action control device through the communication connection channel and controls all action components in the robot to execute corresponding actions according to the user action data, wherein the user action data is action data generated by user limb action change;
the motion control device collects user motion data through a configured motion data measuring instrument, and transmits the user motion data to the robot through the communication connection channel to perform motion control on the robot, wherein the motion data measuring instrument configured by the motion control device comprises a foot pedal provided with three control blocks, and the three control blocks are respectively provided with an inclination angle measuring instrument for acquiring inclination angle depth data generated by stepping motion of the foot of a user so as to respectively control the forward motion speed, the backward motion speed and the jumping motion of the robot.
2. The robot motion control system of claim 1, wherein the motion data measuring instrument configured in the motion control device comprises a head VR glasses having an inertial measurement sensor mounted therein for acquiring first rotation angle data generated by a user's head rotation motion.
3. The robot motion control system of claim 1, wherein the motion data measuring instrument provided in the motion control device further comprises an operation seat having a pivot function, and wherein the pivot chassis of the operation seat is provided with an angle sensor for acquiring second rotation angle data generated by a user's body rotation motion.
4. The robot motion control system of claim 1, wherein the motion data measuring instrument configured in the motion control device further comprises a glove member, and wherein the glove member is provided with a stretch sensor for acquiring stretch data generated by a hand of a user based on gesture motion.
5. A motion control method, wherein the motion control method is applied to the robot motion control system according to any one of claims 1 to 4, and comprises:
receiving user action data measured by the action control device based on the communication connection channel;
according to the user action data, accessing a receiving interface in the robot, and determining a target action part in the robot correspondingly controlled by the user action data;
and configuring the operation parameters of the control equipment corresponding to the target action component according to the user action data so as to control the target action component to execute corresponding actions based on the operation parameters of the control equipment.
6. The motion control method according to claim 5, wherein the step of configuring the operation parameters of the target motion component correspondence control device according to the user motion data to control the target motion component to perform the corresponding motion based on the operation parameters includes:
inquiring a parameter configuration comparison table preset in the robot according to the user action data, and acquiring the operation parameters of the control equipment corresponding to the target action component from the parameter configuration comparison table based on the corresponding quantitative relation between the user action data and the operation parameters of the control equipment in the parameter configuration comparison table.
7. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the action control method according to claim 5 or 6 when the computer program is executed.
8. A computer-readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the action control method according to claim 5 or 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111654304.6A CN114227689B (en) | 2021-12-30 | 2021-12-30 | Robot motion control system and motion control method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111654304.6A CN114227689B (en) | 2021-12-30 | 2021-12-30 | Robot motion control system and motion control method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114227689A CN114227689A (en) | 2022-03-25 |
CN114227689B true CN114227689B (en) | 2023-11-17 |
Family
ID=80744953
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111654304.6A Active CN114227689B (en) | 2021-12-30 | 2021-12-30 | Robot motion control system and motion control method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114227689B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101534316A (en) * | 2008-03-13 | 2009-09-16 | 洪发生 | Remote service system |
CN101890719A (en) * | 2010-07-09 | 2010-11-24 | 中国科学院深圳先进技术研究院 | Robot remote control device and robot system |
CN110328648A (en) * | 2019-08-06 | 2019-10-15 | 米召礼 | A kind of man-machine working machine moved synchronously |
CN111203874A (en) * | 2019-12-26 | 2020-05-29 | 深圳市优必选科技股份有限公司 | Robot control method, device, electronic device and storage medium |
CN111452046A (en) * | 2020-03-31 | 2020-07-28 | 佛山科学技术学院 | Virtual reality-based explosive-handling robot system, control method and storage medium |
CN112621715A (en) * | 2020-12-08 | 2021-04-09 | 深圳市迈步机器人科技有限公司 | Upper limb exoskeleton control method and control device based on voice input |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6754364B2 (en) * | 2015-08-25 | 2020-09-09 | 川崎重工業株式会社 | Robot system |
KR102543212B1 (en) * | 2015-10-26 | 2023-06-14 | (주)한화 | System and method for controlling robot |
-
2021
- 2021-12-30 CN CN202111654304.6A patent/CN114227689B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101534316A (en) * | 2008-03-13 | 2009-09-16 | 洪发生 | Remote service system |
CN101890719A (en) * | 2010-07-09 | 2010-11-24 | 中国科学院深圳先进技术研究院 | Robot remote control device and robot system |
CN110328648A (en) * | 2019-08-06 | 2019-10-15 | 米召礼 | A kind of man-machine working machine moved synchronously |
CN111203874A (en) * | 2019-12-26 | 2020-05-29 | 深圳市优必选科技股份有限公司 | Robot control method, device, electronic device and storage medium |
CN111452046A (en) * | 2020-03-31 | 2020-07-28 | 佛山科学技术学院 | Virtual reality-based explosive-handling robot system, control method and storage medium |
CN112621715A (en) * | 2020-12-08 | 2021-04-09 | 深圳市迈步机器人科技有限公司 | Upper limb exoskeleton control method and control device based on voice input |
Also Published As
Publication number | Publication date |
---|---|
CN114227689A (en) | 2022-03-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11061407B2 (en) | Biped robot equivalent trajectory generating method and biped robot using the same | |
CN107838917A (en) | Robot simulation apparatus | |
CN102024316B (en) | Wireless intelligent sensing method, device and system | |
CN111061363A (en) | Virtual reality system | |
CN110405750B (en) | Motion control method and device of robot and robot | |
CN114227689B (en) | Robot motion control system and motion control method thereof | |
CN114001696B (en) | Three-dimensional scanning system, working precision monitoring method and three-dimensional scanning platform | |
CN214751405U (en) | Multi-scene universal edge vision motion control system | |
Prasad et al. | Hand gesture controlled robot | |
US20230333541A1 (en) | Mobile Brain Computer Interface | |
CN109313483A (en) | A kind of device interacted with reality environment | |
CN110091323B (en) | Intelligent equipment, robot control method and device with storage function | |
CN114888809B (en) | Robot control method and device, computer readable storage medium and robot | |
CN116999291A (en) | Rehabilitation robot control method based on multi-source information perception and electronic equipment | |
CN107291224B (en) | Equipment control method and equipment control device | |
CN115984533A (en) | Capture method, system, computing device, and computer storage medium | |
CN109976357A (en) | A kind of automatic driving control system and method | |
WO2019193574A1 (en) | System and method for heterogenous data collection and analysis in a deterministic system | |
US12073020B2 (en) | Head-mounted display, unlocking method and non-transitory computer readable storage medium thereof | |
CN115602032B (en) | Digestive endoscopy operation training system based on virtual reality | |
CN113610901B (en) | Binocular motion capture camera control device and all-in-one equipment | |
CN109833029B (en) | Sleep staging method, system and terminal equipment | |
CN108563335A (en) | Virtual reality exchange method, device, storage medium and electronic equipment | |
JP7048151B2 (en) | Motion judgment device, motion judgment method, and program | |
CN111191536A (en) | Motion capture system and method based on 5G communication technology |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |