CN117607810A - Sensor calibration method for robot, and storage medium - Google Patents
Sensor calibration method for robot, and storage medium Download PDFInfo
- Publication number
- CN117607810A CN117607810A CN202311441483.4A CN202311441483A CN117607810A CN 117607810 A CN117607810 A CN 117607810A CN 202311441483 A CN202311441483 A CN 202311441483A CN 117607810 A CN117607810 A CN 117607810A
- Authority
- CN
- China
- Prior art keywords
- sensor
- distance
- robot
- target object
- detection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 51
- 238000003860 storage Methods 0.000 title claims abstract description 14
- 238000001514 detection method Methods 0.000 claims abstract description 109
- 238000005259 measurement Methods 0.000 claims abstract description 5
- 230000006870 function Effects 0.000 claims description 27
- 230000000149 penetrating effect Effects 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 12
- 239000000523 sample Substances 0.000 claims description 9
- 230000002093 peripheral effect Effects 0.000 claims description 6
- 238000004140 cleaning Methods 0.000 description 53
- 239000000725 suspension Substances 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 7
- 230000001960 triggered effect Effects 0.000 description 7
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 7
- 230000000694 effects Effects 0.000 description 6
- 238000004146 energy storage Methods 0.000 description 5
- 230000003993 interaction Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 230000032683 aging Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 238000011065 in-situ storage Methods 0.000 description 3
- 230000001788 irregular Effects 0.000 description 3
- 230000002197 limbic effect Effects 0.000 description 3
- 230000002265 prevention Effects 0.000 description 3
- 238000011084 recovery Methods 0.000 description 3
- 230000004888 barrier function Effects 0.000 description 2
- 239000000428 dust Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 239000010865 sewage Substances 0.000 description 2
- 238000010408 sweeping Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000005693 optoelectronics Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52004—Means for monitoring or calibrating
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Manipulator (AREA)
Abstract
The embodiment of the application provides a sensor calibration method of a robot, the robot and a storage medium, wherein the robot comprises a first sensor and a second sensor, the first sensor and the second sensor are distance sensors, and the distance measurement precision of the first sensor is higher than that of the second sensor; the method comprises the following steps: determining a distance between the robot detected by the first sensor and the target object as a first distance; acquiring sensor data of a target object by a second sensor; determining a second distance between the robot corresponding to the second sensor and the target object according to the first distance; and calibrating the detection parameters of the second sensor and/or the detection distance of the second sensor according to the second distance and the sensor data. And the second sensor is calibrated by utilizing the distance detected by the first sensor, so that the accuracy of the distance detection of the second sensor is improved.
Description
Technical Field
The present disclosure relates to the field of robots, and in particular, to a sensor calibration method for a robot, and a storage medium.
Background
The robot is required to be provided with a plurality of distance sensors so as to realize the functions of obstacle avoidance, edge, falling prevention and the like. The distance sensor is preferably a high-precision sensor, but the high-precision sensor is usually high in cost, so that a sensor with lower precision can be selected for some special scenes.
In the related technology, when the robot leaves a factory, the distances corresponding to various parameters of the sensor are measured, for example, different voltages correspond to different distances or different currents correspond to different distances, and then the whole machine of the robot is solidified for shipment. However, in the use process of the low-precision sensor, the precision is reduced along with the ageing of devices and structures, and the like, so that the distance difference between the voltage or current corresponding to the sensor and the factory calibration distance is larger and larger, and the sensor cannot be used even if the function is invalid.
Disclosure of Invention
The application provides a sensor calibration method of a robot, the robot and a storage medium, and aims to calibrate a sensor on the robot so as to improve the accuracy of the sensor.
In a first aspect, an embodiment of the present application provides a sensor calibration method of a robot, where the robot includes a first sensor and a second sensor, the first sensor and the second sensor are distance sensors, and a ranging accuracy of the first sensor is higher than a ranging accuracy of the second sensor; the method comprises the following steps:
Determining a distance between the robot detected by the first sensor and a target object as a first distance;
acquiring sensor data of the second sensor on the target object;
determining a second distance between the robot corresponding to the second sensor and the target object according to the first distance;
and calibrating the detection parameters of the second sensor and/or the detection distance of the second sensor according to the second distance and the sensor data.
In a second aspect, an embodiment of the present application provides a robot, where the robot includes a first sensor and a second sensor, the first sensor and the second sensor are both distance sensors, and positions and/or detection directions of the first sensor and the second sensor on the robot are different;
the robot further includes: a processor and a memory for storing a computer program; the processor is used for executing the computer program and realizing the steps of the sensor calibration method of the robot when the computer program is executed.
In a third aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program, which when executed by a processor causes the processor to implement the steps of the method described above.
The embodiment of the application provides a sensor calibration method of a robot, the robot and a storage medium, wherein the robot comprises a first sensor and a second sensor, the first sensor and the second sensor are distance sensors, and the distance measurement precision of the first sensor is higher than that of the second sensor; the method comprises the following steps: determining a distance between the robot detected by the first sensor and the target object as a first distance; acquiring sensor data of a target object by a second sensor; determining a second distance between the robot corresponding to the second sensor and the target object according to the first distance; and calibrating the detection parameters of the second sensor and/or the detection distance of the second sensor according to the second distance and the sensor data. By calibrating the second sensor with the first distance detected by the first sensor, the accuracy of the distance detection by the second sensor can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure of embodiments of the present application.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a sensor calibration method of a robot according to an embodiment of the present application;
FIG. 2 is a schematic block diagram of a cleaning robot in one embodiment;
FIG. 3 is a schematic view of a sensor of a cleaning robot in one embodiment;
FIG. 4 is a schematic view of a sensor of a cleaning robot in another embodiment;
FIG. 5 is a schematic diagram of a sensor calibration method in one embodiment;
FIG. 6 is a schematic diagram of a sensor calibration method in another embodiment;
FIG. 7 is a schematic diagram of a correspondence relationship between adjustment distance and reception parameters in an embodiment;
fig. 8 is a schematic block diagram of a robot provided in an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The flow diagrams depicted in the figures are merely illustrative and not necessarily all of the elements and operations/steps are included or performed in the order described. For example, some operations/steps may be further divided, combined, or partially combined, so that the order of actual execution may be changed according to actual situations.
Some embodiments of the present application are described in detail below with reference to the accompanying drawings. The following embodiments and features of the embodiments may be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a flow chart of a sensor calibration method of a robot according to an embodiment of the present application.
For example, the robot may be a cleaning robot, a service robot, or the like, but is not limited thereto, and may be a pet robot, or the like. Wherein cleaning robot refers to a cleaning device designed for cleaning, including but not limited to: a dust collector, a floor washing machine, a dust and water suction machine, a floor sweeping machine, a floor mopping machine, a sweeping and mopping integrated machine and the like.
For convenience of explanation, the embodiments of the present application will mainly be described with reference to a robot as a cleaning robot.
Fig. 2 is a schematic block diagram of a cleaning robot in an embodiment. The cleaning robot includes a robot body, a driving motor 102, a sensor unit 103, a controller 104, a cleaning member 105, a traveling unit 106, a memory 107, a communication unit 108, an interaction unit 109, an energy storage unit 110, and the like.
The sensor unit 103 provided on the robot body includes various types of sensors such as a laser radar, a collision sensor, a distance sensor, a drop sensor, a counter, a gyroscope, and the like. For example, the lidar is arranged on top of the robot body, and in operation, surrounding environmental information, such as distance and angle of the obstacle relative to the lidar, etc., is available. In addition, the camera can be used for replacing a laser radar, and the distance, angle and the like of the obstacle relative to the camera can be obtained by analyzing the obstacle in the image shot by the camera. The crash sensors include, for example, a crash shell and a trigger sensor; when the cleaning robot collides with an obstacle through the collision housing, the collision housing moves toward the inside of the cleaning robot, and compresses the elastic buffer. After the collision housing has moved a certain distance into the cleaning robot, the collision housing is brought into contact with a trigger sensor, which is triggered to generate a signal, which can be sent to a controller 104 in the robot body for processing. After the obstacle is bumped, the cleaning robot is far away from the obstacle, and the collision shell moves back to the original position under the action of the elastic buffer piece. It can be seen that the collision sensor detects an obstacle and, when it collides against the obstacle, it acts as a buffer. The distance sensor may specifically be an infrared detection sensor, and may be used to detect the distance of an obstacle to the distance sensor. The distance sensor may be provided at a side of the robot body so that a distance value of an obstacle located near the side of the cleaning robot to the distance sensor can be measured by the distance sensor. The distance sensor may also be an ultrasonic distance sensor, a laser distance sensor, a TOF (Time-of-Flight) sensor, an optoelectronic position sensor, or a depth sensor, for example, a dTOF (direct Time-of-Flight) sensor. The falling sensor is arranged at the bottom edge of the robot main body, and when the cleaning robot moves to the edge position of the ground, the falling sensor can detect the risk that the cleaning robot falls from a high place, so that corresponding anti-falling reaction is performed, for example, the cleaning robot stops moving, or moves in a direction away from the falling position, and the like. The fall sensor may be a ground penetrating sensor, cliff sensor, or the like. The inside of the robot main body is also provided with a counter and a gyroscope. The counter is used for detecting the distance length of the cleaning robot. The gyroscope is used for detecting the rotating angle of the cleaning robot, so that the direction of the cleaning robot can be determined.
The controller 104 is provided inside the robot main body, and the controller 104 is used to control the cleaning robot to perform a specific operation. The controller 104 may be, for example, a central processing unit (Central Processing Unit, CPU), a Microprocessor (Microprocessor), or the like. As shown in fig. 2, the controller 104 is electrically connected to the energy storage unit 110, the memory 107, the driving motor 102, the traveling unit 106, the sensor unit 103, the interaction unit 109, the cleaning member 105, and the like to control these components.
The cleaning members 105 may be used to clean the floor, and the number of cleaning members 105 may be one or more. The cleaning member 105 comprises, for example, a mop. The mop cloth comprises, for example, at least one of the following: the rotary mop, flat mop, roller mop, crawler mop, etc., are of course not limited thereto. The mop is arranged at the bottom of the robot main body, and can be specifically a position of the bottom of the robot main body, which is at a rear position. Taking a cleaning piece as a rotary mop for example, a driving motor 102 is arranged in the robot main body, two rotating shafts extend out of the bottom of the robot main body, and the mop is sleeved on the rotating shafts. The drive motor 102 may rotate the shaft, which in turn rotates the mop.
The traveling unit 106 is a component related to movement of the cleaning robot, and as shown in fig. 3 and 4, the traveling unit 106 includes, for example, a driving wheel 1061 and a universal wheel 1062. The universal wheel 1062 and the driving wheel 1061 cooperate to effect steering and movement of the cleaning robot.
A memory 107 is provided on the robot body, and a program is stored on the memory 107, which when executed by the controller 104, realizes a corresponding operation. The memory 107 is also used to store parameters for use by the cleaning robot. The Memory 107 includes, but is not limited to, a magnetic disk Memory, a compact disk read Only Memory (CD-ROM), an optical Memory, and the like.
A communication unit 108 provided on the robot main body, the communication unit 108 for allowing the cleaning robot to communicate with external devices; for example with a terminal or with a base station. Wherein the base station is a cleaning device for use with a cleaning robot.
The interaction unit 109 is provided on the robot main body, and a user can interact with the cleaning robot through the interaction unit 109. The interaction unit 109 includes, for example, at least one of a touch screen, a switch button, a speaker, and the like. For example, the user can control the cleaning robot to start or stop by pressing a switch button.
The energy storage unit 110 is disposed inside the robot body, and the energy storage unit 110 is used to provide power for the cleaning robot.
The robot body is further provided with a charging part for acquiring power from an external device to charge the energy storage unit 110 of the cleaning robot.
It should be understood that the cleaning robot described in the embodiments of the present invention is only a specific example, and is not limited to the cleaning robot in the embodiments of the present invention, and the cleaning robot in the embodiments of the present invention may be other specific implementations. In other implementations, the cleaning device may have more or fewer components than the cleaning robot shown in fig. 2; for example, the cleaning device may comprise a clean water chamber for storing clean water and/or a recovery chamber for storing dirt, the cleaning device may deliver the clean water stored in the clean water chamber to the mop and/or the floor to wet the mop and clean the floor based on the wet mop, the cleaning device may also collect dirt of the floor or dirt-containing sewage into the recovery chamber; the cleaning device can also convey clean water stored in the clean water chamber to the cleaning piece so as to clean the cleaning piece, and dirty sewage containing dirt after cleaning the cleaning piece can also be conveyed to the recovery chamber.
The following describes in detail a sensor calibration method of a robot provided in an embodiment of the present application.
Specifically, as shown in fig. 3 and 4, the robot includes a first sensor 10 and a second sensor 20, and the first sensor 10 and the second sensor 20 are distance sensors.
In some embodiments, the ranging accuracy of the first sensor 10 is higher than that of the second sensor 20, and the embodiment of the present application can calibrate the second sensor 20 according to the ranging result of the first sensor 10, so as to improve the ranging accuracy of the second sensor 20, prevent the second sensor 20 from losing function with time, and ensure the reliability of long-term operation of the second sensor 20.
Illustratively, the first sensor 10 is selected from at least one of the following: PSD (Position Sensitive Detectors, photo position) sensors, TOF (Time of Flight) sensors, lidar (Light Detection and Ranging ) sensors, and the like; the second sensor 20 is selected from at least one of the following: infrared ranging sensors, ultrasonic ranging sensors, and the like. The PSD sensor judges the distance between the PSD sensor and the obstacle by detecting the center offset distance of the emitted light reflected back on the PSD sensor, and the distance measuring accuracy is high.
For example, the sensor calibration method according to the embodiment of the present application may be performed according to a preset calibration condition, and the second sensor 20 may be calibrated. The second sensor 20 is calibrated periodically, for example, according to a preset calibration period (alternatively referred to as an interval duration threshold, such as one month). Of course, the present invention is not limited thereto, and the second sensor 20 may be calibrated when the robot is turned on, for example.
As shown in fig. 1, the sensor calibration method of the robot according to an embodiment of the present application includes steps S110 to S140.
S110, determining the distance between the robot detected by the first sensor and the target object as a first distance.
The target object is an object in the detection direction of the first sensor 10, for example, the detection direction of the first sensor 10 is right, and the object on the right side is the target object. Or when the detection direction of the first sensor 10 is the ground, the ground below the first sensor 10 may be the target object.
Alternatively, the first distance may be a distance to the target object detected by the robot at the start of performing the sensor calibration method, i.e., the first distance may not be set in advance.
Alternatively, the first distance may be a value within a preset range or a preset value. For example, when the distance between the robot and the target object detected by the first sensor 10 is not within the preset range or is not the preset value, the robot movement may be controlled to adjust the distance between the robot and the target object to be within the preset range or the preset value. The preset range or the preset value may be determined according to a distance detection range or a distance detection value of the first sensor 10 having higher accuracy; or can trigger the robot to execute the distance range or distance determination when the corresponding functional task is executed according to the distance detected by the first sensor 10; in these cases the first sensor 10 can more accurately determine the first distance and the calibration effect on the second sensor 20 can be improved.
The first distance may be a distance between the first sensor 10 and the target object, or may be a distance between a position on the robot and the target object; for example, the distance between the center of the robot and the target object may be determined, for example, based on the distance between the first sensor 10 and the target object detected by the first sensor 10, and the known distance between the first sensor 10 and the center of the robot.
S120, acquiring sensor data of the second sensor on the target object.
For example, the second sensor 20 may output the sensor data when the detection direction is toward the target object, such as the magnitude of the photocurrent of the infrared obstacle avoidance sensor or the infrared cliff sensor, but is not limited thereto, and the second sensor 20 may output a value directly used for characterizing the distance, for example. The distance between the second sensor 20 and the target object may be determined from the sensor data, but when the second sensor 20 is not accurate due to aging or the like, the distance between the second sensor 20 and the target object cannot be accurately determined from the sensor data.
In some embodiments, the first sensor 10 and the second sensor 20 are each disposed at a peripheral side of the robot body, and the detection directions of the first sensor 10 and the second sensor 20 on the robot are different.
Illustratively, as shown in fig. 3, the first sensor 10 is disposed at the left or right side of the robot body; the second sensor 20 is arranged at the front side of the robot body.
In other embodiments, the detection direction of both the first sensor 10 and the second sensor 20 is downward, such as toward the ground or a table top.
As shown in fig. 4, the first sensor 10 and the second sensor 20 are disposed at different positions of the bottom of the robot body, for example, each are disposed near the edge of the robot body, and the detection direction is downward.
It should be noted that, in other embodiments, the positions and/or the detection directions of the first sensor 10 and the second sensor 20 may also be different from those in fig. 3 and 4. For example, the first sensor 10 and the second sensor 20 are provided at the front side and the rear side of the robot body, respectively.
In some embodiments, step S120 obtains sensor data of the target object by the second sensor, including: controlling the robot to perform gesture adjustment so as to enable the detection direction of the second sensor to face the target object; and acquiring sensor data of the second sensor on the target object when the detection direction of the second sensor is towards the target object.
The first sensor 10 and the second sensor 20 are different in position and/or detection direction on the robot, and the detection direction of the second sensor 20 of the robot can be made to face the target object by posture adjustment. The robot may be controlled for posture adjustment by controlling the walking unit 106 of the robot, such as the driving wheel 1061 and the universal wheel 1062, for example.
Optionally, the direction of detection of the second sensor toward the target object means that the target object is located in the detection angle range of the second sensor, so that at least the second sensor can be guaranteed to detect the target object; preferably, the detection angle range of the second sensor may be an angle range with higher ranging accuracy, that is, when the target object is in the angle range with higher ranging accuracy, it may be ensured that the second sensor can accurately detect the distance of the target object.
In some embodiments, referring to fig. 3, the first sensor 10 and the second sensor 20 are disposed on the peripheral side of the robot body, and the detection directions of the first sensor 10 and the second sensor 20 on the robot are different; the controlling the robot to perform posture adjustment so that the detection direction of the second sensor is toward the target object includes: the robot is controlled to rotate counterclockwise or clockwise so that the detection direction of the second sensor is directed toward the target object.
Illustratively, the controlling the robot to perform posture adjustment so that the detection direction of the second sensor 20 is toward the target object includes: the robot is controlled to rotate counterclockwise or clockwise by a first preset angle so that the detection direction of the second sensor 20 is directed toward the target object. The first preset angle of the counterclockwise or clockwise rotation of the robot may be set according to the position and/or the detection direction of the first sensor 10, the second sensor 20.
For example, referring to fig. 5 in conjunction with fig. 3, the first sensor 10 is disposed at the left side or the right side of the robot body; the second sensor 20 is arranged at the front side of the robot body. For example, when the first sensor 10 is disposed on the right side of the robot body and the second sensor 20 is disposed on the front side of the robot body, controlling the robot to rotate clockwise by a first predetermined angle, for example, 75 degrees to 110 degrees, such as 90 degrees, may enable the detection direction of the second sensor 20 to be directed toward the target object. It should be noted that the first preset angle is not limited to 90 degrees, and the detection direction of the second sensor 20 may be oriented to the target object after the robot rotates by the first preset angle, and the second sensor 20 may be capable of detecting the target object, for example, the first preset angle may be 45 degrees.
Optionally, when the robot is controlled to perform posture adjustment so that the detection direction of the second sensor faces the target object, the detection direction of the second sensor faces a position on the target object, which is the same as a position on the target object when the first distance between the robot and the target object is detected by the first sensor, for example, when the target object is irregular, a distance error caused by different positions on the target object detected by the two sensors can be prevented.
In other embodiments, the detection direction of the first sensor 10 and the second sensor 20 are the same and the position on the robot body is different. When the first sensor 10 and the second sensor 20 can each detect a target object such as the ground or the wall, the robot may not perform posture adjustment, but is not limited thereto.
Illustratively, the acquiring sensor data of the target object by the second sensor includes: controlling the robot to perform pose adjustment so that the detection direction of the second sensor faces the position on the target object and is the same as the position on the target object when the first distance between the robot and the target object is detected by the first sensor, wherein the pose adjustment comprises position adjustment and/or pose adjustment; and acquiring sensor data of the second sensor on the target object when the detection direction of the second sensor is towards the position on the target object and the first distance between the robot and the target object detected by the first sensor is towards the position on the target object. For example, it is possible to prevent the two sensors from detecting a distance error caused by different positions on the target object when the target object is irregular.
For example, the first sensor 10 and the second sensor 20 are both disposed on the peripheral side of the robot body, and when the detection directions are the same, for example, when the detection directions are forward, the robot is controlled to perform position adjustment, so that the detection direction of the second sensor faces the target object at the same position as the first distance between the robot and the target object detected by the first sensor.
For example, referring to fig. 4, the first sensor and the second sensor are disposed at different positions of the bottom of the robot body, and the detection direction is downward; the acquiring the sensor data of the second sensor on the target object includes: controlling the robot to perform gesture adjustment so that the robot body is suspended above the target object at a position corresponding to the second sensor 20, wherein the gesture adjustment comprises anticlockwise rotation or clockwise rotation; and acquiring sensor data of the second sensor on the target object when the robot body is suspended above the target object at the position corresponding to the second sensor. By controlling the robot to perform posture adjustment, the detection direction of the second sensor 20 can be made to be also directed toward the target object; optionally, when the robot is controlled to perform posture adjustment so that the robot body is suspended above the target object at the position corresponding to the second sensor, the detection direction of the second sensor is directed to the position on the target object, which is the same as the position on the target object when the first sensor detects the distance between the robot and the target object, for example, when the target object is irregular, for example, the ground is uneven, the two sensors detect the distance errors caused by different positions on the target object.
For example, when the robot is controlled to perform posture adjustment so that the robot body is suspended above the target object at the position corresponding to the second sensor, the robot may be controlled to rotate clockwise by a second preset angle or counterclockwise by a third preset angle so that the robot body is suspended above the target object at the position corresponding to the second sensor 20. The second preset angle, the third preset angle may be set according to the position and/or the detection direction of the first sensor 10, the second sensor 20.
For example, referring to fig. 6 in conjunction with fig. 4, when four cliff sensors are disposed in the left front, right front, left rear, and right rear of the robot, respectively, the robot may be controlled to be rotated approximately 45 ° or approximately 315 ° counterclockwise so that the detection direction of the cliff sensor in the left front is directed toward the target object, the robot may be controlled to be rotated approximately 135 ° or approximately 225 ° counterclockwise so that the detection direction of the cliff sensor in the left rear is directed toward the target object, the robot may be controlled to be rotated approximately 225 ° or approximately 135 ° counterclockwise so that the detection direction of the cliff sensor in the right rear is directed toward the target object, and the robot may be controlled to be rotated approximately 315 ° or approximately 45 ° clockwise so that the detection direction of the cliff sensor in the right front is directed toward the target object. The second preset angle and the third preset angle are not limited to the above angles, and the detection direction of the cliff sensor may be oriented to the target object after the robot rotates by the second preset angle or the third preset angle, and the cliff sensor may detect the target object.
S130, determining a second distance between the robot corresponding to the second sensor and the target object according to the first distance.
The accuracy of the first distance determined by the first sensor 10 is higher due to the higher accuracy of the first sensor 10, and the accuracy of the second distance determined from the first distance is also higher. The second sensor 20 may be calibrated based on the second distance.
In some embodiments, the determining a second distance between the robot corresponding to the second sensor 20 and the target object according to the first distance includes; and determining a second distance between the robot corresponding to the second sensor 20 and the target object according to the sum of the first distance and a preset adjustment distance. For example, the second distance may be represented as l+s, where L represents the first distance, S represents the adjustment distance, and the adjustment distance may be a positive number or a negative number.
By determining the second distance according to the adjustment distance and the first distance, the second distance between the robot and the target object after the posture adjustment in step S120 can be accurately determined.
Illustratively, the first sensor and the second sensor are each disposed on a peripheral side of the robot body, and the first sensor and the second sensor are different in detection direction on the robot. Determining a second distance between the robot corresponding to the second sensor and the target object according to the first distance, wherein the second distance comprises; and determining a second distance between the robot corresponding to the second sensor and the target object according to the sum of the first distance and a preset first adjustment distance. The preset first adjustment distance is determined at least according to the positions of the first sensor and the second sensor on the robot.
In determining the first adjustment distance from the positions of the first sensor and the second sensor on the robot, the first adjustment distance may be determined, for example, from a difference in distances of the first sensor and the second sensor from the robot housing in a radial direction of the robot. Referring to fig. 3, the positions of the first sensor 10 and the second sensor 20 on the robot are different, especially the distances from the robot housing in the radial direction of the robot are inconsistent, and a distance difference exists between the first distance between the first sensor 10 and the target object before the robot performs the posture adjustment in step S120 and the actual distance between the second sensor 20 and the target object after the posture adjustment. The distance difference between the first sensor and the second sensor in the radial direction of the robot and the robot housing may be determined after the robot is designed, that is, the sensor position is fixed, the first adjustment distance may be determined according to the distance difference, and the second distance between the robot and the target object corresponding to the second sensor 20 may be determined according to the sum of the first distance and a preset first adjustment distance.
Illustratively, the controlling the robot to make the pose adjustment includes: and controlling the robot to rotate in situ so as to reduce the change of the position relationship between the robot and the target object and improve the accuracy of the first adjustment distance.
Optionally, the preset first adjustment distance may be determined according to a position change of the robot during the gesture adjustment, for example, the robot does not rotate in situ during the gesture adjustment, and the first adjustment distance may be determined according to a position change of the robot during the gesture adjustment and a distance difference between the first sensor and the second sensor in a radial direction of the robot and the robot housing; determining the first adjustment distance according to the position change amount during posture adjustment and the sum of the distance difference values; based on this first adjustment distance, the actual distance between the second sensor 20 and the target object, i.e. the second distance, can be determined more accurately.
The detection directions of the first sensor and the second sensor are the same and the positions on the robot body are different, and as shown in fig. 4, the detection directions of the first sensor and the second sensor are downward, which is not limited thereto, and may be forward or rightward, for example. Determining a second distance between the robot corresponding to the second sensor and the target object according to the first distance, wherein the second distance comprises; determining a second distance between the robot corresponding to the second sensor and the target object according to the sum of the first distance and a preset second adjustment distance; the preset second adjustment distance is determined at least according to the positions of the first sensor and the second sensor on the robot. Referring to fig. 4, the positions of the first sensor 10 and the second sensor 20 on the robot are different, especially the distances from the bottom of the robot are inconsistent, and the second adjustment distance can be determined according to the height difference between the first sensor and the second sensor on the robot after the robot is designed, i.e. the sensor positions are fixed. Or the first sensor 10 and the second sensor 20 are both disposed on the peripheral side of the robot body, and the detection directions are the same, if both are forward, the distances between the first sensor 10 and the second sensor 20 and the robot housing in the radial direction of the robot are inconsistent, and the second adjustment distance can be determined according to the distance difference between the first sensor and the second sensor in the radial direction of the robot and the distance between the first sensor and the second sensor and the robot housing. A second distance between the robot corresponding to the second sensor 20 and the target object may be determined according to a sum of the first distance and a preset second adjustment distance.
In some embodiments, the first distance and the second distance may be distances between the center of the robot and the target object, and the first distance may be directly determined as the second distance when the center of the robot is unchanged from the target object during posture adjustment such as in-situ rotation of the robot.
It should be noted that the order of the step S120 and the step S130 is not limited.
And S140, calibrating the detection parameters of the second sensor and/or the detection distance of the second sensor according to the second distance and the sensor data.
By calibrating the detection parameters of the second sensor 20, the sensor data of the second sensor 20 when the detection direction is towards the target object can be adjusted so that the sensor data corresponds to the accurate second distance determined in step S130; thereby enabling calibration of the second sensor 20.
In some embodiments, the second sensor 20 comprises a transmitting device, such as an infrared transmitting tube, and a receiving device, such as an infrared receiving tube; of course, the present invention is not limited thereto, and the transmitting device is an ultrasonic transmitting tube, and the receiving device is an ultrasonic receiving tube, for example.
The detection waves (such as infrared light or ultrasonic waves) emitted by the emitting device to the object are reflected by the object and then received by the receiving device; the sensor data includes a reception parameter of the probe wave received by the reception device, and the probe parameter includes a transmission parameter of the probe wave transmitted by the transmission device. For example, the emission parameters of the infrared emission tube comprise emission intensity and/or infrared light frequency, and the receiving parameters of the infrared receiving tube comprise photocurrent intensity; the transmitting parameters of the ultrasonic transmitting tube comprise ultrasonic frequency, and the receiving parameters of the ultrasonic receiving tube comprise ultrasonic transmission time.
Illustratively, the calibration of the detection parameters of the second sensor 20 in step S140 according to the second distance and the sensor data includes the following steps S141 to S143.
Step S141, determining a target receiving parameter corresponding to the second distance based on a corresponding relation between the preset distance and the receiving parameter; for example, it is determined that the target receiving parameter corresponding to the second distance is that the photo current intensity of the infrared receiving tube is X, that is, it may be determined that the distance between the robot corresponding to the second sensor 20 and the target object is equal to the second distance when the photo current intensity of the infrared receiving tube is X. However, when the accuracy of the second sensor 20 is insufficient due to aging or the like, if the distance between the robot corresponding to the second sensor 20 and the target object is equal to the second distance, the photocurrent intensity of the infrared receiving tube may not be equal to X.
Step S142, when the detection direction of the second sensor 20 is towards the target object, adjusting the current emission parameter of the second sensor 20 according to the current reception parameter of the second sensor 20 and the target reception parameter, so that the difference between the current reception parameter of the second sensor 20 and the target reception parameter is less than or equal to a difference threshold; for example, after the gesture of the robot is adjusted in step S120, the emission intensity and/or the infrared light frequency of the infrared emission tube may be adjusted so that the photocurrent intensity of the infrared receiving tube is equal to X.
Step S143, determining the current emission parameter of the second sensor 20 as the target emission parameter of the second sensor 20 when the difference between the current reception parameter of the second sensor 20 and the target reception parameter is less than or equal to the difference threshold; i.e. the second sensor 20 subsequently operates with the target emission parameter. So that when the infrared transmitting tube of the second sensor 20 transmits infrared light with the target transmitting parameter, at least the photocurrent intensity of the infrared receiving tube at the second distance can be ensured to be equal to X, and of course, the photocurrent intensities corresponding to other distances can also be caused to correspond to the distance, thereby realizing the calibration of the second sensor 20.
In other embodiments, the second sensor 20 includes a transmitting device and a receiving device, and the probe wave transmitted by the transmitting device to the object is received by the receiving device after being reflected by the object; the sensor data includes a reception parameter of the probe wave received by the reception device.
Illustratively, the calibrating the detection parameter of the second sensor 20 according to the second distance and the sensor data in step S140 includes the following steps S144 to S145.
And step S144, according to the second distance and the receiving parameter in the sensor data, adjusting the corresponding relation between the preset distance and the receiving parameter so that the receiving parameter in the sensor data corresponds to the second distance in the adjusted corresponding relation between the distance and the receiving parameter.
As shown in fig. 7, taking the second sensor 20 as an ultrasonic receiving tube, the receiving parameters include the transmission time of the ultrasonic wave as an example, the calibration of the second sensor 20 is achieved by adjusting the corresponding relationship between the detection distance of the second sensor 20 and the transmission time of the ultrasonic wave detected by the second sensor 20.
For example, step S144 adjusts the corresponding relationship between the preset distance and the receiving parameter according to the second distance and the receiving parameter in the sensor data, including: based on the corresponding relation between the preset distance and the receiving parameter, determining the distance corresponding to the receiving parameter in the sensor data as the detection distance before calibration; determining the ratio of the second distance to the detection distance before calibration, and multiplying the distance corresponding to each receiving parameter in the corresponding relation by the ratio; or determining a difference value between the second distance and the detection distance before calibration, and adding the difference value to the distance corresponding to each receiving parameter in the corresponding relation. For example, a ratio or a difference for calibrating the distance-receiving parameter curve is determined according to the accurate second distance and the actual receiving parameter of the second sensor 20, and the distance-receiving parameter curve is corrected according to the ratio or the difference, so as to realize the calibration of the second sensor 20.
Taking the second sensor 20 as an ultrasonic receiving tube, the receiving parameters include the transmission time of ultrasonic waves as an example, and the adjusted corresponding relationship between the distance and the receiving parameters can be represented as a distance-transmission time curve represented by a solid line in fig. 7; the adjusted correspondence between the distance and the transmission time may be used to accurately determine the detection distance of the second sensor 20.
Step S145, determining the detection distance of the second sensor 20 according to the reception parameter of the second sensor 20 based on the adjusted correspondence between the distance and the reception parameter.
In other embodiments, exemplary calibrating the detection parameter of the second sensor 20 and the detection distance of the second sensor 20 according to the second distance and the sensor data in step S140 includes: step S146, determining a target receiving parameter corresponding to the second distance based on the corresponding relation between the preset distance and the receiving parameter; step S147, when the detection direction of the second sensor 20 is towards the target object, adjusting the current emission parameter of the second sensor 20 according to the current reception parameter of the second sensor 20 and the target reception parameter, so as to make the difference between the current reception parameter of the second sensor 20 and the target reception parameter reach a minimum value; step S148, according to the current receiving parameter and the second distance when the difference reaches the minimum value, the corresponding relation between the preset distance and the receiving parameter is adjusted, so that the current receiving parameter when the difference reaches the minimum value corresponds to the second distance in the adjusted corresponding relation between the distance and the receiving parameter; step S149, determining the detection distance of the second sensor 20 according to the reception parameter of the second sensor 20 based on the adjusted correspondence between the distance and the reception parameter.
For example, when the minimum value of the difference between the current receiving parameter of the second sensor 20 and the target receiving parameter is greater than the difference threshold, the second sensor 20 may not be completely calibrated by adjusting the transmitting parameter, and the calibration of the second sensor 20 may also be implemented by combining the corresponding relationship between the adjusting distance and the receiving parameter.
In some embodiments, the determining that the distance between the robot detected by the first sensor and the target object is a first distance includes: and when the distance between the robot detected by the first sensor and the target object can trigger the robot to execute a corresponding preset function, determining the distance detected by the first sensor as a first distance. In other words, the first distance may be a distance determination that is capable of triggering the robot to perform a corresponding preset function.
Illustratively, the first sensor 10 comprises an edge sensor and the second sensor 20 comprises an obstacle avoidance sensor. Wherein the edge sensor generally adopts a distance sensor with higher precision, such as a PSD sensor. The obstacle avoidance sensor generally adopts a distance sensor with lower precision, such as an infrared obstacle avoidance sensor, and comprises an infrared transmitting tube and an infrared receiving tube, wherein the infrared transmitting tube transmits infrared light, the infrared transmitting tube reflects back through an obstacle and is received by the infrared receiving tube, and the distance between the infrared transmitting tube and the obstacle is judged according to the received photocurrent intensity.
Because the robot is generally provided with the first sensor having higher accuracy in the distance range in which the robot can be triggered to perform the corresponding preset function when designing, the distance in which the robot is triggered to perform the corresponding preset function of the first sensor can be determined to be the first distance in these cases, and the first sensor 10 can more accurately determine the first distance, so that the calibration effect on the second sensor 20 can be improved.
Optionally, when the distance between the robot and the target object detected by the first sensor can trigger the robot to execute the edge steering function corresponding to the edge sensor, determining the distance detected by the first sensor as a first distance.
For example, the distance that can trigger the robot to perform the edgewise steering function may be an edgewise distance. The robot can carry out edge-following work according to the edge-following distance in the moving process, such as edge-following cleaning of obstacles such as walls and the like. When the robot performs the limbic task, the distance between the left side or the right side of the robot and the target object is determined by the limbic sensor, and the traveling direction of the robot is controlled according to the distance so that the distance between the left side or the right side of the robot and the target object is approximately equal to the limbic distance. For example, when the robot performs left side edge working, the robot is controlled to move towards a distance close to the target object when the distance between the left side of the robot and the target object is larger than the edge distance; and controlling the robot to move to a distance away from the target object when the distance between the left side of the robot and the target object is smaller than the edge distance.
When the second sensor is calibrated, the calibrated second sensor is expected to have higher precision in a distance range when the robot can be triggered to execute the corresponding preset function, in these cases, the distance when the robot is triggered to execute the corresponding preset function of the second sensor can be determined to be the first distance, the calibration effect on the second sensor 20 can be improved, for example, the detection precision of the second sensor 20 near the obstacle avoidance distance can be better improved.
Illustratively, the first sensor 10 comprises an edge sensor and the second sensor 20 comprises an obstacle avoidance sensor. And when the distance between the robot detected by the first sensor and the target object can trigger the robot to execute the obstacle avoidance function corresponding to the obstacle avoidance sensor, determining the distance detected by the first sensor as a first distance.
For example, the distance that can trigger the robot to perform the obstacle avoidance function may be an obstacle avoidance distance. The robot keeps away the barrier according to keeping away the barrier distance in the removal in-process, can prevent that the robot from hitting the preceding target object. For example, the distance between the front side of the robot and the target object is determined through the obstacle avoidance sensor, and the robot is controlled to adjust the travelling direction when the distance is smaller than or equal to the obstacle avoidance distance so as to enable the robot to be far away from the target object.
For example, referring to fig. 4, the first sensor 10 includes a ground penetrating sensor disposed at a front side of the robot body; the second sensor 20 includes cliff sensors, for example, a plurality of cliff sensors may be provided in the left front, right front, left rear, and right rear of the robot, respectively. The ground penetrating sensor usually adopts a sensor with higher precision, such as a TOF sensor, calculates the distance through the flight time of light, and judges the distance of the obstacle. The cliff sensor usually adopts a sensor with lower precision, such as an infrared cliff sensor, and comprises an infrared transmitting tube and an infrared receiving tube, wherein the infrared transmitting tube emits infrared light, the infrared light is reflected back by an obstacle and is received by the infrared receiving tube, and the distance between the infrared transmitting tube and the obstacle is judged according to the received photocurrent intensity.
When the distance between the robot detected by the first sensor and the target object can trigger the robot to execute the anti-falling function, determining the distance detected by the first sensor as a first distance.
For example, the distance that can trigger the robot to perform the anti-drop function may be a floating distance threshold. For example, the method further comprises: and determining the distance between the position corresponding to the ground penetrating sensor and the object below by the robot body through the ground penetrating sensor, and controlling the robot to execute a falling prevention function, such as controlling the robot to stop moving forwards and/or controlling the robot to move backwards when the distance is greater than or equal to the suspension distance threshold value. The robot can be prevented from falling down when moving forward.
For example, the method further comprises: and determining the distance between the robot body and the object below the cliff sensor at the position corresponding to the cliff sensor through the cliff sensor, and controlling the robot to execute a falling prevention function when the distance is greater than or equal to the suspension distance threshold, for example, controlling the robot to stop moving towards the direction corresponding to the cliff sensor and/or controlling the robot to move towards the direction opposite to the direction corresponding to the cliff sensor, wherein the direction corresponding to the cliff sensor is determined according to the direction of the center of the robot body towards the cliff sensor. The robot can be prevented from falling down when backing or turning.
Triggering the robot to realize a corresponding anti-falling function when the distance between the ground penetrating sensor and the cliff sensor and the object below is larger than or equal to the suspension distance threshold value; in other words, the flying distance threshold is a distance that can trigger the robot to perform a fall arrest function. The robot is generally provided with a ground penetrating sensor with higher precision in a distance range corresponding to a suspension distance threshold value when in design, and the calibration effect on the cliff sensor can be improved by calibrating the cliff sensor when the distance between the ground penetrating sensor and an object below is larger than or equal to the suspension distance threshold value, for example, the detection precision of the cliff sensor near the suspension distance threshold value can be better improved. Or when calibrating the cliff sensor, the calibrated cliff sensor is expected to have higher precision in a distance range when the robot can be triggered to execute the corresponding anti-falling function, and the calibration effect on the cliff sensor can be improved by determining the distance when the robot is triggered to execute the anti-falling function corresponding to the second sensor as the first distance, for example, the detection precision of the cliff sensor near a suspension distance threshold can be better improved.
In some embodiments, the steps of the sensor calibration method are performed when a preset task corresponding to the first sensor is performed by the first sensor, and a time interval with a last calibration of the second sensor is greater than or equal to a time interval threshold. By calibrating the robot when executing the preset task, the efficiency of calibrating the sensor can be improved compared with the calibration task which is executed by specially controlling the robot, the preset task can be continuously executed after the calibration, and the distance measurement accuracy of the sensor when the preset task is improved.
Illustratively, the first sensor comprises an edge sensor and the second sensor comprises an obstacle avoidance sensor; and executing the sensor calibration method when the robot executes the edge task and the interval time length with the last calibration of the obstacle avoidance sensor is greater than or equal to an interval time length threshold value.
Referring to fig. 5 in conjunction with fig. 3, the step of the sensor calibration method is performed when the robot performs an edge cleaning task and the interval time length of the last calibration of the obstacle avoidance sensor is greater than or equal to an interval time length threshold, for example, one month; for example, when the first distance is equal to a preset distance, the robot is controlled to perform posture adjustment so that the detection direction of the second sensor 20 is directed toward the target object, and the following steps S120 to S140 are performed. By performing calibration when the robot performs the edgewise task, the efficiency of sensor calibration can be improved as compared with a method of specially controlling the robot to perform an independent calibration task, and the edgewise task can be continuously performed after calibration, thereby improving obstacle avoidance accuracy when the edgewise task is performed.
Illustratively, the first sensor comprises a ground penetrating sensor and the second sensor comprises a cliff sensor; and executing the sensor calibration method when the ground penetrating sensor determines that the distance between the position corresponding to the ground penetrating sensor and the object below the ground penetrating sensor of the robot body is larger than or equal to a preset suspension distance threshold value and the interval time length of the robot body calibrated last time with the cliff sensor is larger than or equal to an interval time length threshold value.
For example, referring to fig. 6 in conjunction with fig. 4, when the ground penetrating sensor determines that the distance between the location corresponding to the ground penetrating sensor and the object below the ground penetrating sensor is greater than or equal to the suspension distance threshold, and the interval time length with the cliff sensor last calibrated is greater than or equal to the interval time length threshold, the steps of the sensor calibration method are executed.
According to the sensor calibration method of the robot, the robot comprises the first sensor and the second sensor, the first sensor and the second sensor are distance sensors, and the distance measurement precision of the first sensor is higher than that of the second sensor; the method comprises the following steps: determining a distance between the robot detected by the first sensor and the target object as a first distance; acquiring sensor data of a target object by a second sensor; determining a second distance between the robot corresponding to the second sensor and the target object according to the first distance; and calibrating the detection parameters of the second sensor and/or the detection distance of the second sensor according to the second distance and the sensor data. By calibrating the second sensor with the first distance detected by the first sensor, the accuracy of the distance detection of the second sensor can be improved, and for example, functional failure caused by the aging of the device and structure of the second sensor can be prevented.
Referring to fig. 8 in combination with the above embodiments, fig. 8 is a schematic block diagram of a robot according to an embodiment of the present application.
The robot comprises a first sensor 10 and a second sensor 20, wherein the first sensor 10 and the second sensor 20 are distance sensors, and the positions and/or detection directions of the first sensor 10 and the second sensor 20 on the robot are different.
The robot may further comprise a walking unit, which is a component related to the movement of the robot, for example comprising driving wheels and universal wheels. The universal wheel and the driving wheel are matched to realize the steering and the movement of the robot. Of course, it is not limited thereto, and may be, for example, a crawler-type or foot-type walking unit.
The robot further includes: a processor 301 and a memory 302, the memory 302 being for storing a computer program.
The processor 301 and the memory 302 are illustratively connected by a bus 303, such as an I2C (Inter-integrated Circuit) bus, for example.
Specifically, the processor 301 may be a Micro-controller Unit (MCU), a central processing Unit (Central Processing Unit, CPU), a digital signal processor (Digital Signal Processor, DSP), or the like.
Specifically, the Memory 302 may be a Flash chip, a Read-Only Memory (ROM) disk, an optical disk, a U-disk, a removable hard disk, or the like.
The processor 301 is configured to execute a computer program stored in the memory 302, and implement the steps of the sensor calibration method of the robot according to the embodiment of the present application when the computer program is executed.
The specific principles and implementation manners of the robot provided in the embodiments of the present application are similar to those of the foregoing embodiments, and are not repeated here.
Embodiments of the present application also provide a computer-readable storage medium storing a computer program which, when executed by a processor, causes the processor to implement the steps of the method of any of the embodiments described above.
The computer readable storage medium may be an internal storage unit of the robot according to any one of the foregoing embodiments, for example, a hard disk or a memory of the robot. The computer readable storage medium may also be an external storage device of the robot, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the robot.
It is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
It should also be understood that the term "and/or" as used in this application and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
While the invention has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (14)
1. The sensor calibration method of the robot is characterized in that the robot comprises a first sensor and a second sensor, the first sensor and the second sensor are distance sensors, and the distance measurement precision of the first sensor is higher than that of the second sensor; the method comprises the following steps:
determining a distance between the robot detected by the first sensor and a target object as a first distance;
acquiring sensor data of the second sensor on the target object;
Determining a second distance between the robot corresponding to the second sensor and the target object according to the first distance;
and calibrating the detection parameters of the second sensor and/or the detection distance of the second sensor according to the second distance and the sensor data.
2. The sensor calibration method according to claim 1, wherein the first sensor and the second sensor are each provided on a peripheral side of the robot body, and detection directions of the first sensor and the second sensor on the robot are different, the acquiring sensor data of the target object by the second sensor, comprising:
controlling the robot to perform gesture adjustment so that the detection direction of the second sensor is toward the target object, wherein the gesture adjustment comprises anticlockwise rotation or clockwise rotation;
and acquiring sensor data of the second sensor on the target object when the detection direction of the second sensor is towards the target object.
3. The sensor calibration method according to claim 2, wherein the determining a second distance between the robot corresponding to the second sensor and the target object according to the first distance includes;
Determining a second distance between the robot corresponding to the second sensor and the target object according to the sum of the first distance and a preset first adjustment distance;
the preset first adjustment distance is determined at least according to the positions of the first sensor and the second sensor on the robot.
4. The sensor calibration method according to claim 1, wherein the detection directions of the first sensor and the second sensor are the same and the positions on the robot body are different, the determining a second distance between the robot corresponding to the second sensor and the target object according to the first distance includes;
determining a second distance between the robot corresponding to the second sensor and the target object according to the sum of the first distance and a preset second adjustment distance;
the preset second adjustment distance is determined at least according to the positions of the first sensor and the second sensor on the robot.
5. The sensor calibration method of claim 4, wherein the acquiring sensor data of the target object by the second sensor comprises:
Controlling the robot to perform pose adjustment so that the detection direction of the second sensor faces the position on the target object and is the same as the position on the target object when the first distance between the robot and the target object is detected by the first sensor, wherein the pose adjustment comprises position adjustment and/or pose adjustment;
and acquiring sensor data of the second sensor on the target object when the detection direction of the second sensor is towards the position on the target object and the first distance between the robot and the target object detected by the first sensor is towards the position on the target object.
6. The sensor calibration method according to claim 1, wherein the second sensor includes a transmitting device and a receiving device, and a probe wave transmitted to an object by the transmitting device is received by the receiving device after being reflected by the object; the sensor data comprises receiving parameters of detection waves received by the receiving device, and the detection parameters comprise transmitting parameters of the detection waves transmitted by the transmitting device;
the calibrating the detection parameter of the second sensor according to the second distance and the sensor data comprises the following steps:
Determining a target receiving parameter corresponding to the second distance based on a corresponding relation between the preset distance and the receiving parameter;
when the detection direction of the second sensor faces the target object, according to the current receiving parameter of the second sensor and the target receiving parameter, adjusting the current transmitting parameter of the second sensor so that the difference value between the current receiving parameter of the second sensor and the target receiving parameter is smaller than or equal to a difference value threshold;
and when the difference value between the current receiving parameter of the second sensor and the target receiving parameter is smaller than or equal to the difference value threshold value, determining the current transmitting parameter of the second sensor as the target transmitting parameter of the second sensor.
7. The sensor calibration method according to claim 1, wherein the second sensor includes a transmitting device and a receiving device, and a probe wave transmitted to an object by the transmitting device is received by the receiving device after being reflected by the object; the sensor data includes a reception parameter of the probe wave received by the reception device;
the calibrating the detection distance of the second sensor according to the second distance and the sensor data comprises the following steps:
According to the second distance and the receiving parameter in the sensor data, adjusting the corresponding relation between the preset distance and the receiving parameter so that the receiving parameter in the sensor data corresponds to the second distance in the corresponding relation between the adjusted distance and the receiving parameter;
and determining the detection distance of the second sensor according to the receiving parameter of the second sensor based on the adjusted corresponding relation between the distance and the receiving parameter.
8. The method for calibrating a sensor according to claim 7, wherein the adjusting the correspondence between the preset distance and the receiving parameter according to the second distance and the receiving parameter in the sensor data comprises:
based on the corresponding relation between the preset distance and the receiving parameter, determining the distance corresponding to the receiving parameter in the sensor data as the detection distance before calibration;
determining the ratio of the second distance to the detection distance before calibration, and multiplying the distance corresponding to each receiving parameter in the corresponding relation by the ratio; or determining a difference value between the second distance and the detection distance before calibration, and adding the difference value to the distance corresponding to each receiving parameter in the corresponding relation.
9. The sensor calibration method according to claim 1, wherein the second sensor includes a transmitting device and a receiving device, and a probe wave transmitted to an object by the transmitting device is received by the receiving device after being reflected by the object; the sensor data comprises receiving parameters of detection waves received by the receiving device, and the detection parameters comprise transmitting parameters of the detection waves transmitted by the transmitting device;
the calibrating the detection parameter of the second sensor and the detection distance of the second sensor according to the second distance and the sensor data comprises the following steps:
determining a target receiving parameter corresponding to the second distance based on a corresponding relation between the preset distance and the receiving parameter;
when the detection direction of the second sensor faces the target object, the current emission parameter of the second sensor is adjusted according to the current receiving parameter of the second sensor and the target receiving parameter, so that the difference value between the current receiving parameter of the second sensor and the target receiving parameter reaches the minimum value;
according to the current receiving parameter and the second distance when the difference reaches the minimum value, the corresponding relation between the preset distance and the receiving parameter is adjusted, so that the current receiving parameter and the second distance when the difference reaches the minimum value correspond to the corresponding relation between the adjusted distance and the receiving parameter;
And determining the detection distance of the second sensor according to the receiving parameter of the second sensor based on the adjusted corresponding relation between the distance and the receiving parameter.
10. The sensor calibration method according to any one of claims 1-9, wherein the determining that the distance between the robot detected by the first sensor and the target object is a first distance comprises:
and when the distance between the robot detected by the first sensor and the target object can trigger the robot to execute a corresponding preset function, determining the distance detected by the first sensor as a first distance.
11. The sensor calibration method of claim 10, wherein the first sensor comprises an edge sensor and the second sensor comprises an obstacle avoidance sensor; when the distance between the robot detected by the first sensor and the target object can trigger the robot to execute a corresponding preset function, determining the distance detected by the first sensor as a first distance includes: determining that the distance detected by the first sensor is a first distance when the distance between the robot and a target object detected by the first sensor can trigger the robot to perform an edge-wise steering function or an obstacle avoidance function; or alternatively
The first sensor comprises a ground penetrating sensor, and the second sensor comprises a cliff sensor; when the distance between the robot detected by the first sensor and the target object can trigger the robot to execute a corresponding preset function, determining the distance detected by the first sensor as a first distance includes: when the distance between the robot detected by the first sensor and the target object can trigger the robot to execute the anti-falling function, determining the distance detected by the first sensor as a first distance.
12. The sensor calibration method according to any one of claims 1 to 9, characterized in that the steps of the sensor calibration method are performed when a preset task corresponding to the first sensor is performed by the first sensor and a time interval with the last calibration of the second sensor is greater than or equal to a time interval threshold.
13. A robot, characterized in that the robot comprises a first sensor and a second sensor, wherein the first sensor and the second sensor are distance sensors, and the positions and/or detection directions of the first sensor and the second sensor on the robot are different;
The robot further includes: a processor and a memory for storing a computer program; the processor is configured to execute the computer program and to implement the steps of the sensor calibration method of a robot according to any of the claims 1-12 when the computer program is executed.
14. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program which, when executed by a processor, causes the processor to carry out the steps of the sensor calibration method of a robot according to any one of claims 1-12.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311441483.4A CN117607810A (en) | 2023-10-31 | 2023-10-31 | Sensor calibration method for robot, and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311441483.4A CN117607810A (en) | 2023-10-31 | 2023-10-31 | Sensor calibration method for robot, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117607810A true CN117607810A (en) | 2024-02-27 |
Family
ID=89945140
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311441483.4A Pending CN117607810A (en) | 2023-10-31 | 2023-10-31 | Sensor calibration method for robot, and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117607810A (en) |
-
2023
- 2023-10-31 CN CN202311441483.4A patent/CN117607810A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11389040B2 (en) | Cleaning mode control method and cleaning robot | |
US12064082B2 (en) | Cleaning robot and controlling method thereof | |
AU2019236712B2 (en) | Cleaning robot and controlling method thereof | |
CN110403539B (en) | Cleaning control method for cleaning robot, and storage medium | |
WO2020140516A1 (en) | Mobile robot | |
CN110477820B (en) | Obstacle following cleaning method for cleaning robot, and storage medium | |
CN111796290B (en) | Ground detection method, ground detector and autonomous mobile device | |
CN109645896B (en) | Method for cleaning floor, control device, cleaning robot and storage medium | |
CN111526768A (en) | Mobile device for cleaning and control method thereof | |
US10542858B2 (en) | Self-propelled electronic device and travel method for self-propelled electronic device | |
CN111166248A (en) | Cleaning robot, autonomous charging method and autonomous charging system | |
JP2003330543A (en) | Rechargeable autonomous driving system | |
CN110495825B (en) | Obstacle crossing method for cleaning robot, and storage medium | |
CN111857153B (en) | Distance detection device and robot sweeps floor | |
KR20220021980A (en) | Cleaning robot and controlling method thereof | |
EP4280010A1 (en) | Obstacle detection apparatus of cleaning robot and cleaning robot | |
CN117607810A (en) | Sensor calibration method for robot, and storage medium | |
CN216167276U (en) | Self-moving robot | |
CN109452905B (en) | Autonomous mobile device | |
KR102121458B1 (en) | Method for path finding of robot cleaner for automatic charging and robot cleaner using the same | |
US20230248203A1 (en) | Method for controlling robot cleaner | |
CN114608849A (en) | Sweeper state detection method and device, electronic equipment and storage medium | |
CN117665849A (en) | Terrain detection method for robot, control method for robot, and storage medium | |
CN113854900B (en) | Self-moving robot | |
CN218528622U (en) | Self-moving equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |