CN107943025A - The trapped detection method of robot and the processing method got rid of poverty - Google Patents
The trapped detection method of robot and the processing method got rid of poverty Download PDFInfo
- Publication number
- CN107943025A CN107943025A CN201711098354.4A CN201711098354A CN107943025A CN 107943025 A CN107943025 A CN 107943025A CN 201711098354 A CN201711098354 A CN 201711098354A CN 107943025 A CN107943025 A CN 107943025A
- Authority
- CN
- China
- Prior art keywords
- robot
- grid
- poverty
- unit
- grid cell
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 31
- 238000003672 processing method Methods 0.000 title claims abstract description 8
- 230000004888 barrier function Effects 0.000 claims abstract description 30
- 238000000034 method Methods 0.000 claims abstract description 21
- 238000000926 separation method Methods 0.000 claims description 4
- 230000010354 integration Effects 0.000 claims description 3
- 230000000694 effects Effects 0.000 description 6
- 241000209094 Oryza Species 0.000 description 2
- 235000007164 Oryza sativa Nutrition 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 235000009566 rice Nutrition 0.000 description 2
- 241001417527 Pempheridae Species 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
Abstract
The present invention relates to a kind of trapped detection method of robot and the processing method got rid of poverty, robot is by searching for grating map, it can determine which grid cell has been passed by, then judge whether these grid cells passed by can directly connect current grid unit and target grid cell, if can directly it connect, then show that robot has outlet to reach source location, robot is not caught in;If can not directly connect, show the hopeless arrival source location of robot, robot is caught in.It is this that the whether trapped method of robot, advantage of lower cost are judged by grating map.In addition, after definite robot is stranded, by the path of getting rid of poverty for searching for robot in grating map, allow robot by path walking of getting rid of poverty, when robot ambulation is to when encountering barrier or reaching the border of local map, do not get rid of poverty also, then different walking manners is selected according to current state, got rid of poverty with final realization.
Description
Technical field
The present invention relates to robot field, and in particular to a kind of trapped detection method of robot and the processing side to get rid of poverty
Method.
Background technology
When sweeping robot is walked in complex environment, sometimes into some narrow regions, due to surrounding objects
Limitation, robot is often stranded can not to come out of in this region.For the problem, the prior art proposes a kind of sweeper
Device people is stranded detection method and method of getting rid of poverty, by being uniformly arranged four ultrasonic sensors and four rangings biographies in fuselage surrounding
Sensor, detects obstacle information, if it is determined that robot is besieged by barrier, then by ultrasonic sensor and distance measuring sensor
Path of getting rid of poverty is found using photographing module.Although this method can solve the problems, such as that robot is besieged, due to needing
Using more ultrasonic sensor and distance measuring sensor, cause the cost of robot to increase, be unsuitable for promoting and applying.
The content of the invention
To solve the above problems, the present invention provides a kind of trapped detection method of robot and the processing method got rid of poverty,
It can determine whether robot is stranded in the case where not increasing additional sensors, and can have after definite robot is stranded
Effect is got rid of poverty.The concrete technical scheme of the present invention is as follows:
A kind of trapped detection method of robot, includes the following steps:
Grating map is searched for, determines the grid cell that robot has passed by;
Judge whether there is the grid cell all passed by by robot from current grid unit to target grid cell
The raster path of composition;
If it is, determine that robot is not stranded;
If it is not, then determine that robot is stranded;
Wherein, the grid cell corresponding to current location point of the current grid unit residing for robot, the target grid
Lattice unit wants the grid cell corresponding to the source location of arrival for robot.
Further, described search grating map, determines the grid cell that robot has passed by, includes the following steps:
The unit on the basis of the current grid unit, searches for along predetermined direction and judges the grid cell adjacent with reference cell
Whether it is grid cell that robot has passed by;
If it is, the grid cell that recorder people has passed by, and passed by and do not had with each robot of record respectively
As unit on the basis of the grid cell for crossing reference cell, continue that the grid adjacent with reference cell are searched for and judged along predetermined direction
Whether lattice unit is grid cell that robot has passed by, and so on, to the border for searching grating map;
If it is not, then do not record;
Wherein, for the same grid cell passed by, only record once.
Further, the predetermined direction is the up, down, left and right four direction of the reference cell, or is described
Reference cell it is upper, upper it is 45 ° to the right, right, right avertence is 45 ° lower, under, under 45 ° of eight directions in 45 ° to the left, left and left avertence.
Further, judge to be not present all by robot from current grid unit to target grid cell described
After the step of raster path that the grid cell passed by is formed, before the step whether definite robot is stranded, also wrap
Include following steps:
The quantity for the grid cell that the robot recorded of statistics and the current grid unit direct neighbor has passed by, and
The grid cell that the robot recorded being connected with the current grid unit by the grid cell passed by has passed by
Quantity;
The area for the grid cell that the robot recorded that counting statistics goes out has passed by;
Judge whether the area is less than preset area;
If it is not, then the step not being stranded into the definite robot;
If it is, the step being stranded into the definite robot.
Further, the preset area is greater than or equal to 1.5 square metres, less than or equal to 2 square metres.
The processing method that a kind of robot gets rid of poverty, includes the following steps:
Step 1, based on the above-mentioned trapped detection method of robot, determines that robot is stranded;
Step 2, based on grating map, the point centered on the current location of robot, towards preset direction searching route, calculates every
The obstacle unit and the quantity of steep cliff unit included in bar searching route;
Step 3, selects the path of minimum number of the obstacle unit and steep cliff unit included as path of getting rid of poverty;
Step 4, the border encountered barrier or reach local map is run to along the path of getting rid of poverty;
Step 5, then based on the above-mentioned trapped detection method of robot, judge whether robot is stranded;
The success if it is not, then robot gets rid of poverty;
If it is, judging that robot is to encounter barrier, or the border of local map is reached, if encountering barrier,
Six are then entered step, if reaching the border of local map, then enters step seven;
Step 6, the walking of obstacle thing, and the air line distance run to from the location point for encountering barrier be more than first it is default away from
From when stop, being then based on the trapped detection method of above-mentioned robot and judge whether robot is stranded, if it is, returning to step
Rapid two, the success if it is not, then robot gets rid of poverty;
Step 7, continuation are walked forward, and when air line distance for running to distance center point is more than the second pre-determined distance stops, or
The air line distance that person runs to distance center point is less than or equal to the second pre-determined distance and stops when encountering barrier, is then based on
The trapped detection method of above-mentioned robot judges whether robot is stranded, if it is, step 2 is returned to, if it is not, then machine
Device people gets rid of poverty success;
Wherein, the obstacle unit detects grid cell corresponding during barrier for robot, and the steep cliff unit is machine
Device people detects grid cell corresponding during steep cliff, and the local map is to have setting length in grating map and set wide
The area map of degree.
Further, in step 6 and step 7, machine is judged if based on the above-mentioned trapped detection method of robot
The trapped number continuous integration of device people reaches preset times, it is determined that robot can not get rid of poverty, out of service and report an error.
Further, the preset direction for the central point it is upper, upper it is 45 ° to the right, right, right avertence is 45 ° lower, under, it is lower inclined
45 ° of eight directions in 45 ° left, left and left avertence.
Further, in the step 3, if comprising obstacle unit and steep cliff unit minimum number and quantity phase
Same path has two or more than two, then therefrom selects the path of path direction and current angular separation minimum to be used as and get rid of poverty
Path;If the minimum and identical path of angle has two or more than two, by robot on the basis of front direction, from
Middle the first paths for selecting to count along clockwise direction are as path of getting rid of poverty.
Further, first pre-determined distance in the step 6 is greater than or equal to 1.2 meters, less than or equal to 1.4
Rice;Second pre-determined distance in the step 7 is set length twice.
The beneficial effects of the present invention are:Robot is by searching for grating map, it may be determined which grid cell is
Pass by, then judge whether these grid cells passed by can directly connect current grid unit and target grid list
Member, if it is possible to directly connect, then show that robot has outlet to reach source location, robot is not caught in;If no
It can directly connect, then show the hopeless arrival source location of robot, robot is caught in.It is this by grid
Figure judges the whether trapped method of robot, it is not necessary to extra detection sensor is configured, so advantage of lower cost.In addition,
After definite robot is stranded, by searching for the path of getting rid of poverty of robot in grating map, allow robot by road of getting rid of poverty
Footpath is walked, and when robot ambulation to the border for encountering barrier or arrival local map, is not got rid of poverty also, then according to current
The different walking manner of condition selecting, is got rid of poverty with final realization.It is this to be got rid of poverty path by being searched in grating map, and de-
Different walking manners is taken according to the different conditions of robot during tired, can preferably solve that robot is trapped to be asked
Topic, it is not necessary to using camera, it is not required that carry out substantial amounts of analysis of image data and processing, cost of getting rid of poverty is low, effect of getting rid of poverty
Preferably.
Brief description of the drawings
Fig. 1 is the schematic diagram of search raster path of the present invention.
Fig. 2 is the schematic diagram that calculating robot of the present invention is stranded area.
Fig. 3 is a kind of flow chart of embodiment of the trapped detection method of robot of the present invention.
Fig. 4 is a kind of flow chart of the embodiment for the processing method that robot of the present invention gets rid of poverty.
Embodiment
The embodiment of the present invention is described further below in conjunction with the accompanying drawings:
Robot of the present invention is one kind of controlling intelligent household appliances, can rely on certain artificial intelligence, automatically in some fields
Conjunction is walked automatically.The body of robot is equipped with various sensors, can detect travel distance, walking angle, fuselage state
With barrier etc., such as encounter wall or other barriers, can voluntarily turn, and according to different settings, and different routes is walked, have
The walking of planning ground.Robot of the present invention includes such as lower structure:The machine for being capable of autonomous with driving wheel is man-machine
Body, body are equipped with human-computer interaction interface, and body is equipped with obstacle detection unit.Internal body is provided with inertial sensor, institute
Stating inertial sensor includes accelerometer and gyroscope etc., and driving wheel is equipped with the mileage for the travel distance for being used to detect driving wheel
Meter(Usually code-disc), the parameter that can handle related sensor is additionally provided with, and execution unit can be output a control signal to
Control module.
The trapped detection method of robot of the present invention, includes the following steps:Grating map is searched for, determines robot
The grid cell passed by;Judge whether have all to have been walked by robot from current grid unit to target grid cell
The raster path that the grid cell crossed is formed;If it is, determine that robot is not stranded;If it is not, then determine robot quilt
It is tired.Wherein, the grid cell corresponding to current location point of the current grid unit residing for robot, the target grid
Unit wants the grid cell corresponding to the source location of arrival for robot.The grating map is based on grid cell structure
Into map, grid cell is a kind of dummy unit lattice, and the area of each cell is identical.Robot in the process of walking, meeting
The grid cell that robot in grating map had walked is denoted as unit of having passed by, the grid cell for detecting barrier
Obstacle unit is denoted as, detecting the cell designation of steep cliff for steep cliff unit, etc..The method of the invention passes through robot
Search for grating map, it may be determined which grid cell has been passed by, and then judges whether is grid cell that these have passed by
Current grid unit and target grid cell can directly be connected, if it is possible to directly connect, then show that robot has outlet to arrive
Up to source location, robot is not caught in;If can not directly connect, show the hopeless arrival target of robot
Location point, robot are caught in.It is this that the whether trapped method of robot is judged by grating map, it is not necessary to which that configuration is extra
Detection sensor, advantage of lower cost.
Preferably, described search grating map, determines the grid cell that robot has passed by, includes the following steps:With institute
Unit on the basis of current grid unit is stated, is searched for along predetermined direction and judges whether the grid cell adjacent with reference cell is machine
The grid cell that device people has passed by;If it is, the grid cell that recorder people has passed by, and respectively with each machine of record
Device people has passed by and be not used as unit on the basis of the grid cell of reference cell, continue to search for and judge along predetermined direction with
Whether the adjacent grid cell of reference cell is grid cell that robot has passed by, and so on, to searching grating map
Border;If it is not, then do not record.Wherein, for the same grid cell passed by, only record once.Wherein, this is recorded
A little information are for the ease of the trapped area of follow-up calculating robot.As shown in Figure 1, the grid that X is indicated in figure represents obstacle list
Member or steep cliff unit, the grid cell that robot passed by is represented with the grid of other letter designations outside X.Grid A is to work as
Preceding grid cell, grid Z are target grid cell.Using the up, down, left and right four direction of reference cell as predetermined direction, into
The explanation of row the present embodiment.First using grid A as reference cell, it is respectively grid to search for its grid cell adjacent up and down
B, grid F, grid D and grid H, these are all the grid cells that robot has passed by, and are made a record.Base is used as using grid B again
Quasi- unit, it is respectively grid X, grid A, grid C and grid X to search for its grid cell adjacent up and down, it can be deduced that two
A grid X is obstacle unit or steep cliff unit, is not kept a record, the grid cell that grid A and grid C have passed by for robot, and
Recorded when grid A is as reference cell, and so not repeating record grid A, only recorded grid C.Similarly, respectively with grid F,
Unit scans on the basis of grid D and grid H, and has recorded the grid cell that robot has passed by and be not recorded.
Then, using the grid C adjacent with grid B as reference cell, it is respectively grid to search for its grid cell adjacent up and down
X, grid D, grid J and grid B;Since grid X is obstacle unit or steep cliff unit, do not keep a record, grid D and grid B are
Recorded, and be not repeated to record, so, only record grid J.Similarly, respectively with adjacent with grid F, grid D and grid H and do not have
Have and scanned for as unit on the basis of the grid for crossing reference cell.And so on, grid are searched always by above-mentioned way of search
The border of lattice map, that is, complete the search of whole grating map.Finally, as can be seen from the figure have from grid A to grid Z
There is the raster path that the grid cell passed by by robot is formed.Certainly, simply a part of grid cell is denoted in figure
State is illustrated, it is possible to can also be there are other from grid A to the raster path of grid Z, details are not described herein.Pass through
The way of search of this grid cell can rapidly search for the raster path from current location to target location, accuracy compared with
It is high.
Preferably, the predetermined direction is the up, down, left and right four direction of the reference cell, or is the base
Quasi- unit it is upper, upper it is 45 ° to the right, right, right avertence is 45 ° lower, under, under 45 ° of eight directions in 45 ° to the left, left and left avertence.Above-mentioned implementation
Example is illustrated using the up, down, left and right four direction of reference cell as predetermined direction.The search of four direction is to search
The adjacent grid cell in four sides of rope and reference cell, and the search in eight directions is search more than the search of four direction
The grid cell adjacent with four angles of reference cell, search principle is same as the previously described embodiments, and details are not described herein.By this
The mode at adjacent side and/or adjacent angle searches for grid cell, can more accurately carry out route searching, avoids asking for searching route error
Topic.
Preferably, as shown in figure 3, judging that there is no all from current grid unit to target grid cell described
After the step of raster path that the grid cell passed by by robot is formed, determine the step that whether is stranded of robot it
Before, further include following steps:The grid that the robot recorded of statistics and the current grid unit direct neighbor has passed by
The quantity of unit, and walked with the robot recorded that the current grid unit is connected by the grid cell passed by
The quantity for the grid cell crossed;The area for the grid cell that the robot recorded that counting statistics goes out has passed by;Described in judgement
Whether area is less than preset area;If it is not, then the step not being stranded into the definite robot;If it is, into
The step that the definite robot is stranded.As shown in Fig. 2, grid A is current grid unit, grid Z is target grid cell.By
In since grid A, way of search of the robot as described in above-described embodiment, searches for and is formed less than the grid cell by having passed by
Arrival target grid cell raster path, so, robot need further confirm that at present be capable of activity region have it is more
Greatly, i.e., by counting with the current grid unit direct neighbor and passing through the grid cell passed by with current grid unit
The grid cell that the robot recorded of connection has passed by(Grid B shown in figure)Quantity, carry out counting statistics and go out
The area for the grid cell that the robot of record has passed by(The region area that i.e. robot can be movable).If grid B institutes group
Into area be less than preset area, then can determine that robot is stranded, otherwise, it may be determined that robot is not stranded.Robot
Be not in situation as shown in Figure 2 in actual walking process, why can search robot by obstacle unit or outstanding
The situation that precipice unit surrounds, is due to cartographic information wrong caused by the walking error of robot.It is inevitable in actual environment
Grid A can be reached by robot there are some entrance, simply this entrance is too small, and peripheral obstacle is too many, as long as robot
Somewhat there is a point tolerance in navigation, entrance will be mistakenly considered obstacle unit.Subsequent robot's still meeting during getting rid of poverty
According to actual walking situation renewal grating map, so as to find path of getting rid of poverty.Generally, robot is in that barrier is intensive, area
Less region, robot is more difficult to walk out from the region, just will be considered that stranded.If the mobilizable region ratio of robot
It is larger, but search for again less than the path for reaching target grid cell, the reason for being primarily due to map error caused, machine
People is not bottled up really.So this determine whether robot is trapped based on the movable area of robot
Method, can further improve the whether trapped accuracy of detection robot.
Preferably, the preset area is greater than or equal to 1.5 square metres, less than or equal to 2 square metres.If preset area
Set too big or too small, just lose the effect of area of detection, do not reach and improve the standard whether detection robot is caught in
The effect of true property.Wherein, optimal value is arranged to 1.8 square metres.
The processing method that robot of the present invention gets rid of poverty, as shown in figure 4, including the following steps:Step 1, based on upper
The trapped detection method of robot stated, determines that robot is stranded;Step 2, based on grating map, with the present bit of robot
Central point is set to, towards preset direction searching route, the obstacle unit and the number of steep cliff unit that are included in every searching route of calculating
Amount;Step 3, selects the path of minimum number of the obstacle unit and steep cliff unit included as path of getting rid of poverty;Step 4, edge
The path of getting rid of poverty runs to the border encountered barrier or reach local map;Step 5, then based on above-mentioned robot
Trapped detection method, judges whether robot is stranded;The success if it is not, then robot gets rid of poverty;If it is, judge robot
It is to encounter barrier, or reaches the border of local map, if encountering barrier, then enter step six, if reaches
The border of local map, then enter step seven;Step 6, the walking of obstacle thing, and run to from the location point for encountering barrier
Air line distance stop when being more than the first pre-determined distance, be then based on the trapped detection method of above-mentioned robot and judge robot
Whether it is stranded, if it is, step 2 is returned to, the success if it is not, then robot gets rid of poverty;Step 7, continuation are walked forward, parallel
Walk and stop when being more than the second pre-determined distance to the air line distance of distance center point, or run to the air line distance of distance center point
Stop less than or equal to the second pre-determined distance and when encountering barrier, be then based on the trapped detection method of above-mentioned robot and sentence
Whether disconnected robot is stranded, if it is, step 2 is returned to, the success if it is not, then robot gets rid of poverty.Wherein, the obstacle list
Member detects grid cell corresponding during barrier for robot;The steep cliff unit is right when detecting steep cliff for robot
The grid cell answered;The local map is to have setting length in grating map and set the area map of width, described to set
Measured length and the setting width can accordingly be set according to actual conditions, both can also be arranged to equal value.It is excellent
Choosing, the setting length is arranged to 3 meters or 2 meters, and the setting width is arranged to 2 meters or 1 meter.Method of the present invention exists
After determining that robot is stranded, by searching for grating map and using the minimum path of obstacle unit or steep cliff unit as machine
The path of getting rid of poverty of people, can reduce the situation that barrier or steep cliff are encountered by robot, to improve the efficiency of getting rid of poverty of robot.Machine
When device people walks by path of getting rid of poverty, when robot ambulation to the border for encountering barrier or arrival local map, if also
Do not get rid of poverty, then continue around barrier or continue certain distance of walking forward.Do not get rid of poverty also at this time, then return to step two
Path of getting rid of poverty is re-searched for, is walked on by above-mentioned identical mode, is got rid of poverty to final realization.It is this by grating map
Path of getting rid of poverty is searched for, and takes different walking manners according to the different conditions of robot during getting rid of poverty, can be preferable
Solve the problems, such as that robot is stranded, it is not necessary to using camera, it is not required that carry out substantial amounts of analysis of image data and processing, take off
Tired cost is low, and effect of getting rid of poverty is preferable.
Preferably, in step 6 and step 7, machine is judged if based on the above-mentioned trapped detection method of robot
The trapped number continuous integration of people reaches preset times, it is determined that robot can not get rid of poverty, out of service and report an error.Wherein, institute
Preset times are stated as the natural number more than or equal to 2, preferred value is 2 or 3.If walk 2 times by the above-mentioned mode of getting rid of poverty
Or 3 times can not still get rid of poverty, show that robot has been likely to occur exception, can not effectively get rid of poverty, at this time can it is out of service simultaneously
Report an error, freed from predicament robot by user, avoid robot from walking repeatedly in stranded region and cause power supply consumption net.
Preferably, the preset direction for the central point it is upper, upper it is 45 ° to the right, right, right avertence is 45 ° lower, under, under it is to the left
45 ° of eight directions in 45 °, a left side and left avertence.This eight directions are four be directed toward from central point with the grid cell where central point
The direction of the adjacent grid cell in a angle and four sides.By searching for the state of grid cell in all directions, barrier can be found out
Hinder the path of getting rid of poverty of thing and steep cliff less, robot walks by this path of getting rid of poverty, can improve efficiency of getting rid of poverty, avoid frequent impact
The problem of wheel efficiency caused by barrier or the steep cliff that detours is low.
Preferably, in the step 3, if comprising obstacle unit it is identical with the minimum number and quantity of steep cliff unit
Path have two or more than two, then therefrom select the path conduct of path direction and current angular separation minimum to get rid of poverty road
Footpath;If the minimum and identical path of angle has two or more than two, by robot on the basis of front direction, therefrom
The first paths for selecting to count along clockwise direction are as path of getting rid of poverty.Due to that in path search process, may search for
Have to the minimum number comprising obstacle unit and steep cliff unit and identical path a plurality of, therefore, it is desirable to therefrom select path side
To the path with current angular separation minimum as path of getting rid of poverty, in this way, robot only need to somewhat adjust direction, it is possible to
Walk on, relative efficiency is some higher.If angle is also identical, such as with when front direction 45 ° of directions to the left and to work as front
To 45 ° of directions to the right, the two directions are all identical with the angle when front direction, and the number comprising obstacle unit and steep cliff unit
Measure minimum and identical, then need therefrom to select the first paths counted along clockwise direction(I.e. with when front direction 45 ° of sides to the right
To corresponding path)As path of getting rid of poverty, in this way, final path of getting rid of poverty quickly can be selected and be determined, avoid the occurrence of more
Robot randomly selected situation during paths, improves the purpose of robot ambulation.
Preferably, first pre-determined distance in the step 6 is greater than or equal to 1.2 meters, less than or equal to 1.4 meters.
Second pre-determined distance in the step 7 is set length twice.Wherein, first pre-determined distance and institute
Setting can be adjusted accordingly according to actual conditions by stating the second pre-determined distance, easily remote if these value settings are too much
From target location;If setting too small, and can not effectively it get rid of poverty, therefore the optimal value of above-mentioned setting is the first pre-determined distance 1.3
Rice, the second 3 meters of pre-determined distance.
Above example be only it is fully open is not intended to limit the present invention, all creation purports based on the present invention, without creating
Property work equivalence techniques feature replacement, should be considered as the application exposure scope.
Claims (10)
1. a kind of trapped detection method of robot, it is characterised in that include the following steps:
Grating map is searched for, determines the grid cell that robot has passed by;
Judge whether there is the grid cell all passed by by robot from current grid unit to target grid cell
The raster path of composition;
If it is, determine that robot is not stranded;
If it is not, then determine that robot is stranded;
Wherein, the grid cell corresponding to current location point of the current grid unit residing for robot, the target grid
Lattice unit wants the grid cell corresponding to the source location of arrival for robot.
2. according to the method described in claim 1, it is characterized in that, described search grating map, determines what robot had passed by
Grid cell, includes the following steps:
The unit on the basis of the current grid unit, searches for along predetermined direction and judges the grid cell adjacent with reference cell
Whether it is grid cell that robot has passed by;
If it is, the grid cell that recorder people has passed by, and passed by and do not had with each robot of record respectively
As unit on the basis of the grid cell for crossing reference cell, continue that the grid adjacent with reference cell are searched for and judged along predetermined direction
Whether lattice unit is grid cell that robot has passed by, and so on, to the border for searching grating map;
If it is not, then do not record;
Wherein, for the same grid cell passed by, only record once.
3. according to the method described in claim 2, it is characterized in that, the predetermined direction is the upper and lower, left of the reference cell
With right four direction, or for the reference cell it is upper, upper it is 45 ° to the right, right, right avertence is 45 ° lower, under, under 45 ° to the left, Zuo He
45 ° of eight directions in left avertence.
4. according to the method described in claim 2, it is characterized in that, in the judgement from current grid unit to target grid list
After the step of raster path that the grid cell all passed by by robot is formed is not present between member, in definite robot
Whether before stranded step, following steps are further included:
The quantity for the grid cell that the robot recorded of statistics and the current grid unit direct neighbor has passed by, and
The grid cell that the robot recorded being connected with the current grid unit by the grid cell passed by has passed by
Quantity;
The area for the grid cell that the robot recorded that counting statistics goes out has passed by;
Judge whether the area is less than preset area;
If it is not, then the step not being stranded into the definite robot;
If it is, the step being stranded into the definite robot.
5. according to the method described in claim 4, it is characterized in that, the preset area is less than more than or equal to 1.5 square metres
Or equal to 2 square metres.
6. the processing method that a kind of robot gets rid of poverty, it is characterised in that include the following steps:
Step 1, based on the trapped detection method of Claims 1-4 any one of them robot, determines that robot is stranded;
Step 2, based on grating map, the point centered on the current location of robot, towards preset direction searching route, calculates every
The obstacle unit and the quantity of steep cliff unit included in bar searching route;
Step 3, selects the path of minimum number of the obstacle unit and steep cliff unit included as path of getting rid of poverty;
Step 4, the border encountered barrier or reach local map is run to along the path of getting rid of poverty;
Step 5, then based on the trapped detection method of Claims 1-4 any one of them robot, whether judge robot
It is stranded;
The success if it is not, then robot gets rid of poverty;
If it is, judging that robot is to encounter barrier, or the border of local map is reached, if encountering barrier,
Six are then entered step, if reaching the border of local map, then enters step seven;
Step 6, the walking of obstacle thing, and the air line distance run to from the location point for encountering barrier be more than first it is default away from
From when stop, being then based on the trapped detection method of Claims 1-4 any one of them robot judge robot whether by
It is tired, if it is, step 2 is returned to, the success if it is not, then robot gets rid of poverty;
Step 7, continuation are walked forward, and when air line distance for running to distance center point is more than the second pre-determined distance stops, or
The air line distance that person runs to distance center point is less than or equal to the second pre-determined distance and stops when encountering barrier, is then based on
The trapped detection method of Claims 1-4 any one of them robot judges whether robot is stranded, if it is, returning to
Step 2, the success if it is not, then robot gets rid of poverty;
Wherein, the obstacle unit detects grid cell corresponding during barrier for robot, and the steep cliff unit is machine
Device people detects grid cell corresponding during steep cliff, and the local map is to have setting length in grating map and set wide
The area map of degree.
7. according to the method described in claim 6, it is characterized in that, in step 6 and step 7, if based on claim 1
Judge that the trapped number continuous integration of robot reaches preset times to the 4 trapped detection methods of any one of them robot,
Then determine that robot can not get rid of poverty, it is out of service and report an error.
8. the method according to claim 6 or 7, it is characterised in that the preset direction is the upper, upper inclined of the central point
Right 45 °, right, right avertence is 45 ° lower, under, under 45 ° of eight directions in 45 ° to the left, left and left avertence.
9. the method according to claim 6 or 7, it is characterised in that in the step 3, if comprising obstacle unit and
The path that the minimum number and quantity of steep cliff unit are identical has two or more than two, then therefrom select path direction with it is current
The path of angular separation minimum is as path of getting rid of poverty;If the minimum and identical path of angle has two or more than two,
By robot on the basis of front direction, the first paths for therefrom selecting to count along clockwise direction are as path of getting rid of poverty.
10. the method according to claim 6 or 7, it is characterised in that first pre-determined distance in the step 6 is big
In or equal to 1.2 meters, less than or equal to 1.4 meters;Second pre-determined distance in the step 7 is the setting length
Twice.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711098354.4A CN107943025B (en) | 2017-11-09 | 2017-11-09 | Processing method for robot escaping from poverty |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711098354.4A CN107943025B (en) | 2017-11-09 | 2017-11-09 | Processing method for robot escaping from poverty |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107943025A true CN107943025A (en) | 2018-04-20 |
CN107943025B CN107943025B (en) | 2020-12-15 |
Family
ID=61933564
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711098354.4A Active CN107943025B (en) | 2017-11-09 | 2017-11-09 | Processing method for robot escaping from poverty |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107943025B (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108693880A (en) * | 2018-05-15 | 2018-10-23 | 北京石头世纪科技有限公司 | Intelligent mobile equipment and its control method, storage medium |
CN109343521A (en) * | 2018-09-27 | 2019-02-15 | 深圳乐动机器人有限公司 | A kind of robot cleans the method and robot in room |
CN109528089A (en) * | 2018-11-19 | 2019-03-29 | 珠海市微半导体有限公司 | A kind of walk on method, apparatus and the chip of stranded clean robot |
CN110448241A (en) * | 2019-07-18 | 2019-11-15 | 广东宝乐机器人股份有限公司 | The stranded detection of robot and method of getting rid of poverty |
CN110464262A (en) * | 2019-07-30 | 2019-11-19 | 广东宝乐机器人股份有限公司 | The method of getting rid of poverty of sweeping robot |
CN110485502A (en) * | 2019-07-17 | 2019-11-22 | 爱克斯维智能科技(苏州)有限公司 | A kind of excavator intelligent walking system, excavator and control method |
CN110687903A (en) * | 2018-06-19 | 2020-01-14 | 速感科技(北京)有限公司 | Mobile robot trapped judging method and device and motion control method and device |
CN110908388A (en) * | 2019-12-17 | 2020-03-24 | 小狗电器互联网科技(北京)股份有限公司 | Robot trapped detection method and robot |
CN110968099A (en) * | 2019-12-17 | 2020-04-07 | 小狗电器互联网科技(北京)股份有限公司 | Robot trapped detection method and robot |
CN111002346A (en) * | 2019-12-17 | 2020-04-14 | 小狗电器互联网科技(北京)股份有限公司 | Robot trapped detection method and robot |
CN111035317A (en) * | 2018-10-12 | 2020-04-21 | 北京奇虎科技有限公司 | Method and device for detecting and processing local predicament and computing equipment |
CN111166240A (en) * | 2018-11-09 | 2020-05-19 | 北京奇虎科技有限公司 | Method, device and equipment for setting cleaning forbidden zone and storage medium |
CN111493748A (en) * | 2019-01-31 | 2020-08-07 | 北京奇虎科技有限公司 | Robot cleaning execution method, device and computer readable storage medium |
CN111904346A (en) * | 2020-07-09 | 2020-11-10 | 深圳拓邦股份有限公司 | Method and device for getting rid of difficulties of sweeping robot, computer equipment and storage medium |
CN113110499A (en) * | 2021-05-08 | 2021-07-13 | 珠海市一微半导体有限公司 | Judging method of passing area, route searching method, robot and chip |
CN113156956A (en) * | 2021-04-26 | 2021-07-23 | 珠海市一微半导体有限公司 | Robot navigation method, chip and robot |
CN113219975A (en) * | 2021-05-08 | 2021-08-06 | 珠海市一微半导体有限公司 | Route optimization method, route planning method, chip and robot |
WO2021208530A1 (en) * | 2020-04-14 | 2021-10-21 | 北京石头世纪科技股份有限公司 | Robot obstacle avoidance method, device, and storage medium |
CN114812535A (en) * | 2021-01-19 | 2022-07-29 | 扬智科技股份有限公司 | Trapped state detection method and mobile platform |
CN115437388A (en) * | 2022-11-09 | 2022-12-06 | 成都朴为科技有限公司 | Method and device for escaping from poverty of omnidirectional mobile robot |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050192707A1 (en) * | 2004-02-27 | 2005-09-01 | Samsung Electronics Co., Ltd. | Dust detection method and apparatus for cleaning robot |
US20110054686A1 (en) * | 2009-08-25 | 2011-03-03 | Samsung Electronics Co., Ltd. | Apparatus and method detecting a robot slip |
US20110153081A1 (en) * | 2008-04-24 | 2011-06-23 | Nikolai Romanov | Robotic Floor Cleaning Apparatus with Shell Connected to the Cleaning Assembly and Suspended over the Drive System |
EP2721987A2 (en) * | 2012-10-18 | 2014-04-23 | LG Electronics, Inc. | Method of controlling automatic cleaner |
CN105739500A (en) * | 2016-03-29 | 2016-07-06 | 海尔优家智能科技(北京)有限公司 | Interaction control method and device of intelligent sweeping robot |
-
2017
- 2017-11-09 CN CN201711098354.4A patent/CN107943025B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050192707A1 (en) * | 2004-02-27 | 2005-09-01 | Samsung Electronics Co., Ltd. | Dust detection method and apparatus for cleaning robot |
US20110153081A1 (en) * | 2008-04-24 | 2011-06-23 | Nikolai Romanov | Robotic Floor Cleaning Apparatus with Shell Connected to the Cleaning Assembly and Suspended over the Drive System |
US20110054686A1 (en) * | 2009-08-25 | 2011-03-03 | Samsung Electronics Co., Ltd. | Apparatus and method detecting a robot slip |
EP2721987A2 (en) * | 2012-10-18 | 2014-04-23 | LG Electronics, Inc. | Method of controlling automatic cleaner |
CN105739500A (en) * | 2016-03-29 | 2016-07-06 | 海尔优家智能科技(北京)有限公司 | Interaction control method and device of intelligent sweeping robot |
Non-Patent Citations (2)
Title |
---|
刘杰: "基于环境地图的机器人全局路径规划的研究", 《中国优秀硕士学位论文全文数据库(电子期刊)》 * |
单建华: "克服死循环转角固定的实时路径规划方法", 《控制工程》 * |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108693880A (en) * | 2018-05-15 | 2018-10-23 | 北京石头世纪科技有限公司 | Intelligent mobile equipment and its control method, storage medium |
CN110687903A (en) * | 2018-06-19 | 2020-01-14 | 速感科技(北京)有限公司 | Mobile robot trapped judging method and device and motion control method and device |
CN110687903B (en) * | 2018-06-19 | 2022-07-08 | 速感科技(北京)有限公司 | Mobile robot trapped judging method and device and motion control method and device |
CN109343521A (en) * | 2018-09-27 | 2019-02-15 | 深圳乐动机器人有限公司 | A kind of robot cleans the method and robot in room |
CN111035317A (en) * | 2018-10-12 | 2020-04-21 | 北京奇虎科技有限公司 | Method and device for detecting and processing local predicament and computing equipment |
CN111166240A (en) * | 2018-11-09 | 2020-05-19 | 北京奇虎科技有限公司 | Method, device and equipment for setting cleaning forbidden zone and storage medium |
CN109528089A (en) * | 2018-11-19 | 2019-03-29 | 珠海市微半导体有限公司 | A kind of walk on method, apparatus and the chip of stranded clean robot |
CN109528089B (en) * | 2018-11-19 | 2021-03-23 | 珠海市一微半导体有限公司 | Method, device and chip for continuously walking trapped cleaning robot |
CN111493748A (en) * | 2019-01-31 | 2020-08-07 | 北京奇虎科技有限公司 | Robot cleaning execution method, device and computer readable storage medium |
CN110485502A (en) * | 2019-07-17 | 2019-11-22 | 爱克斯维智能科技(苏州)有限公司 | A kind of excavator intelligent walking system, excavator and control method |
CN110448241A (en) * | 2019-07-18 | 2019-11-15 | 广东宝乐机器人股份有限公司 | The stranded detection of robot and method of getting rid of poverty |
CN110448241B (en) * | 2019-07-18 | 2021-05-18 | 华南师范大学 | Robot trapped detection and escaping method |
WO2021008611A1 (en) * | 2019-07-18 | 2021-01-21 | 广东宝乐机器人股份有限公司 | Robot trapping detection and de-trapping method |
CN110464262A (en) * | 2019-07-30 | 2019-11-19 | 广东宝乐机器人股份有限公司 | The method of getting rid of poverty of sweeping robot |
CN110464262B (en) * | 2019-07-30 | 2021-05-14 | 广东宝乐机器人股份有限公司 | Method for getting rid of difficulties of sweeping robot |
CN110968099A (en) * | 2019-12-17 | 2020-04-07 | 小狗电器互联网科技(北京)股份有限公司 | Robot trapped detection method and robot |
CN111002346A (en) * | 2019-12-17 | 2020-04-14 | 小狗电器互联网科技(北京)股份有限公司 | Robot trapped detection method and robot |
CN110968099B (en) * | 2019-12-17 | 2024-05-07 | 小狗电器互联网科技(北京)股份有限公司 | Robot trapped detection method and robot |
CN111002346B (en) * | 2019-12-17 | 2021-05-14 | 小狗电器互联网科技(北京)股份有限公司 | Robot trapped detection method and robot |
CN110908388B (en) * | 2019-12-17 | 2023-08-11 | 小狗电器互联网科技(北京)股份有限公司 | Robot trapped detection method and robot |
CN110908388A (en) * | 2019-12-17 | 2020-03-24 | 小狗电器互联网科技(北京)股份有限公司 | Robot trapped detection method and robot |
WO2021208530A1 (en) * | 2020-04-14 | 2021-10-21 | 北京石头世纪科技股份有限公司 | Robot obstacle avoidance method, device, and storage medium |
CN111904346A (en) * | 2020-07-09 | 2020-11-10 | 深圳拓邦股份有限公司 | Method and device for getting rid of difficulties of sweeping robot, computer equipment and storage medium |
CN111904346B (en) * | 2020-07-09 | 2021-08-24 | 深圳拓邦股份有限公司 | Method and device for getting rid of difficulties of sweeping robot, computer equipment and storage medium |
CN114812535A (en) * | 2021-01-19 | 2022-07-29 | 扬智科技股份有限公司 | Trapped state detection method and mobile platform |
CN113156956A (en) * | 2021-04-26 | 2021-07-23 | 珠海市一微半导体有限公司 | Robot navigation method, chip and robot |
CN113156956B (en) * | 2021-04-26 | 2023-08-11 | 珠海一微半导体股份有限公司 | Navigation method and chip of robot and robot |
CN113219975A (en) * | 2021-05-08 | 2021-08-06 | 珠海市一微半导体有限公司 | Route optimization method, route planning method, chip and robot |
CN113110499B (en) * | 2021-05-08 | 2024-02-23 | 珠海一微半导体股份有限公司 | Determination method of traffic area, route searching method, robot and chip |
CN113219975B (en) * | 2021-05-08 | 2024-04-05 | 珠海一微半导体股份有限公司 | Route optimization method, route planning method, chip and robot |
CN113110499A (en) * | 2021-05-08 | 2021-07-13 | 珠海市一微半导体有限公司 | Judging method of passing area, route searching method, robot and chip |
CN115437388A (en) * | 2022-11-09 | 2022-12-06 | 成都朴为科技有限公司 | Method and device for escaping from poverty of omnidirectional mobile robot |
CN115437388B (en) * | 2022-11-09 | 2023-01-24 | 成都朴为科技有限公司 | Method and device for escaping from poverty of omnidirectional mobile robot |
Also Published As
Publication number | Publication date |
---|---|
CN107943025B (en) | 2020-12-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107943025A (en) | The trapped detection method of robot and the processing method got rid of poverty | |
US11216004B2 (en) | Map automation—lane classification | |
CN103901892B (en) | The control method of unmanned plane and system | |
Kümmerle et al. | A navigation system for robots operating in crowded urban environments | |
US11255679B2 (en) | Global and local navigation for self-driving | |
CN107348910A (en) | The detection method and build drawing method and chip that robot skids | |
US20190286921A1 (en) | Structured Prediction Crosswalk Generation | |
CN108508891B (en) | A kind of method of robot reorientation | |
Shah et al. | Viking: Vision-based kilometer-scale navigation with geographic hints | |
CN104897168B (en) | The intelligent vehicle method for searching path and system assessed based on road hazard | |
CN109425352A (en) | Self-movement robot paths planning method | |
CN107807643B (en) | The walking prediction of robot and control method | |
CN108012326A (en) | The method and chip of robot monitoring pet based on grating map | |
CN107443430A (en) | The detection method of intelligent robot collision obstacle and build drawing method | |
CN103186710B (en) | Optimum route search method and system | |
CN107092264A (en) | Towards the service robot autonomous navigation and automatic recharging method of bank's hall environment | |
CN109407675A (en) | The barrier-avoiding method and chip and autonomous mobile robot of robot time seat | |
CN113741438A (en) | Path planning method and device, storage medium, chip and robot | |
CN113433937B (en) | Hierarchical navigation obstacle avoidance system and hierarchical navigation obstacle avoidance method based on heuristic exploration | |
CN105136155B (en) | A kind of air navigation aid and electronic equipment | |
JP6584048B2 (en) | Route generation apparatus and route generation method | |
CN110174112A (en) | A kind of method for optimizing route for building figure task automatically for mobile robot | |
KR101500168B1 (en) | Method for Recognizing Side Boundary of Road during Driving | |
CN114879660A (en) | Robot environment sensing method based on target driving | |
US20210382491A1 (en) | Self-propelled inspection robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP03 | Change of name, title or address |
Address after: 519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong Patentee after: Zhuhai Yiwei Semiconductor Co.,Ltd. Country or region after: China Address before: Room 105-514, No.6 Baohua Road, Hengqin New District, Zhuhai City, Guangdong Province Patentee before: AMICRO SEMICONDUCTOR Co.,Ltd. Country or region before: China |
|
CP03 | Change of name, title or address |