[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN112051841B - Obstacle boundary generation method and device - Google Patents

Obstacle boundary generation method and device Download PDF

Info

Publication number
CN112051841B
CN112051841B CN201910484306.1A CN201910484306A CN112051841B CN 112051841 B CN112051841 B CN 112051841B CN 201910484306 A CN201910484306 A CN 201910484306A CN 112051841 B CN112051841 B CN 112051841B
Authority
CN
China
Prior art keywords
obstacle
point set
boundary
current position
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910484306.1A
Other languages
Chinese (zh)
Other versions
CN112051841A (en
Inventor
陶思含
黄玉刚
周国扬
郑鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Sumec Intelligent Technology Co Ltd
Original Assignee
Nanjing Sumec Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Sumec Intelligent Technology Co Ltd filed Critical Nanjing Sumec Intelligent Technology Co Ltd
Priority to CN201910484306.1A priority Critical patent/CN112051841B/en
Publication of CN112051841A publication Critical patent/CN112051841A/en
Application granted granted Critical
Publication of CN112051841B publication Critical patent/CN112051841B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Acoustics & Sound (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A method and a device for generating an obstacle boundary record coordinates of an obstacle when a self-walking device encounters the obstacle, then collect a plurality of obstacle coordinates with the distance meeting the requirements into an obstacle point set, generate the obstacle boundary based on each obstacle point set, and update a map. Therefore, the map information of the self-walking equipment can be synchronously updated based on the response action of the self-walking equipment to the obstacle in the working process of the self-walking equipment. The invention can accurately update the obstacle boundary in the map range so as to accurately control the robot to walk.

Description

Obstacle boundary generation method and device
Technical Field
The invention relates to the technical field of self-walking control, in particular to a method for limiting an obstacle in self-walking.
Background
The mobile robot comprises a navigation control module and a motor driving module for realizing the self-walking function. The navigation control module outputs corresponding control signals to the motor driving module according to map information stored by the mobile robot, and the motor driving module controls the motor to run so as to realize self-walking control of the mobile robot in a map range.
The above-described self-walking control depends on map information. Therefore, in order to achieve synchronous positioning and map construction of the mobile robot, map information needs to be continuously updated in the process of moving the robot. An important part is to judge and update obstacles in the map.
When the existing mobile robot works, the current relative position or absolute position of the mobile robot can be generally obtained through various positioning technologies. The existing mobile robot can also judge whether the current position of the mobile robot has an obstacle or not through related various sensors. However, these sensing data cannot determine the outline information of the obstacle as a whole. In the self-walking control based on the map information, accurate obstacle contour information needs to be stored in the map in order to avoid the obstacle. The prior art can only trigger steering or a corresponding control mechanism when encountering an obstacle, but can not obtain the required obstacle boundary information, the existing map is not updated accurately, and the mobile robot is controlled to walk automatically according to the existing map information, so that the mobile robot is blocked by the obstacle in practice.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides the obstacle boundary generation method and the obstacle boundary generation device, which can judge the position relation among all coordinate points of the obstacle in the walking process of the robot, and facilitate the updating of the robot map. The invention adopts the following technical scheme.
First, the present invention provides a method for generating an obstacle boundary, for a self-walking device, comprising the steps of: recording the current position coordinates when encountering an obstacle; grouping the current position coordinates into an obstacle point set; and updating the boundary of the obstacle according to the coordinates in the obstacle point set when the number of the coordinates in the obstacle point set reaches a preset value T.
Optionally, in the above method for generating an obstacle boundary, the basis for determining whether to group the current position coordinates to an obstacle point set is: whether the current position coordinates are positioned in the obstacle point set area or not; if the current position coordinate is positioned in the obstacle point set area, the current position coordinate is collected to the obstacle point set; otherwise, the collection is not carried out.
Optionally, in the above method for generating an obstacle boundary, the obstacle point set region includes a region with a radius around each coordinate in the obstacle point set being a threshold N.
Optionally, in the above method for generating an obstacle boundary, the distance between the obstacle point set collected by the current position coordinate and the obstacle point set is not more than a threshold N.
Optionally, in the above method for generating an obstacle boundary, the distance between the current position coordinate and the obstacle point set is: and the nearest distance between the current position coordinate and each coordinate in the obstacle point set.
Optionally, in the above method for generating an obstacle boundary, if a distance between the current position coordinate and at least one coordinate in one obstacle point set is within a threshold N, the current position coordinate is collected to the obstacle point set; otherwise, the collection is not carried out.
Optionally, in the above method for generating an obstacle boundary, when the current position coordinates are collected to an obstacle point set: if the distance between the current position coordinate and any obstacle point set exceeds the threshold value N, or if the current position coordinate is not located in the area of any obstacle point set, or if the distance between the current position coordinate and any coordinate in any obstacle point set exceeds the threshold value N; a new set of obstacle points is created and the current position coordinates are clustered to the new set of obstacle points.
Optionally, in the above method for generating an obstacle boundary, each coordinate corresponds to a map grid within a map range of a working area where the self-walking device is located.
Optionally, in the above method for generating an obstacle boundary, the step of updating the obstacle boundary according to coordinates in the obstacle point set includes: updating the obstacle boundary to be the whole map grid on or in the boundary line of the obstacle point set region or updating the obstacle boundary to be the boundary of the region surrounded by the coordinates in the obstacle point set.
Optionally, in the above method for generating an obstacle boundary, the area surrounded by the obstacle boundary includes all coordinates in the corresponding obstacle point set.
Meanwhile, the invention also provides an obstacle boundary generating device for self-walking equipment, which comprises: the obstacle detection unit is used for triggering a signal when encountering an obstacle, and the signal can trigger and record the current position coordinates of the self-walking equipment; and the data processing unit is used for collecting the current position coordinates into an obstacle point set, and updating the obstacle boundary according to the coordinates in the obstacle point set when the number of the coordinates in the obstacle point set reaches a preset value T.
Optionally, the obstacle boundary generating device further includes a map data storage module, configured to store and/or update the obstacle boundary.
Optionally, in the obstacle boundary generating device, the self-walking device includes: a mower provided with a self-walking unit.
Optionally, in the above obstacle boundary generating device, the obstacle detecting unit includes: the device comprises one or any combination of a collision sensor, a Hall sensor, an ultrasonic sensor, an infrared sensor, a motor current detection module and a rotation speed change detection module.
Optionally, the obstacle boundary generating device further includes a positioning unit, configured to be triggered by the obstacle detecting unit to obtain the current position coordinate of the self-walking device.
Optionally, in the obstacle boundary generating device, the positioning unit includes: one or any combination of RTK positioning sensor, DGPS positioning sensor, UWB positioning sensor and inertial navigation sensor.
Advantageous effects
When the self-walking equipment encounters an obstacle, the coordinates of the obstacle are recorded, then a plurality of obstacle coordinates with the distance meeting the requirements are collected into one obstacle point set, and an obstacle boundary is generated based on each obstacle point set to update a map. Therefore, the map information of the self-walking equipment can be synchronously updated based on the response action of the self-walking equipment to the obstacle in the working process of the self-walking equipment. The invention can automatically update the map information in the working process without manual operation. In addition, compared with the manual input of information such as obstacle coordinates, the method and the device can accurately update the obstacle boundaries in the map range so as to accurately control the robot to walk.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, and do not limit the invention. In the drawings:
FIG. 1 is a schematic diagram of the overall architecture of a self-walking device to which the present invention is applied;
FIG. 2 is a flow chart of the obstacle boundary generation method of the present invention;
FIG. 3 is a first graph of map updates performed during operation of a robot to which the present invention is applied;
FIG. 4 is a second graph of map updates performed during operation of a robot to which the present invention is applied;
FIG. 5 is a third graph of map updates performed during operation of a robot to which the present invention is applied;
FIG. 6 is a fourth graph of map updates performed during operation of a robot to which the present invention is applied;
FIG. 7 is a fifth graph of map update of a robot in operation to which the present invention is applied;
FIG. 8 is a sixth graph of map updates performed during operation of a robot to which the present invention is applied;
FIG. 9 is a seventh graph of map updating performed during operation of a robot to which the present invention is applied;
FIG. 10 is an eighth graph of map updates performed during operation of a robot to which the present invention is applied;
fig. 11 is a ninth graph of map update performed during operation of the robot to which the present invention is applied.
Detailed Description
In order to make the purpose and technical solutions of the embodiments of the present invention more clear, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings of the embodiments of the present invention. It will be apparent that the described embodiments are some, but not all, embodiments of the invention. All other embodiments, which can be made by a person skilled in the art without creative efforts, based on the described embodiments of the present invention fall within the protection scope of the present invention.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The meaning of "and/or" in the present invention means that each exists alone or both exist.
In a specific implementation, the present invention may provide a self-walking device, including a self-walking mowing robot, or a self-walking sweeping robot, or other robot with self-walking function. The robot, referring to fig. 1, may include:
the obstacle detection unit may include one or any combination of a collision sensor, a hall sensor, an ultrasonic sensor, an infrared sensor, a motor current detection module, and a rotation speed change detection module. The obstacle detection unit is used for triggering a signal when encountering an obstacle, and the signal can trigger and record the current position coordinate of the self-walking equipment;
the data processing unit, referring to fig. 1, can interact with the obstacle detection unit, the positioning unit, the map data storage module and the navigation control module. And the navigation control module is used for collecting the current position coordinates acquired by the positioning unit into an obstacle point set, updating the obstacle boundary according to the coordinates in the obstacle point set when the number of the coordinates in the obstacle point set reaches a preset value T, and storing the obstacle boundary in a proper storage element so as to control the motor driving module to correspondingly drive the self-walking equipment to operate according to the stored map information output control signal.
In a more specific implementation, the mowing robot may be configured to further include:
a map data storage module for storing and/or updating obstacle boundaries and map information similar to those shown in fig. 3 to 11;
and the positioning unit is triggered by the obstacle detection unit to acquire the current position coordinates of the self-walking equipment. The positioning unit may specifically include: one or any combination of RTK positioning sensor, DGPS positioning sensor, UWB positioning sensor and inertial navigation sensor. The generated position coordinates can be in an array mode and correspond to a point in the area range stored by the map data storage module. The point coordinates may be generally set to include an abscissa and an ordinate, and a polar or other defined coordinate system may be used. The inertial navigation sensor is used for sensing inertial data in the walking process of the self-walking equipment and improving the positioning and navigation accuracy according to the data.
The navigation control module can be specifically arranged to mutually transmit data with the data processing unit, can read the data of the inertial navigation sensor, fuses the map data in the map data storage module and the data of the inertial navigation sensor, and sends a navigation control instruction to the motor driving module correspondingly.
In particular, the self-walking device may update the boundary of the obstacle in the map in the manner shown in fig. 2. The method comprises the following steps:
recording the current position coordinates when encountering an obstacle, namely, the coordinates of the obstacle points;
grouping the current position coordinates into an obstacle point set; the determining whether to collect the current position coordinates into an obstacle point set may be performed according to whether the current position coordinates are located within the obstacle point set region: if the current position coordinate is positioned in the obstacle point set area, the current position coordinate is collected to the obstacle point set; otherwise, the collection is not carried out.
And updating the boundary of the obstacle according to the coordinates in the obstacle point set when the number of the coordinates in the obstacle point set reaches a preset value T.
In the above process, in a specific implementation manner, the obstacle point set area specifically includes an area with a radius around each coordinate in the obstacle point set being a threshold N. The distance between the obstacle point set collected by the current position coordinate and the obstacle point set is not more than a threshold value N, that is, the nearest distance between each coordinate in the obstacle point set collected by the current position coordinate and the current position coordinate is not more than the threshold value N.
In the above process, referring to the case of fig. 10, when the current position coordinates are grouped into the obstacle point sets, if the distances between the current position coordinates and any obstacle point set exceed the threshold value N, or if the current position coordinates are not located within the area of any obstacle point set, or if the distances between the current position coordinates and any coordinates within any obstacle point set exceed the threshold value N; a new set of obstacle points is created and the current position coordinates are clustered to the new set of obstacle points. If the distance between the current position coordinate and at least one coordinate in one obstacle point set is within a threshold value N, the current position coordinate is collected to the obstacle point set; otherwise, the collection is not carried out.
Specifically, an obstacle boundary generation process in a self-walking process is taken as an example.
As shown in fig. 3, in the running process of the self-walking device such as the robot, after encountering an obstacle point, the two-dimensional coordinates of the obstacle point are recorded in the cache. Marked X in fig. 3. Referring to fig. 4, in the subsequent self-walking process, if the robot encounters an obstacle again, it is determined whether the obstacle point is within N range with the previous obstacle point as the center of the circle. As shown in fig. 4, a distance n=3 may be set. If the new obstacle point is located in the set range, the coordinates of the new obstacle point are recorded, and as shown in fig. 5, X and X2 are combined to form the same obstacle point set. Then, as shown in fig. 6, the obstacle point determination range is updated as follows: including the area covered by all the points in the point set together as circles with the radius of the circle center of N. The determination of the obstacle points continues according to the above steps until the map shown in fig. 7 is obtained. And then generating an obstacle boundary when the number of the obstacle points detected in a certain point set is greater than or equal to a preset value T according to a preset value T. Taking the preset value T as 5 as an example, the generated obstacle boundary is the whole area marked by the circle in fig. 7. According to the area update map, the obstacle boundary in fig. 8 is updated to a blank portion shown in fig. 9, which includes all coordinates in the corresponding obstacle point set. Specifically, each coordinate may correspond to a map grid within a map range of a working area where the self-walking device is located. In this correspondence, the boundary range is the entire map mesh on or within the boundary line of the obstacle point set region, or the boundary range is the boundary of the region surrounded by the coordinates within the obstacle point set. It may also be an inscribed polygon or an circumscribed polygon of the corresponding obstacle point set area.
In the above-described process, if a plurality of obstacles are encountered in the working area, the determination and generation of the obstacle boundaries may be performed by:
after encountering a different obstacle point with a distance exceeding the threshold value N, generating a new two-dimensional array of obstacle coordinates in the buffer memory, and recording the coordinates of the point as a22 in fig. 10. Thus, the original obstacle point A1 (x 1, y 1) belongs to a point set (the point set includes a plurality of dynamic two-dimensional arrays A1[ (x 1, y 1)).
Then, as shown in fig. 11, the robot encounters the next obstacle point within the preset value range T of the distance A1, records the coordinates a21 (x 2, y 2) of the point, and stores in the point set A1, A1[ (x 1, y 1), (x 2, y 2) & gt.
If the robot encounters the next obstacle point outside the range of preset values T for distance a1, the coordinates a22 (x 2, y 2) of that point are recorded, and a new point set (dynamic two-dimensional array A2[ (x 2, y 2. ]) is created, with a22 clustered into the new point set. After the number of coordinate points in any one point set reaches the above setting, the boundary of the corresponding obstacle can be obtained in the manner shown in fig. 7 to 9 above according to the distribution condition of the coordinates in each point set in turn.
The foregoing is a description of embodiments of the invention, which are specific and detailed, but are not to be construed as limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention.

Claims (15)

1. An obstacle boundary generating method for a self-walking device, comprising the steps of: recording the current position coordinates when encountering an obstacle;
grouping the current position coordinates into an obstacle point set with the distance not exceeding a threshold value N;
and updating the boundary of the obstacle according to the coordinates in the obstacle point set when the number of the coordinates in the obstacle point set reaches a preset value T.
2. The obstacle boundary generation method according to claim 1, wherein the basis for determining whether to group the current position coordinates into one obstacle point set is: whether the current position coordinates are positioned in the obstacle point set area or not;
if the current position coordinate is positioned in the obstacle point set area, the current position coordinate is collected to the obstacle point set;
otherwise, the collection is not carried out.
3. The obstacle boundary generation method according to claim 2, wherein the obstacle point set region includes a region around each coordinate in the obstacle point set having a radius of a threshold value N.
4. The obstacle boundary generation method of claim 3 wherein the distance between the current location coordinates and the set of obstacle points is: and the nearest distance between the current position coordinate and each coordinate in the obstacle point set.
5. The obstacle boundary generation method according to claim 1, wherein if a distance between the current position coordinate and at least one coordinate in one obstacle point set is within a threshold N, the current position coordinate is clustered to the obstacle point set;
otherwise, the collection is not carried out.
6. The obstacle boundary generation method according to any one of claims 1 to 5, wherein when the current position coordinates are collected to an obstacle point set, if the distance between the current position coordinates and any obstacle point set exceeds the threshold N, or if the current position coordinates are not located within an area of any obstacle point set, or if the distance between the current position coordinates and any coordinates within any obstacle point set exceeds the threshold N;
a new set of obstacle points is created and the current position coordinates are clustered to the new set of obstacle points.
7. The obstacle boundary generation method as claimed in claim 6, wherein each of the coordinates corresponds to a map grid within a map range of a work area in which the self-walking device is located.
8. The obstacle boundary generation method as claimed in claim 7, wherein the step of updating the obstacle boundary based on coordinates within the obstacle point set comprises: updating the obstacle boundary to be the whole map grid on or in the boundary line of the obstacle point set region or updating the obstacle boundary to be the boundary of the region surrounded by the coordinates in the obstacle point set.
9. The obstacle boundary generation method as claimed in claim 8, wherein the area surrounded by the obstacle boundary includes all coordinates in the set of obstacle points to which it corresponds.
10. An obstacle boundary generating device for a self-walking apparatus, comprising:
the obstacle detection unit is used for triggering a signal when encountering an obstacle, and the signal can trigger and record the current position coordinates of the self-walking equipment;
and the data processing unit is used for collecting the current position coordinates to an obstacle point set with the distance not exceeding a threshold value N, and updating the obstacle boundary according to the coordinates in the obstacle point set when the number of the coordinates in the obstacle point set reaches a preset value T.
11. The obstacle boundary generation device of claim 10, further comprising a map data storage module to store and/or update the obstacle boundary.
12. The obstacle boundary generating apparatus according to claim 10, wherein the self-walking device comprises: a mower provided with a self-walking unit.
13. The obstacle boundary generating apparatus according to claim 10, wherein the obstacle detecting unit includes: the device comprises one or any combination of a collision sensor, a Hall sensor, an ultrasonic sensor, an infrared sensor, a motor current detection module and a rotation speed change detection module.
14. The obstacle boundary generating apparatus according to claim 10, further comprising a positioning unit operable to acquire current position coordinates of the self-walking device triggered by the obstacle detecting unit.
15. The obstacle boundary generating apparatus according to claim 14, wherein the positioning unit includes: one or any combination of RTK positioning sensor, DGPS positioning sensor, UWB positioning sensor and inertial navigation sensor.
CN201910484306.1A 2019-06-05 2019-06-05 Obstacle boundary generation method and device Active CN112051841B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910484306.1A CN112051841B (en) 2019-06-05 2019-06-05 Obstacle boundary generation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910484306.1A CN112051841B (en) 2019-06-05 2019-06-05 Obstacle boundary generation method and device

Publications (2)

Publication Number Publication Date
CN112051841A CN112051841A (en) 2020-12-08
CN112051841B true CN112051841B (en) 2023-06-02

Family

ID=73608501

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910484306.1A Active CN112051841B (en) 2019-06-05 2019-06-05 Obstacle boundary generation method and device

Country Status (1)

Country Link
CN (1) CN112051841B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112702693B (en) * 2020-12-23 2023-06-23 南京苏美达智能技术有限公司 Map construction method and positioning method of self-walking equipment
CN113110462A (en) * 2021-04-21 2021-07-13 广州极飞科技股份有限公司 Obstacle information processing method and device and operating equipment
CN114291083B (en) * 2022-01-12 2023-07-25 未岚大陆(北京)科技有限公司 Self-moving device control method, device, system, medium and self-moving device
CN116399330B (en) * 2023-05-29 2023-08-15 未岚大陆(北京)科技有限公司 Map modification method, map modification device, electronic equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6769788B2 (en) * 2016-09-07 2020-10-14 ボッシュ株式会社 Processing equipment and processing method for generating obstacle information around the moving object
CN107340768B (en) * 2016-12-29 2020-08-28 珠海市一微半导体有限公司 Path planning method of intelligent robot
CN109744945B (en) * 2017-11-08 2020-12-04 杭州萤石网络有限公司 Method, device and system for determining regional attributes and electronic equipment
CN108507578B (en) * 2018-04-03 2021-04-30 珠海市一微半导体有限公司 Navigation method of robot
CN109645897A (en) * 2019-01-10 2019-04-19 轻客小觅智能科技(北京)有限公司 A kind of obstacle detection method and system of sweeper

Also Published As

Publication number Publication date
CN112051841A (en) 2020-12-08

Similar Documents

Publication Publication Date Title
CN112051841B (en) Obstacle boundary generation method and device
CN110673115B (en) Combined calibration method, device, equipment and medium for radar and integrated navigation system
EP3627267B1 (en) Method and apparatus for charging robot
CN109709945B (en) Path planning method and device based on obstacle classification and robot
US10496093B2 (en) Movement control system, movement control device, and computer-implemented program for movement control
CA2674398C (en) A vehicle control system
EP3230815B1 (en) Improved navigation for a robotic working tool
US20200233061A1 (en) Method and system for creating an inverse sensor model and method for detecting obstacles
EP3564624A1 (en) Path planning method and device
CN107671862A (en) The detection method and processing method that robot is stuck
CN101750972A (en) Garment for use near autonomous machines
CN103926925A (en) Improved VFH algorithm-based positioning and obstacle avoidance method and robot
CN111469127B (en) Cost map updating method and device, robot and storage medium
CN111609848B (en) Intelligent optimization method and system for multi-robot cooperation mapping
CN112833890B (en) Map construction method, device, equipment, robot and storage medium
CN111199677B (en) Automatic work map establishing method and device for outdoor area, storage medium and working equipment
CN105043402A (en) Traveling route optimizing method, vehicle and electronic device
CN111483464B (en) Dynamic automatic driving lane changing method, equipment and storage medium based on road side unit
JPWO2019180765A1 (en) Autonomous traveling work machine
CN113503877A (en) Robot partition map establishing method and device and robot
CN113686347A (en) Method and device for generating robot navigation path
US20200369290A1 (en) System and method for configuring worksite warning zones
CN112834249B (en) Steering parameter detection method, device, equipment and storage medium
CN113387099A (en) Map construction method, map construction device, map construction equipment, warehousing system and storage medium
CN116917826A (en) automatic working system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant