US20110082585A1 - Method and apparatus for simultaneous localization and mapping of mobile robot environment - Google Patents
Method and apparatus for simultaneous localization and mapping of mobile robot environment Download PDFInfo
- Publication number
- US20110082585A1 US20110082585A1 US12/873,018 US87301810A US2011082585A1 US 20110082585 A1 US20110082585 A1 US 20110082585A1 US 87301810 A US87301810 A US 87301810A US 2011082585 A1 US2011082585 A1 US 2011082585A1
- Authority
- US
- United States
- Prior art keywords
- robot
- physical environment
- particles
- map
- data acquisition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 238000013507 mapping Methods 0.000 title claims abstract description 19
- 230000004807 localization Effects 0.000 title abstract description 18
- 239000002245 particle Substances 0.000 claims abstract description 115
- 238000012544 monitoring process Methods 0.000 claims abstract description 3
- 238000012545 processing Methods 0.000 claims description 25
- 230000033001 locomotion Effects 0.000 claims description 20
- 238000004891 communication Methods 0.000 claims 2
- 238000012795 verification Methods 0.000 abstract description 12
- 238000001514 detection method Methods 0.000 abstract description 11
- 230000008569 process Effects 0.000 abstract description 7
- 238000001914 filtration Methods 0.000 abstract 1
- 238000009826 distribution Methods 0.000 description 14
- 238000013459 approach Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 238000005259 measurement Methods 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 238000011156 evaluation Methods 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000013480 data collection Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 230000002730 additional effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 238000009827 uniform distribution Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D3/00—Control of position or direction
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
- B25J11/0085—Cleaning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0003—Home robots, i.e. small robots for domestic use
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/01—Mobile robot
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/46—Sensing device
- Y10S901/47—Optical
Definitions
- aspects of the present invention relate to mobile robots, and more particularly to the mapping of environments in which mobile robots operate, to facilitate movement of mobile robots within those environments.
- simultaneous localization and mapping As a system that enables a mobile robot to map its environment and maintain working data of its position within that map, simultaneous localization and mapping (SLAM) is both accurate and versatile. Its reliability and suitability for a variety of applications make it a useful element for imparting a robot with some level of autonomy.
- SLAM techniques tend to be computationally intensive and thus their efficient execution often requires a level of processing power and memory capacity that may not be cost effective for some consumer product applications.
- FIG. 1 depicts a block diagram showing some features according to the invention.
- FIG. 2 depicts a flow chart showing some features according to the invention, corresponding to certain aspects of FIG. 1 .
- FIG. 3 depicts another flow chart showing some other features according to the invention, corresponding to certain aspects of FIG. 1 .
- FIG. 4 depicts a block diagram showing some other features according to the invention.
- FIG. 5 depicts an example of particle weight distribution for a localization iteration process.
- FIG. 6 depicts a further example of particle weight distribution for a localization iteration process.
- FIG. 7 depicts yet a further example of particle weight distribution for a localization iteration process.
- FIG. 8 depicts localized and delocalized states based on verified particle distribution data.
- FIG. 9 depicts a block diagram showing some other features according to the invention.
- FIG. 10 depicts a flow chart showing some other features of the invention, corresponding to certain aspects of FIG. 6 .
- FIG. 11 depicts an example of orientation of a mobile robot in its physical environment.
- FIG. 12 depicts a further example of orientation of a mobile robot in its physical environment.
- FIG. 13 depicts a flow chart showing some other features of the invention, corresponding to certain aspects of FIG. 1 .
- FIG. 14 depicts one scenario of movement and orientation of a mobile robot in its physical environment.
- FIG. 15 depicts a further scenario of movement and orientation of a mobile robot in its physical environment.
- FIG. 16 depicts yet a further scenario of movement and orientation of a mobile robot in its physical environment.
- Localization requires regularly updating a robot's pose (position and angle) within its environment. The frequency with which this is done can affect overall system performance, depending on how often data must be processed as a result of an update operation. Minimizing computational load is essential to providing a SLAM system that can function effectively in a low-cost hardware environment.
- computational load may be reduced by eliminating robot position updates when it appears that the robot has become delocalized, in which case the updates likely would be erroneous anyway.
- FIG. 1 is a diagram depicting aspects of the just-mentioned feature in a mobile robotic system 100 .
- data acquisition system 110 generates data regarding the environment of mobile robot 120 .
- This data becomes input data to processing apparatus 130 .
- processing apparatus 130 From this data, processing apparatus 130 generates a map or model of the mobile robot's environment (block 132 ).
- Processing apparatus 130 also may contain a separate function (block 134 ) that monitors the generation or updating of the map for any shift in map elements beyond a threshold limit. If such an occurrence is detected, the processing apparatus (block 136 ) responds by executing instructions to suspend or modify the use of data from data acquisition system 110 .
- a sensing unit 140 also may monitor the data acquisition system 110 for a loss in preferred orientation of the data acquisition system 110 for data generation. If sensing unit 140 detects a loss in orientation, processing apparatus 130 will respond by executing instructions to suspend or modify use of data generated by the data acquisition system 110 .
- Mobile robot 120 may be connected to processing apparatus 130 .
- the sensing unit 140 if present, may be attached to the mobile robot 120 .
- Data acquisition system 110 may be attached to the mobile robot 120 as well, or alternatively may be separate.
- FIG. 2 shows a flow of operation of the system depicted in FIG. 1 .
- the data acquisition system generates data regarding the robot's physical environment, yielding the generated data at block 202 .
- the orientation of the data acquisition system is monitored to see whether the data acquisition system is maintaining its preferred orientation with respect to the robot's physical environment (e.g. whether the data acquisition system is tilting, has tipped over, or otherwise seems to display an orientation other than one in which the robot can function within its physical environment.
- the generated data is used to generate or update the map of the robot's physical environment.
- FIG. 3 is a diagram showing other features of the invention.
- map generation apparatus 310 provides a map of a mobile device's environment for localization of the mobile device within that environment.
- a delocalization detection apparatus 320 uses the map information to determine the position of the device.
- Particle generation apparatus 322 generates particles representing potential poses of the mobile device.
- Particle weight assignment apparatus 324 assigns weights to each particle representing its relative likelihood of accuracy relative to other particles.
- an erroneous particle generation apparatus 326 generates particles such that their corresponding weights as generated by particle weight assignment apparatus 324 will be low, representing a low probability of correctly indicating the mobile device's position.
- a particle weight comparison apparatus 328 compares the weights of the erroneous particles with the weights of the particles generated by the particle generation apparatus 322 and confirms that the device is accurately localized or determines whether delocalization has occurred.
- the method may operate as follows:
- FIG. 4 depicts a flow of operation of the system depicted in FIG. 3 .
- the existing map may be used or updated as appropriate.
- particles are generated, either anew or iteratively, the iteratively generated particles being added to the existing particle set.
- weights are assigned to each particle.
- erroneous particles are generated, and at block 405 , the erroneous particles have weights assigned to them.
- the weights of the erroneous particles are compared to those of the original particle set to determine whether delocalization has occurred.
- a check for delocalization is made. If delocalization has not occurred, then similarly to block 205 in FIG. 2 , map generation and updating continues. If delocalization has occurred, then similarly to block 206 in FIG. 2 , map generation is suspended or modified.
- a typical approach to localization under a SLAM scheme might include the following steps:
- a typical localization iteration based on the above process might yield the particle weight distribution illustrated in FIG. 5 .
- the distribution of particles, sorted by weight appears as a curve, indicating a mix of particles of low, middle and high weights.
- a particle's index number may indicate its relative position with respect to other particles regarding its probability of accurately representing the robot's pose (position and angle).
- particle 1 has the highest probability of accuracy and all subsequent particles (i.e., particles 2, 3, 4, etc.) have sequentially lower probabilities of accuracy in their pose.
- the weight scale (the vertical axis in the graph) may be highly dependent on environmental conditions such as distance from walls, number of valid distance readings from a spatial sensor such as a laser rangefinder, etc.
- An approach to determining delocalization via the introduction of erroneous particles generally should be independent of environmental conditions.
- the goal of introducing erroneous particles is to identify when the particles with higher probability of representing the robot's pose are not much better than particles with the lowest probability of representing the robot's pose. In such a circumstance, the implication is that most or all potential poses are bad, and therefore the robot has little or no reliable information regarding its actual whereabouts within its environment. By definition, the robot is delocalized.
- the process of assessing the state of localization involves introducing additional test particles whose pose is deliberately erroneous in order to set a baseline weight for comparison to better particles.
- the particles representing candidate location angles with the highest weights are fairly close to an ideal motion model. Recognizing this, a generally effective approach to delocalization detection is to introduce erroneous particles at the center of the ideal motion model with large offsets to the angle (e.g., ⁇ 30°, 40°, 50°, 60°, etc.).
- the erroneous particles will reside relatively close together at the end of the sorted distribution curve that contains the lowest weighted particles, as shown in FIG. 6 .
- the erroneous particles referred to here as verification particles for their purpose, are clustered together on the lower right end of the curve, each having a weight that is closer to zero than the particles comprising the rest of the sorted distribution.
- some erroneous (verification) particles reside at the far right side of the distribution, but other erroneous particles are scattered through the rest of the particle set. As more particles known to be erroneous have weights that exceed other, non-verification particles, it becomes increasingly likely that the robot has delocalized.
- delocalization can be done in any of a variety of ways, including by examining the mean index value of the erroneous (verification) particles. In a localized condition, most or all of the erroneous particles will reside relatively close together at the bottom of the index, since they generally will have the lowest weights. Averaging the indices of the erroneous particles in a localized case will yield a large number relative to the size of the total set of particles, including both erroneous and non-erroneous particles.
- an average of verification particle indices that remains constant and high in value with respect to total particle set size reflects a localized condition.
- An average that falls in value or begins to fluctuate in value may indicate a delocalized condition.
- both of these states, localized and delocalized, are depicted in the plots of the averaged verification particle data in FIG. 8 .
- the plotted data are the averaged verification particle indices.
- the averaged data are high and relatively constant, which is consistent with a localized state.
- the average value drops significantly and then recovers; in this particular data set, this drop corresponds to an engineer picking the robot up from the floor and moving it to a different location.
- the return of the average to a high, stable number indicates that the robot likely recovered from the event.
- Determining that the robot has delocalized relies on comparing the averaged erroneous particle index to a threshold number.
- the threshold number can be decided a priori during coding, but it is typically beneficial to include some hysteresis in the evaluation of whether a robot is localized. For example, looking at the latter portion of the data set illustrated in FIG. 8 , the variability of the averaged verification particle indices reaches a high number several times, but, in each instance, it drops again after only a few iterations.
- a proper evaluation of whether a robot has recovered from a delocalization event should not look only at instantaneous values, but also should evaluate whether the averaged index returns to a high value and remains stable at a high value for a period of time sufficient to demonstrate that the robot likely has successfully re-localized.
- the necessary minimum duration can also be defined in the code.
- Newly encountered, unmapped space may contain a mix of dynamic and static elements. Making a distinction between the robot's identification of potentially dynamic areas of the map and those that are static is essential for building useful and accurate maps for the robot to use.
- the issue of distinguishing between static (permanent) elements of the robot's surroundings and dynamic (transient) elements may be addressed in the following way:
- FIG. 9 is a diagram of a system containing other features of the invention.
- a data acquisition system 910 generates data regarding the physical environment of a mobile device such as a robot.
- the data generated by the data acquisition system provides input to a map/model processing apparatus 920 .
- the map/model processing apparatus 920 generates and maintains a map in a cell-based grid form (block 922 ) and assigns a probability of occupancy to each cell (block 924 ) based on the data received from the data acquisition system.
- the map/model processing unit monitors individual cells (block 926 ) for changes in their probability of occupancy. Based on the detection of such changes, the processing unit determines if any cells are dynamic. If cells are determined to be dynamic, they are marked accordingly (block 928 ). Mapping or updating of such cells is suspended for the period that they are in a dynamic state.
- FIG. 10 depicts a flow of operation of the embodiment shown in FIG. 9 .
- the data acquisition system generates data regarding the robot's physical environment, yielding the generated data at block 1002 .
- the generated data is used to generate or update the map of the robot's physical environment.
- probabilities of occupancy for each cell in the grid map are assigned or updated.
- floors may have areas of uneven surface or surface discontinuities, or because objects resting on the floor may introduce non-uniformities in a robot's available travel surface, it is possible that a sensor collecting spatial data may not maintain consistent orientation with the presiding surfaces of the surrounding geometry, which can lead to erroneous delineation of the robot's surroundings.
- FIGS. 11-12 illustrate the potential problem encountered by a robot collecting spatial data without an ability to detect when its sensor has lost parallel orientation with the floor.
- the robot In the upper illustration, the robot is traveling away from a physical boundary at A and toward a physical boundary at B.
- a sensor mounted on the robot in this example is collecting spatial data in a horizontal plane indicated by the thin line positioned at a height near the top of the robot.
- the robot In the lower illustration, the robot begins traversing an obstacle which tilts the robot backward. If the robot does not recognize that it is no longer collecting data in a plane that is accordant with the surrounding geometry, then the spatial construction developed from the sensor data will not match the actual geometry defined by the robot's surroundings.
- the data collection plane's forward incline will distort the previously determined position of the wall at B to one further out, at B′.
- the backward decline on the data collection plane results in its intersection with the floor, creating the impression that a boundary exists behind the robot at A′ rather than at the further position of A.
- wheel slip accompanies tilt when a robot traverses a substantive irregularity in a floor surface. This can be particularly problematic if it occurs when the robot is collecting its first data on a new area (e.g., when the robot has turned a corner into an unmapped space) since the distorted image may be incorporated into the map.
- erroneous data generated during a tilt event can propagate into mapping or localization algorithms.
- the potential results may include some degree of mapping corruption, which frequently can lead to delocalization.
- dynamic areas created by people, pets or objects moved or in use by a person will present a dynamic area to mark, one that usually is limited in its footprint.
- the dynamic area is spread along a relatively wide area, then this may represent a different scenario. For example, if a map boundary area shifts suddenly or moves in a way that many, possibly contiguous cells are tagged as active, then it may be likely that the robot has tilted. In such a case, the spatial sensor's detection plane may be angled such that a portion of the floor near the robot is read as a boundary, as indicated in the example described earlier.
- the robot identifies that a dynamic area involves an area larger than would be created by people, pets or moving objects in relative proportion with the former, then the updating of the map may be suspended.
- FIG. 13 depicts a flow of operation of a system as depicted in FIG. 1 , with the variant that tilt of the robot is detected and addressed in software.
- the data acquisition system generates data regarding the robot's physical environment, yielding the generated data at block 1302 .
- the generated data is used to generate or update the map of the robot's physical environment.
- a check is made to see if any elements of the map (e.g. a map boundary area) has shifted beyond a threshold limit. If not, then at block 1305 , map generation or update continues. However, if at block 1304 there has been a shift beyond the threshold limit, then at block 1306 , the map generation is suspended, or the map is modified.
- the map e.g. a map boundary area
- the instruction to suspend or modify is generated within the processing apparatus, and does not originate from the sensing unit.
- flow returns to data generation, so that further checks can be made to see whether the map elements have returned to within threshold limits.
- Detection of motion may rely on spatial scanning done by, for example, a laser rangefinder, which may continuously scan a robot's surroundings.
- the spatial distance represented by an aggregate distance, or by a distance differential may be compared to a pre-defined threshold value. If the difference between the first to the last distance measurement is larger than the threshold, it may be concluded that the robot is tilted.
- FIG. 14 provides an example of such a scenario. Consider the robot at location A moving through a room and passing a doorway into an adjoining room. Assume that the robot employs a planar spatial sensor enabling it to delineate the physical limits of its surroundings.
- Such a sensor likely would detect, through the open doorway, some portion of the wall of the adjoining room, which, in the example case, may yield the detected length of wall segment B. If one side of the advancing robot encounters an obstacle such as, for example, a thick rug, that results in the robot straddling the object (e.g., the left wheel(s) may be raised by the rug while the right wheel(s) continues to roll on the floor), then the robot's sensing plane likely will tilt toward its right side. Depending on room geometry and degree of tilt, it is possible that the portion of the sensing plane that had been detecting the wall of the adjoining room at B, now would intersect the floor of the adjoining room at the much closer location of B′.
- an obstacle such as, for example, a thick rug
- the data may show the wall boundary shift suddenly from B to B′ while other boundaries might show little or no variation in position.
- the determination that a tilt event has occurred may be based on a comparison between the physical length represented by the consecutive, newly-“occupied” cells and a pre-defined threshold. If the represented distance, or distance differential, meets or exceeds the threshold, it may be concluded that the robot has tilted and map updating may be suspended.
- Detection of tilt in hardware may involve the use of an accelerometer or similar component that may detect changes in the orientation of the component's mounting surface.
- data generated by the spatial scanner may be supplemented by data regarding changes in orientation.
- this latter data set providing contextual verification for the spatial sensor's data
- information collected while the tilt-detecting component indicates that the spatial sensor has lost its preferred orientation could be discarded.
- this data may be discarded before it is processed by any localization or mapping software.
- a robot uses a sensor generating 2D spatial information in a horizontal plane from the robot's surroundings.
- the dotted line indicates the sensing perimeter, created by the spatial sensing plane intersecting objects surrounding the robot. This perimeter informs the robot of nearby obstacles and the boundaries presented by walls and doors.
- FIG. 16 As depicted in FIG. 16 . if the robot traverses a low obstacle, such as the door frame shown in FIG. 16 , or an uneven surface, then the robot may lose its parallel disposition with respect to the floor. As a result, a sensor fixed to the robot collecting spatial information regarding the robot's surroundings may collect data at an angle away from horizontal.
- the dotted line in FIG. 16 shows the intersection of the spatial sensor's plane of detection with object surfaces surrounding the robot. With the robot tilted, the generated spatial data becomes erroneous. The calculated distance to the wall in front of the robot becomes distorted as the detection plane at B′ intersects the wall at a higher point, but, more critically, the detection plane's intersection with the floor behind the robot would incorrectly report a linear boundary at A′.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Radar, Positioning & Navigation (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Optics & Photonics (AREA)
- Electromagnetism (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
Abstract
Description
- The present application claims the benefit of co-pending U.S. provisional application Ser. No. 61/238,597, filed Aug. 31, 2009, entitled “Computation Optimization Techniques for Simultaneous Localization and Mapping,” the disclosure of which is incorporated by reference herein in its entirety.
- Aspects of the present invention relate to mobile robots, and more particularly to the mapping of environments in which mobile robots operate, to facilitate movement of mobile robots within those environments.
- As a system that enables a mobile robot to map its environment and maintain working data of its position within that map, simultaneous localization and mapping (SLAM) is both accurate and versatile. Its reliability and suitability for a variety of applications make it a useful element for imparting a robot with some level of autonomy.
- Typically, however, SLAM techniques tend to be computationally intensive and thus their efficient execution often requires a level of processing power and memory capacity that may not be cost effective for some consumer product applications.
- For those facing the low-cost production targets necessary for competition in the consumer market, it is unlikely that an economic hardware environment would include processing and memory capacities capable of supporting adequately a robust SLAM system. It therefore is imperative that developers seek ways to facilitate efficient execution of the core SLAM algorithms within the limits of the computational capacities they have. Generally, such optimization schemes would seek to use processing power and system bandwidth judiciously, which might mean simplifying some of the SLAM algorithms in ways that do not critically compromise their performance, or reducing input data size or bandwidth.
- Four concepts are outlined herein, each intended to enable a SLAM system to maintain efficiency when it is operating on a platform that provides limited processor power and/or memory capacity. Some of these optimization methods may reside entirely in software, or may require some element of hardware support to function properly.
-
FIG. 1 depicts a block diagram showing some features according to the invention. -
FIG. 2 depicts a flow chart showing some features according to the invention, corresponding to certain aspects ofFIG. 1 . -
FIG. 3 depicts another flow chart showing some other features according to the invention, corresponding to certain aspects ofFIG. 1 . -
FIG. 4 depicts a block diagram showing some other features according to the invention. -
FIG. 5 depicts an example of particle weight distribution for a localization iteration process. -
FIG. 6 depicts a further example of particle weight distribution for a localization iteration process. -
FIG. 7 depicts yet a further example of particle weight distribution for a localization iteration process. -
FIG. 8 depicts localized and delocalized states based on verified particle distribution data. -
FIG. 9 depicts a block diagram showing some other features according to the invention. -
FIG. 10 depicts a flow chart showing some other features of the invention, corresponding to certain aspects ofFIG. 6 . -
FIG. 11 depicts an example of orientation of a mobile robot in its physical environment. -
FIG. 12 depicts a further example of orientation of a mobile robot in its physical environment. -
FIG. 13 depicts a flow chart showing some other features of the invention, corresponding to certain aspects ofFIG. 1 . -
FIG. 14 depicts one scenario of movement and orientation of a mobile robot in its physical environment. -
FIG. 15 depicts a further scenario of movement and orientation of a mobile robot in its physical environment. -
FIG. 16 depicts yet a further scenario of movement and orientation of a mobile robot in its physical environment. - Localization requires regularly updating a robot's pose (position and angle) within its environment. The frequency with which this is done can affect overall system performance, depending on how often data must be processed as a result of an update operation. Minimizing computational load is essential to providing a SLAM system that can function effectively in a low-cost hardware environment.
- According to one feature of the invention, computational load may be reduced by eliminating robot position updates when it appears that the robot has become delocalized, in which case the updates likely would be erroneous anyway.
-
FIG. 1 is a diagram depicting aspects of the just-mentioned feature in a mobilerobotic system 100. InFIG. 1 , data acquisition system 110 generates data regarding the environment ofmobile robot 120. This data becomes input data to processingapparatus 130. From this data,processing apparatus 130 generates a map or model of the mobile robot's environment (block 132).Processing apparatus 130 also may contain a separate function (block 134) that monitors the generation or updating of the map for any shift in map elements beyond a threshold limit. If such an occurrence is detected, the processing apparatus (block 136) responds by executing instructions to suspend or modify the use of data from data acquisition system 110. Asensing unit 140 also may monitor the data acquisition system 110 for a loss in preferred orientation of the data acquisition system 110 for data generation. If sensingunit 140 detects a loss in orientation,processing apparatus 130 will respond by executing instructions to suspend or modify use of data generated by the data acquisition system 110.Mobile robot 120 may be connected to processingapparatus 130. Thesensing unit 140, if present, may be attached to themobile robot 120. Data acquisition system 110 may be attached to themobile robot 120 as well, or alternatively may be separate. -
FIG. 2 shows a flow of operation of the system depicted inFIG. 1 . InFIG. 2 , atblock 201, the data acquisition system generates data regarding the robot's physical environment, yielding the generated data atblock 202. Atblock 203, the orientation of the data acquisition system is monitored to see whether the data acquisition system is maintaining its preferred orientation with respect to the robot's physical environment (e.g. whether the data acquisition system is tilting, has tipped over, or otherwise seems to display an orientation other than one in which the robot can function within its physical environment. Atblock 204, if the preferred orientation is not lost, then atblock 205, the generated data is used to generate or update the map of the robot's physical environment. On the other hand, if atblock 204, if the preferred orientation is lost, then atblock 206, the map generation is suspended, or the map is modified. After eitherblock 205 orblock 206, flow returns to the top ofFIG. 1 to generate data and monitor the orientation of the data acquisition system. -
FIG. 3 is a diagram showing other features of the invention. InFIG. 3 ,map generation apparatus 310 provides a map of a mobile device's environment for localization of the mobile device within that environment. Adelocalization detection apparatus 320 uses the map information to determine the position of the device.Particle generation apparatus 322 generates particles representing potential poses of the mobile device. Particleweight assignment apparatus 324 assigns weights to each particle representing its relative likelihood of accuracy relative to other particles. Separately, an erroneousparticle generation apparatus 326 generates particles such that their corresponding weights as generated by particleweight assignment apparatus 324 will be low, representing a low probability of correctly indicating the mobile device's position. A particleweight comparison apparatus 328 compares the weights of the erroneous particles with the weights of the particles generated by theparticle generation apparatus 322 and confirms that the device is accurately localized or determines whether delocalization has occurred. - The method may operate as follows:
-
- 1) Erroneous position and inclination particles may be introduced to the set of tracking particles. The erroneous particles, also referred to later as verification particles, may be selected in a way that they likely will not introduce additional error into the current estimate of the robot's position and inclination.
- 2) Typically, erroneous particles have low weights, which may correspond generally to their low probability of accurately representing the robot's current position. If the erroneous particles have weights that are not uniformly low, but rather may be a distribution or some combination of low and high weights, then this may imply that the robot has become delocalized.
- 3) If it is determined that the robot likely is delocalized, then updating its position within the map of its surroundings may be suspended until the weights of the erroneous particles return to a more uniform distribution of low values.
-
FIG. 4 depicts a flow of operation of the system depicted inFIG. 3 . InFIG. 4 , atblock 401, the existing map may be used or updated as appropriate. Atblock 402, particles are generated, either anew or iteratively, the iteratively generated particles being added to the existing particle set. Atblock 403, weights are assigned to each particle. Atblock 404, erroneous particles are generated, and atblock 405, the erroneous particles have weights assigned to them. Atblock 406, the weights of the erroneous particles are compared to those of the original particle set to determine whether delocalization has occurred. Atblock 407, a check for delocalization is made. If delocalization has not occurred, then similarly to block 205 inFIG. 2 , map generation and updating continues. If delocalization has occurred, then similarly to block 206 inFIG. 2 , map generation is suspended or modified. - There are precautionary reasons why this procedure is implemented in a SLAM system and it may afford other advantages beyond computational load reduction. Suspension of mapping when delocalization is detected may avoid corrupting the map. Also, once delocalization is detected, additional actions can be enabled to improve the likelihood that the robot will re-localize, such as increasing the number of particles in the set or employing looser error models. Depending on the severity of the delocalization, other actions might be taken aside from those that are related to recovery. For example, the robot might stop or restart its run.
- A typical approach to localization under a SLAM scheme might include the following steps:
- 1) For each particle:
-
- a) Apply an ideal motion model (e.g., odometry).
- b) Apply position and angle (x,y,θ) adjustments drawn from error model distributions.
- c) Evaluate with respect to the current map to compute weight.
- 2) Resample particles proportional to computed weights.
- A typical localization iteration based on the above process might yield the particle weight distribution illustrated in
FIG. 5 . - In
FIG. 5 , the distribution of particles, sorted by weight, appears as a curve, indicating a mix of particles of low, middle and high weights. The particles with higher weights—those at the upper left side of the distribution—have a proportionally higher probability of representing accurately the robot's pose relative to other particles lower on the sorted distribution of weights. When the particles are indexed by their weights, a particle's index number may indicate its relative position with respect to other particles regarding its probability of accurately representing the robot's pose (position and angle). Within such a framework,particle 1 has the highest probability of accuracy and all subsequent particles (i.e.,particles - It is worth noting that the weight scale (the vertical axis in the graph) may be highly dependent on environmental conditions such as distance from walls, number of valid distance readings from a spatial sensor such as a laser rangefinder, etc. An approach to determining delocalization via the introduction of erroneous particles generally should be independent of environmental conditions.
- The goal of introducing erroneous particles is to identify when the particles with higher probability of representing the robot's pose are not much better than particles with the lowest probability of representing the robot's pose. In such a circumstance, the implication is that most or all potential poses are bad, and therefore the robot has little or no reliable information regarding its actual whereabouts within its environment. By definition, the robot is delocalized.
- The process of assessing the state of localization involves introducing additional test particles whose pose is deliberately erroneous in order to set a baseline weight for comparison to better particles.
- It is often observed that particle evaluation is most sensitive to angular errors. Small changes in robot angle, for example, can translate to large errors in distance measurements as the distance from the robot to an object in its surrounding environment increases. Large angular errors can have similar distributions of laser readings in terms of distance, but they may dramatically reduce the overall weight of the full particle set.
- Typically, the particles representing candidate location angles with the highest weights are fairly close to an ideal motion model. Recognizing this, a generally effective approach to delocalization detection is to introduce erroneous particles at the center of the ideal motion model with large offsets to the angle (e.g., ±30°, 40°, 50°, 60°, etc.).
- If the robot is properly localized, the erroneous particles will reside relatively close together at the end of the sorted distribution curve that contains the lowest weighted particles, as shown in
FIG. 6 . - In
FIG. 6 , the erroneous particles, referred to here as verification particles for their purpose, are clustered together on the lower right end of the curve, each having a weight that is closer to zero than the particles comprising the rest of the sorted distribution. - If the robot is delocalized, many normal particles will have low weights, and many of these are likely to have weights lower than some of the erroneous or verification particles, as seen in
FIG. 7 . - In
FIG. 7 , some erroneous (verification) particles reside at the far right side of the distribution, but other erroneous particles are scattered through the rest of the particle set. As more particles known to be erroneous have weights that exceed other, non-verification particles, it becomes increasingly likely that the robot has delocalized. - The actual determination of delocalization can be done in any of a variety of ways, including by examining the mean index value of the erroneous (verification) particles. In a localized condition, most or all of the erroneous particles will reside relatively close together at the bottom of the index, since they generally will have the lowest weights. Averaging the indices of the erroneous particles in a localized case will yield a large number relative to the size of the total set of particles, including both erroneous and non-erroneous particles.
- In a delocalized state, however, the erroneous particles are scattered through the distribution curve, and an indexing of particles in order of their weight will yield a set of erroneous particles whose averaged index is not necessarily high with respect to the size of the total set of particles. Generally, an average of verification particle indices that remains constant and high in value with respect to total particle set size reflects a localized condition. An average that falls in value or begins to fluctuate in value may indicate a delocalized condition.
- Both of these states, localized and delocalized, are depicted in the plots of the averaged verification particle data in
FIG. 8 . In this graph, the plotted data are the averaged verification particle indices. Forlocalization iterations 1 through 600, the averaged data are high and relatively constant, which is consistent with a localized state. Shortly afteriteration 600, the average value drops significantly and then recovers; in this particular data set, this drop corresponds to an engineer picking the robot up from the floor and moving it to a different location. Like the previous drops in index average, the return of the average to a high, stable number indicates that the robot likely recovered from the event. - At a point on the graph between 800 and 1000 localization iterations the data begins to fluctuate greatly. The lack of consistency in the average and the range of its variability are indicative of a delocalized condition. Unlike the previous, large delocalization, the robot likely was unable to recover from this delocalization as indicated by the data's continuing instability through the end of the data set.
- Determining that the robot has delocalized relies on comparing the averaged erroneous particle index to a threshold number. The threshold number can be decided a priori during coding, but it is typically beneficial to include some hysteresis in the evaluation of whether a robot is localized. For example, looking at the latter portion of the data set illustrated in
FIG. 8 , the variability of the averaged verification particle indices reaches a high number several times, but, in each instance, it drops again after only a few iterations. A proper evaluation of whether a robot has recovered from a delocalization event should not look only at instantaneous values, but also should evaluate whether the averaged index returns to a high value and remains stable at a high value for a period of time sufficient to demonstrate that the robot likely has successfully re-localized. The necessary minimum duration can also be defined in the code. - One of the challenges confronting a robot engaged in creation and update of maps of its surroundings is the potential mix of static and dynamic elements within its surroundings. While it is generally expected that most of a robot's surroundings will remain fixed, a robot should be prepared to function within an environment in which people, pets, etc. may be moving.
- Newly encountered, unmapped space may contain a mix of dynamic and static elements. Making a distinction between the robot's identification of potentially dynamic areas of the map and those that are static is essential for building useful and accurate maps for the robot to use.
- In an embodiment, the issue of distinguishing between static (permanent) elements of the robot's surroundings and dynamic (transient) elements may be addressed in the following way:
-
- 1) The robot may create an abstraction of its environment (a map) within a grid-space of cells available in memory, each cell containing a number that indicates a relative probability of whether the space within the cell is empty or occupied. These values may range from, for example, zero (empty) to 254 (occupied), with an initial condition value within every cell of 127 (i.e., a value in the middle of the spectrum).
- 2) A spatial sensor, most conveniently a laser rangefinder, may scan the robot's surroundings, measuring distances to boundaries and other objects. This data stream may provide the base information from which the robot can determine the probability that a cell is occupied or not. For example, if the spatial sensor measures a distance to a wall, the occupancy probability that the cell on the robot-generated map corresponding to that point along the wall is occupied increases while the occupancy probability for all the cells along the measurement vector between the robot and the wall decreases (because the wall was the first object detected). With repeated measurement from the spatial sensor, the probabilities may become more certain.
- 3) If a cell currently identified as empty has an occupancy probability that is changing (e.g., appearing suddenly to be occupied), it may signify a potentially dynamic area of the map.
- 4) If such cells are detected, they may be marked so as to not be updated with regard to their likelihood of containing an obstacle while they are dynamic. Similarly, this also can extend to an arbitrary zone surrounding these cells.
-
FIG. 9 is a diagram of a system containing other features of the invention. InFIG. 9 , adata acquisition system 910 generates data regarding the physical environment of a mobile device such as a robot. The data generated by the data acquisition system provides input to a map/model processing apparatus 920. The map/model processing apparatus 920 generates and maintains a map in a cell-based grid form (block 922) and assigns a probability of occupancy to each cell (block 924) based on the data received from the data acquisition system. Additionally, the map/model processing unit monitors individual cells (block 926) for changes in their probability of occupancy. Based on the detection of such changes, the processing unit determines if any cells are dynamic. If cells are determined to be dynamic, they are marked accordingly (block 928). Mapping or updating of such cells is suspended for the period that they are in a dynamic state. -
FIG. 10 depicts a flow of operation of the embodiment shown inFIG. 9 . InFIG. 10 , atblock 1001, the data acquisition system generates data regarding the robot's physical environment, yielding the generated data atblock 1002. Atblock 1003, the generated data is used to generate or update the map of the robot's physical environment. Atblock 1004, probabilities of occupancy for each cell in the grid map are assigned or updated. Atblock 1005, it is determined whether probabilities of occupancy of any of the cells are changing. If they are not, then flow returns to block 1001. If they are, then atblock 1006, the cells whose probabilities of occupancy are changing are marked as dynamic so that they are not updated while probability of occupancy is changing. Flow then returns to block 1001. - Accurate delineation of a robot's surroundings as part of mapping and localization requires maintaining the orientations of the sensors generating spatial data in congruence with the presiding surfaces of the surrounding geometry. For a robot operating inside a building or similar enclosure, this means that a sensor collecting information in two dimensions would preferably maintain its plane of detection as parallel to the floor since the floor would define the dominant plane of motion available to a robot traversing it.
- Because floors may have areas of uneven surface or surface discontinuities, or because objects resting on the floor may introduce non-uniformities in a robot's available travel surface, it is possible that a sensor collecting spatial data may not maintain consistent orientation with the presiding surfaces of the surrounding geometry, which can lead to erroneous delineation of the robot's surroundings.
-
FIGS. 11-12 illustrate the potential problem encountered by a robot collecting spatial data without an ability to detect when its sensor has lost parallel orientation with the floor. In the upper illustration, the robot is traveling away from a physical boundary at A and toward a physical boundary at B. A sensor mounted on the robot in this example is collecting spatial data in a horizontal plane indicated by the thin line positioned at a height near the top of the robot. In the lower illustration, the robot begins traversing an obstacle which tilts the robot backward. If the robot does not recognize that it is no longer collecting data in a plane that is accordant with the surrounding geometry, then the spatial construction developed from the sensor data will not match the actual geometry defined by the robot's surroundings. In this case, the data collection plane's forward incline will distort the previously determined position of the wall at B to one further out, at B′. The backward decline on the data collection plane results in its intersection with the floor, creating the impression that a boundary exists behind the robot at A′ rather than at the further position of A. - Often, wheel slip accompanies tilt when a robot traverses a substantive irregularity in a floor surface. This can be particularly problematic if it occurs when the robot is collecting its first data on a new area (e.g., when the robot has turned a corner into an unmapped space) since the distorted image may be incorporated into the map.
- For a robot using the continuous generation of spatial boundary information to provide updates to a map, erroneous data generated during a tilt event can propagate into mapping or localization algorithms. The potential results may include some degree of mapping corruption, which frequently can lead to delocalization.
- Consequently, it is important to provide a strategy to identify and address tilt conditions during normal operation, and two approaches to same are described below. These approaches are designed such that they can be used separately or together in potential reinforcement.
- Typically, dynamic areas created by people, pets or objects moved or in use by a person will present a dynamic area to mark, one that usually is limited in its footprint. However, if the dynamic area is spread along a relatively wide area, then this may represent a different scenario. For example, if a map boundary area shifts suddenly or moves in a way that many, possibly contiguous cells are tagged as active, then it may be likely that the robot has tilted. In such a case, the spatial sensor's detection plane may be angled such that a portion of the floor near the robot is read as a boundary, as indicated in the example described earlier. When the robot identifies that a dynamic area involves an area larger than would be created by people, pets or moving objects in relative proportion with the former, then the updating of the map may be suspended.
-
FIG. 13 depicts a flow of operation of a system as depicted inFIG. 1 , with the variant that tilt of the robot is detected and addressed in software. Atblock 1301, the data acquisition system generates data regarding the robot's physical environment, yielding the generated data atblock 1302. Atblock 1303, the generated data is used to generate or update the map of the robot's physical environment. Atblock 1304, a check is made to see if any elements of the map (e.g. a map boundary area) has shifted beyond a threshold limit. If not, then atblock 1305, map generation or update continues. However, if atblock 1304 there has been a shift beyond the threshold limit, then atblock 1306, the map generation is suspended, or the map is modified. In this aspect, the instruction to suspend or modify is generated within the processing apparatus, and does not originate from the sensing unit. After eitherblock 1305 orblock 1306, flow returns to data generation, so that further checks can be made to see whether the map elements have returned to within threshold limits. - It should be noted that instructions to suspend or modify the use of generated data for mapping need not come solely from the sensing unit or from within the processing apparatus. These respective features of the system depicted in
FIG. 1 may operate concurrently. - Detection of motion may rely on spatial scanning done by, for example, a laser rangefinder, which may continuously scan a robot's surroundings. When scanning indicates that consecutive distance readings show “dynamic” movement, the spatial distance represented by an aggregate distance, or by a distance differential, may be compared to a pre-defined threshold value. If the difference between the first to the last distance measurement is larger than the threshold, it may be concluded that the robot is tilted.
FIG. 14 provides an example of such a scenario. Consider the robot at location A moving through a room and passing a doorway into an adjoining room. Assume that the robot employs a planar spatial sensor enabling it to delineate the physical limits of its surroundings. Such a sensor likely would detect, through the open doorway, some portion of the wall of the adjoining room, which, in the example case, may yield the detected length of wall segment B. If one side of the advancing robot encounters an obstacle such as, for example, a thick rug, that results in the robot straddling the object (e.g., the left wheel(s) may be raised by the rug while the right wheel(s) continues to roll on the floor), then the robot's sensing plane likely will tilt toward its right side. Depending on room geometry and degree of tilt, it is possible that the portion of the sensing plane that had been detecting the wall of the adjoining room at B, now would intersect the floor of the adjoining room at the much closer location of B′. In such a case, as the robot updates the map of its surroundings, the data may show the wall boundary shift suddenly from B to B′ while other boundaries might show little or no variation in position. For a robot monitoring sudden changes in consecutive cells—from empty cells at B′ during level operation to occupied cells at B′ when the robot is tilting—the determination that a tilt event has occurred may be based on a comparison between the physical length represented by the consecutive, newly-“occupied” cells and a pre-defined threshold. If the represented distance, or distance differential, meets or exceeds the threshold, it may be concluded that the robot has tilted and map updating may be suspended. - Detection of tilt in hardware may involve the use of an accelerometer or similar component that may detect changes in the orientation of the component's mounting surface.
- With this approach, data generated by the spatial scanner may be supplemented by data regarding changes in orientation. With this latter data set providing contextual verification for the spatial sensor's data, information collected while the tilt-detecting component indicates that the spatial sensor has lost its preferred orientation could be discarded. In a typical embodiment, this data may be discarded before it is processed by any localization or mapping software.
- As depicted in
FIG. 15 , a robot uses a sensor generating 2D spatial information in a horizontal plane from the robot's surroundings. The dotted line indicates the sensing perimeter, created by the spatial sensing plane intersecting objects surrounding the robot. This perimeter informs the robot of nearby obstacles and the boundaries presented by walls and doors. - As depicted in
FIG. 16 . if the robot traverses a low obstacle, such as the door frame shown inFIG. 16 , or an uneven surface, then the robot may lose its parallel disposition with respect to the floor. As a result, a sensor fixed to the robot collecting spatial information regarding the robot's surroundings may collect data at an angle away from horizontal. The dotted line inFIG. 16 shows the intersection of the spatial sensor's plane of detection with object surfaces surrounding the robot. With the robot tilted, the generated spatial data becomes erroneous. The calculated distance to the wall in front of the robot becomes distorted as the detection plane at B′ intersects the wall at a higher point, but, more critically, the detection plane's intersection with the floor behind the robot would incorrectly report a linear boundary at A′. - Several features and aspects of the present invention have been illustrated and described in detail with reference to particular embodiments by way of example only, and not by way of limitation. Reference herein to “one embodiment” or “an embodiment” means that a particular feature, structure, operation, or other characteristic described in connection with the embodiment may be included in at least one implementation of the invention. However, the appearance of the phrase “in one embodiment” or “in an embodiment” in various places in the specification does not necessarily refer to the same embodiment. It is envisaged that the ordinarily skilled person could use any or all of the above embodiments individually, or in any compatible combination or permutation. Those of skill in the art will appreciate that alternative implementations and various modifications to the disclosed embodiments are within the scope and contemplation of the present disclosure. Therefore, it is intended that the invention be considered as limited only by the scope of the appended claims.
Claims (25)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/873,018 US20110082585A1 (en) | 2009-08-31 | 2010-08-31 | Method and apparatus for simultaneous localization and mapping of mobile robot environment |
US14/067,705 US8903589B2 (en) | 2009-08-31 | 2013-10-30 | Method and apparatus for simultaneous localization and mapping of mobile robot environment |
US14/543,508 US9678509B2 (en) | 2009-08-31 | 2014-11-17 | Method and apparatus for simultaneous localization and mapping of mobile robot environment |
US15/602,012 US20170255203A1 (en) | 2009-08-31 | 2017-05-22 | Method and apparatus for simultaneous localization and mapping of mobile robot environment |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US23859709P | 2009-08-31 | 2009-08-31 | |
US12/873,018 US20110082585A1 (en) | 2009-08-31 | 2010-08-31 | Method and apparatus for simultaneous localization and mapping of mobile robot environment |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/067,705 Division US8903589B2 (en) | 2009-08-31 | 2013-10-30 | Method and apparatus for simultaneous localization and mapping of mobile robot environment |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110082585A1 true US20110082585A1 (en) | 2011-04-07 |
Family
ID=42941963
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/873,018 Abandoned US20110082585A1 (en) | 2009-08-31 | 2010-08-31 | Method and apparatus for simultaneous localization and mapping of mobile robot environment |
US14/067,705 Active US8903589B2 (en) | 2009-08-31 | 2013-10-30 | Method and apparatus for simultaneous localization and mapping of mobile robot environment |
US14/543,508 Active US9678509B2 (en) | 2009-08-31 | 2014-11-17 | Method and apparatus for simultaneous localization and mapping of mobile robot environment |
US15/602,012 Abandoned US20170255203A1 (en) | 2009-08-31 | 2017-05-22 | Method and apparatus for simultaneous localization and mapping of mobile robot environment |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/067,705 Active US8903589B2 (en) | 2009-08-31 | 2013-10-30 | Method and apparatus for simultaneous localization and mapping of mobile robot environment |
US14/543,508 Active US9678509B2 (en) | 2009-08-31 | 2014-11-17 | Method and apparatus for simultaneous localization and mapping of mobile robot environment |
US15/602,012 Abandoned US20170255203A1 (en) | 2009-08-31 | 2017-05-22 | Method and apparatus for simultaneous localization and mapping of mobile robot environment |
Country Status (10)
Country | Link |
---|---|
US (4) | US20110082585A1 (en) |
EP (1) | EP2473890B1 (en) |
JP (2) | JP2013503404A (en) |
KR (1) | KR101362961B1 (en) |
CN (2) | CN102576228A (en) |
AU (1) | AU2010286429B2 (en) |
CA (2) | CA2772636A1 (en) |
HK (1) | HK1211352A1 (en) |
NZ (1) | NZ598500A (en) |
WO (1) | WO2011026119A2 (en) |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8209144B1 (en) * | 2009-09-15 | 2012-06-26 | Google Inc. | Accurate alignment of multiple laser scans using a template surface |
US20120195491A1 (en) * | 2010-07-21 | 2012-08-02 | Palo Alto Research Center Incorporated | System And Method For Real-Time Mapping Of An Indoor Environment Using Mobile Robots With Limited Sensing |
WO2014110204A1 (en) * | 2013-01-10 | 2014-07-17 | Intel Corporation | Positioning and mapping based on virtual landmarks |
US20140350839A1 (en) * | 2013-05-23 | 2014-11-27 | Irobot Corporation | Simultaneous Localization And Mapping For A Mobile Robot |
US20150261223A1 (en) * | 2011-09-30 | 2015-09-17 | Irobot Corporation | Adaptive mapping with spatial summaries of sensor data |
US9304001B2 (en) | 2013-07-03 | 2016-04-05 | Samsung Electronics Co., Ltd | Position recognition methods of autonomous mobile robots |
US9361591B2 (en) | 2012-10-29 | 2016-06-07 | Electronics And Telecommunications Research Institute | Apparatus and method for building map of probability distribution based on properties of object and system |
US20160378115A1 (en) * | 2015-06-25 | 2016-12-29 | Hyundai Motor Company | System and method for writing occupancy grid map of sensor centered coordinate system using laser scanner |
US9566706B2 (en) | 2014-01-14 | 2017-02-14 | Samsung Electronics Co., Ltd. | Robot and control method thereof |
WO2017116492A1 (en) * | 2015-12-31 | 2017-07-06 | Olney Guy | Method for integrating parallel streams of related sensor data generating trial responses without prior knowledge of data meaning or the environment being sensed |
DE102016203547A1 (en) | 2016-03-03 | 2017-09-07 | Kuka Roboter Gmbh | Method for updating an occupancy card and autonomous vehicle |
US9864377B2 (en) | 2016-04-01 | 2018-01-09 | Locus Robotics Corporation | Navigation using planned robot travel paths |
WO2018094272A1 (en) * | 2016-11-18 | 2018-05-24 | Robert Bosch Start-Up Platform North America, LLC, Series 1 | Robotic creature and method of operation |
US10310511B2 (en) | 2016-04-20 | 2019-06-04 | Toyota Jidosha Kabushiki Kaisha | Automatic driving control system of mobile object |
CN109900267A (en) * | 2019-04-12 | 2019-06-18 | 哈尔滨理工大学 | A kind of mobile robot map-building based on slam and autonomous searching system |
US10365656B2 (en) | 2017-11-22 | 2019-07-30 | Locus Robotics Corp. | Robot charger docking localization |
US10386851B2 (en) | 2017-09-22 | 2019-08-20 | Locus Robotics Corp. | Multi-resolution scan matching with exclusion zones |
CN110174894A (en) * | 2019-05-27 | 2019-08-27 | 小狗电器互联网科技(北京)股份有限公司 | Robot and its method for relocating |
US10429847B2 (en) | 2017-09-22 | 2019-10-01 | Locus Robotics Corp. | Dynamic window approach using optimal reciprocal collision avoidance cost-critic |
US20190329407A1 (en) * | 2018-04-30 | 2019-10-31 | Beijing Jingdong Shangke Information Technology Co., Ltd. | System and method for multimodal mapping and localization |
US10549430B2 (en) * | 2015-08-28 | 2020-02-04 | Panasonic Intellectual Property Corporation Of America | Mapping method, localization method, robot system, and robot |
US20200050205A1 (en) * | 2018-08-07 | 2020-02-13 | Cnh Industrial America Llc | System and method for updating a mapped area |
US20200265621A1 (en) * | 2019-02-14 | 2020-08-20 | Faro Technologies, Inc. | System and method of scanning two dimensional floorplans using multiple scanners concurrently |
US10761539B2 (en) | 2017-11-22 | 2020-09-01 | Locus Robotics Corp. | Robot charger docking control |
TWI736960B (en) * | 2019-08-28 | 2021-08-21 | 財團法人車輛研究測試中心 | Synchronous positioning and mapping optimization method |
CN113478480A (en) * | 2021-06-22 | 2021-10-08 | 中建三局集团有限公司 | Trajectory planning method for transverse arm material distributing machine |
US11199853B1 (en) | 2018-07-11 | 2021-12-14 | AI Incorporated | Versatile mobile platform |
US11254002B1 (en) | 2018-03-19 | 2022-02-22 | AI Incorporated | Autonomous robotic device |
US11320828B1 (en) | 2018-03-08 | 2022-05-03 | AI Incorporated | Robotic cleaner |
US11340079B1 (en) | 2018-05-21 | 2022-05-24 | AI Incorporated | Simultaneous collaboration, localization, and mapping |
US20220269273A1 (en) * | 2021-02-23 | 2022-08-25 | Hyundai Motor Company | Apparatus for estimating position of target, robot system having the same, and method thereof |
US11454981B1 (en) | 2018-04-20 | 2022-09-27 | AI Incorporated | Versatile mobile robotic device |
US11548159B1 (en) * | 2018-05-31 | 2023-01-10 | AI Incorporated | Modular robot |
US11625870B2 (en) | 2017-07-31 | 2023-04-11 | Oxford University Innovation Limited | Method of constructing a model of the motion of a mobile device and related systems |
US11944876B2 (en) * | 2022-05-30 | 2024-04-02 | Tennibot Inc. | Autonomous tennis assistant systems |
US12039674B2 (en) | 2020-09-18 | 2024-07-16 | Apple Inc. | Inertial data management for extended reality for moving platforms |
DE102023204536A1 (en) | 2023-05-15 | 2024-11-21 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for locating a mobile device |
Families Citing this family (71)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5429986B2 (en) * | 2009-12-25 | 2014-02-26 | 株式会社Ihiエアロスペース | Mobile robot remote environment recognition apparatus and method |
JP5452442B2 (en) * | 2010-10-25 | 2014-03-26 | 株式会社日立製作所 | Robot system and map updating method |
BR112014032713A2 (en) | 2012-06-27 | 2017-06-27 | Pentair Water Pool & Spa Inc | pool cleaner with a laser range finder method and system |
US9020637B2 (en) * | 2012-11-02 | 2015-04-28 | Irobot Corporation | Simultaneous localization and mapping for a mobile robot |
CN103903253B (en) * | 2012-12-28 | 2017-06-27 | 联想(北京)有限公司 | A kind of movable termination localization method and system |
CN103901774B (en) * | 2012-12-28 | 2017-02-08 | 联想(北京)有限公司 | Efficient robust SLAM coordinating method and system based on multiple sensors |
JP2014203145A (en) * | 2013-04-02 | 2014-10-27 | パナソニック株式会社 | Autonomous mobile apparatus |
CN103412565B (en) * | 2013-05-17 | 2016-01-27 | 浙江中控研究院有限公司 | A kind of robot localization method with the quick estimated capacity of global position |
KR101505129B1 (en) | 2013-08-19 | 2015-03-23 | 부경대학교 산학협력단 | Method for location recognization using system for location recognization and mapping using laser scanner |
US9886036B2 (en) * | 2014-02-10 | 2018-02-06 | John Bean Technologies Corporation | Routing of automated guided vehicles |
CN103984037B (en) * | 2014-04-30 | 2017-07-28 | 深圳市墨克瑞光电子研究院 | The mobile robot obstacle detection method and device of view-based access control model |
US9259838B1 (en) | 2014-07-24 | 2016-02-16 | Google Inc. | Systems and methods for ground plane estimation |
WO2016019390A1 (en) * | 2014-08-01 | 2016-02-04 | Locuslabs Ip | Image-based object location system and process |
FR3025325B1 (en) * | 2014-09-01 | 2016-12-30 | Valeo Schalter & Sensoren Gmbh | DEVICE AND METHOD FOR LOCALIZATION AND MAPPING |
US10660496B2 (en) * | 2014-09-24 | 2020-05-26 | Samsung Electronics Co., Ltd. | Cleaning robot and method of controlling the cleaning robot |
CN104597900A (en) * | 2014-12-02 | 2015-05-06 | 华东交通大学 | Electromagnetism-like mechanism optimization based FastSLAM method |
EP3234721B1 (en) * | 2014-12-17 | 2021-11-24 | Husqvarna AB | Multi-sensor, autonomous robotic vehicle with mapping capability |
US10444760B2 (en) | 2014-12-17 | 2019-10-15 | Husqvarna Ab | Robotic vehicle learning site boundary |
TWI548891B (en) * | 2015-01-12 | 2016-09-11 | 金寶電子工業股份有限公司 | Positioning system for sweeper and positioning method using for the positioning system |
CN104858871B (en) * | 2015-05-15 | 2016-09-07 | 珠海市一微半导体有限公司 | Robot system and self-built map thereof and the method for navigation |
CN106325266A (en) * | 2015-06-15 | 2017-01-11 | 联想(北京)有限公司 | Spatial distribution map building method and electronic device |
DE102015111613A1 (en) | 2015-07-17 | 2017-01-19 | Still Gmbh | Method for detecting obstacles in an industrial truck |
CN106584451B (en) * | 2015-10-14 | 2019-12-10 | 国网智能科技股份有限公司 | automatic transformer substation composition robot and method based on visual navigation |
US10093021B2 (en) * | 2015-12-02 | 2018-10-09 | Qualcomm Incorporated | Simultaneous mapping and planning by a robot |
CN105892461B (en) * | 2016-04-13 | 2018-12-04 | 上海物景智能科技有限公司 | A kind of matching and recognition method and system of robot local environment and map |
US9996944B2 (en) * | 2016-07-06 | 2018-06-12 | Qualcomm Incorporated | Systems and methods for mapping an environment |
US10274325B2 (en) * | 2016-11-01 | 2019-04-30 | Brain Corporation | Systems and methods for robotic mapping |
CN108107882B (en) * | 2016-11-24 | 2021-07-06 | 中国科学技术大学 | Automatic calibration and detection system of service robot based on optical motion tracking |
KR102758179B1 (en) * | 2017-03-30 | 2025-01-23 | 크라운 이큅먼트 코포레이션 | Warehouse mapping tools |
US10394246B2 (en) | 2017-03-31 | 2019-08-27 | Neato Robotics, Inc. | Robot with automatic styles |
CN106919174A (en) * | 2017-04-10 | 2017-07-04 | 江苏东方金钰智能机器人有限公司 | A kind of bootstrap technique of intelligently guiding robot |
US10761541B2 (en) * | 2017-04-21 | 2020-09-01 | X Development Llc | Localization with negative mapping |
US10551843B2 (en) | 2017-07-11 | 2020-02-04 | Neato Robotics, Inc. | Surface type detection for robotic cleaning device |
US10918252B2 (en) | 2017-07-27 | 2021-02-16 | Neato Robotics, Inc. | Dirt detection layer and laser backscatter dirt detection |
CA3073151C (en) | 2017-08-16 | 2021-02-16 | Sharkninja Operating Llc | Robotic vacuum |
AU2018320867B2 (en) | 2017-08-22 | 2024-08-01 | Pentair Water Pool And Spa, Inc. | Algorithm for a pool cleaner |
US10583561B2 (en) | 2017-08-31 | 2020-03-10 | Neato Robotics, Inc. | Robotic virtual boundaries |
GB2567944A (en) | 2017-08-31 | 2019-05-01 | Neato Robotics Inc | Robotic virtual boundaries |
JP2019047848A (en) * | 2017-09-07 | 2019-03-28 | パナソニックIpマネジメント株式会社 | Autonomous travel vacuum cleaner, and cumulative floor surface probability update method |
CN107728616B (en) * | 2017-09-27 | 2019-07-02 | 广东宝乐机器人股份有限公司 | The map creating method and mobile robot of mobile robot |
KR102629762B1 (en) * | 2017-10-02 | 2024-01-29 | 소니그룹주식회사 | Environmental information update device, environmental information update method and program |
WO2019068214A1 (en) * | 2017-10-03 | 2019-04-11 | Intel Corporation | Grid occupancy mapping using error range distribution |
WO2019068222A1 (en) * | 2017-10-06 | 2019-04-11 | Qualcomm Incorporated | Concurrent relocation and reinitialization of vslam |
US10638906B2 (en) | 2017-12-15 | 2020-05-05 | Neato Robotics, Inc. | Conversion of cleaning robot camera images to floorplan for user interaction |
WO2019216578A1 (en) * | 2018-05-11 | 2019-11-14 | Samsung Electronics Co., Ltd. | Method and apparatus for executing cleaning operation |
US11243540B2 (en) | 2018-05-17 | 2022-02-08 | University Of Connecticut | System and method for complete coverage of unknown environments |
US11194335B2 (en) | 2018-07-10 | 2021-12-07 | Neato Robotics, Inc. | Performance-based cleaning robot charging method and apparatus |
US11157016B2 (en) | 2018-07-10 | 2021-10-26 | Neato Robotics, Inc. | Automatic recognition of multiple floorplans by cleaning robot |
DE102018121365A1 (en) | 2018-08-31 | 2020-04-23 | RobArt GmbH | EXPLORATION OF A ROBOT APPLICATION AREA BY AN AUTONOMOUS MOBILE ROBOT |
US11272823B2 (en) | 2018-08-31 | 2022-03-15 | Neato Robotics, Inc. | Zone cleaning apparatus and method |
CN118014912A (en) * | 2018-10-15 | 2024-05-10 | 科沃斯机器人股份有限公司 | Method, device and storage medium for correcting environment map |
CN109191027A (en) * | 2018-11-09 | 2019-01-11 | 浙江国自机器人技术有限公司 | A kind of robot calling method, system, equipment and computer readable storage medium |
CN109531592B (en) * | 2018-11-30 | 2022-02-15 | 佛山科学技术学院 | Book checking robot based on visual SLAM |
EP3731130B1 (en) * | 2019-04-23 | 2024-06-05 | Continental Autonomous Mobility Germany GmbH | Apparatus for determining an occupancy map |
US11250576B2 (en) * | 2019-08-19 | 2022-02-15 | Toyota Research Institute, Inc. | Systems and methods for estimating dynamics of objects using temporal changes encoded in a difference map |
US11327483B2 (en) * | 2019-09-30 | 2022-05-10 | Irobot Corporation | Image capture devices for autonomous mobile robots and related systems and methods |
CN110852211A (en) * | 2019-10-29 | 2020-02-28 | 北京影谱科技股份有限公司 | Neural network-based method and device for filtering obstacles in SLAM |
WO2021125510A1 (en) * | 2019-12-20 | 2021-06-24 | Samsung Electronics Co., Ltd. | Method and device for navigating in dynamic environment |
US11880209B2 (en) * | 2020-05-15 | 2024-01-23 | Samsung Electronics Co., Ltd. | Electronic apparatus and controlling method thereof |
CN111928866B (en) * | 2020-09-27 | 2021-02-12 | 上海思岚科技有限公司 | Robot map difference updating method and device |
KR102735054B1 (en) * | 2020-11-12 | 2024-11-27 | 주식회사 유진로봇 | Apparatus and method for editing 3D SLAM data |
EP4245474A1 (en) * | 2020-11-12 | 2023-09-20 | Yujin Robot Co., Ltd. | Robot system |
CN112698345B (en) * | 2020-12-04 | 2024-01-30 | 江苏科技大学 | Laser radar robot simultaneous positioning and map building optimization method |
CN112581613B (en) * | 2020-12-08 | 2024-11-01 | 纵目科技(上海)股份有限公司 | Grid map generation method, system, electronic equipment and storage medium |
WO2022190324A1 (en) * | 2021-03-11 | 2022-09-15 | 株式会社Fuji | Moving system and management device |
WO2022197544A1 (en) * | 2021-03-15 | 2022-09-22 | Omron Corporation | Method and apparatus for updating an environment map used by robots for self-localization |
CN112985417B (en) * | 2021-04-19 | 2021-07-27 | 长沙万为机器人有限公司 | Pose correction method for particle filter positioning of mobile robot and mobile robot |
KR20230017060A (en) * | 2021-07-27 | 2023-02-03 | 삼성전자주식회사 | Robot and controlling method thereof |
US20230320551A1 (en) | 2022-04-11 | 2023-10-12 | Vorwerk & Co. Interholding Gmb | Obstacle avoidance using fused depth and intensity from nnt training |
WO2024232795A1 (en) * | 2023-05-11 | 2024-11-14 | Telefonaktiebolaget Lm Ericsson (Publ) | Stopping generation of a map of an environment |
KR20240175382A (en) | 2023-06-13 | 2024-12-20 | 한국로봇융합연구원 | Apparatus and method for verifying robot movement performance |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5202661A (en) * | 1991-04-18 | 1993-04-13 | The United States Of America As Represented By The Secretary Of The Navy | Method and system for fusing data from fixed and mobile security sensors |
US5363305A (en) * | 1990-07-02 | 1994-11-08 | Nec Research Institute, Inc. | Navigation system for a mobile robot |
US5793934A (en) * | 1994-06-22 | 1998-08-11 | Siemens Aktiengesellschaft | Method for the orientation, route planning and control of an autonomous mobile unit |
US5957984A (en) * | 1994-09-06 | 1999-09-28 | Siemens Aktiengesellschaft | Method of determining the position of a landmark in the environment map of a self-propelled unit, the distance of the landmark from the unit being determined dynamically by the latter |
US20040076324A1 (en) * | 2002-08-16 | 2004-04-22 | Burl Michael Christopher | Systems and methods for the automated sensing of motion in a mobile robot using visual data |
US20040167669A1 (en) * | 2002-12-17 | 2004-08-26 | Karlsson L. Niklas | Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system |
US20050171637A1 (en) * | 2004-01-30 | 2005-08-04 | Funai Electric Co., Ltd. | Self-running cleaner with collision obviation capability |
US20060235585A1 (en) * | 2005-04-18 | 2006-10-19 | Funai Electric Co., Ltd. | Self-guided cleaning robot |
US20070061043A1 (en) * | 2005-09-02 | 2007-03-15 | Vladimir Ermakov | Localization and mapping system and method for a robotic device |
US20080027591A1 (en) * | 2006-07-14 | 2008-01-31 | Scott Lenser | Method and system for controlling a remote vehicle |
US20080109126A1 (en) * | 2006-03-17 | 2008-05-08 | Irobot Corporation | Lawn Care Robot |
US20100049391A1 (en) * | 2008-08-25 | 2010-02-25 | Murata Machinery, Ltd. | Autonomous moving apparatus |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB0126497D0 (en) * | 2001-11-03 | 2002-01-02 | Dyson Ltd | An autonomous machine |
JP2006239844A (en) * | 2005-03-04 | 2006-09-14 | Sony Corp | Obstacle avoiding device, obstacle avoiding method, obstacle avoiding program and mobile robot device |
JP2007041657A (en) * | 2005-07-29 | 2007-02-15 | Sony Corp | Moving body control method, and moving body |
KR100843085B1 (en) * | 2006-06-20 | 2008-07-02 | 삼성전자주식회사 | Grid map preparation method and device of mobile robot and method and device for area separation |
US7587260B2 (en) * | 2006-07-05 | 2009-09-08 | Battelle Energy Alliance, Llc | Autonomous navigation system and method |
US7801644B2 (en) * | 2006-07-05 | 2010-09-21 | Battelle Energy Alliance, Llc | Generic robot architecture |
US8996172B2 (en) * | 2006-09-01 | 2015-03-31 | Neato Robotics, Inc. | Distance sensor system and method |
CN100449444C (en) * | 2006-09-29 | 2009-01-07 | 浙江大学 | A Method for Simultaneous Localization and Map Construction of Mobile Robots in Unknown Environments |
US7613673B2 (en) * | 2006-10-18 | 2009-11-03 | The Boeing Company | Iterative particle reduction methods and systems for localization and pattern recognition |
KR100809352B1 (en) * | 2006-11-16 | 2008-03-05 | 삼성전자주식회사 | Particle Filter-based Attitude Estimation Method and Apparatus |
JP2009169845A (en) * | 2008-01-18 | 2009-07-30 | Toyota Motor Corp | Autonomous mobile robot and map updating method |
KR101538775B1 (en) * | 2008-09-12 | 2015-07-30 | 삼성전자 주식회사 | Apparatus and method for localization using forward images |
KR101503903B1 (en) * | 2008-09-16 | 2015-03-19 | 삼성전자 주식회사 | Apparatus and method for building map used in mobile robot |
-
2010
- 2010-08-31 KR KR1020127006404A patent/KR101362961B1/en active IP Right Grant
- 2010-08-31 CN CN2010800459012A patent/CN102576228A/en active Pending
- 2010-08-31 WO PCT/US2010/047358 patent/WO2011026119A2/en active Application Filing
- 2010-08-31 CN CN201510047904.4A patent/CN104699099B/en active Active
- 2010-08-31 JP JP2012527100A patent/JP2013503404A/en active Pending
- 2010-08-31 NZ NZ598500A patent/NZ598500A/en not_active IP Right Cessation
- 2010-08-31 CA CA2772636A patent/CA2772636A1/en not_active Abandoned
- 2010-08-31 AU AU2010286429A patent/AU2010286429B2/en active Active
- 2010-08-31 CA CA2859112A patent/CA2859112C/en active Active
- 2010-08-31 US US12/873,018 patent/US20110082585A1/en not_active Abandoned
- 2010-08-31 EP EP10760486.0A patent/EP2473890B1/en active Active
-
2013
- 2013-10-30 US US14/067,705 patent/US8903589B2/en active Active
- 2013-12-05 JP JP2013251735A patent/JP5837553B2/en active Active
-
2014
- 2014-11-17 US US14/543,508 patent/US9678509B2/en active Active
-
2015
- 2015-12-07 HK HK15112036.4A patent/HK1211352A1/en unknown
-
2017
- 2017-05-22 US US15/602,012 patent/US20170255203A1/en not_active Abandoned
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5363305A (en) * | 1990-07-02 | 1994-11-08 | Nec Research Institute, Inc. | Navigation system for a mobile robot |
US5202661A (en) * | 1991-04-18 | 1993-04-13 | The United States Of America As Represented By The Secretary Of The Navy | Method and system for fusing data from fixed and mobile security sensors |
US5793934A (en) * | 1994-06-22 | 1998-08-11 | Siemens Aktiengesellschaft | Method for the orientation, route planning and control of an autonomous mobile unit |
US5957984A (en) * | 1994-09-06 | 1999-09-28 | Siemens Aktiengesellschaft | Method of determining the position of a landmark in the environment map of a self-propelled unit, the distance of the landmark from the unit being determined dynamically by the latter |
US20040076324A1 (en) * | 2002-08-16 | 2004-04-22 | Burl Michael Christopher | Systems and methods for the automated sensing of motion in a mobile robot using visual data |
US20040167669A1 (en) * | 2002-12-17 | 2004-08-26 | Karlsson L. Niklas | Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system |
US20050171637A1 (en) * | 2004-01-30 | 2005-08-04 | Funai Electric Co., Ltd. | Self-running cleaner with collision obviation capability |
US20060235585A1 (en) * | 2005-04-18 | 2006-10-19 | Funai Electric Co., Ltd. | Self-guided cleaning robot |
US20070061043A1 (en) * | 2005-09-02 | 2007-03-15 | Vladimir Ermakov | Localization and mapping system and method for a robotic device |
US20080109126A1 (en) * | 2006-03-17 | 2008-05-08 | Irobot Corporation | Lawn Care Robot |
US20080027591A1 (en) * | 2006-07-14 | 2008-01-31 | Scott Lenser | Method and system for controlling a remote vehicle |
US20100049391A1 (en) * | 2008-08-25 | 2010-02-25 | Murata Machinery, Ltd. | Autonomous moving apparatus |
Non-Patent Citations (1)
Title |
---|
Sheng Fu et al, "SLAM for Mobile Robots Using Laser Range Finder and Monocular Vision", IEEE, 2007 * |
Cited By (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8209144B1 (en) * | 2009-09-15 | 2012-06-26 | Google Inc. | Accurate alignment of multiple laser scans using a template surface |
US8209143B1 (en) * | 2009-09-15 | 2012-06-26 | Google Inc. | Accurate alignment of multiple laser scans using a template surface |
US20120195491A1 (en) * | 2010-07-21 | 2012-08-02 | Palo Alto Research Center Incorporated | System And Method For Real-Time Mapping Of An Indoor Environment Using Mobile Robots With Limited Sensing |
US10962376B2 (en) | 2011-09-30 | 2021-03-30 | Irobot Corporation | Adaptive mapping with spatial summaries of sensor data |
US9404756B2 (en) * | 2011-09-30 | 2016-08-02 | Irobot Corporation | Adaptive mapping with spatial summaries of sensor data |
US9952053B2 (en) * | 2011-09-30 | 2018-04-24 | Irobot Corporation | Adaptive mapping with spatial summaries of sensor data |
US20150261223A1 (en) * | 2011-09-30 | 2015-09-17 | Irobot Corporation | Adaptive mapping with spatial summaries of sensor data |
US9218003B2 (en) * | 2011-09-30 | 2015-12-22 | Irobot Corporation | Adaptive mapping with spatial summaries of sensor data |
US20160069691A1 (en) * | 2011-09-30 | 2016-03-10 | Irobot Corporation | Adaptive mapping with spatial summaries of sensor data |
US20170052033A1 (en) * | 2011-09-30 | 2017-02-23 | Irobot Corporation | Adaptive mapping with spatial summaries of sensor data |
US9361591B2 (en) | 2012-10-29 | 2016-06-07 | Electronics And Telecommunications Research Institute | Apparatus and method for building map of probability distribution based on properties of object and system |
KR101807484B1 (en) | 2012-10-29 | 2017-12-11 | 한국전자통신연구원 | Apparatus for building map of probability distrubutition based on properties of object and system and method thereof |
US9677890B2 (en) | 2013-01-10 | 2017-06-13 | Intel Corporation | Positioning and mapping based on virtual landmarks |
WO2014110204A1 (en) * | 2013-01-10 | 2014-07-17 | Intel Corporation | Positioning and mapping based on virtual landmarks |
US20140350839A1 (en) * | 2013-05-23 | 2014-11-27 | Irobot Corporation | Simultaneous Localization And Mapping For A Mobile Robot |
US9037396B2 (en) * | 2013-05-23 | 2015-05-19 | Irobot Corporation | Simultaneous localization and mapping for a mobile robot |
US9304001B2 (en) | 2013-07-03 | 2016-04-05 | Samsung Electronics Co., Ltd | Position recognition methods of autonomous mobile robots |
US9566706B2 (en) | 2014-01-14 | 2017-02-14 | Samsung Electronics Co., Ltd. | Robot and control method thereof |
US20160378115A1 (en) * | 2015-06-25 | 2016-12-29 | Hyundai Motor Company | System and method for writing occupancy grid map of sensor centered coordinate system using laser scanner |
US9827994B2 (en) * | 2015-06-25 | 2017-11-28 | Hyundai Motor Company | System and method for writing occupancy grid map of sensor centered coordinate system using laser scanner |
US10549430B2 (en) * | 2015-08-28 | 2020-02-04 | Panasonic Intellectual Property Corporation Of America | Mapping method, localization method, robot system, and robot |
WO2017116492A1 (en) * | 2015-12-31 | 2017-07-06 | Olney Guy | Method for integrating parallel streams of related sensor data generating trial responses without prior knowledge of data meaning or the environment being sensed |
WO2017148730A1 (en) | 2016-03-03 | 2017-09-08 | Kuka Roboter Gmbh | Method for updating an occupancy map and autonomous vehicle |
DE102016203547A1 (en) | 2016-03-03 | 2017-09-07 | Kuka Roboter Gmbh | Method for updating an occupancy card and autonomous vehicle |
US9864377B2 (en) | 2016-04-01 | 2018-01-09 | Locus Robotics Corporation | Navigation using planned robot travel paths |
US10310511B2 (en) | 2016-04-20 | 2019-06-04 | Toyota Jidosha Kabushiki Kaisha | Automatic driving control system of mobile object |
WO2018094272A1 (en) * | 2016-11-18 | 2018-05-24 | Robert Bosch Start-Up Platform North America, LLC, Series 1 | Robotic creature and method of operation |
US11625870B2 (en) | 2017-07-31 | 2023-04-11 | Oxford University Innovation Limited | Method of constructing a model of the motion of a mobile device and related systems |
US10429847B2 (en) | 2017-09-22 | 2019-10-01 | Locus Robotics Corp. | Dynamic window approach using optimal reciprocal collision avoidance cost-critic |
US10386851B2 (en) | 2017-09-22 | 2019-08-20 | Locus Robotics Corp. | Multi-resolution scan matching with exclusion zones |
US10365656B2 (en) | 2017-11-22 | 2019-07-30 | Locus Robotics Corp. | Robot charger docking localization |
US10761539B2 (en) | 2017-11-22 | 2020-09-01 | Locus Robotics Corp. | Robot charger docking control |
US11320828B1 (en) | 2018-03-08 | 2022-05-03 | AI Incorporated | Robotic cleaner |
US11254002B1 (en) | 2018-03-19 | 2022-02-22 | AI Incorporated | Autonomous robotic device |
US11454981B1 (en) | 2018-04-20 | 2022-09-27 | AI Incorporated | Versatile mobile robotic device |
US20190329407A1 (en) * | 2018-04-30 | 2019-10-31 | Beijing Jingdong Shangke Information Technology Co., Ltd. | System and method for multimodal mapping and localization |
US10807236B2 (en) * | 2018-04-30 | 2020-10-20 | Beijing Jingdong Shangke Information Technology Co., Ltd. | System and method for multimodal mapping and localization |
US11340079B1 (en) | 2018-05-21 | 2022-05-24 | AI Incorporated | Simultaneous collaboration, localization, and mapping |
US11548159B1 (en) * | 2018-05-31 | 2023-01-10 | AI Incorporated | Modular robot |
US11199853B1 (en) | 2018-07-11 | 2021-12-14 | AI Incorporated | Versatile mobile platform |
US20200050205A1 (en) * | 2018-08-07 | 2020-02-13 | Cnh Industrial America Llc | System and method for updating a mapped area |
US20200265621A1 (en) * | 2019-02-14 | 2020-08-20 | Faro Technologies, Inc. | System and method of scanning two dimensional floorplans using multiple scanners concurrently |
US10891769B2 (en) * | 2019-02-14 | 2021-01-12 | Faro Technologies, Inc | System and method of scanning two dimensional floorplans using multiple scanners concurrently |
CN109900267A (en) * | 2019-04-12 | 2019-06-18 | 哈尔滨理工大学 | A kind of mobile robot map-building based on slam and autonomous searching system |
CN110174894A (en) * | 2019-05-27 | 2019-08-27 | 小狗电器互联网科技(北京)股份有限公司 | Robot and its method for relocating |
TWI736960B (en) * | 2019-08-28 | 2021-08-21 | 財團法人車輛研究測試中心 | Synchronous positioning and mapping optimization method |
US12039674B2 (en) | 2020-09-18 | 2024-07-16 | Apple Inc. | Inertial data management for extended reality for moving platforms |
US20220269273A1 (en) * | 2021-02-23 | 2022-08-25 | Hyundai Motor Company | Apparatus for estimating position of target, robot system having the same, and method thereof |
CN113478480A (en) * | 2021-06-22 | 2021-10-08 | 中建三局集团有限公司 | Trajectory planning method for transverse arm material distributing machine |
US11944876B2 (en) * | 2022-05-30 | 2024-04-02 | Tennibot Inc. | Autonomous tennis assistant systems |
DE102023204536A1 (en) | 2023-05-15 | 2024-11-21 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for locating a mobile device |
Also Published As
Publication number | Publication date |
---|---|
JP5837553B2 (en) | 2015-12-24 |
CA2772636A1 (en) | 2011-03-03 |
NZ598500A (en) | 2013-11-29 |
HK1211352A1 (en) | 2016-05-20 |
US20170255203A1 (en) | 2017-09-07 |
WO2011026119A3 (en) | 2011-06-16 |
US20140058610A1 (en) | 2014-02-27 |
CN102576228A (en) | 2012-07-11 |
JP2013503404A (en) | 2013-01-31 |
EP2473890A2 (en) | 2012-07-11 |
AU2010286429B2 (en) | 2013-11-28 |
AU2010286429A1 (en) | 2012-04-05 |
WO2011026119A2 (en) | 2011-03-03 |
JP2014078254A (en) | 2014-05-01 |
CA2859112C (en) | 2017-08-15 |
CN104699099B (en) | 2018-03-23 |
US8903589B2 (en) | 2014-12-02 |
US20150105964A1 (en) | 2015-04-16 |
KR20120043096A (en) | 2012-05-03 |
CA2859112A1 (en) | 2011-03-03 |
US9678509B2 (en) | 2017-06-13 |
KR101362961B1 (en) | 2014-02-12 |
CN104699099A (en) | 2015-06-10 |
EP2473890B1 (en) | 2014-03-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8903589B2 (en) | Method and apparatus for simultaneous localization and mapping of mobile robot environment | |
JP2014078254A5 (en) | ||
CN111708047B (en) | Robot positioning evaluation method, robot and computer storage medium | |
KR101762504B1 (en) | Method for detecting floor obstacle using laser range finder | |
JP2018526748A (en) | System having an autonomous mobile robot and an autonomous mobile robot base station, an autonomous mobile robot base station, a method for an autonomous mobile robot, and an automatic docking method for an autonomous mobile robot to a base station | |
CN110865393A (en) | Positioning method and system based on laser radar, storage medium and processor | |
US11656083B2 (en) | Autonomous tunnel navigation with a robotic system | |
Nobili et al. | Predicting alignment risk to prevent localization failure | |
CN110471086A (en) | A kind of radar survey barrier system and method | |
CN114237243A (en) | Anti-falling method and device for mobile robot, electronic equipment and storage medium | |
KR20220000328A (en) | Apparatus and method for recognizing location of mobile robot based on spatial structure information using laser reflection intensity | |
CN111671360B (en) | Sweeping robot position calculating method and device and sweeping robot | |
Batavia et al. | Obstacle detection in smooth high curvature terrain | |
EP3865910A1 (en) | System and method of correcting orientation errors | |
CN112344966B (en) | Positioning failure detection method and device, storage medium and electronic equipment | |
CN115164882B (en) | Laser distortion removal method, device and system and readable storage medium | |
US20250044453A1 (en) | 3d sub-grid map-based robot pose estimation method and robot using the same | |
KR20200048918A (en) | Positioning method and apparatus thereof | |
CN118962666A (en) | Methods for locating mobile devices | |
CN117179637A (en) | Carpet detection method, carpet detection device, cleaning robot, and storage medium | |
CN117539234A (en) | Positioning success judging method based on road sign, chip and robot | |
JP2024158196A (en) | Inspection route setting system and inspection route setting method | |
CN115808684A (en) | Target optimization method, device, equipment and storage medium | |
Tanaka | Multiscan-based map optimizer for RFID map-building with low-accuracy measurements | |
CN117434934A (en) | Robot motion control method based on TOF module |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NEATO ROBOTICS, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SOFMAN, BORIS;ERMAKOV, VLADIMIR;EMMERICH, MARK;AND OTHERS;SIGNING DATES FROM 20101120 TO 20101214;REEL/FRAME:025696/0116 |
|
AS | Assignment |
Owner name: SQUARE 1 BANK, NORTH CAROLINA Free format text: SECURITY AGREEMENT;ASSIGNOR:NEATO ROBOTICS, INC.;REEL/FRAME:032382/0669 Effective date: 20120824 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: NEATO ROBOTICS, INC., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:SQUARE 1 BANK;REEL/FRAME:034905/0429 Effective date: 20150206 |