WO2016013095A1 - 自律移動装置 - Google Patents
自律移動装置 Download PDFInfo
- Publication number
- WO2016013095A1 WO2016013095A1 PCT/JP2014/069635 JP2014069635W WO2016013095A1 WO 2016013095 A1 WO2016013095 A1 WO 2016013095A1 JP 2014069635 W JP2014069635 W JP 2014069635W WO 2016013095 A1 WO2016013095 A1 WO 2016013095A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- road surface
- data group
- data
- avoidance
- information
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 21
- 230000033001 locomotion Effects 0.000 claims description 17
- 238000012795 verification Methods 0.000 claims description 2
- 238000000034 method Methods 0.000 description 13
- 238000007726 management method Methods 0.000 description 8
- 238000012545 processing Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 244000025254 Cannabis sativa Species 0.000 description 1
- 230000008034 disappearance Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/217—Validation; Performance evaluation; Active pattern learning techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
Definitions
- the present invention relates to a technique for determining obstacle avoidance when a moving body such as a robot or an automobile moves autonomously.
- sensors such as cameras, laser sensors, and ultrasonic sensors are mounted on moving bodies such as robots and automobiles, and obstacle avoidance judgments are made based on these sensor values.
- Patent Document 1 Japanese Patent Application Laid-Open No. 2004-42148 discloses an environment map storage unit that stores a map of a moving environment in which a known obstacle is described in a mobile robot that autonomously moves the moving environment by battery driving.
- Self-position and direction detection means for detecting the self-position and direction, obstacle detection means for detecting an unknown obstacle, and when a movement command is received, the mobile robot is moved according to the movement command, and when an avoidance command is received, the obstacle Refer to the environment map for a path avoiding the known obstacle from the self-position detected by the self-position direction detecting means to the moving destination, and a moving means for moving the mobile robot until the detecting means no longer detects the unknown obstacle
- the planned route is given to the moving means as a movement command, and when the obstacle detecting means detects an unknown obstacle, an avoidance command is sent to the moving means.
- Route planning means for re-planning with reference to a map and giving the re-planned route to the moving means as a movement command.
- Patent Document 2 Japanese Patent Application Laid-Open No. 2012-187698 expresses route information as a series having a path and a landmark as elements, a path connecting the start and end of the path, and forward, backward, spin turn, and after the mark.
- the fixed path information can be changed by teaching without changing the software, and if there is a problem, it can be improved by re-teaching, so the effort and effort are greatly reduced.
- By creating a library of route information it is possible to accumulate know-how of visually handicapped persons and walking trainers who walk independently on a daily basis.
- the conventional obstacle avoidance technique is a method of detecting the height from the road surface with, for example, a distance sensor, etc., and generating an action so as to avoid the area having the height if the height is a certain value or more. There is.
- an obstacle detection sensor such as a camera, and a possible avoidance such as a person or a car that the moving body stores in advance
- By comparing the database in which the obstacle information to be registered is registered with the object information detected by the obstacle detection sensor it is determined whether or not the detected object is an obstacle registered in the database.
- the mobile robot of Patent Document 1 stores an environment map in which known obstacles are described, and the detected obstacle is described in the environment map by comparing the detected obstacle with the environment map.
- a moving environment with many known obstacles or unknown obstacles by determining whether the obstacle is a known obstacle or an unknown obstacle that is not described in the environment map, and if it is an unknown obstacle, repeat the operation that is newly described in the environment map.
- the mobile robot can reach the destination with high probability.
- the rerun traveling of the traveling robot of Patent Document 2 and its teaching method and control method are temporarily stopped when approaching the obstacle detected by the traveling robot, and detect whether there is a space to avoid the obstacle on the right side or the left side. And if there is a space to avoid, the traveling robot asks the user whether to avoid the obstacle or wait until the obstacle retreats, and if there is no space to avoid, until the obstacle retreats By continuing the pause, an action can be generated for an obstacle even outdoors.
- An object of the present invention is to provide an autonomous mobile device that can appropriately determine whether or not to avoid an object on a road surface while reducing the burden on the user in a mobile object that autonomously moves indoors and outdoors.
- one of the representative autonomous mobile devices of the present invention provides information on an object that needs to be avoided and information on an object that does not need to be avoided when a mobile object that moves autonomously travels.
- An object detection unit for detecting an object on the road surface around the moving body using a sensor provided in the moving body, a detected object on the road surface, and an object stored in the database
- a collation unit that collates the road surface object, and if the database does not have the same or similar data as the object on the road surface as a result of the collation, the determination to accept the determination input on whether or not to avoid the road surface object from the operator An input unit, and a motion generation unit that instructs the next motion of the moving body based on a collation result by the collation unit or an avoidance necessity determination result input by the determination input unit.
- the present invention provides an autonomous mobile device that can appropriately determine whether or not to avoid an object on the road surface while reducing the burden on the user in a mobile object that autonomously moves indoors and outdoors.
- the purpose is to provide.
- FIG. 1 It is a block diagram which shows the structure of the mobile body and database management system by the form of Example 1.
- FIG. It is a figure which shows the database for avoidance determination by the form of Example 1.
- FIG. It is a flowchart of the collation part by the form of Example 1.
- FIG. It is a flowchart of the determination input part by the form of Example 1.
- FIG. It is a figure which shows the example of a display of the monitor of the determination input part by the form of Example 1.
- FIG. It is a figure which shows the example of a display of the monitor of the determination input part by the form of Example 1.
- FIG. It is a figure which shows the example of a display of the monitor of the determination input part by the form of Example 1.
- FIG. It is a figure which shows the example of a display of the monitor of the determination input part by the form of Example 1.
- FIG. It is a figure which shows the example of a display of the monitor of the determination input part by the form of Example 1.
- the autonomous mobile device of the present invention when a moving body such as an automobile or a robot autonomously moves indoors or outdoors, a sensor for detecting an object on the road surface is mounted, and the autonomous mobile device of the present invention can avoid the object on the road surface in the same way as a person. By determining whether it is necessary or not and generating the next action based on the determination result, autonomous movement with improved affinity with people is performed.
- FIG. 1 shows the configuration of a moving body 200 and the configuration of a database management system 201 according to the present embodiment.
- the moving body 200 is equipped with a sensor 110, and various control programs are executed by the arithmetic processing unit 114.
- a program executed by the arithmetic processing unit 114 is represented as a functional block diagram, and specifically includes an object detection unit 111, a collation unit 112, and an operation generation unit 117.
- the database management system 201 includes a database 113 that stores road surface object information including necessary / unnecessary information for avoidance, and includes an arithmetic processing unit 118 that executes various programs.
- the program executed by the arithmetic processing unit 118 is represented as a functional block diagram in FIG. 1, and specifically includes a determination input unit 115 and a database creation unit 116.
- the database management system 201 constitutes the autonomous mobile device 202.
- the mobile unit 200 and the database management system 201 can always communicate with each other by wireless communication means.
- the communication means is preferably a wireless LAN (Local Area Network), but other communication means may be used.
- the moving body 200 is equipped with means for calculating its own position in addition to the sensor 110.
- a wheel odometry that is a method of installing an encoder on a wheel portion of a moving body and calculating relative coordinates of the moving body based on an arbitrary point based on the rotation speed of the wheel.
- a gyroscope or inertial measurement unit IMU: Internal Measurement Unit
- IMU Internal Measurement Unit
- the wheel odometry and gyro odometry increase the error of the self-position calculated according to the moving distance, so the environment map and environment map measured using a distance measuring sensor such as a camera, laser, or radar are matched.
- map matching is a method for calculating the self position.
- a satellite positioning system (NSS: Navigation Satellite system) may be used.
- Markov property is assumed for the observation value of each sensor installed in the moving body and the moving amount of the moving body
- the odometry, map matching, and positioning results by satellite positioning may be merged by a Kalman filter.
- SLAM Simultaneous Localization and Mapping
- SLAM Simultaneous Localization and Mapping
- the object detection unit 111 uses the sensor 110 to detect an object on the road surface.
- the sensor 110 is preferably a sensor that can detect the height from the road surface and the relative position from the moving body using a separation sensor such as a camera, laser, or radar.
- the collation unit 112 collates the object on the road surface detected by the object detection unit 111 with the database 113 managed by the database management system 201, and matches the road surface object detected in the data stored in the database 113. Check if there is. If there is matching data, it is determined whether to avoid the object on the road surface detected from the avoidance necessity / unnecessary information of the match data, and the process moves to the motion generation unit 117. If there is no matching data, the determination input unit 115 is moved. Details of the database 113 will be described with reference to FIG. 2, and details of the verification unit 112 will be described with reference to FIG.
- the determination input unit 115 outputs the collation result of the collation unit 112 to the monitor, and the operator of the database management system 201 performs the avoidance determination of the object on the road surface detected based on the collation result output to the monitor.
- the operator is preferably a person who is familiar with the autonomous mobile device 202. After the operator makes an avoidance determination, the avoidance determination result is transmitted to the database creation unit 116 and the motion generation unit 117 of the moving body 200.
- the database creation unit 116 receives the avoidance determination result of the determination input unit 115 and updates the database 113 based on the avoidance determination result.
- the motion generation unit 117 determines whether to avoid the detected road object based on the collation result of the collation unit 112 or the avoidance determination result of the determination input unit 115. However, if avoidance is not necessary, continue driving without avoidance.
- FIG. 2 shows the details of the database 113 of FIG.
- the database 113 includes an avoidance required class D101 in which object information that needs to be avoided is registered and an avoidance unnecessary class 102 in which object information that does not need to be registered is registered.
- the avoidance required class D101 includes a data group D103 in which similar objects that need to be avoided are grouped.
- the avoidance unnecessary class D102 includes a data group D104 in which similar objects unnecessary to avoid are collected.
- the data group D103 and the data group D104 are composed of representative data D105 and sub-data D106 associated with the representative data D105.
- the forms of the representative data D105 and the sub data D106 differ depending on the sensor 110 in FIG.
- the representative data D105 and the sub data D106 are captured images of objects on the road surface or feature information of the captured images.
- the feature information includes, for example, a Harris detector, a SUSAN (Smallest Universe Segmented Nucleus) detector, a Fast (Features From Accented Segment Test), or the like applied to the imaged image by detecting the position on the imaged image.
- This is a feature amount extracted by applying information, ORB (Oriented-BRIEF) or the like to the feature point. If feature point detection and feature amount extraction of the feature point can be performed, any feature point detection method and feature A quantity extraction method may be used.
- the representative data D105 is calculated such that the degree of coincidence with all data in the data group D103 is calculated based on the feature information, and the sample average of the degree of coincidence is maximized.
- the degree of coincidence is obtained, for example, by calculating the Euclidean distance, the Mahalanobis distance, the correlation coefficient, and the determination coefficient of the feature amounts of the two pieces of data to be collated. May be used.
- the sample average of the degree of coincidence is added to each data, and the information on the sample average of the degree of coincidence is used by the determination input unit 115 in FIG.
- ID is assigned to each of the representative data D105 and the sub data D106.
- IDs may be assigned in any way as long as avoidance is necessary / unnecessary, and the association between the representative data and the subdata and the representative data and the subdata is clear.
- FIG. 3 shows details of the matching unit 112 of FIG.
- S100 collates the object on the road surface detected by the object detection unit 111 of FIG. 1 with all the representative data registered in the database 113 of FIG. 1, and calculates the degree of coincidence thereof. Further, representative data is extracted such that the calculated degree of coincidence is greater than the threshold value M1.
- S101 collates the detected object on the road surface with the sub-data of all representative data extracted in the process S100, and calculates the degree of coincidence thereof. Further, the number of sub-data whose count of coincidence calculated is greater than the threshold value M2 is counted, and the coincidence ratio occupied by the counted sub-data in each data group is calculated.
- S102 extracts, as candidate data groups, data groups in which the matching ratio calculated in step S101 is greater than the threshold value P, and stores the extracted representative data IDs of the candidate data groups as candidate data group IDs (S105). To do.
- step S103 selects a data group that maximizes the matching ratio calculated in step S101 from among the candidate data groups extracted in step S102.
- S104 confirms whether the maximum value of the degree of coincidence of all the data of the data group selected in step S103 is larger than the threshold value M3. If the maximum matching value is smaller than the threshold value M3, the candidate data group ID (S105) is transmitted to the determination input unit 115 and moved to the determination input unit 115.
- FIG. 4 shows details of the determination input unit 115 of FIG. 5, FIG. 6, and FIG. 7 show a monitor for displaying the collation result of the collation unit 112 in FIG. 1 to the operator by the determination input unit 115 in FIG.
- S200 receives the candidate data group ID (S105) of FIG. 3, and displays the candidate data group stored in the database 113 of FIG. 2 on the monitor based on the received candidate data group ID (S105), Further, the object on the road surface detected by the object detection unit 111 in FIG. 1 is displayed on the monitor. For example, as shown in FIG. 5, the representative data 102 and the sub-data M103 of the candidate data group M101 are visualized and displayed on the monitor M100, and the visualized object on the road surface is displayed on M104. When there are many candidate data groups M101, the part of M101 can be scrolled up and down, right and left. Furthermore, each data M102 and M103 of the visualized candidate data group can be selected.
- step S201 the operator confirms whether the candidate data group displayed on the monitor includes the same data group as the object on the road surface detected by the object detecting unit 111 in FIG. For example, when there is the same data group, the same data as the on-road object M104 is selected from the representative data M102 or the sub-data M103 in the candidate data group M101 in FIG. 5, and the “select” button M105 is clicked. If there is no identical data group, the “none” button M106 is clicked.
- step S202 moves to step S203 when the operator confirms that there is the same data group as the object on the road surface from the candidate data group in step S201, and there is no data group identical to the object on the road surface. If it is confirmed, the process moves to step S206.
- S203 calculates the degree of coincidence between the detected object on the road surface and the data of the same data group when it is confirmed that there is the same data group as the object on the road surface in step S202, and the degree of coincidence calculated Compute the sample mean of.
- step S204 moves to step S205 when the sample average of the degree of coincidence of the objects on the road surface calculated in step S203 takes an intermediate value in the same data group, and moves the intermediate value in the same data group. If not, the process moves to step S208.
- S205 moves representative data of the same data group to sub-data when the sample average of the degree of coincidence of the objects on the road surface is intermediate in the same data group in step S204, and Let the object be representative data.
- the sub-data IDs are ai-1 to ai-j
- the ID of the original representative data moved to the sub-data is changed from ai to ai- (j + 1).
- step S206 when it is confirmed in step S202 that there is no data group identical to the object on the road surface, the operator determines whether or not the object on the road surface is to be avoided.
- step S207 generates a new data group in the avoidance required class D101 in FIG. 2 when the operator determines in the step S206 that the avoidance of the object on the road surface is necessary, and the object on the road surface is represented by the representative of the new data group.
- a new data group is generated in the avoidance unnecessary class D102 of FIG. 2, and the object on the road surface is converted into the new data group. Add as representative data.
- the representative data ID of the new data group is set to a (), assuming that the ID of the representative data of the data group before creation is from a1 to ai. i + 1).
- a window M200 describing generation of a new data group as shown in FIG. 6 is displayed.
- the window M200 displays an “Avoidance Required” button and an “Avoidance Unnecessary” button, and the operator clicks on a button determined by himself / herself. After the click, the newly added data group M201 is displayed.
- S208 adds the object on the road surface to the sub-data of the same data group when the sample average of the degree of coincidence of the object on the road surface does not take an intermediate place in the same data group in step S204.
- the ID of the object on the road surface added to the sub-data is ai- (j + 1).
- the road object M300 is added to the same data group M301 as shown in FIG. Displayed.
- the autonomous mobile device of the present invention includes a database configured with avoidance necessity / unnecessary classes, and determines whether the detected object on the road surface should be avoided or not, Unlike conventional autonomous mobile devices, it is possible to realize movement with improved affinity with humans by making avoidance decisions similar to those of humans.
- the database 113 is divided into the avoidance required class and the avoidance unnecessary class, it is possible to obtain an effect that the avoidance unnecessary determination can be performed as well as the avoidance determination of the detected object on the road surface. Further, when collating the detected object on the road surface with the database, it is not necessary to collate with all the data by first collating with the representative data, so that the effect of reducing the collation processing load can be obtained.
- the operator determines whether avoidance is necessary or not, so that it is possible to obtain avoidance determination for various objects on the road surface.
- the results determined by the operator are sequentially reflected in the database, it is possible to reduce objects on the road surface that cannot be automatically avoided.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Multimedia (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
111 物体検出部
112 照合部
117 動作生成部
200 移動体
113 データベース
115 判定入力部
116 データベース作成部
201 データベース管理システム
202 自律移動装置
Claims (5)
- 自律移動する移動体が走行する際に、回避が必要な物体の情報、及び、回避が不要な物体の情報が記憶されるデータベースと、
前記移動体に備えられたセンサを用いて、前記移動体周囲の路面上物体を検出する物体検出部と、
検出された路面上物体と、前記データベースに記憶された物体を照合する照合部と、
前記照合部による照合の結果、前記路面上物体と同一、または、類似するデータが前記データベースない場合に、オペレータから当該路面上物体の回避要否の判定入力を受付ける判断入力部と、
前記照合部による照合結果、または、前記判定入力部によって入力された回避要否判定結果をもとに、前記移動体の次の動作を指示する動作生成部と、
を有する自律移動装置。 - 請求項1において、
前記データベースは、回避が必要な物体の情報が記憶される回避要クラスと、回避が不要な物体の情報が記憶される回避不要クラスで構成され、各物体の情報には、物体の特徴を比較可能な特徴量が定められており、
前記回避要クラスと前記回避不要クラスはそれぞれ、データ群の代表となる物体の情報である代表データと、前記代表データの特徴に類似する物体の情報であって、当該代表データに関連付けられたサブデータで構成されるデータ群を要素とする
ことを特徴とする自律移動装置。 - 請求項2において、
前記照合部は、
前記物体検出部で検出した前記路面上物体を前記データベースの各代表データと照合して前記路面上物体と類似する前記代表データを複数抽出し、
抽出した各代表データに関連付けられた前記サブデータに前記路面上物体を照合し、前記路面上物体と類似する前記サブデータの割合が多い前記データ群を候補データ群として抽出し、
抽出した前記候補データ群の中いずれかの前記代表データまたは前記サブテータの特徴量が、前記路面上物体の特徴量と所定の差以内であった場合には、当該候補データ群が属する回避要・不要クラスを前記動作生成部に送信し、
抽出した前記候補データ群の中のいずれの前記代表データまたは前記サブテータの特徴量も、前記路面上物体の特徴量と所定の差以上あった場合には、前記候補データ群の情報を前記判定入力部に送信し、
前記判定入力部は、前記候補データ群の情報と前記路面上物体をオペレータに表示して、オペレータから当該路面上物体の回避要否の判定入力を受付ける
ことを特徴とする自律移動装置。 - 請求項3において、
前記判定入力部は、
前記照合部から前記候補データ群の情報を受信すると、前記候補データ群と前記路面上物体をモニタに表示し、
前記モニタに表示された前記候補データ群のなかに前記路面上物体と同一種のデータ群の有無をオペレータから受付け、
前記路面上物体と同一種のデータ群がある場合は、当該データ群に前記路面上物体の情報を追加し、当該データ群が属する回避要・不要クラスの情報をもとに、前記路面上物体の回避要・不要を判定し、
前記路面上物体と同一種のデータ群がない場合は、オペレータから前記路面上物体の回避判定を受付け、当該回避判定の結果をもとに前記路面上物体の情報を代表データとして前記データベースに追加し、前記判定結果をもとに回避要・不要を判定する
ことを特徴とする自律移動装置。 - 請求項1において
前記センサ、前記物体検出部、前記照合部、及び、前記動作生成部は前記移動体に備えられ、
前記データベース及び前記判定入力部は、前記移動体と通信可能なサーバに備えられる
ことを特徴とする自律移動装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016535597A JP6462692B2 (ja) | 2014-07-25 | 2014-07-25 | 自律移動装置 |
US15/328,783 US10156847B2 (en) | 2014-07-25 | 2014-07-25 | Autonomous moving device |
PCT/JP2014/069635 WO2016013095A1 (ja) | 2014-07-25 | 2014-07-25 | 自律移動装置 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/069635 WO2016013095A1 (ja) | 2014-07-25 | 2014-07-25 | 自律移動装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016013095A1 true WO2016013095A1 (ja) | 2016-01-28 |
Family
ID=55162650
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/069635 WO2016013095A1 (ja) | 2014-07-25 | 2014-07-25 | 自律移動装置 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10156847B2 (ja) |
JP (1) | JP6462692B2 (ja) |
WO (1) | WO2016013095A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017224280A (ja) * | 2016-05-09 | 2017-12-21 | ツーアンツ インク.TwoAntz Inc. | 視覚的測位によるナビゲーション装置およびその方法 |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10643377B2 (en) * | 2014-12-22 | 2020-05-05 | Husqvarna Ab | Garden mapping and planning via robotic vehicle |
JP6194520B1 (ja) * | 2016-06-24 | 2017-09-13 | 三菱電機株式会社 | 物体認識装置、物体認識方法および自動運転システム |
US10430657B2 (en) * | 2016-12-12 | 2019-10-01 | X Development Llc | Object recognition tool |
CN109507997B (zh) * | 2017-09-15 | 2021-11-12 | 现代摩比斯株式会社 | 用于自动驾驶的装置、方法和系统 |
US10713940B2 (en) | 2017-10-31 | 2020-07-14 | Waymo Llc | Detecting and responding to traffic redirection for autonomous vehicles |
US10401862B2 (en) * | 2017-10-31 | 2019-09-03 | Waymo Llc | Semantic object clustering for autonomous vehicle decision making |
WO2019089444A1 (en) * | 2017-10-31 | 2019-05-09 | Waymo Llc | Detecting and responding to traffic redirection for autonomous vehicles |
EP3863743A4 (en) * | 2018-10-09 | 2021-12-08 | Resonai Inc. | SYSTEMS AND PROCEDURES FOR 3D SCENES ENLARGEMENT AND RECONSTRUCTION |
AU2020256049B2 (en) * | 2019-04-05 | 2024-02-08 | The Toro Company | Barrier passage system for autonomous working machine |
CN112015178B (zh) * | 2020-08-20 | 2022-10-21 | 中国第一汽车股份有限公司 | 一种控制方法、装置、设备及存储介质 |
US11954914B2 (en) * | 2021-08-02 | 2024-04-09 | Nvidia Corporation | Belief propagation for range image mapping in autonomous machine applications |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004042148A (ja) * | 2002-07-09 | 2004-02-12 | Mitsubishi Heavy Ind Ltd | 移動ロボット |
JP2005092820A (ja) * | 2003-09-19 | 2005-04-07 | Sony Corp | 環境認識装置及び方法、経路計画装置及び方法、並びにロボット装置 |
JP2012187698A (ja) * | 2011-03-08 | 2012-10-04 | Rota Kk | 走行ロボットのやり直し走行、そのティーチング方法および制御方法 |
WO2014091611A1 (ja) * | 2012-12-13 | 2014-06-19 | 株式会社日立製作所 | 自律走行装置 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2984254B1 (fr) * | 2011-12-16 | 2016-07-01 | Renault Sa | Controle de vehicules autonomes |
US9523984B1 (en) * | 2013-07-12 | 2016-12-20 | Google Inc. | Methods and systems for determining instructions for pulling over an autonomous vehicle |
-
2014
- 2014-07-25 US US15/328,783 patent/US10156847B2/en active Active
- 2014-07-25 JP JP2016535597A patent/JP6462692B2/ja active Active
- 2014-07-25 WO PCT/JP2014/069635 patent/WO2016013095A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004042148A (ja) * | 2002-07-09 | 2004-02-12 | Mitsubishi Heavy Ind Ltd | 移動ロボット |
JP2005092820A (ja) * | 2003-09-19 | 2005-04-07 | Sony Corp | 環境認識装置及び方法、経路計画装置及び方法、並びにロボット装置 |
JP2012187698A (ja) * | 2011-03-08 | 2012-10-04 | Rota Kk | 走行ロボットのやり直し走行、そのティーチング方法および制御方法 |
WO2014091611A1 (ja) * | 2012-12-13 | 2014-06-19 | 株式会社日立製作所 | 自律走行装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017224280A (ja) * | 2016-05-09 | 2017-12-21 | ツーアンツ インク.TwoAntz Inc. | 視覚的測位によるナビゲーション装置およびその方法 |
Also Published As
Publication number | Publication date |
---|---|
JP6462692B2 (ja) | 2019-01-30 |
US10156847B2 (en) | 2018-12-18 |
US20170212518A1 (en) | 2017-07-27 |
JPWO2016013095A1 (ja) | 2017-04-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6462692B2 (ja) | 自律移動装置 | |
EP2927769B1 (en) | Localization within an environment using sensor fusion | |
US11885900B2 (en) | Method and system for tracking a mobile device | |
KR101202695B1 (ko) | 자율 이동 장치 | |
CN109491375A (zh) | 用于自动驾驶车辆的基于驾驶场景的路径规划 | |
JP5892785B2 (ja) | 情報処理装置、及び情報処理方法 | |
Löper et al. | Automated valet parking as part of an integrated travel assistance | |
CN106997688A (zh) | 基于多传感器信息融合的停车场停车位检测方法 | |
US20210012124A1 (en) | Method of collecting road sign information using mobile mapping system | |
CN103558856A (zh) | 动态环境下服务动机器人导航方法 | |
TWI772743B (zh) | 資訊處理裝置以及移動機器人 | |
CN113330279A (zh) | 用于确定车辆的位置的方法和系统 | |
EP3207335B1 (en) | Diverging and converging road geometry generation from sparse data | |
CN113433937B (zh) | 基于启发式探索的分层导航避障系统、分层导航避障方法 | |
Zheng et al. | A hierarchical approach for mobile robot exploration in pedestrian crowd | |
EP4053761A1 (en) | Providing access to an autonomous vehicle based on user's detected interest | |
US20200377111A1 (en) | Trainer system for use with driving automation systems | |
CN114879660A (zh) | 一种基于目标驱动的机器人环境感知方法 | |
TWI426241B (zh) | Self - propelled device for the tracking system | |
JP4427517B2 (ja) | 歩行者動線観測装置、方法およびプログラム | |
Jackermeier et al. | Task-Oriented Evaluation of Indoor Positioning Systems | |
Jafar et al. | Visual features based motion controller for mobile robot navigation | |
Suprapto et al. | Designing an Autonomous Vehicle Using Sensor Fusion Based on Path Planning and Deep Learning Algorithms | |
Bikmaev | Error Estimation of a Strapdown Inertial Navigation System Based on the Results of Road Sign Recognition in a Multidimensional Optical Geophysical Field | |
Cheng | Proprioceptive Localization for Robots |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14898126 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016535597 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15328783 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14898126 Country of ref document: EP Kind code of ref document: A1 |