WO2018020589A1 - 自己位置推定方法及び自己位置推定装置 - Google Patents
自己位置推定方法及び自己位置推定装置 Download PDFInfo
- Publication number
- WO2018020589A1 WO2018020589A1 PCT/JP2016/071922 JP2016071922W WO2018020589A1 WO 2018020589 A1 WO2018020589 A1 WO 2018020589A1 JP 2016071922 W JP2016071922 W JP 2016071922W WO 2018020589 A1 WO2018020589 A1 WO 2018020589A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target
- gradient
- section
- position data
- vehicle
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 238000001514 detection method Methods 0.000 claims description 37
- 230000008569 process Effects 0.000 description 33
- 238000012937 correction Methods 0.000 description 15
- 238000003384 imaging method Methods 0.000 description 12
- 238000009825 accumulation Methods 0.000 description 9
- 230000001133 acceleration Effects 0.000 description 8
- 238000005259 measurement Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 238000006073 displacement reaction Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000010187 selection method Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1652—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D1/00—Measuring arrangements giving results other than momentary value of variable, of general application
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4808—Evaluating distance, position or velocity data
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0248—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
Definitions
- the present invention relates to a self-position estimation method and a self-position estimation apparatus.
- Patent Document 1 As a technique for detecting the relative position between a known target and a moving body and estimating the position of the moving body, a technique described in Patent Document 1 is known.
- the robot described in Patent Document 1 is based on the amount of positional deviation between a point environment map that shows a movable area as point set data and ambient environment information that shows a detection result of a laser range sensor mounted on the robot as point set data.
- the robot self-position estimation result is corrected.
- An object of this invention is to suppress the fall of the estimation precision of the position on a two-dimensional map resulting from the difference between the diagonal distance and horizontal distance in a gradient area.
- the relative position of the target existing around the moving body is detected, the relative position is corrected based on the moving amount of the moving body, and the target position data is accumulated.
- the current position of the moving object is checked by comparing the target position data of the target in the section where the gradient amount is less than the threshold with map information indicating the position of the target on the two-dimensional map. Estimate the position.
- the vehicle 1 is equipped with a self-position estimation device 2 and a driving support system 3.
- the self-position estimation device 2 includes an imaging device 10, a distance measurement device 11, a wheel speed sensor 12, a steering angle sensor 13, a gyro sensor 14, an acceleration sensor 15, and a self-position estimation circuit 16.
- the imaging device 10 is attached to the interior of the vehicle 1 or the like, and images the front area of the vehicle 1, for example.
- the imaging device 10 may be a wide angle camera, for example.
- the imaging device 10 outputs a captured image of the front area of the vehicle 1 to the self-position estimation circuit 16.
- the distance measuring device 11 is attached outside the passenger compartment of the vehicle 1 and irradiates the front area of the vehicle 1 with electromagnetic waves and detects the reflected wave.
- the distance measuring device 11 may be a laser range finder, for example.
- the attachment position of the distance measuring device 11 may be, for example, around the hood, bumper, license plate, headlight, or side mirror of the vehicle 1.
- the distance measuring device 11 outputs the measurement result to the self-position estimation circuit 16.
- the wheel speed sensor 12 generates a predetermined number of wheel speed pulses each time the wheel of the vehicle 1 makes one revolution.
- the wheel speed sensor 12 outputs a wheel speed pulse to the self-position estimation circuit 16.
- the steering angle sensor 13 is provided, for example, in a steering column that rotatably supports the steering wheel of the vehicle 1.
- the steering angle sensor 13 detects a current steering angle that is a current rotation angle (a steering operation amount) of a steering wheel that is a steering operator.
- the steering angle sensor 13 outputs the detected current steering angle to the self-position estimation circuit 16.
- the gyro sensor 14 detects the yaw rate generated in the vehicle 1, the displacement amount in the pitch direction, and the displacement amount in the roll direction.
- the gyro sensor 14 outputs the detected yaw rate, the displacement amount in the pitch direction, and the displacement amount in the roll direction to the self-position estimation circuit 16.
- the acceleration sensor 15 detects the lateral G that is the acceleration / deceleration in the vehicle width direction generated in the vehicle 1 and the acceleration / deceleration in the front-rear direction.
- the acceleration sensor 15 outputs the detected lateral G and longitudinal acceleration / deceleration to the self-position estimation circuit 16.
- the self-position estimation circuit 16 is an electronic circuit device including a processor such as a CPU (Central Processing Unit), a storage device, and peripheral components.
- the self-position estimation circuit 16 is a signal 2 received from the imaging device 10, the distance measurement device 11, the wheel speed sensor 12, the steering angle sensor 13, and the gyro sensor 14, and 2 indicating the position of a known target on the two-dimensional map. Based on the dimensional map information, the current position on the map of the vehicle 1 is estimated. Hereinafter, the current position on the map of the vehicle 1 may be referred to as “self position”.
- the self-position estimation circuit 16 outputs a self-position signal indicating the self-position to the driving support system 3.
- the driving support system 3 uses the self-position indicated by the self-position signal received from the self-position estimation circuit 16 to perform driving assistance for driving the vehicle 1 by the driver.
- An example of driving assistance may be provision of information such as an alarm for the driver, for example.
- the driving support system 3 may control at least one of the type and intensity of the alarm presented to the driver according to the self-position of the vehicle 1.
- An example of driving assistance may be control of the running state of the vehicle 1 including at least one of braking control, acceleration control, and steering control of the vehicle 1.
- the driving support system 3 may determine which of the braking force and the driving force is generated in the vehicle 1 according to the self position of the vehicle 1.
- the self-position estimation circuit 16 includes a target position detection unit 20, a movement amount estimation unit 21, a gradient detection unit 22, a target position storage unit 23, a storage device 24, a selection unit 25, and a position estimation unit 26. And the map information acquisition part 27 is provided.
- the processor included in the self-position estimation circuit 16 executes a computer program stored in the storage device 24, thereby causing a target position detection unit 20, a movement amount estimation unit 21, a gradient detection unit 22, a target position storage unit 23, The function of the selection part 25, the position estimation part 26, and the map information acquisition part 27 is implement
- the target position detection unit 20 receives a captured image of the front area of the vehicle 1 generated by the imaging device 10. Further, the target position detection unit 20 receives the measurement result of the distance measuring device 11. The target position detection unit 20 detects a target existing around the vehicle 1 based on the captured image of the front area of the vehicle 1 and the measurement result of the distance measuring device 11. For example, the target position detection unit 20 detects a target existing in front of the vehicle 1. Furthermore, the target position detection unit 20 detects the relative position of the target with respect to the vehicle 1. The target position detection unit 20 outputs a relative position signal indicating the detected relative position to the target position storage unit 23.
- the target may be, for example, a line on the road surface on which the vehicle 1 travels (lane line, etc.), a curb on the road shoulder, a guardrail, or the like.
- the movement amount estimation unit 21 receives the wheel speed pulse, the current steering angle, and the yaw rate from the wheel speed sensor 12, the steering angle sensor 13, and the gyro sensor 14, respectively. Based on these signals received from the wheel speed sensor 12, the steering angle sensor 13, and the gyro sensor 14, the movement amount estimation unit 21 estimates the self-position of the vehicle 1 in the previous processing cycle and continues until the present vehicle 1. Is estimated by odometry. The movement amount estimation unit 21 outputs a movement amount signal indicating the estimated movement amount ⁇ P to the target position accumulation unit 23.
- the gradient detector 22 receives the displacement amount in the pitch direction from the gyro sensor 14.
- the gradient detection unit 22 detects the gradient amount of the traveling path of the vehicle 1, that is, the inclination of the traveling direction of the vehicle 1 based on the displacement amount in the pitch direction received from the gyro sensor 14.
- the gradient detection unit 22 may receive a captured image of the front area of the vehicle 1 generated by the imaging device 10.
- the gradient detection unit 22 may detect the gradient amount of the travel path of the vehicle 1 based on the flow of the 3D point group by analyzing the captured image.
- the gradient detection unit 22 determines whether or not the travel path of the vehicle 1 is a gradient section.
- the gradient detection unit 22 may determine that the travel path is a gradient section when the gradient amount of the travel path of the vehicle 1 is equal to or greater than a predetermined threshold.
- the gradient detection unit 22 outputs a determination result signal indicating the determination result to the selection unit 25.
- the target position accumulation unit 23 receives a relative position signal from the target position detection unit 20 and receives a movement amount signal from the movement amount estimation unit 21.
- the target position storage unit 23 stores the relative position of the target around the vehicle 1 indicated by the relative position signal in the storage device 24.
- the target position accumulation unit 23 corrects the relative position of the target accumulated in the past to a relative position with respect to the current position of the vehicle 1 using the elapsed time up to the present and the movement amount ⁇ P indicated by the movement amount signal. . That is, the target position accumulating unit 23 moves the relative position in the direction opposite to the moving direction of the vehicle 1 by the movement amount ⁇ P that the vehicle has moved during the elapsed time up to the present time.
- the target position accumulating unit 23 accumulates data of a target position that is a corrected relative position (hereinafter may be referred to as “target position data”) in the storage device 24. If the target position data has already been accumulated in the storage device 24, the target position accumulation unit 23 updates the accumulated target position data using the movement amount ⁇ P indicated by the movement amount signal. That is, the target position accumulation unit 23 moves the relative position of the accumulated target position data in the direction opposite to the movement direction of the vehicle 1 by the movement amount ⁇ P. Thereafter, the target position accumulating unit 23 overwrites the accumulated target position data with the relative position moved relative to the movement amount ⁇ P.
- the selection unit 25 selects target position data used for estimating the self position of the vehicle 1 from the target position data accumulated in the storage device 24.
- the target position data selected for use in the estimation of the self position is hereinafter referred to as “selected target position data”. The process in which the selection unit 25 selects the selected target position data will be described later.
- the position estimation unit 26 estimates the self-position of the vehicle 1 by comparing the selected target position data with the two-dimensional map information acquired by the map information acquisition unit 27.
- the map information acquisition unit 27 acquires map data and two-dimensional map information indicating the position of the target existing on the map data on the two-dimensional map.
- the map information acquisition unit 27 is a car navigation system or a map database.
- the map information acquisition unit 27 may acquire the two-dimensional map information from the outside via a communication system such as wireless communication (road-to-vehicle communication or vehicle-to-vehicle communication is also possible). In this case, the map information acquisition unit 27 may periodically acquire the latest 2D map information and update the stored 2D map information.
- the map information acquisition part 27 may accumulate
- the position estimation unit 26 may estimate the self position of the vehicle 1 by collating the selected target position data with the two-dimensional map information, for example, by the following data collation process. Please refer to FIG. Reference sign S i indicates selected target position data.
- the index i is an integer from 1 to N, where N is the number of selected target position data.
- the position estimation unit 26 determines the temporary position of the vehicle 1 by correcting the self-position estimated in the previous processing cycle with the movement amount ⁇ P.
- the position estimation unit 26 assumes that the position of the vehicle 1 on the two-dimensional map is a temporary position, and converts the relative position of the target indicated by the selected target position data S i into an absolute position on the two-dimensional map. To do.
- the position estimation unit 26 selects the position information M j of the target in the two-dimensional map information that is closest to the absolute position of the selected target position data S i .
- nearest closest, Selected target object position location information M z position information M y is the Selected target object position data S 2 position information M x is the Selected target object position data S 1 closest to the data S 3.
- the position estimation unit 26 calculates the distance D ij between the selected target position data S i and the position information M j closest to this data, and calculates the average S of the distances D ij using the following equation (1). To do.
- the position estimation unit 26 calculates the position and orientation of the vehicle 1 that minimizes the average S by numerical analysis, and determines the estimated value of the self position of the vehicle 1.
- the position estimation unit 26 outputs the estimated value of the self position to the driving support system 3.
- selection method of selected target position data Next, a process in which the selection unit 25 selects selected target position data will be described. As described above, there is a difference between the oblique distance and the horizontal distance in the gradient section. For this reason, the distance between the target indicated by the target position data of the target detected before passing through the gradient section and the vehicle 1 may be longer than the actual horizontal distance. The reason will be described with reference to FIG.
- the upper stage is a schematic diagram of the traveling path of the vehicle 1 including the gradient section Ss and the target on the traveling path.
- Square plots T1 to T7 indicate targets on the road.
- the position of the vehicle 1 in FIG. 4 indicates the position at the time after passing through the gradient section.
- the middle row is a schematic diagram of the distance between the target and the vehicle 1 indicated by the target position data stored in the storage device 24 at the time after passing through the gradient section.
- Circular plots S1 to S7 correspond to the target position data of the targets T1 to T7.
- the lower part is a schematic diagram of the distance between the targets T1 to T7 and the vehicle 1 on the two-dimensional map.
- Triangular plots M1 to M7 indicate the positions of the targets T1 to T7 on the map.
- the oblique distance is longer than the horizontal distance. Therefore, using the movement amount ⁇ P including the movement amount in the gradient section Ss, the target positions of the targets T1 to T7 accumulated in the past at the position of the vehicle 1 after passing through the gradient section in FIG. Since the data is corrected, the distance between the target indicated by the target position data S3 to S7 of the target T3 to T7 in the section before passing through the gradient section Ss and the vehicle 1 is the distance on the two-dimensional map (that is, Longer than (horizontal distance).
- the difference between the distance between the target T3 and T4 indicated by the target position data S3 and S4 and the vehicle 1 and the distance on the two-dimensional map is e3 and e4, and e4 is longer than e3.
- the relative positions of the targets T5 to T7 indicated by the target position data S5 to S7 are the same as the oblique distance and the horizontal distance of the gradient section Ss.
- the difference e5 is shifted backward.
- the relative positions of the target position data S5 to S7 do not change.
- the target position data S1 to S2 of the targets T1 to T2 in the flat section after passing through the gradient section Ss are not corrected using the movement amount ⁇ P estimated in the gradient section Ss. Therefore, there is no difference between the distance between the targets T1 and T2 indicated by the target position data S1 and S2 and the vehicle 1 and the distance on the two-dimensional map. Further, the relative position between the target position data S1 and S2 does not change.
- the target position data S1 to S7 When such target position data S1 to S7 are collated with the two-dimensional map information, the target position data S1 to S2 of the flat section after passing through the gradient section where the relative position between the targets does not change, and before the gradient section is entered.
- the target position data S5 to S7 in the flat section agree well with the position information on the map. For this reason, if the self-position is estimated so that the average S of the distance D ij between the target position data and the position information on the map is minimized, the target position data S5 to S7 before entering the gradient section and the position information on the map As a result, the estimation error may not be reduced.
- the target position data before entering the gradient section Ss is larger than the target position data after passing through the gradient section Ss. For this reason, the target position data before entering the gradient section Ss acts dominantly, and the estimation error may increase. As a result, when the vehicle passes through the gradient section Ss and enters the intersection, an estimation error becomes large in self-position estimation using a target around the intersection such as a stop line. Further, the estimated position calculated with the dominant action of the target position data S1 to S2 after passing through the gradient section and the target position data S5 to S7 before the approach of the gradient section with the dominant action were calculated. Unlike the estimated position, the error of the estimated position of the self position may fluctuate and become unstable.
- Reference numeral P1 indicates an estimated position calculated by the target position data S1 to S2 after passing through the gradient section dominantly acting.
- Reference numeral P2 indicates that the target position data S5 to S7 before entering the gradient section is The estimated position calculated by acting dominantly is shown.
- the calculation result may be unstablely oscillated between P1 and P2.
- the selection unit 25 determines whether or not the vehicle 1 has passed the gradient section Ss based on the determination result signal from the gradient detection unit 22.
- the selection unit 25 selects the target position data S1 to S2 after passing through the gradient section (that is, the target position of the target from the section through the gradient section to the current position). Data) is selected as the selected target position data. That is, the selection unit 25 excludes the target position data S3 to S7 before passing through the gradient section from the selected target position data.
- the position estimation unit 26 estimates the self position of the vehicle 1 by comparing the target position data S1 to S2 after passing through the selected gradient section with the positions M1 to M2 on the map of the targets T1 to T2. To do. This is shown in FIG.
- the distance between the target indicated by the target position data S3 to S7 of the target detected before the vehicle 1 passes through the gradient section Ss and the vehicle 1 when the vehicle 1 travels in the gradient section Ss is the actual horizontal level. Even if the distance is longer than the distance, the target position data S3 to S7 can be excluded from the position estimation. As a result, it is possible to suppress a decrease in position estimation accuracy on the two-dimensional map due to the difference between the oblique distance and the horizontal distance in the gradient section.
- the selection unit 25 may delete target position data other than the selected target position data (that is, target position data before passing through the gradient section) from the storage device 24.
- the selection unit 25 may delete the target position data S3 to S4 in the gradient section and the target position data S5 to S7 before entering the gradient section from the storage device 24.
- the position estimation unit 26 may estimate the current position of the vehicle 1 by collating the target position data remaining in the storage device 24 with map information indicating the position of the target on the map. By deleting the target position data before passing through the gradient section from the storage device 24, the storage area of the storage device 24 can be used effectively.
- the selection unit 25 preferentially uses the target position data of the target that is detected by the target position detection unit 20 after passing through the gradient section and has a shorter elapsed time since detection as the selected target position data. You may choose.
- the selection unit 25 may select target position data of targets around the current position of the vehicle 1 after passing through the gradient section. For example, target position data of a target within about 20 m from the current position of the vehicle 1 is selected.
- the target position data of the target around the current position of the vehicle 1 tends to have high position accuracy because there is little error accumulation due to correction using the movement amount ⁇ P.
- the position data of lanes and curbs that are road boundaries have high lateral position accuracy in the runway.
- step S ⁇ b> 1 the imaging device 10, the distance measurement device 11, and the target position detection unit 20 detect the relative positions of the targets existing around the vehicle 1 with respect to the vehicle 1.
- the target position detection unit 20 outputs a relative position signal indicating the detected relative position to the target position storage unit 23.
- step S ⁇ b> 2 the movement amount estimation unit 21 estimates the movement amount ⁇ P of the vehicle 1 from the estimation of the self-position of the vehicle 1 in the previous processing cycle to the present.
- step S ⁇ b> 3 the target position accumulation unit 23 accumulates the relative position of the target around the vehicle 1 indicated by the relative position signal in the storage device 24.
- the target position accumulating unit 23 corrects the relative position of the target accumulated in the past to a relative position with respect to the current position of the vehicle 1 using the elapsed time up to the present and the movement amount ⁇ P indicated by the movement amount signal.
- the data is stored in the storage device 24 as standard position data.
- step S ⁇ b> 4 the imaging device 10, the gyro sensor 14, and the gradient detection unit 22 detect the gradient amount of the travel path of the vehicle 1.
- step S5 the gradient detection unit 22 and the selection unit 25 determine whether the vehicle 1 is in the gradient section, before entering the gradient section, or after passing through the gradient section, by the gradient section passage determination process.
- step S6 the selection unit 25 determines whether or not the vehicle 1 is determined to be in the gradient section in the gradient section passage determination process.
- step S6: Y the process proceeds to step S9. If the vehicle 1 is not in the gradient section (step S6: N), the process proceeds to step S7.
- step S ⁇ b> 7 the selection unit 25 determines whether or not the vehicle 1 has been determined to have passed through the gradient section in the gradient section passage determination process. When the vehicle 1 passes through the gradient section (step S7: Y), the process proceeds to step S8.
- step S7: N When the vehicle 1 has not passed through the gradient section (step S7: N), that is, when the vehicle 1 is before entering the gradient section, the process proceeds to step S9.
- step S8 the selection unit 25 deletes the target position data S3 to S7 before passing through the gradient section from the storage device 24. That is, the selection unit 25 selects the target position data S1 to S2 after passing through the gradient section, and leaves them in the storage device 24 as selected target position data.
- step S ⁇ b> 9 the position estimation unit 26 collates the selected target position data with the map information and estimates the self position of the vehicle 1. That is, the current position of the vehicle 1 is estimated by comparing the target position data remaining in the storage device 24 with the map information.
- step S ⁇ b> 10 the driving support system 3 uses the self-position of the vehicle 1 estimated by the position estimation unit 26 to perform driving support for driving the vehicle 1 by the driver.
- step S20 the selection unit 25 determines whether or not the vehicle 1 is determined to be in the gradient section in the previous gradient section passage determination process. When it is determined that the vehicle 1 is in the gradient section (step S20: Y), the process proceeds to step S24. If it is not determined that the vehicle 1 is in the gradient section (step S20: N), the process proceeds to step S21.
- step S ⁇ b> 21 the gradient detection unit 22 determines whether the gradient amount of the travel path of the vehicle 1 is equal to or greater than a threshold value.
- This threshold value may be set according to whether or not the difference between the oblique distance and the horizontal distance due to the gradient is within an allowable range. The threshold may be 2 degrees, for example. If the gradient amount is greater than or equal to the threshold (step S21: Y), the process proceeds to step S23. If the gradient amount is less than the threshold value (step S21: N), the process proceeds to step S22. In step S22, the selection unit 25 determines that the vehicle 1 has not yet entered the gradient section. Thereafter, the process ends.
- step S23 the gradient detector 22 determines that the vehicle 1 is in the gradient section. Thereafter, the process ends.
- step S24 the gradient detection unit 22 determines whether the gradient amount of the travel path of the vehicle 1 is equal to or greater than a threshold value. If the gradient amount is greater than or equal to the threshold (step S24: Y), the process proceeds to step S23. If the gradient amount is less than the threshold value (step S24: N), the process proceeds to step S25. In step S25, the selection unit 25 determines that the vehicle 1 has still passed the gradient section. Thereafter, the process ends.
- the imaging device 10 and the distance measuring device 11 as the target detection sensor and the target position detection unit 20 detect the relative position of the target existing around the vehicle 1 with respect to the vehicle 1.
- the movement amount estimation unit 21 estimates the movement amount of the vehicle 1.
- the target position accumulation unit 23 corrects the relative position based on the movement amount of the vehicle 1 and accumulates it as target position data.
- the imaging device 10 and the gyro sensor 14 as the gradient detection sensor and the gradient detection unit 22 detect the gradient of the travel path of the vehicle 1.
- the selection unit 25 selects the target position data of the target in the section from the accumulated target position data to the current position after passing through the gradient section.
- the position estimation unit 26 estimates the current position of the vehicle 1 by collating the selected target position data with map information indicating the position of the target on the map.
- the target position data of the target detected before passing through the gradient section can be excluded from the position estimation.
- the self-position can be estimated based on an accurate target position without a distance difference around the intersection, so that the estimation accuracy is improved.
- the selection unit 25 selects target position data of a target around the current position of the vehicle 1, and the position estimation unit 26 collates the selected target position data with map information.
- the target position data of the target around the current position of the vehicle 1 tends to have high position accuracy because there is little error accumulation due to correction using the movement amount ⁇ P.
- the selection unit 25 preferentially selects one of the targets after passing through the gradient section as the selected target position data and selects the remaining target position. It may be excluded from the data. For example, the selection unit 25 may preferentially select the target that has a larger angle between the straight line connecting the vehicle 1 and the target and the traveling direction of the vehicle 1 as the selected target position data. Further, for example, the selection unit 25 may exclude a target whose distance from the vehicle 1 is longer than a predetermined upper limit from the selected target position data. Here, the longer the distance between the target and the vehicle 1, the easier it is to enter a gradient section between the target and the vehicle 1, and the estimation error of the movement amount ⁇ P is likely to increase. Therefore, the upper limit of the distance between the target and the vehicle 1 may be adjusted according to the allowable range of the position estimation error caused by the estimation error of the movement amount ⁇ P.
- the self-position estimation apparatus 2 of 2nd Embodiment is demonstrated. While traveling in a section where the amount of slope is continuously below the threshold, self-estimation is performed using the target position data of the target in this section, so that the difference between the oblique distance and the horizontal distance is obtained. The estimated accuracy of the position on the two-dimensional map can be suppressed. Therefore, in the first section having a slope amount less than the threshold that the vehicle 1 traveled before entering the slope section, and in the second section having a slope amount less than the threshold that the vehicle 1 travels after passing through the slope section, the vehicle 1 The self-position can be detected with high accuracy. For this reason, the relative position between the self position of the vehicle 1 estimated in the first section and the self position of the vehicle 1 estimated in the second section can be calculated with high accuracy.
- the self-position estimation circuit 16 uses the relative position between the self-position estimated in the first section and the self-position estimated in the second section to obtain target position data of the target in the first section. to correct.
- the self-position estimation circuit 16 includes a correction unit 28.
- the processor included in the self-position estimation circuit 16 implements the function of the correction unit 28 by executing a computer program stored in the storage device 24.
- the position estimation unit 26 compares the target position data of the target in the first section with the map information in the first section having a gradient amount less than the threshold traveled by the vehicle 1 before entering the gradient section. The first position of the vehicle 1 before entering is estimated.
- the position estimation unit 26 outputs the first position to the driving support system 3 and the correction unit 28.
- the correction unit 28 adds the information on the first position of the vehicle 1 estimated in the first section to the target position data of the target in the first section and stores it in the storage device 24.
- the position estimation unit 26 passes the gradient section by comparing the target position data of the target in the second section and the map information in the second section having a gradient amount less than the threshold value at which the vehicle 1 travels after passing the gradient section. A second position of the subsequent vehicle 1 is estimated. The position estimation unit 26 outputs the second position to the correction unit 28. The correction unit 28 corrects the target position data of the target in the first section based on the relative position between the first position and the second position.
- the position estimation unit 26 collates the corrected target position data and the target position data of the target in the second section with map information. To estimate the second position of the vehicle 1 after passing through the gradient section.
- the object position estimation unit 26 outputs the second position estimated after the correction of the target position data to the driving support system 3.
- the target position estimation unit 26 adds the information on the second position estimated after the correction of the target position data to the target position data of the target in the second section, and stores it in the storage device 24.
- step S35 the selection unit 25 determines whether or not the vehicle 1 is determined to be in the gradient section in the gradient section passage determination process.
- step S35: Y the process proceeds to step S43. If the vehicle 1 is not in the gradient section (step 35: N), the process proceeds to step S36.
- step S ⁇ b> 36 the selection unit 25 determines whether or not the vehicle 1 has been determined to have passed through the gradient section in the gradient section passage determination process.
- step S36: Y the process proceeds to step S37.
- step S36: N the vehicle 1 has not passed through the gradient section
- step S43 the process proceeds to step S43.
- step S ⁇ b> 36 the selection unit 25 deletes the target position data of the target in the gradient section from the storage device 24. That is, the selection unit 25 selects a target other than the gradient section (that is, the target position data before the gradient section and the target position after passing through the gradient section) as the selected target position data.
- the selection unit 25 is not limited to the target position data of the target after passing through the gradient section, but selects the target position data of the target in the section where the gradient amount other than the gradient section is less than the threshold value. Select as position data.
- it is not necessary to select all the targets other than the gradient section as the selected target position data and it is necessary for collating the map information acquired by the map information acquisition unit 27 to estimate the self position of the vehicle 1. Only target position data may be selected.
- step S38 the selection unit 25 selects target position data of the target in the second section after passing through the gradient section.
- step S39 the position estimation unit 26 collates the target position data selected in step S38 with the two-dimensional map information, and estimates the second position of the vehicle 1.
- step S ⁇ b> 40 the correction unit 28 stores the information on the first position of the vehicle 1 estimated in the first section stored in addition to the target position data of the target in the first section before entering the gradient section. Read from. The correction unit 28 corrects the target position data of the target in the first section based on the relative position between the first position and the second position.
- step S41 the position estimation unit 26 collates the target position data remaining in the storage device 24 (that is, the target position data corrected in step S40 and the target position data of the target in the second section) with map information.
- the correction unit 28 adds the information on the second position of the vehicle 1 estimated in step S41 to the target position data of the target in the second section and stores the information in the storage device 24.
- the process of step S42 is the same as the process of step S10 of FIG.
- the process in step S43 is the same as the process in step S9 in FIG. After step S43, the process proceeds to step S42.
- the position estimation unit 26 collates the target position data of the target of the first section having a gradient amount less than the threshold before entering the gradient section and the map information to check the vehicle 1 before entering the gradient section. A first position is estimated. Further, after passing through the gradient section, the second position of the vehicle 1 after passing through the gradient section is estimated by collating the target position data of the target of the second section having a gradient amount less than the threshold and map information. To do.
- the correction unit 28 corrects the target position data of the target in the first section based on the relative position between the first position and the second position.
- the position estimation unit 26 estimates the self-position of the vehicle 1 by collating the corrected target position data and the target position data of the target in the second section with map information.
- the position estimation unit 26 collates the target position data of the target in the section whose gradient amount is less than the threshold and the map information, not only the target position data of the target in the second section after passing through the gradient section. By doing so, the self-position of the vehicle 1 is estimated. Thereby, the fall of the estimation precision of the position on a two-dimensional map resulting from the difference of the diagonal distance between the targets of a gradient area and a horizontal distance can be suppressed. Moreover, since the target position data of the target before entering the gradient section can be used again, the accuracy of self-position estimation can be improved.
- the target position data of the target in the undulation need not be excluded from the selected target position data.
- the selection unit 25 excludes the target position data of the target of the gradient section such as the entrance to the bridge or the highway from the selected target position data, for example, when the gradient amount is 1 to 2 degrees and the transit time May not be excluded from the selected target position data.
- the selection unit 25 may select target position data of the target in this section as the selected target position data.
- the selection unit 25 may exclude the target position data of the target in this section from the selected target position data. That is, the target position data of a target in a section having a gradient less than the threshold other than this section is selected as the selected target position data. The same applies to the first embodiment.
- target position data for the target in this section is selected, so the target position of the gradient section that affects the self-position estimation accuracy is selected. Data can be excluded appropriately.
- the predetermined length may be set based on, for example, the travel time for traveling in a section having a gradient equal to or greater than a threshold value. For example, the predetermined length may be set to a length of 3 seconds or more. The predetermined length may be set based on the distance of a section having a gradient equal to or greater than the threshold. For example, the predetermined length may be set to a length of 30 seconds or more. The predetermined length may be set dynamically such that the larger the gradient amount of the traveling path of the vehicle 1 is, the shorter the length becomes. Thereby, the measurement error resulting from the difference between the oblique distance and the horizontal distance can be suppressed within a desired allowable range regardless of the magnitude of the gradient amount.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Aviation & Aerospace Engineering (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
特許文献1に記載のロボットは、移動可能な領域を点集合データで示す点環境地図と、ロボットに搭載したレーザーレンジセンサの検出結果を点集合データで示す周囲環境情報との位置ずれ量に基づき、ロボットの自己位置の推定結果を補正する。
本発明は、勾配区間における斜距離と水平距離との間の差に起因する2次元地図上の位置の推定精度の低下を抑制することを目的とする。
本発明の目的及び利点は、特許請求の範囲に示した要素及びその組合せを用いて具現化され達成される。前述の一般的な記述及び以下の詳細な記述の両方は、単なる例示及び説明であり、特許請求の範囲のように本発明を限定するものでないと解するべきである。
(第1実施形態)
(構成)
図1を参照する。以下、移動体の一例として車両の現在位置を推定する場合について説明するが、本発明は車両に限らず様々な移動体の現在位置の推定に広く適用することができる。
車両1には、自己位置推定装置2と運転支援システム3が搭載される。自己位置推定装置2は、撮像装置10と、距離測定装置11と、車輪速センサ12と、操舵角センサ13と、ジャイロセンサ14と、加速度センサ15と、自己位置推定回路16を備える。
距離測定装置11は、車両1の車室外などに取り付けられ、車両1の前方領域に電磁波を照射しその反射波を検出する。距離測定装置11は、例えばレーザレンジファインダであってよい。また、距離測定装置11の取り付け位置は、例えば車両1のボンネット、バンパー、ナンバープレート、ヘッドライト、又はサイドミラーの周辺であってよい。距離測定装置11は、測定結果を自己位置推定回路16へ出力する。
操舵角センサ13は、例えば、車両1のステアリングホイールを回転可能に支持するステアリングコラムに設けられる。操舵角センサ13は、操舵操作子であるステアリングホイールの現在の回転角度(操舵操作量)である現在操舵角を検出する。操舵角センサ13は、検出した現在操舵角を自己位置推定回路16へ出力する。
加速度センサ15は、車両1に発生する車幅方向の加減速度である横G、及び前後方向の加減速度を検出する。加速度センサ15は、検出した横G及び前後方向の加減速度を自己位置推定回路16へ出力する。
自己位置推定回路16は、撮像装置10、距離測定装置11、車輪速センサ12、操舵角センサ13、及びジャイロセンサ14から受信した信号と、既知の物標の2次元地図上の位置を示す2次元地図情報に基づいて、車両1の地図上の現在位置を推定する。以下、車両1の地図上の現在位置を「自己位置」と表記することがある。自己位置推定回路16は、自己位置を示す自己位置信号を運転支援システム3へ出力する。
運転支援の一例は、例えば運転者に対する警報等の情報提供であってよい。運転支援システム3は、車両1の自己位置に応じて運転者に提示する警報の種類及び強度の少なくとも一方を制御してよい。
運転支援の一例は、車両1の制動制御、加速制御及び操舵制御の少なくとも1つを含む車両1の走行状態の制御であってもよい。例えば運転支援システム3は、車両1の自己位置に応じて車両1に制動力及び駆動力のいずれを発生させるのかを決定してよい。
自己位置推定回路16が備えるプロセッサは、記憶装置24に格納されたコンピュータプログラムを実行することにより、物標位置検出部20、移動量推定部21、勾配検出部22、物標位置蓄積部23、選択部25、位置推定部26、及び地図情報取得部27の機能を実現する。
物標位置検出部20は、車両1の前方領域の撮像画像と距離測定装置11の測定結果に基づいて、車両1の周囲に存在する物標を検出する。例えば物標位置検出部20は、車両1の前方に存在する物標を検出する。
さらに、物標位置検出部20は、車両1に対する物標の相対位置を検出する。物標位置検出部20は、検出した相対位置を示す相対位置信号を物標位置蓄積部23へ出力する。
ここで物標とは、例えば車両1が走行する走行路面上の線(車線区分線等)や、路肩の縁石、ガードレール等であってよい。
また、勾配検出部22は、撮像装置10が生成した車両1の前方領域の撮像画像を受信してもよい。勾配検出部22は、撮像画像を解析することにより3D点群のフローに基づいて車両1の走行路の勾配量を検出してもよい。
勾配検出部22は、車両1の走行路が勾配区間であるか否かを判定する。例えば勾配検出部22は、車両1の走行路の勾配量が所定の閾値以上である場合に走行路が勾配区間であると判定してよい。勾配検出部22は、判定結果を示す判定結果信号を選択部25に出力する。
物標位置蓄積部23は、相対位置信号が示す車両1の周囲の物標の相対位置を記憶装置24に蓄積する。
また、物標位置蓄積部23は、過去に蓄積した物標の相対位置を、現在までの経過時間と移動量信号が示す移動量ΔPを用いて、車両1の現在位置に対する相対位置へ補正する。すなわち、物標位置蓄積部23は、現在までの経過時間に車両が移動した移動量ΔPだけ車両1の移動方向と逆方向に相対位置を移動させる。
既に物標位置データが記憶装置24に蓄積されている場合には、物標位置蓄積部23は、移動量信号が示す移動量ΔPを用いて蓄積された物標位置データを更新する。すなわち物標位置蓄積部23は、蓄積された物標位置データの相対位置を、移動量ΔP分だけ車両1の移動方向と逆方向に移動させる。その後、物標位置蓄積部23は、移動量ΔP分だけ相対移動させた相対位置を、蓄積していた物標位置データに上書きする。
選択部25が選択済物標位置データを選択する処理については後述する。
地図情報取得部27は、地図データと、地図データ上に存在する物標の2次元地図上の位置を示す2次元地図情報を取得する。例えば、地図情報取得部27は、カーナビゲーションシステムや地図データベース等である。なお、地図情報取得部27は、無線通信(路車間通信、または、車車間通信でも可)等の通信システムを介して外部から2次元地図情報を取得してもよい。この場合、地図情報取得部27は、定期的に最新の2次元地図情報を入手して、保有する2次元地図情報を更新してもよい。また、地図情報取得部27は、車両1が実際に走行した走路で検出した物標の位置情報を、2次元地図情報として蓄積してもよい。
図3を参照する。参照符号Siは選択済物標位置データを示す。インデックスiは1~Nの整数であり、Nは選択済物標位置データの個数である。
位置推定部26は、前回の処理周期で推定した自己位置を移動量ΔPで補正して車両1の仮位置を決定する。
位置推定部26は、選択済物標位置データSiとこのデータに最も近い位置情報Mjとの距離Dijを算出し、以下の式(1)を用いて距離Dijの平均Sを算出する。
次に、選択部25が選択済物標位置データを選択する処理を説明する。
上述の通り、勾配区間では斜距離と水平距離との間に差がある。このため、勾配区間通過前に検出した物標の物標位置データが示す物標と車両1との間の距離が、実際の水平距離より伸びることがある。図4を参照してその理由を説明する。
中段は、勾配区間通過後の時点で記憶装置24に蓄積されている物標位置データが示す物標と車両1との間の距離の模式図である。円形プロットS1~S7は、物標T1~T7の物標位置データに対応する。
下段は、2次元地図上での物標T1~T7と車両1との間の距離の模式図である。三角形のプロットM1~M7は、物標T1~T7の地図上の位置を示す。
例えば、勾配区間Ss内の物標T3及びT4について、物標位置データS3及びS4が示す物標T3及びT4と車両1との距離と、2次元地図上の距離と、の差はそれぞれe3及びe4であり、e4はe3より長い。
一方で、勾配区間Ssを通過した後の平坦区間の物標T1~T2の物標位置データS1~S2は、勾配区間Ssで推定した移動量ΔPを用いて補正されていない。このため、物標位置データS1~S2が示す物標T1及びT2と車両1との距離と、2次元地図上の距離と、の間に差は生じない。また、物標位置データS1~S2同士の相対位置も変化しない。
このため、物標位置データと地図上の位置情報との距離Dijの平均Sが最小となるよう自己位置を推定すると、勾配区間進入前の物標位置データS5~S7と地図上の位置情報との距離も小さくする作用が働くため、推定誤差が小さくならない場合がある。
さらに、勾配区間通過後の物標位置データS1~S2が支配的に作用して算出された推定位置と、勾配区間進入前の物標位置データS5~S7が支配的に作用して算出された推定位置とが異なり、自己位置の推定位置の誤差が変動して不安定になるおそれがある。
車両1が勾配区間Ssを通過した場合、選択部25は、勾配区間通過後の物標位置データS1~S2(すなわち、勾配区間を通過してから現在位置までの区間の物標の物標位置データ)を選択済物標位置データとして選択する。すなわち選択部25は、勾配区間通過前の物標位置データS3~S7を選択済物標位置データから除外する。
そして、位置推定部26は、選択した勾配区間通過後の物標位置データS1~S2を、物標T1~T2の地図上の位置M1~M2と照合することにより、車両1の自己位置を推定する。この様子を図6に示す。
また、選択部25は、選択済物標位置データ以外の物標位置データ(すなわち勾配区間通過前の物標位置データ)を記憶装置24から削除してもよい。例えば、選択部25は、勾配区間内の物標位置データS3~S4と勾配区間進入前の物標位置データS5~S7を記憶装置24から削除してもよい。位置推定部26は、記憶装置24に残っている物標位置データと物標の地図上の位置を示す地図情報とを照合することで車両1の現在位置を推定してよい。
勾配区間通過前の物標位置データを記憶装置24から削除することにより、記憶装置24の記憶領域を有効に活用することができる。
次に、第1実施形態に係る自己位置推定装置2の動作について説明する。図7を参照する。
ステップS1において撮像装置10、距離測定装置11、及び物標位置検出部20は、車両1の周囲に存在する物標の車両1に対する相対位置を検出する。物標位置検出部20は、検出した相対位置を示す相対位置信号を物標位置蓄積部23へ出力する。
ステップS2において移動量推定部21は、前回の処理周期で車両1の自己位置を推定してから現在までの車両1の移動量ΔPを推定する。
ステップS3において物標位置蓄積部23は、相対位置信号が示す車両1の周囲の物標の相対位置を記憶装置24に蓄積する。また物標位置蓄積部23は、過去に蓄積した物標の相対位置を、現在までの経過時間と移動量信号が示す移動量ΔPを用いて車両1の現在位置に対する相対位置へ補正し、物標位置データとして記憶装置24に蓄積する。
ステップS5において勾配検出部22及び選択部25は、勾配区間通過判定処理により、車両1が勾配区間内にいるのか、勾配区間進入前であるのか、勾配区間通過後であるのかを判定する。
ステップS7において選択部25は、勾配区間通過判定処理にて車両1が勾配区間通過後であると判定されたか否かを判断する。車両1が勾配区間を通過した場合(ステップS7:Y)に処理はステップS8へ進む。
ステップS8において選択部25は、勾配区間通過前の物標位置データS3~S7を記憶装置24から削除する。すなわち選択部25は、勾配区間通過後の物標位置データS1~S2を選択して、選択済物標位置データとして記憶装置24に残す。
ステップS10において運転支援システム3は、位置推定部26が推定した車両1の自己位置を用いて、運転者による車両1の運転に対する運転支援を実施する。
ステップS22において選択部25は、車両1がまだ勾配区間に進入していないと判断する。その後処理は終了する。
一方で、ステップS24において勾配検出部22は、車両1の走行路の勾配量が閾値以上であるか否かを判断する。勾配量が閾値以上である場合(ステップS24:Y)に処理はステップS23へ進む。勾配量が閾値未満である場合(ステップS24:N)に処理はステップS25へ進む。
ステップS25において選択部25は、車両1がまだ勾配区間を通過したと判断する。その後処理は終了する。
(1)物標検出センサとしての撮像装置10及び距離測定装置11と物標位置検出部20とは、車両1の周囲に存在する物標の車両1に対する相対位置を検出する。移動量推定部21は、車両1の移動量を推定する。物標位置蓄積部23は、車両1の移動量に基づき相対位置を補正して物標位置データとして蓄積する。勾配検出センサとしての撮像装置10及びジャイロセンサ14と勾配検出部22とは、車両1の走行路の勾配を検出する。選択部25は、蓄積した物標位置データのうち、勾配区間を通過してから現在位置までの区間の物標の物標位置データを選択する。位置推定部26は、選択した物標位置データと物標の地図上の位置を示す地図情報とを照合することで車両1の現在位置を推定する。
例えば、勾配区間を通過して交差点に進入するような場合に、交差点周辺の距離差の無い正確な物標位置に基づき自己位置を推定できるため、推定精度が向上する。
推定位置の精度向上や処理時間の短縮のため、選択部25は、勾配区間通過後の物標のうちいずれかを優先的に選択済物標位置データとして選択し、残りを選択済物標位置データから除外してもよい。例えば選択部25は、車両1と物標とを結ぶ直線と車両1の進行方向とのなす角が大きくなる物標ほど、優先して選択済物標位置データとして選択してもよい。
また例えば選択部25は、車両1からの距離が所定の上限より長い物標を選択済物標位置データから除外してもよい。ここで、物標と車両1との間の距離が長いほど、物標と車両1との間に勾配区間が入りやすくなり、移動量ΔPの推定誤差が増加しやすくなる。したがって、物標と車両1との間の距離の上限は、移動量ΔPの推定誤差に起因する位置推定誤差の許容範囲に応じて調整してもよい。
続いて、第2実施形態の自己位置推定装置2を説明する。
勾配量が継続して閾値未満である区間を走行している間は、この区間の物標の物標位置データを用いて自己推定を行うことで、斜距離と水平距離との間の差に起因する2次元地図上の位置の推定精度を抑制することができる。
したがって、勾配区間進入前に車両1が走行した閾値未満の勾配量を有する第1区間と、勾配区間通過後に車両1が走行する閾値未満の勾配量を有する第2区間とでは、それぞれ車両1の自己位置を高い精度で検出することができる。このため、第1区間で推定した車両1の自己位置と第2区間で推定した車両1の自己位置との間の相対位置を高い精度で算出することができる。
第3実施形態の自己位置推定回路16は、第1区間で推定した自己位置と第2区間で推定した自己位置との間の相対位置を用いて第1区間の物標の物標位置データを補正する。
位置推定部26は、勾配区間進入前に車両1が走行した閾値未満の勾配量を有する第1区間において、第1区間の物標の物標位置データと地図情報とを照合することにより勾配区間へ進入する前の車両1の第1位置を推定する。位置推定部26は、第1位置を運転支援システム3及び補正部28に出力する。
補正部28は、第1区間で推定した車両1の第1位置の情報を、第1区間の物標の物標位置データに付加して記憶装置24に記憶する。
補正部28は、第1位置と第2位置との相対位置に基づいて、第1区間の物標の物標位置データを補正する。
物位置推定部26は、物標位置データの補正後に推定した第2位置を運転支援システム3に出力する。物位置推定部26は、物標位置データの補正後に推定した第2位置の情報を、第2区間の物標の物標位置データに付加して記憶装置24に記憶する。
ステップS35において選択部25は、勾配区間通過判定処理にて車両1が勾配区間内にいると判定されたか否かを判断する。車両1が勾配区間内にいる場合(ステップS35:Y)に処理はステップS43へ進む。車両1が勾配区間内にいない場合(ステップ35:N)に処理はステップS36へ進む。
車両1が勾配区間を通過していない場合(ステップS36:N)、すなわち車両1が勾配区間への進入前である場合、処理はステップS43へ進む。
すなわち選択部25は、勾配区間以外の物標(すなわち、勾配区間進入前の物標と勾配区間通過後の物標の物標位置データ)を選択済物標位置データとして選択する。
言い換えれば、選択部25は、勾配区間通過後の物標の物標位置データに限定せず、勾配区間以外の勾配量が閾値未満である区間の物標の物標位置データを選択済物標位置データとして選択する。なお、勾配区間以外の物標を、選択済物標位置データとしてすべて選択する必要はなく、地図情報取得部27が取得した地図情報と照合して、車両1の自己位置を推定できるために必要な物標位置データのみ選択するようにしてもよい。
ステップS39において位置推定部26は、ステップS38で選択した物標位置データと2次元地図情報とを照合して車両1の第2位置を推定する。
ステップS40において補正部28は、勾配区間進入前の第1区間の物標の物標位置データに付加して記憶された、第1区間で推定した車両1の第1位置の情報を記憶装置24から読み出す。補正部28は、第1位置と第2位置との相対位置に基づいて、第1区間の物標の物標位置データを補正する。
ステップS41において位置推定部26は、記憶装置24に残っている物標位置データ(すなわちステップS40で補正した物標位置データ及び第2区間の物標の物標位置データ)と地図情報とを照合することで勾配区間通過後の車両1の第2位置を推定する。補正部28は、ステップS41において推定した車両1の第2位置の情報を、第2区間の物標の物標位置データに付加して記憶装置24に記憶する。
ステップS42の処理は、図7のステップS10の処理と同様である。
ステップS43の処理は、図7のステップS9の処理と同様である。ステップS43の後、処理はステップS42へ進む。
位置推定部26は、勾配区間へ進入する前に閾値未満の勾配量を有する第1区間の物標の物標位置データと地図情報とを照合することにより勾配区間へ進入する前の車両1の第1位置を推定する。また、勾配区間を通過した後に閾値未満の勾配量を有する第2区間の物標の物標位置データと地図情報とを照合することにより勾配区間を通過した後の車両1の第2位置を推定する。補正部28は、第1位置と第2位置との相対位置に基づいて、第1区間の物標の前記物標位置データを補正する。
位置推定部26は、補正した物標位置データ及び第2区間の物標の物標位置データと地図情報とを照合することにより車両1の自己位置を推定する。
これにより、勾配区間の物標同士の間の斜距離と水平距離との差に起因する2次元地図上の位置の推定精度の低下を抑制できる。
また、勾配区間進入前の物標の物標位置データを再度利用することができるので、自己位置推定の精度を向上できる。
どこにでもあるような多少の起伏を通過した場合には、起伏にある物標の物標位置データを選択済物標位置データから除外しなくてもよい。
例えば、選択部25は、橋梁や高速道路への進入口等の勾配区間の物標の物標位置データを選択済物標位置データから除外し、例えば、勾配量が1~2度で通過時間が2~3秒程度の起伏にある物標の物標位置データを選択済物標位置データから除外しなくてもよい。
一方で、選択部25は、閾値以上の勾配を有する区間が所定長以上継続した場合には、この区間の物標の物標位置データを選択済物標位置データから除外してよい。すなわち、この区間以外の、閾値未満の勾配を有する区間の物標の物標位置データを選択済物標位置データとして選択する。第1実施形態でも同様である。
このように、閾値以上の勾配を有する区間が所定長以上継続した場合に、この区間の物標の物標位置データを選択するので、自己位置の推定精度に影響を与える勾配区間の物標位置データを適切に除外できる。
車両1の走行路の勾配量が大きいほど短くなるように所定長を動的に設定してもよい。これにより、勾配量の大小に関わらず斜距離と水平距離との差に起因する測定誤差を所望の許容範囲内に抑えることができる。
Claims (7)
- 移動体の周囲に存在する物標の前記移動体に対する相対位置を検出し、
前記移動体の移動量を推定し、
前記移動体の移動量に基づき前記相対位置を補正して物標位置データとして蓄積し、
前記移動体の走行路の勾配量を検出し、
蓄積した前記物標位置データのうち、勾配量が閾値未満である区間の物標の物標位置データを選択し、
選択した前記物標位置データと2次元地図上の前記物標の位置を示す地図情報とを照合することにより前記移動体の現在位置を推定する、
ことを特徴とする自己位置推定方法。 - 前記閾値以上の勾配量を有する勾配区間を通過してから現在位置までの区間の物標の物標位置データを選択して、前記地図情報と照合することを特徴とする請求項1に記載の自己位置推定方法。
- 前記勾配区間へ進入する前に前記閾値未満の勾配量を有する第1区間の物標の物標位置データと前記地図情報とを照合することにより前記勾配区間へ進入する前の前記移動体の第1位置を推定し、
前記勾配区間を通過した後に前記閾値未満の勾配量を有する第2区間の物標の物標位置データと前記地図情報とを照合することにより前記勾配区間を通過した後の前記移動体の第2位置を推定し、
前記第1位置と前記第2位置との相対位置に基づいて、前記第1区間の物標の前記物標位置データを補正する、
ことを特徴とする請求項2に記載の自己位置推定方法。 - 前記閾値以上の勾配量を有する区間が所定長以上継続した場合に、前記閾値未満の勾配量を有する区間の物標の物標位置データを選択して前記地図情報と照合し、
前記閾値以上の勾配量を有する区間が前記所定長以上継続しない場合に、前記閾値以上の勾配量を有する区間の物標の物標位置データを、前記地図情報と照合する物標位置データに含める、
ことを特徴とする請求項1~3の何れか一項に記載の自己位置推定方法。 - 前記移動体の走行路の勾配量が大きいほど前記所定長を短く設定することを特徴とする請求項4に記載の自己位置推定方法。
- 前記移動体の現在位置の周囲の物標の物標位置データを選択して前記地図情報と照合することを特徴とする請求項1~5のいずれか一項に記載の自己位置推定方法。
- 移動体の周囲に存在する物標の前記移動体に対する相対位置を検出する物標検出センサと、
前記移動体の車輪速を検出する車輪速センサと、
前記移動体の走行路の勾配量を検出する勾配検出センサと、
少なくとも前記車輪速センサの検出結果に応じて前記移動体の移動量を推定し、前記物標検出センサが検出した前記相対位置を前記移動量に基づき補正して物標位置データとして記憶装置に蓄積し、蓄積した前記物標位置データのうち、勾配量が閾値未満である区間の物標の物標位置データを選択し、選択した前記物標位置データと2次元地図上の前記物標の位置を示す地図情報とを照合することにより前記移動体の現在位置を推定する自己位置推定回路と、
を備えることを特徴とする自己位置推定装置。
Priority Applications (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/071922 WO2018020589A1 (ja) | 2016-07-26 | 2016-07-26 | 自己位置推定方法及び自己位置推定装置 |
RU2019105128A RU2722356C1 (ru) | 2016-07-26 | 2016-07-26 | Способ оценки собственного положения и устройство оценки собственного положения |
JP2018530245A JP6575686B2 (ja) | 2016-07-26 | 2016-07-26 | 自己位置推定方法及び自己位置推定装置 |
MX2019001092A MX2019001092A (es) | 2016-07-26 | 2016-07-26 | Metodo de estimacion de la posicion propia y dispositivo de estimacion de la posicion propia. |
BR112019001441-1A BR112019001441B1 (pt) | 2016-07-26 | 2016-07-26 | Método de estimativa de autoposição e dispositivo de estimativa de autoposição |
KR1020197004534A KR20190028528A (ko) | 2016-07-26 | 2016-07-26 | 자기 위치 추정 방법 및 자기 위치 추정 장치 |
US16/320,307 US11243080B2 (en) | 2016-07-26 | 2016-07-26 | Self-position estimation method and self-position estimation device |
CA3032068A CA3032068C (en) | 2016-07-26 | 2016-07-26 | Self-position estimation method and self-position estimation device |
EP16910492.4A EP3492871B1 (en) | 2016-07-26 | 2016-07-26 | Self-position estimation method and self-position estimation apparatus |
CN201680087942.5A CN109564098B (zh) | 2016-07-26 | 2016-07-26 | 自身位置推定方法及自身位置推定装置 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/071922 WO2018020589A1 (ja) | 2016-07-26 | 2016-07-26 | 自己位置推定方法及び自己位置推定装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018020589A1 true WO2018020589A1 (ja) | 2018-02-01 |
Family
ID=61016515
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/071922 WO2018020589A1 (ja) | 2016-07-26 | 2016-07-26 | 自己位置推定方法及び自己位置推定装置 |
Country Status (10)
Country | Link |
---|---|
US (1) | US11243080B2 (ja) |
EP (1) | EP3492871B1 (ja) |
JP (1) | JP6575686B2 (ja) |
KR (1) | KR20190028528A (ja) |
CN (1) | CN109564098B (ja) |
BR (1) | BR112019001441B1 (ja) |
CA (1) | CA3032068C (ja) |
MX (1) | MX2019001092A (ja) |
RU (1) | RU2722356C1 (ja) |
WO (1) | WO2018020589A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019230098A1 (ja) * | 2018-05-30 | 2019-12-05 | クラリオン株式会社 | 情報処理装置 |
JP2019219204A (ja) * | 2018-06-18 | 2019-12-26 | 日産自動車株式会社 | 走行軌跡推定方法及び走行軌跡推定装置 |
JP7358108B2 (ja) | 2019-07-31 | 2023-10-10 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3492870B1 (en) * | 2016-07-26 | 2020-07-15 | Nissan Motor Co., Ltd. | Self-position estimation method and self-position estimation device |
EP3842886A4 (en) * | 2018-08-23 | 2022-05-11 | Nsk Ltd. | SELF-PROPELLED DEVICE, AND TRAVEL CONTROL METHOD AND TRAVEL CONTROL PROGRAM FOR SELF-PROPELLED DEVICE |
CN109215136B (zh) | 2018-09-07 | 2020-03-20 | 百度在线网络技术(北京)有限公司 | 一种真实数据增强方法、装置以及终端 |
CN109146898B (zh) * | 2018-09-07 | 2020-07-24 | 百度在线网络技术(北京)有限公司 | 一种仿真数据量增强方法、装置以及终端 |
CN110103823B (zh) * | 2019-05-21 | 2021-06-11 | 东南大学 | 一种基于增强型数字地图的车辆侧翻事前预警方法 |
CN110473417A (zh) * | 2019-06-03 | 2019-11-19 | 浙江吉利控股集团有限公司 | 一种高架桥限速值提示方法、装置及存储介质 |
JP7332403B2 (ja) * | 2019-09-11 | 2023-08-23 | 株式会社東芝 | 位置推定装置、移動体制御システム、位置推定方法およびプログラム |
CN113450407B (zh) * | 2021-05-14 | 2023-10-13 | 东莞市李群自动化技术有限公司 | 定位方法及作业方法、电子设备、轨道设备和存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003252149A (ja) * | 2002-03-01 | 2003-09-10 | Mitsubishi Electric Corp | 車線認識画像処理装置及びその処理を実行させるためのプログラム |
JP2008250906A (ja) * | 2007-03-30 | 2008-10-16 | Sogo Keibi Hosho Co Ltd | 移動ロボット、自己位置補正方法および自己位置補正プログラム |
JP2012008706A (ja) * | 2010-06-23 | 2012-01-12 | Denso Corp | 道路形状検出装置 |
JP2012103858A (ja) * | 2010-11-09 | 2012-05-31 | Toyota Motor Corp | 障害物認識装置 |
JP2016017758A (ja) * | 2014-07-04 | 2016-02-01 | 日産自動車株式会社 | 走行支援装置及び走行支援方法 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5367458A (en) * | 1993-08-10 | 1994-11-22 | Caterpillar Industrial Inc. | Apparatus and method for identifying scanned reflective anonymous targets |
JP4092308B2 (ja) * | 2004-06-02 | 2008-05-28 | トヨタ自動車株式会社 | 境界線検出装置 |
AU2009211435A1 (en) * | 2008-02-04 | 2009-08-13 | Tele Atlas B.V. | Method for map matching with sensor detected objects |
JP2010146202A (ja) * | 2008-12-17 | 2010-07-01 | Toyota Central R&D Labs Inc | 移動体と移動体の位置推定方法 |
JP5441549B2 (ja) * | 2009-07-29 | 2014-03-12 | 日立オートモティブシステムズ株式会社 | 道路形状認識装置 |
JP5278378B2 (ja) * | 2009-07-30 | 2013-09-04 | 日産自動車株式会社 | 車両運転支援装置及び車両運転支援方法 |
JP5321497B2 (ja) * | 2010-02-22 | 2013-10-23 | 株式会社デンソー | 白線認識装置 |
KR20110134633A (ko) | 2010-06-09 | 2011-12-15 | 엠텍비젼 주식회사 | 속도 측정 장치 및 측정된 속도 보정 방법 |
JP5951266B2 (ja) * | 2012-01-27 | 2016-07-13 | 三菱重工業株式会社 | 勾配情報取得方法、勾配情報記憶済記憶媒体を作成する方法、勾配情報取得装置およびプログラム |
WO2014076844A1 (ja) * | 2012-11-19 | 2014-05-22 | 株式会社日立製作所 | 自律移動システムおよび管制装置 |
US9740942B2 (en) * | 2012-12-12 | 2017-08-22 | Nissan Motor Co., Ltd. | Moving object location/attitude angle estimation device and moving object location/attitude angle estimation method |
WO2014115319A1 (ja) * | 2013-01-25 | 2014-07-31 | トヨタ自動車株式会社 | 道路環境認識システム |
RU2621823C1 (ru) * | 2014-02-24 | 2017-06-07 | Ниссан Мотор Ко., Лтд. | Устройство вычисления собственного местоположения и способ вычисления собственного местоположения |
WO2017056484A1 (ja) * | 2015-09-28 | 2017-04-06 | 京セラ株式会社 | 画像処理装置、ステレオカメラ装置、車両及び画像処理方法 |
JP6406226B2 (ja) * | 2015-11-27 | 2018-10-17 | 株式会社デンソー | 車両制御装置 |
US10184799B2 (en) * | 2016-06-13 | 2019-01-22 | The Boeing Company | Systems and methods for targeting objects of interest in denied GPS environments |
GB2552021B (en) * | 2016-07-08 | 2019-08-28 | Jaguar Land Rover Ltd | Improvements in vehicle speed control |
-
2016
- 2016-07-26 EP EP16910492.4A patent/EP3492871B1/en active Active
- 2016-07-26 CN CN201680087942.5A patent/CN109564098B/zh active Active
- 2016-07-26 US US16/320,307 patent/US11243080B2/en active Active
- 2016-07-26 JP JP2018530245A patent/JP6575686B2/ja active Active
- 2016-07-26 RU RU2019105128A patent/RU2722356C1/ru active
- 2016-07-26 BR BR112019001441-1A patent/BR112019001441B1/pt active IP Right Grant
- 2016-07-26 WO PCT/JP2016/071922 patent/WO2018020589A1/ja unknown
- 2016-07-26 KR KR1020197004534A patent/KR20190028528A/ko active IP Right Grant
- 2016-07-26 MX MX2019001092A patent/MX2019001092A/es active IP Right Grant
- 2016-07-26 CA CA3032068A patent/CA3032068C/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003252149A (ja) * | 2002-03-01 | 2003-09-10 | Mitsubishi Electric Corp | 車線認識画像処理装置及びその処理を実行させるためのプログラム |
JP2008250906A (ja) * | 2007-03-30 | 2008-10-16 | Sogo Keibi Hosho Co Ltd | 移動ロボット、自己位置補正方法および自己位置補正プログラム |
JP2012008706A (ja) * | 2010-06-23 | 2012-01-12 | Denso Corp | 道路形状検出装置 |
JP2012103858A (ja) * | 2010-11-09 | 2012-05-31 | Toyota Motor Corp | 障害物認識装置 |
JP2016017758A (ja) * | 2014-07-04 | 2016-02-01 | 日産自動車株式会社 | 走行支援装置及び走行支援方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3492871A4 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019230098A1 (ja) * | 2018-05-30 | 2019-12-05 | クラリオン株式会社 | 情報処理装置 |
JP2019207214A (ja) * | 2018-05-30 | 2019-12-05 | クラリオン株式会社 | 情報処理装置 |
CN112204348A (zh) * | 2018-05-30 | 2021-01-08 | 歌乐株式会社 | 信息处理装置 |
US20210206391A1 (en) * | 2018-05-30 | 2021-07-08 | Clarion Co., Ltd. | Information processing apparatus |
JP7137359B2 (ja) | 2018-05-30 | 2022-09-14 | フォルシアクラリオン・エレクトロニクス株式会社 | 情報処理装置 |
US11560160B2 (en) | 2018-05-30 | 2023-01-24 | Faurecia Clarion Electronics Co., Ltd. | Information processing apparatus |
CN112204348B (zh) * | 2018-05-30 | 2024-08-27 | 歌乐株式会社 | 信息处理装置 |
JP2019219204A (ja) * | 2018-06-18 | 2019-12-26 | 日産自動車株式会社 | 走行軌跡推定方法及び走行軌跡推定装置 |
JP7084792B2 (ja) | 2018-06-18 | 2022-06-15 | 日産自動車株式会社 | 走行軌跡推定方法及び走行軌跡推定装置 |
JP7358108B2 (ja) | 2019-07-31 | 2023-10-10 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
CN109564098B (zh) | 2020-07-14 |
US20190265040A1 (en) | 2019-08-29 |
US11243080B2 (en) | 2022-02-08 |
EP3492871A1 (en) | 2019-06-05 |
EP3492871A4 (en) | 2019-09-04 |
BR112019001441B1 (pt) | 2023-02-07 |
MX2019001092A (es) | 2019-07-04 |
CA3032068A1 (en) | 2018-02-01 |
RU2722356C1 (ru) | 2020-05-29 |
JPWO2018020589A1 (ja) | 2019-03-28 |
CN109564098A (zh) | 2019-04-02 |
EP3492871B1 (en) | 2020-05-06 |
CA3032068C (en) | 2020-01-14 |
KR20190028528A (ko) | 2019-03-18 |
BR112019001441A2 (pt) | 2019-05-07 |
JP6575686B2 (ja) | 2019-09-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6575686B2 (ja) | 自己位置推定方法及び自己位置推定装置 | |
KR101880013B1 (ko) | 자기 위치 추정 장치 및 자기 위치 추정 방법 | |
JP6575685B2 (ja) | 自己位置推定方法及び自己位置推定装置 | |
JP4724043B2 (ja) | 対象物認識装置 | |
US10289120B2 (en) | Self-position estimation device and self-position estimation method | |
RU2735567C1 (ru) | Способ хранения предысторий движения, способ для выработки модели пути движения, способ для оценки локальной позиции и устройство хранения предысторий движения | |
KR102134841B1 (ko) | 자율주행차의 위치추정시스템 및 그의 위치추정방법 | |
CN110914639B (zh) | 产生和更新至少一个建筑物的至少一个空间的拓扑地图的数据生成方法 | |
JP6747157B2 (ja) | 自己位置推定方法及び自己位置推定装置 | |
KR20190005189A (ko) | 차간 거리 추정 방법 및 차간 거리 추정 장치 | |
JP2021030980A (ja) | 走行支援方法及び走行支援装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2018530245 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16910492 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 3032068 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112019001441 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 20197004534 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2016910492 Country of ref document: EP Effective date: 20190226 |
|
ENP | Entry into the national phase |
Ref document number: 112019001441 Country of ref document: BR Kind code of ref document: A2 Effective date: 20190124 |