WO2020008221A1 - 走行支援方法及び走行支援装置 - Google Patents
走行支援方法及び走行支援装置 Download PDFInfo
- Publication number
- WO2020008221A1 WO2020008221A1 PCT/IB2018/000830 IB2018000830W WO2020008221A1 WO 2020008221 A1 WO2020008221 A1 WO 2020008221A1 IB 2018000830 W IB2018000830 W IB 2018000830W WO 2020008221 A1 WO2020008221 A1 WO 2020008221A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lane boundary
- lane
- vehicle
- driving support
- integrated
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 51
- 230000010354 integration Effects 0.000 claims abstract description 98
- 239000000470 constituent Substances 0.000 claims description 78
- 238000001514 detection method Methods 0.000 claims description 76
- 238000006243 chemical reaction Methods 0.000 abstract 3
- 238000012545 processing Methods 0.000 description 28
- 230000008569 process Effects 0.000 description 18
- 238000004891 communication Methods 0.000 description 9
- 230000004048 modification Effects 0.000 description 8
- 238000012986 modification Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 238000005070 sampling Methods 0.000 description 6
- 239000004065 semiconductor Substances 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000004549 pulsed laser deposition Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3807—Creation or updating of map data characterised by the type of data
- G01C21/3815—Road data
- G01C21/3819—Road shape data, e.g. outline of a route
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
- G01C21/32—Structuring or formatting of map data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/01—Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
Definitions
- the present invention relates to a driving support method and a driving support device.
- Patent Document 1 describes a technique for determining a mismatch between observation information of the real world acquired by a sensor and a map serving as a reference for the surrounding environment.
- Patent Literature 1 does not consider a road shape in a range in which observation information and a map are matched. For this reason, when the road shape within the range to be matched is only a straight line or a curve having a constant curvature and there is no feature in the road shape, the matching may be shifted.
- the object of the present invention is to provide a driving support method and a driving support device that can appropriately perform matching between road shape information in map data and road shape information detected by a sensor.
- a lane boundary existing around a vehicle is detected by a sensor, a self-position of the vehicle is calculated, and a coordinate system of the detected lane boundary is stored in a storage device based on the self-position.
- the driving support method for converting the coordinate data into the same coordinate system as the converted map data and integrating the lane boundary shape information included in the map data and the converted lane boundary lines to generate integrated data the lane boundary line
- the integration range when integrating the shape information and the lane boundary line obtained by converting the coordinate system is generated by associating the shape information with the lane boundary line whose coordinate system has been converted so as to include at least the determined integrated range.
- a driving support method and a driving support device that can appropriately perform matching between road shape information in map data and road shape information detected by a sensor.
- FIG. 9 is an explanatory diagram of an example of a matching process.
- FIG. 9 is an explanatory diagram of an example of a matching result.
- FIG. 9 is an explanatory diagram illustrating an example of a process of determining an integration range. It is explanatory drawing of an example of the calculation process of the angle of a lane traveling direction. It is explanatory drawing of an example of the calculation process of the angle of a lane traveling direction. It is a graph which shows the distribution of the angle of the lane traveling direction.
- FIG. 9 is an explanatory diagram illustrating an example of a process of determining an integration range.
- FIG. 9 is an explanatory diagram illustrating an example of a process of determining an integration range.
- FIG. 9 is an explanatory diagram illustrating an example of a process of determining an integration range.
- FIG. 9 is an explanatory diagram illustrating an example of a process of determining an integration range.
- a driving support device 1 can be mounted on a vehicle (hereinafter, a vehicle on which the driving support device according to the embodiment of the present invention is mounted is referred to as “own vehicle”).
- a driving support device 1 includes a storage device 2, a global positioning system (GPS) receiver (GPS sensor) 3, a vehicle sensor 4, a surrounding sensor 5, a processing circuit 6, A display 7, a vehicle control device 8 and an actuator 9 are provided.
- the processing circuit 6, and the storage device 2, the GPS receiver 3, the vehicle sensor 4, the surrounding sensor 5, the display 7, and the vehicle control device 8 transmit and receive data and signals by wire or wireless such as a controller area network (CAN) bus. It is possible.
- CAN controller area network
- the storage device 2 may be, for example, a semiconductor storage device, a magnetic storage device, an optical storage device, or the like.
- the storage device 2 may be built in the processing circuit 6.
- the storage device 2 includes a map storage unit 10 that stores map data.
- the point data acquired by the surrounding sensor 5 is integrated with the map data stored in the map storage unit 10 by the driving support device 1 according to the embodiment of the present invention.
- the map data stored in the map storage unit 10 may be, for example, high-accuracy map data (hereinafter, simply referred to as “high-accuracy map”).
- High-precision maps are suitable as maps for automatic driving.
- the high-accuracy map is higher-accuracy map data than map data for navigation (hereinafter, simply referred to as “navi map”), and includes more detailed lane-by-lane information than road-by-road information.
- the high-precision map includes, as the lane information, lane node information indicating a reference point on a lane reference line (for example, a center line in the lane) and lane link information indicating a lane section between lane nodes. including.
- the information of the lane node includes the identification number of the lane node, the position coordinates, the number of connected lane links, and the identification number of the connected lane link.
- the lane link information includes information such as the lane link identification number, lane type, lane width, lane shape, lane boundary line type, lane boundary line shape, and lane reference line shape.
- the high-precision map further includes the type and position coordinates of features such as traffic lights, stop lines, signs, buildings, telephone poles, curbs, and pedestrian crossings existing on or near the lane, and lane nodes corresponding to the location coordinates of the features.
- Information of the feature such as the identification number of the lane link and the identification number of the lane link. Since the high-precision map includes node and link information in lane units, it is possible to specify the lane in which the host vehicle travels on the traveling route.
- the high-precision map has coordinates capable of expressing positions in the lane extension direction and the width direction.
- the high-precision map has coordinates (for example, longitude, latitude, and altitude) that can represent a position in a three-dimensional space, and lanes and the above-mentioned features can be described as shapes in the three-dimensional space.
- a high-accuracy map may have point cloud data of constituent points (boundary points) constituting a lane boundary.
- the map data integrated with the vehicle boundary data acquired by the surrounding sensor 5 by the driving support device 1 is not necessarily a high-accuracy map, but includes a map including point cloud data of constituent points constituting the lane boundary. Any data is acceptable.
- the database of the map data is managed by a server different from the driving support device 1, the difference data of the updated map data is obtained through, for example, a telematics service, and the update of the map data stored in the map storage unit 10 is performed. May go.
- the map data may be acquired by a telematics service such as vehicle-to-vehicle communication or road-to-vehicle communication in accordance with the position where the vehicle is traveling. By using the telematics service, the vehicle does not need to have map data with a large data capacity, and the memory capacity can be reduced.
- updated map data can be obtained, so that actual traveling conditions such as a change in road structure and the presence or absence of a construction site can be accurately grasped.
- map data created based on data collected from a plurality of other vehicles other than the own vehicle so that accurate information can be grasped.
- the GPS receiver 3 receives radio waves from a plurality of navigation satellites, acquires the current position of the own vehicle (own position), and outputs the obtained current position of the own vehicle to the processing circuit 6.
- a global positioning system (GNSS) receiver other than the GPS receiver 3 may be provided.
- the vehicle sensor 4 is a sensor that detects the current position and the running state (behavior) of the own vehicle.
- the vehicle sensor 4 may be, for example, a vehicle speed sensor, an acceleration sensor, a gyro sensor, and a yaw rate sensor, but the type and number of the vehicle sensors 4 are not limited thereto.
- the vehicle speed sensor may detect the vehicle speed based on the wheel speed of the host vehicle and output the detected vehicle speed to the processing circuit 6.
- the acceleration sensor may detect acceleration in the front-rear direction and the vehicle width direction of the own vehicle, and use the detected acceleration as the processing circuit 6.
- the gyro sensor may detect the angular velocity of the own vehicle and output the detected angular velocity to the processing circuit 6.
- the yaw rate sensor may detect a rotation angular velocity (yaw rate) around a vertical axis passing through the center of gravity of the vehicle, and output the detected yaw rate to the processing circuit 6.
- the surrounding sensor 5 is a sensor that detects the surrounding environment (surrounding situation) of the vehicle.
- the surrounding sensor 5 may be, for example, a camera, a radar, a communication device, or the like, but the type and number of the surrounding sensors 5 are not limited thereto.
- the camera used as the surrounding sensor 5 may be, for example, a CCD camera or the like.
- the camera may be a monocular camera or a stereo camera.
- the camera captures an image of the surrounding environment of the own vehicle, and from the captured image, a relative position between the own vehicle and an object such as another vehicle, a pedestrian or a bicycle, a distance between the object and the own vehicle, a lane boundary on a road (white line). It detects road structures such as roads and curbs as data of the surrounding environment of the vehicle, and outputs the detected data of the surrounding environment to the processing circuit 6.
- the radar used as the surrounding sensor 5 may be, for example, a millimeter-wave radar, an ultrasonic radar, a laser lane difinder (LRF), or the like.
- the radar detects the relative position of the object and the own vehicle, the distance between the object and the own vehicle, the relative speed of the object and the own vehicle as data of the surrounding environment of the own vehicle, and processes the detected data of the surrounding environment. Output to the circuit 6.
- the communication device used as the surrounding sensor 5 performs the vehicle-to-vehicle communication with another vehicle, the road-to-vehicle communication with the roadside device, or the communication with the traffic information center, etc.
- the data of the surrounding environment of the vehicle may be received, and the received data of the surrounding environment may be output to the processing circuit 6.
- each of the GPS receiver 3, the vehicle sensor 4, and the surrounding sensor 5 can be used as a sensor capable of detecting an actual (real) road structure around the own vehicle.
- the processing circuit 6 is a controller such as an electronic control unit (ECU) that performs an arithmetic and logic operation of a process necessary for assisting the traveling of the vehicle.
- the processing circuit 6 may include, for example, a processor, a storage device, and an input / output I / F.
- the processor may be an arithmetic logic unit (ALU), a control circuit (control unit), a central processing unit (CPU) including various registers, or the like, or a microprocessor equivalent thereto.
- the storage device built in or external to the processing circuit 6 may be a semiconductor memory, a disk medium, or the like, and may include a storage medium such as a register, a cache memory, and a ROM and a RAM used as a main storage device. .
- the processor of the processing circuit 6 executes information processing by the driving support device 1 described below by executing a program stored in the storage device in advance.
- the processing circuit 6 may output guidance information based on the map data stored in the map storage unit 10 from an information presenting device such as the display 7 or another speaker and present it to the occupant.
- the processing circuit 6 may implement driving support of the own vehicle.
- the driving support may be fully automatic driving in which the own vehicle is automatically driven without involvement of an occupant (for example, a driver), or driving assistance that controls at least one of driving, braking, and steering. Is also good. That is, in this specification, the driving support includes a case where all the control of driving, braking, and steering of the own vehicle is executed without involving the occupant, and at least one control of the driving, braking, and steering of the own vehicle is executed. Includes cases where
- the processing circuit 6 When performing the driving support, the processing circuit 6 generates a driving route for driving the own vehicle based on the map data stored in the map storage unit 10.
- the traveling route includes a traveling locus in lanes in addition to a traveling route in road units from the current position of the own vehicle to the destination.
- the travel locus may also include a travel control profile such as a speed profile.
- the processing circuit 6 specifies the position of the own vehicle on the map data, and generates a travel route (travel locus) so as to be drawn in a lane based on the position of the own vehicle.
- the travel route (travel locus) may be generated so as to pass through the center in the lane, for example.
- the processing circuit 6 When generating the traveling route, the processing circuit 6 refers to the map data stored in the map storage unit 10, but can also obtain the position information of the surrounding environment with high accuracy from the surrounding sensor 5. Therefore, it is desirable to create not only map data but also integrated data in which map data and data of the surrounding environment from the surrounding sensor 5 are integrated, and generate a traveling route using the integrated data.
- the processing circuit 6 outputs the generated traveling route to the vehicle control device 8.
- the vehicle control device 8 is an ECU that controls traveling of the own vehicle.
- the vehicle control device 8 includes a processor and peripheral components such as a storage device.
- the processor may be, for example, a CPU or an equivalent microprocessor.
- the storage device may include any of a semiconductor storage device, a magnetic storage device, and an optical storage device.
- the storage device may include a register, a cache memory, and a memory such as a ROM and a RAM used as a main storage device.
- the vehicle control device 8 may be realized by a functional logic circuit set in a general-purpose semiconductor integrated circuit.
- the vehicle control device 8 may include a programmable logic device (PLD) such as a field programmable gate array (FPGA).
- PLD programmable logic device
- FPGA field programmable gate array
- the vehicle control device 8 calculates the control amount of the actuator 9 so that the own vehicle travels on the traveling route generated by the processing circuit 6.
- the vehicle control device 8 transmits the calculated control amount to the actuator 9.
- the actuator 9 controls traveling of the own vehicle according to a control signal from the vehicle control device 8.
- the actuator 9 may be, for example, a drive actuator, a brake actuator, and a steering actuator.
- the drive actuator may be, for example, an electronically controlled throttle valve, and controls the accelerator opening of the host vehicle based on a control signal from the vehicle control device 8.
- the brake actuator may be, for example, a hydraulic circuit, and controls a braking operation of the brake of the host vehicle based on a control signal from the vehicle control device 8.
- the steering actuator controls the steering of the own vehicle based on a control signal from the vehicle control device 8.
- the map data includes position coordinates of a plurality of constituent points P1 constituting one lane boundary line L1 defining the lane L0 in which the vehicle 100 travels, and the other lane boundary defining the lane L0.
- the position coordinates of a plurality of constituent points P2 constituting the line L2 are stored.
- the surrounding sensor 5 detects a real-world lane boundary corresponding to the lane boundary L1 by detecting a real-world lane boundary corresponding to the lane boundary L1 or L2 while the host vehicle 100 is traveling.
- the position coordinates of the plurality of detection points p1 and the position coordinates of the plurality of detection points p2 when the lane boundary line in the real world corresponding to the lane boundary line L2 is detected are detected. 2, the traveling direction D0 of the vehicle 100 and the traveling direction D1 of the lane L0 are indicated by arrows.
- matching points P1 and P2 which are shape information of the lane boundary lines L1 and L2 of the map data, and detection points p1 and p2 detected by the surrounding sensor 5 are matched.
- the integrated data is generated by associating with (collation).
- the lane L0 within the integrated range A1 has only a straight road shape and has no characteristic road shape, there is no data binding force in the traveling direction D1 of the lane L0, and matching is performed in the traveling direction D1 of the lane L0. Misalignment occurs.
- the constituent points P1 and P2 of the map data are corrected so as to reduce the error from the detection points p1 and p2.
- the lane boundary lines L3 and L4 formed by the corrected constituent points P3 and P4 are changed to the real world lane boundary lines. L5 and L6 may deviate.
- the driving support device 1 determines an appropriate integration range at the time of generating integrated data including matching in consideration of the feature in the road shape, suppresses a deviation in matching, It is possible to generate appropriate integrated data.
- the processing circuit 6 of the driving support device 1 includes a self-position calculation unit 11, a boundary line recognition unit 12, an integration range determination unit 13, a matching unit 14, And a logical block such as a travel route calculation unit 16 as functional or physical hardware resources.
- the self-position calculation unit 11, the boundary line recognition unit 12, the integration range determination unit 13, the matching unit 14, the integration unit 15, and the travel route calculation unit 16 may be physical logic circuits such as PLDs such as FPGAs. Alternatively, a functional logic circuit which is equivalently set in a general-purpose semiconductor integrated circuit by processing by software may be used.
- the self-position calculating unit 11, the boundary line recognizing unit 12, the integrated range determining unit 13, the matching unit 14, the integrating unit 15, and the traveling route calculating unit 16 may be realized by a single piece of hardware. Hardware may be used.
- the self-position calculating unit 11, the boundary line recognizing unit 12, the integrated range determining unit 13, the matching unit 14, the integrating unit 15, and the traveling route calculating unit 16 are realized by a car navigation system such as an in-vehicle infotainment (IVI) system.
- the vehicle control device 8 may be realized by a driving support system such as an advanced driving support system (ADAS).
- ADAS advanced driving support system
- the self-position calculation unit 11 acquires the history (point sequence) of the own position of the own vehicle from the GPS receiver 3, the road-to-vehicle communication, the vehicle-to-vehicle communication, or the like.
- the own position calculation unit 11 may convert the coordinate system of the own position of the own vehicle in the real world into the same coordinate system as the map data, and calculate the own position of the own vehicle on the map data. For example, the position coordinates of the own position of the own vehicle in the real world obtained from the GPS receiver 3 are converted into the position coordinates of the map coordinate system.
- the self-position calculating unit 11 calculates the moving amount and moving direction of the own vehicle from vehicle information (odometry information) such as the wheel speed and the yaw rate of the own vehicle detected by the vehicle sensor 4, that is, the self-position is calculated by odometry.
- vehicle information odometry information
- a history may be calculated.
- the self-position calculating unit 11 may calculate the traveling locus of the own vehicle by approximating the points constituting the history of the own position with a curve.
- the boundary recognition unit 12 recognizes the lane boundary based on the data of the surrounding environment detected by the surrounding sensor 5.
- the boundary line recognition unit 12 recognizes a lane marker such as a white line as a lane boundary line.
- the type of the lane boundary is not particularly limited, and may be any target that is included as a lane boundary in the map data.
- the boundary line recognition unit 12 may recognize structures such as gutters, curbs, guardrails, and columns as lane boundary lines.
- the boundary line recognition unit 12 converts the coordinate system of the recognized lane boundary line into the same coordinate system as the map data based on the self-position calculated by the self-position calculation unit 11.
- the boundary line recognizing unit 12 detects a detection point of a lane boundary line which is a local coordinate system whose origin is the own position based on the own position of the own vehicle from the GPS receiver 3 acquired by the own position calculating unit 11, for example. Convert the coordinate system to a map coordinate system based on the map data.
- the format is not limited to the map coordinate system as long as the format allows alignment, and may be converted to, for example, a relative coordinate system.
- the map data includes the position coordinates of a plurality of constituent points P11, P12, P13, P14, and P15 forming one lane boundary line L7 that divides the lane L9 where the vehicle 100 travels, and the other.
- the position coordinates of a plurality of constituent points P31, P32, P33, P34 and P35 constituting the lane boundary line L8 are stored.
- the broken lines indicating the lane boundary lines L7 and L8 are obtained by interpolating the constituent points P11 to P15 and P31 to P35 with respective curves.
- the X direction is a direction in which the east direction in the east-west direction of the map coordinate system is a positive direction
- the Y direction is a direction in which the north direction of the north-south direction of the map coordinate system is positive.
- the lane boundary line L7 is connected to a portion (straight portion) L71 extending in the traveling direction D2 of the lane L9 and to the straight portion L71, and is connected to a left curve portion (curved portion) L72 and a curved portion L72. It has a portion (linear portion) L73 extending in the traveling direction D3 of the lane L9 in a direction orthogonal to the direction D2.
- the lane boundary line L8 is connected to a portion (linear portion) L81 extending in the traveling direction D2 and a linear portion L81, and is connected to a portion (curved portion) L82 that is a left curve and a curved portion L82, and is connected to the traveling direction D2.
- the surrounding sensor 5 detects a lane boundary in the real world corresponding to the lane boundary L7, L8 while the vehicle 100 is traveling.
- the boundary line recognition unit 12 recognizes a real world lane boundary line corresponding to the lane boundary lines L7 and L8 from the detection result of the surrounding sensor 5 and detects a real world lane boundary line corresponding to the lane boundary line L7.
- the position coordinates of the plurality of detection points p11 to p19 and the position coordinates of the plurality of detection points p21 to p29 when the real-world lane boundary corresponding to the lane boundary L8 is detected are calculated.
- FIG. 4 illustrates a case where a real-world lane boundary corresponding to a pair of lane boundaries L7 and L8 is recognized.
- FIG. 4 shows detection points p11 to p19 and p21 to p29 behind the host vehicle 100.
- the lane boundary lines L7 and L8 are detected in front and side of the host vehicle 100.
- a detection point may be acquired.
- the integration range determination unit 13 determines an integration range (matching range) when integrating the lane boundary shape information included in the map data with the lane boundary information detected by the surrounding sensor 5. For example, the integrated range determining unit 13 determines the integrated range such that the lane boundary includes a plurality of portions having different curvatures or radii of curvature.
- the radius of curvature can be obtained, for example, by approximating the curved portion of the lane boundary line to an arc, and the curvature can be obtained as the reciprocal of the radius of curvature.
- Each of the plurality of portions may be a straight line portion or a curved portion. That is, the plurality of portions may be a combination of two or more curved portions having different curvatures, or a combination of a straight portion and a curved portion.
- the integrated range determination unit 13 may determine the integrated range to include the two parts L71 and L72 of the lane boundary line L7, or may determine the integrated range to include the two parts L72 and L73. Is also good. Alternatively, the integrated range determination unit 13 may determine the integrated range to include the three portions L71, L72, and L73 of the lane boundary line L7.
- the integrated range determination unit 13 may determine the integrated range such that the lane boundary includes a plurality of straight line portions having different directions. Further, the integration range determination unit 13 sets the integration range so as to include a plurality of straight line portions whose angles (angle differences) are equal to or larger than a predetermined threshold (eg, 30 °) and equal to or smaller than a predetermined threshold (eg, 90 °). You may decide. In other words, the integrated range determining unit 13 determines the integrated range such that the lane boundary includes a plurality of straight line portions having different angles with respect to a predetermined direction (for example, the X direction in FIG. 4). Good. For example, as shown in FIG. 4, the integrated range determining unit 13 may determine the integrated range to include two straight line portions L71 and L73 having directions orthogonal to each other on the lane boundary line L7.
- a predetermined threshold eg, 30 °
- a predetermined threshold eg, 90 °
- the integrated range determination unit 13 specifies, from the point cloud data of the constituent points constituting the lane boundary of the map data, constituent points corresponding to the respective detection points at which the surrounding sensor 5 has detected the lane boundary.
- the constituent points may be constituent points stored in the map data or virtual constituent points set on the curve by interpolating between the constituent points stored in the map data with a curve.
- the integrated range determination unit 13 evaluates the angle between the predetermined direction and the traveling direction of the lane at each of the specified constituent points.
- the traveling direction of the lane can be obtained from the attitude of the host vehicle, link information included in the map data, and the like.
- the integrated range determining unit 13 determines the integrated range so that the variation in the angle evaluated at each configuration point is equal to or greater than a predetermined threshold.
- a difference value between the data of the angle evaluated at each constituent point and its average value may be obtained, and the difference values may be squared and averaged to obtain the variance.
- the integrated range determination unit 13 After specifying the constituent points corresponding to the respective detection points at which the lane boundary has been detected by the surrounding sensor 5, the integrated range determination unit 13 interpolates between the respective constituent points with a curve and calculates the nearest point on the curve for each detected point. A side point may be calculated. Then, the integration range determination unit 13 evaluates the angle between the predetermined direction and the traveling direction of the lane at each of the nearest points, and integrates such that the variation in the angles evaluated at each of the nearest points is equal to or greater than a predetermined threshold. The range may be determined.
- the integrated range determination unit 13 determines the constituent points of the map data corresponding to the detection points p11 to p19 and p21 to p29 at which the surrounding sensor 5 has detected the lane boundary, by using the detection points p11 to p19. , P21 to p29 within a predetermined distance threshold. As a result, the integration range determination unit 13 specifies the constituent points P11, P15, P31, and P35 corresponding to the detection points p13, p17, p23, and p27.
- the integration range determination unit 13 interpolates between the constituent points P11 to P15 including the constituent points P11 and P15 with the curve L7, and interpolates between the constituent points P31 to P35 including the constituent points P31 and P35 with the curve L8. Further, the integration range determination unit 13 hypothesizes the nearest points P21 to P27 on the curve L7 corresponding to the detection points p11, p12, p14, p15, p16, p17, p18, and p19 for which the corresponding constituent points are not specified. The nearest neighbor points P41 to P47 on the curve L8 corresponding to the detection points p21, p22, p24, p25, p26, p27, p28, p29 are calculated as virtual constituent points.
- the position coordinates of the detection points p11 to p19, p21 to p29, the constituent points P11 to P15, P31 to 35, and the nearest points P21 to 27, P41 to 47, and the detection points p11 to p19, p21 to p29 and the constituent points Information corresponding to P11 to P15, P31 to P35 and the nearest points P21 to P27 is stored in the map storage unit 10 or the like of the storage device 2 and can be used as appropriate when expanding the integration range. is there.
- the integration range determination unit 13 sets an initial value of the integration range, and determines whether or not the variation (variance) of the angles evaluated at each of the constituent points and the nearest point in the integration range is equal to or greater than a predetermined threshold.
- the integration range determination unit 13 determines the integration range by expanding the integration range until the angle becomes greater than or equal to the predetermined threshold. For example, as shown in FIG. 4, when the integration range A1 is determined so that the lane boundary line L7 includes only the straight line portion L71 and the lane boundary line L8 includes only the straight line portion L81, the integration is performed as shown in FIG.
- the angle variation d1 at each of the constituent points P11 and P12 and the nearest points P21 to P24 within the range A1 is small.
- the integration range determination unit 13 determines that the angle variation d1 is less than a predetermined threshold.
- the integrated range determination unit 13 sets the lane boundary L7 to include three portions L71, L72, and L73, and the lane boundary L8 to include the three portions L81, L82, and L83.
- the integration range A1 is expanded.
- the angle variation d2 of each of the constituent points P11 to P15 and the nearest points P21 to P27 within the enlarged integrated range A1 increases, for example, as shown in FIG.
- the integrated range determination unit 13 determines that the variation in the angle is equal to or greater than the predetermined threshold, and determines the integrated range A1 shown in FIG.
- the integrated range A1 is determined behind the host vehicle 100. However, the integrated range may be determined in a region including up to the front of the host vehicle 100.
- the integrated range determination unit 13 may evaluate the angle between the predetermined direction and the traveling direction of the lane at each detection point. . Then, the integration range determination unit 13 may determine the integration range such that the variation (variance) of the angle evaluated at each detection point is equal to or greater than a predetermined threshold.
- the matching unit 14 matches the shape information of the lane boundary line included in the map data with the information of the lane boundary line detected by the surrounding sensor 5 so as to include at least the integrated range determined by the integrated range determining unit 13 ( Collation).
- a known method such as ICP (Iterative @ Closest @ Point) can be adopted.
- “matching” associates the constituent points of the shape information of the lane boundary included in the map data with the detection points at which the lane boundary detected by the surrounding sensor 5 is detected (in other words, detection A combination of a point and a constituent point corresponding to the detected point is determined).
- the matching unit 14 outputs association information (matching result) in which the detected points are associated with the constituent points.
- the matching unit 14 compares the constituent points P11, P15 and the nearest points P21 to P27 of the lane boundary line L7 included in the map data with the detection points p11 to p17 detected by the surrounding sensor 5. Correspond. Further, the matching unit 14 associates the constituent points P31, P35 and the nearest points P41 to P47 of the lane boundary line L8 included in the map data with the detection points p21 to p27 detected by the surrounding sensor 5.
- the integration unit 15 Based on the matching result by the matching unit 14, the integration unit 15 detects the lane boundary information detected by the surrounding sensor 5 and converted into the same coordinate system as the map data, and the lane boundary shape information included in the map data. To generate integrated data.
- integrated means that the detection points detected by the surrounding sensor 5 are associated with the detection points detected by the surrounding sensor 5 and the constituent points of the map data, which are combined by matching. And the data of the constituent points of the map data.
- the integrated data can be used, for example, to generate a travel plan.
- the generated integrated data is stored in the map storage unit 10 of the storage device 2 or another storage device.
- the integration unit 15 integrates the information of the lane boundary detected by the surrounding sensor 5 and the shape information of the lane boundary included in the map data as they are, the position coordinates of the detection point detected by the surrounding sensor 5, and Integrated data including the detected points and the position coordinates of the constituent points included in the map data associated with the detected points may be generated.
- the integration unit 15 corrects the lane boundary shape information included in the map data based on the matching result by the matching unit 14, and corrects the lane boundary shape information and the lane detected by the surrounding sensor 5.
- the information on the boundary line may be generated as integrated data.
- the integration unit 15 converts the map data into map data such that the position error between the constituent points of the lane boundary line included in the map data and the detection points detected by the surrounding sensor 5 is minimized.
- the position coordinates of the constituent points of the included lane boundary may be optimized. For example, the integration unit 15 calculates the sum of squares of the difference (error) between the constituent points of the lane boundary line included in the map data and the detection points detected by the surrounding sensor 5, and minimizes the calculated sum. You may.
- the integrating unit 15 minimizes an error between the entire point cloud data of the detection points detected by the surrounding sensor 5 in the integrated area and the entire point cloud data of the constituent points of the lane boundary included in the map data in the integrated area.
- the data detected by the surrounding sensor 5 and the map data may be integrated.
- the integrating unit 15 moves the position coordinates of the constituent points P11 and P15 and the nearest points P21 to P27 of the lane boundary line L7 included in the map data shown in FIG. By calculating the minimum movement amount, the constituent points P11, P15 and the nearest points P21 to P27 of the lane boundary line L7 included in the map data and the detection points p11 to p17 detected by the surrounding sensor 5 are determined. Align. Then, the integration unit 15 corrects (updates) the position coordinates of the constituent points included in the map data stored in the map storage unit 10 so as to move by the movement amount M. The integration unit 15 may correct all the constituent points included in the map data, or may correct only the constituent points within a predetermined range around the own vehicle.
- the integrated range determination unit 13 may repeatedly determine the integrated range at a predetermined cycle every time the vehicle moves. Further, the matching unit 14 may repeatedly match the detection points and the constituent points in the integrated range determined by the integrated range determining unit 13 at a predetermined cycle every time the vehicle moves.
- the integration unit 15 may execute an integration process such as repeatedly correcting map data using the matching result of the matching unit 14.
- the traveling route calculation unit 16 generates a traveling route for driving the own vehicle based on the integrated data generated by the integration unit 15.
- the traveling route calculation unit 16 outputs the generated traveling route to the vehicle control device 8.
- step S1 the self-position calculation unit 11 calculates the self-position of the own vehicle from the GPS receiver 3 or the like.
- the boundary recognition unit 12 detects a lane boundary existing around the host vehicle based on the data of the surrounding environment detected by the surrounding sensor 5.
- the integrated range determining unit 13 converts the coordinate system of the lane boundary detected by the boundary recognition unit 12 into the same coordinate system as the map data.
- step S ⁇ b> 4 the integrated range determination unit 13 compares the lane boundary shape information included in the map data stored in the map storage unit 10 of the storage device 2 with the lane boundary line information detected by the surrounding sensor 5. Determine the scope of integration when integrating. For example, the integrated range determining unit 13 determines the integrated range such that the lane boundary includes at least one of a plurality of portions having different curvatures and a plurality of straight portions having different directions.
- step S41 the integrated range determination unit 13 calculates the nearest point on the lane boundary line on the map data for each detection point at which the lane boundary line is detected.
- step S42 the integrated range determination unit 13 calculates the angle of the traveling direction of the lane with respect to the predetermined direction at each nearest point.
- step S43 the integration range determination unit 13 determines whether the variation (variance) of the angle at each nearest point in the integration range is equal to or greater than a predetermined threshold.
- step S44 a process of expanding the integration range to the side opposite to the traveling direction of the vehicle is performed, and the process returns to step S43.
- step S43 the integration range determination unit 13 determines the integration range, and completes the integration range determination process in step S4.
- step S5 the matching unit 14 detects the road shape information included in the map data and the surrounding sensor 5 so as to include at least the integrated range determined by the integrated range determining unit 13, The data is matched with the road shape information that has been coordinate-transformed to the same coordinate system.
- step S ⁇ b> 6 based on the matching result by the matching unit 14, the integrating unit 15 determines the road shape information included in the map data and the road shape detected by the surrounding sensor 5 and coordinate-transformed to the same coordinate system as the map data. Generates integrated data by integrating the information of The integration unit 15 stores the generated integrated data in the map storage unit 10.
- step S ⁇ b> 7 the traveling route calculation unit 16 generates a traveling route for driving the own vehicle based on the integrated data generated by the integrating unit 15.
- step S ⁇ b> 8 the vehicle control device 8 drives the actuator 9 so that the own vehicle travels on the travel route generated by the travel route calculation unit 16 and performs vehicle control of the own vehicle.
- step S9 the processing circuit 6 determines whether or not the ignition switch (IGN) of the host vehicle has been turned off. If it is determined that the ignition switch has been turned off, the process ends. On the other hand, if it is determined that the ignition switch is not off, the process returns to step S1.
- IGN ignition switch
- the integrated range determination when generating the integrated data by integrating the lane boundary shape information included in the map data and the lane boundary line information detected by the surrounding sensor 5, the integrated range determination is performed.
- the unit 13 determines the integration range such that the lane boundaries include different curvatures or directions.
- the lane boundary shape information included in the map data and the lane boundary information detected by the surrounding sensor 5 are associated and integrated to generate integrated data so as to include at least the determined integrated range. .
- the integration range can be appropriately determined so that the road shape of the lane within the integration range has a characteristic. Therefore, the accuracy of matching between the lane boundary shape information included in the map data and the lane boundary line information detected by the surrounding sensor 5 can be improved, and matching can be appropriately performed. Therefore, the shape information of the lane boundary line included in the map data and the information of the lane boundary line detected by the surrounding sensor 5 are appropriately associated with each other, and after being accurately aligned as necessary, the integrated data is Properly generated.
- the integrated range determination unit 13 determines the integrated range such that the lane boundary includes a plurality of straight line portions having different angles with respect to the predetermined direction. As a result, it is possible to improve the matching deviation in the traveling direction of the lane due to the matching using only the straight lane, and to improve the accuracy of the matching.
- the integration range determination unit 13 calculates constituent points corresponding to the respective detection points, evaluates the angle between the predetermined direction and the traveling direction of the lane at each constituent point, and evaluates the variation (variance) of the evaluated angles. ) Is determined to be equal to or greater than a predetermined threshold. As a result, it is possible to improve the matching deviation in the traveling direction of the lane due to the matching using only the straight lane, and to improve the accuracy of the matching.
- the integration range determining unit 13 calculates constituent points corresponding to the respective detection points, interpolates between the respective constituent points with a curve, and calculates the nearest point on the curve for each of the detected points. Then, the integrated range determination unit 13 evaluates the angle between the predetermined direction and the traveling direction of the lane at each nearest point, and determines the integrated range so that the variation in the angle is equal to or larger than a predetermined threshold. Thereby, the traveling direction of the lane at the nearest point can be accurately calculated.
- the integration range determination unit 13 evaluates the angle between the predetermined direction and the traveling direction of the lane at each detection point, and determines the integration range so that the variation (variance) of the evaluated angle is equal to or larger than a predetermined threshold. decide. Accordingly, for example, when the detection points can be continuously detected, the variation (dispersion) of the traveling direction of the lane at the detection point can be evaluated, and the matching of the traveling direction of the lane by matching only on a straight road can be performed. And the matching accuracy can be improved.
- the integrated range determining unit 13 repeatedly determines the integrated range
- the matching unit 14 repeatedly performs the detection point and the constituent point within the integrated range determined by the integrated range determining unit 13.
- the matching is performed, and the integrating unit 15 repeatedly corrects the map data using the matching result.
- the correction of the map data is performed in advance using the previous matching result, so that when determining the integration range, it is necessary to accurately detect the constituent points corresponding to the respective detection points. Can be.
- the integration unit 15 generates integrated data so as to include a region in the traveling direction of the host vehicle ahead of the integrated range outside the integrated range.
- integrated data can be generated in advance not only in the integrated range where the matching is performed by the matching unit 14 but also in an area where the host vehicle is not traveling.
- the driving support device 1 is applicable to various road shapes. For example, as shown in FIG. 12, consider a case where the lane L9 has a road shape that bends at a right angle.
- the lane boundary line L7 has a linear portion L71 extending in the traveling direction D2, and a linear portion L72 connected to the linear portion L71 and extending in the traveling direction D3 perpendicular to the traveling direction D2.
- the lane boundary line L8 has a linear portion L81 extending in the traveling direction D2, and a linear portion L82 connected to the linear portion L81 and extending in the traveling direction D3 perpendicular to the traveling direction D2.
- the integrated range determining unit 13 determines the integrated range A1 so as to include the two straight line portions L71 and L72 of the lane boundary line L7 and the two straight line portions L81 and L82 of the lane boundary line L8.
- the integrated range determining unit 13 may determine the integrated range A1 based on only one of the lane boundary lines L7 and L8.
- the self-position calculation unit 11 sequentially samples the self-position along the traveling locus of the own vehicle, and for example, as shown in FIG. 13, the self-positions P51 to P56 when the surrounding sensor 5 can detect the lane boundary. Calculate the point sequence.
- the integrated range determination unit 13 evaluates the traveling direction of the own vehicle 100 at each of the self positions P51 to P56 calculated by the self position calculation unit 11. For example, the integrated range determination unit 13 may use the attitude of the own vehicle 100 acquired by the GPS receiver 3 as the traveling direction of the own vehicle 100. Alternatively, the integration range determination unit 13 interpolates the history (point sequence) of the own positions P51 to P56 with the curve L10 and obtains the tangent of the curve L10 at each of the own positions P51 to P56, thereby changing the traveling direction of the own vehicle 100. It may be calculated.
- the integrated range determination unit 13 determines the information on the lane to which the host vehicle 100 belongs. May be used to calculate the traveling direction of the host vehicle 100.
- the integration range determination unit 13 determines the integration range such that the variation (variance) in the traveling direction of the vehicle 100 at each of the self-positions P51 to P56 is equal to or greater than a predetermined threshold.
- the own position when the lane boundary is detected is calculated, the traveling direction of the own vehicle at the own position is evaluated, and the variation in the traveling direction of the own vehicle is determined.
- the integration range is determined so as to be equal to or more than the threshold value.
- the traveling direction of the own vehicle can be easily calculated.
- the traveling direction of the own vehicle can be accurately calculated.
- the traveling direction of the own vehicle can be accurately calculated.
- the integration range determination unit 13 specifies constituent points corresponding to each detection point detected by the surrounding sensor 5 from constituent points forming the shape information of the lane boundary line included in the map data, and The curvature or radius of curvature of the lane is evaluated at the constituent points. For example, by calculating the radius of a circumscribed circle between the position coordinates of the target constituent point and the position coordinates of two constituent points adjacent to the target constituent point, the curvature radius of the lane at the target constituent point is calculated. can do.
- the integration range is determined so that the variation in the curvature or radius of curvature of the lane at each component point is equal to or greater than a predetermined threshold.
- the integrated range determination unit 13 specifies constituent points corresponding to the respective detection points detected by the surrounding sensor 5 from constituent points forming the shape information of the lane boundary line included in the map data, and A curve may be interpolated between the constituent points, and the nearest point on the curve for each detection point may be calculated. Further, the integrated range determination unit 13 evaluates the curvature or radius of curvature of the lane at each of the calculated nearest points, for example, the position coordinates of the target nearest point and the two nearest neighbors of the target nearest point. The curvature or radius of curvature of the lane at the target nearest point can be calculated based on the position coordinates of the adjacent point. The integrated range determination unit 13 may determine the integrated range such that the variation in the curvature or radius of curvature of the lane at each nearest point is equal to or greater than a predetermined threshold.
- the integration range determination unit 13 determines the lane boundary line L7 based on the position coordinates of the nearest point P22, the constituent point P11, and the nearest point P23 of the lane boundary line L7 at the constituent point P11.
- the curvature of the straight line portion L71 of L7 is calculated as the curvature of the lane L9.
- the integration range determination unit 13 calculates the curvature of the curved portion L72 of the lane boundary line L7 based on the position coordinates of the configuration points P12, P13, and P14 of the lane boundary line L7 at the configuration point P13 of the lane boundary line L7. It is calculated as the curvature of L9.
- the curvature or radius of curvature of the lane is evaluated at each component point corresponding to each detection point at which the lane boundary line is detected, and the curvature or radius of curvature of the lane ( (Integration range) is determined so that (variance) is equal to or greater than a predetermined threshold.
- the curvature or radius of curvature of the lane is determined so that (variance) is equal to or greater than a predetermined threshold.
- a curve is interpolated between the respective constituent points corresponding to the respective detection points at which the lane boundary lines have been detected, the nearest point on the curve for each detection point is calculated, and the curvature or radius of curvature of the lane is evaluated at each nearest point.
- the integration range is determined so that the variation in the curvature or radius of curvature of the lane is equal to or greater than a predetermined threshold. Thereby, the curvature or the radius of curvature of the lane can be accurately calculated.
- the integration range determination unit 13 determines the integration range based on the curvature or radius of curvature of the traveling locus of the host vehicle when determining the integration range. .
- the self-position calculation unit 11 sequentially samples the self-position along the traveling locus of the own vehicle, and for example, as shown in FIG. 13, the self-positions P51 to P56 when the surrounding sensor 5 can detect the lane boundary. Calculate the point sequence.
- the integrated range determination unit 13 evaluates the curvature or radius of curvature of the traveling locus of the vehicle at each of the calculated own positions.
- the integrated range determination unit 13 may calculate the curvature or the radius of curvature of the traveling locus of the own vehicle by interpolating the point sequence of the own position with a curve.
- the integrated range determination unit 13 uses the information on the lane to which the own vehicle belongs to The curvature or radius of curvature of the traveling locus may be calculated.
- the integrated range determination unit 13 determines the integrated range so that the variation in the curvature or radius of curvature of the travel locus of the vehicle at each position is equal to or greater than a predetermined threshold.
- the own position when the lane boundary is detected is sequentially calculated, the curvature of the running locus of the own vehicle is evaluated at each self position, and the running locus of the own vehicle is evaluated.
- the integration range is determined so that the variation in the curvature is equal to or larger than a predetermined threshold.
- the curvature or radius of curvature of the traveling locus is calculated based on the yaw rate detected by the yaw rate sensor which is the surrounding sensor 5 mounted on the own vehicle, so that the curvature or radius of curvature of the traveling locus of the own vehicle can be easily calculated. it can.
- the curvature or the radius of curvature of the traveling locus of the own vehicle can be accurately calculated.
- the GPS receiver 3 or the yaw rate sensor is calculated by calculating the curvature or radius of curvature of the traveling locus using the information on the lane to which the vehicle belongs, which is included in the map data. It is possible to accurately calculate the curvature or the radius of curvature of the traveling locus of the own vehicle without receiving the variation in the accuracy of the vehicle.
- the integrated range determination unit 13 initially sets the integrated range, and in the set integrated range, any one of a plurality of portions having lane boundary lines having different curvatures and a plurality of straight line portions having different directions from each other. If not included, the integration range is expanded in the opposite direction (rearward) of the traveling direction of the host vehicle. At this time, the integration range determination unit 13 thins out detection points or self-positions within the integration range.
- the thinning interval can be set as appropriate, and may be, for example, every other point or every plural points, at every predetermined distance, or at irregular intervals. The thinned detection points or self-positions are not used at the time of matching by the matching unit 14, and matching is performed using the remaining detection points or self-positions.
- the integration range determination unit 13 may change the interval at which the detection points or the self-positions are thinned out according to the curvature of the lane on the map data. For example, the integrated range determination unit 13 thins out the detection points or self-positions because it is assumed that the larger the curvature of the lane (in other words, the smaller the radius of curvature of the lane), the fewer the features of the road shape. Increase the interval and perform fine sampling. Conversely, the smaller the curvature of the lane (in other words, the larger the radius of curvature of the lane), the smaller the intervals at which the detection points or self-positions are thinned out, and sampling is performed roughly.
- the integrated range determination unit 13 may change the intervals at which the detection points or the self-positions are thinned out according to the length of the lane on the map data. For example, since it is assumed that the longer the lane, the greater the number of portions having a smaller road shape, the integrated range determination unit 13 reduces the intervals at which the detection points or self-positions are thinned, and performs coarse sampling. Conversely, the shorter the lane is, the finer the sampling is made by reducing the interval at which the detection point or the self-position is thinned.
- the lane boundary lines L7 and L8 each include only one straight line portion L71 and L81.
- the range A1 is expanded.
- the detection points p12, p14, p16, and p18 are thinned out every other detection points p11 to p19 corresponding to the lane boundary line L7.
- the detection points p21 to p29 corresponding to the lane boundary L8 can be thinned out in the same manner as the detection points p11 to p19 corresponding to the lane boundary L7.
- the fourth modification of the embodiment of the present invention when expanding the integration range, by thinning out the detection points or the self-positions in the integration range, the efficiency of the processing at the time of matching is increased, and the calculation load is reduced. Can be reduced.
- the thinning interval is increased and coarse sampling is performed. Is performed, the calculation load can be efficiently reduced without lowering the matching accuracy.
- the accuracy of matching can be improved by reducing the thinning interval and performing finer sampling.
- the detection point or the interval for thinning out the self-position in accordance with the length of the lane on the map data for example, a long straight lane with less features in the road shape is sampled more coarsely.
- the calculation load can be efficiently reduced without lowering the matching accuracy.
- the range in which the integrating unit 15 generates the integrated data by integrating the lane boundary shape information included in the map data and the information on the lane boundary detected by the surrounding sensor 5 is limited to the integrated range. Not done.
- the integrating unit 15 may integrate the shape information of the lane boundary included in the map data and the information of the lane boundary detected by the surrounding sensor 5 in a range including an area outside the integrated range. For example, as illustrated in FIG. 14, the integration unit 15 may generate the integrated data in a range A3 that includes a region in the traveling direction of the vehicle 100 ahead of the integrated range A1 outside the integrated range A1. Accordingly, integrated data can be generated in advance even for an area where the vehicle 100 has not yet traveled and the detection point has not been detected.
- the integrating unit 15 may integrate all of the lane boundary shape information included in the map data within the integrated range and all of the lane boundary information detected by the surrounding sensor 5. Alternatively, the integration unit 15 may integrate part of the lane boundary shape information included in the map data within the integration range and all of the lane boundary line information detected by the surrounding sensor 5. Conversely, the integration unit 15 may integrate all of the lane boundary shape information included in the map data within the integration range and a part of the lane boundary information detected by the surrounding sensor 5.
- SYMBOLS 1 Driving assistance device, 2 ... Storage device, 3 ... Earth positioning system receiver, 4 ... Vehicle sensor, 5 ... Surrounding sensor, 6 ... Processing circuit, 7 ... Display, 8 ... Vehicle control device, 9 ... Actuator, 10 ... Map storage unit, 11 self-position calculation unit, 12 boundary line recognition unit, 13 integrated range determination unit, 14 matching unit, 15 integration unit, 16 travel route calculation unit, 100 own vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
Description
本発明の実施形態に係る走行支援装置は、車両に搭載可能である(以下、本発明の実施形態に係る走行支援装置が搭載された車両を「自車両」という)。図1に示すように、本発明の実施形態に係る走行支援装置1は、記憶装置2、地球測位システム(GPS)受信機(GPSセンサ)3、車両センサ4、周囲センサ5、処理回路6、ディスプレイ7、車両制御装置8及びアクチュエータ9を備える。処理回路6と、記憶装置2、GPS受信機3、車両センサ4、周囲センサ5、ディスプレイ7及び車両制御装置8とは、コントローラエリアネットワーク(CAN)バス等の有線又は無線でデータや信号を送受信可能である。
次に、図10のフローチャートを参照しながら、本発明の実施形態に係る走行支援装置1の動作(走行支援方法)の一例を説明する。
本発明の実施形態によれば、地図データに含まれる車線境界線の形状情報と、周囲センサ5により検出された車線境界線の情報とを統合して統合データを生成する際に、統合範囲決定部13が、車線境界線が互いに異なる曲率又は方向を含むように統合範囲を決定する。そして、決定した統合範囲を少なくとも含むように、地図データに含まれる車線境界線の形状情報と、周囲センサ5により検出された車線境界線の情報とを対応付けて統合して統合データを生成する。
本発明の実施形態の第1の変形例として、統合範囲決定部13が統合範囲を決定する際に、自車両の進行方向のばらつきに基づき統合範囲を決定する場合を説明する。
本発明の実施形態の第2の変形例として、統合範囲決定部13が統合範囲を決定する際に、車線の曲率のばらつきに基づき統合範囲を決定する場合を説明する。
本発明の実施形態の第3の変形例として、統合範囲決定部13が統合範囲を決定する際に、自車両の走行軌跡の曲率又は曲率半径のばらつきに基づき統合範囲を決定する場合を説明する。
本発明の実施形態の第4の変形例として、統合範囲決定部13が統合範囲を拡大する際に、統合範囲内の検出点又は自己位置を間引く場合を説明する。
上記のように、本発明は実施形態によって記載したが、この開示の一部をなす論述及び図面は本発明を限定するものであると理解すべきではない。この開示から当業者には様々な代替の実施形態、実施例及び運用技術が明らかとなろう。
Claims (21)
- センサにより車両の周囲に存在する車線境界線を検出し、
前記車両の自己位置を算出し、
前記自己位置に基づき、前記検出した車線境界線の座標系を、記憶装置に記憶された地図データと同じ座標系に変換し、
前記地図データに含まれる前記車線境界線の形状情報と、前記座標系を変換した車線境界線とを統合して統合データを生成する走行支援方法において、
前記車線境界線が、互いに異なる曲率を有する複数の部分、及び互いに異なる方向を有する複数の直線部分の少なくとも一方を含むように、前記形状情報と、前記座標系を変換した車線境界線とを統合する際の統合範囲を決定し、
前記決定した統合範囲を少なくとも含むように、前記形状情報と、前記座標系を変換した車線境界線とを対応付けて前記統合データを生成する
ことを特徴とする走行支援方法。 - 前記複数の直線部分が、所定の方向に対して互いに異なる角度を有することを特徴とする請求項1に記載の走行支援方法。
- 前記車線境界線を検出した各検出点に対応する、前記形状情報を構成する構成点を特定し、
前記特定された各構成点において、前記所定の方向と、前記車線境界線で区画される車線の進行方向とがなす角度を評価し、
前記角度のばらつきが所定の閾値以上となるように、前記統合範囲を決定する
ことを特徴とする請求項2に記載の走行支援方法。 - 前記車線境界線を検出した各検出点に対応する、前記形状情報を構成する構成点を特定し、
前記特定された各構成点間を曲線で補間し、
前記各検出点に対する前記曲線上の最近傍点を算出し、
前記算出された各最近傍点において、前記所定の方向と、前記車線境界線で区画される車線の進行方向とがなす角度を評価し、
前記角度のばらつきが所定の閾値以上となるように、前記統合範囲を決定する
ことを特徴とする請求項2に記載の走行支援方法。 - 前記車線境界線を検出した各検出点において、前記所定の方向と、前記車線境界線で区画される車線の進行方向とがなす角度を評価し、
前記角度のばらつきが所定の閾値以上となるように、前記統合範囲を決定する
ことを特徴とする請求項2に記載の走行支援方法。 - 前記車線境界線を検出したときの前記自己位置の点列を算出し、
前記算出された各自己位置において前記車両の進行方向を評価し、
前記車両の進行方向のばらつきが所定の閾値以上となるように、前記統合範囲を決定する
ことを特徴とする請求項1又は2に記載の走行支援方法。 - 前記車両に搭載された地球測位システム受信機により前記車両の進行方向を検出することを特徴とする請求項6に記載の走行支援方法。
- 前記算出された各自己位置間を曲線で補間し、
前記曲線の接線を求めることにより、前記車両の進行方向を算出する
ことを特徴とする請求項6に記載の走行支援方法。 - 前記地図データに含まれる前記車両が属する車線の情報を用いて、前記車両の進行方向を算出することを特徴とする請求項6に記載の走行支援方法。
- 前記車線境界線を検出した各検出点に対応する、前記形状情報を構成する構成点を特定し、
前記特定された各構成点において、前記車線境界線で区画される車線の曲率を評価し、
前記車線の曲率のばらつきが所定の閾値以上となるように、前記統合範囲を決定する
ことを特徴とする請求項1に記載の走行支援方法。 - 前記車線境界線を検出した各検出点に対応する、前記形状情報を構成する構成点を特定し、
前記特定された各構成点間を曲線で補間し、
前記各検出点に対する前記曲線上の最近傍点を算出し、
前記算出された各最近傍点において、前記車線境界線で区画される車線の曲率を評価し、
前記車線の曲率のばらつきが所定の閾値以上となるように、前記統合範囲を決定する
ことを特徴とする請求項1に記載の走行支援方法。 - 前記車線境界線を検出したときの前記自己位置の点列を算出し、
前記算出された各自己位置において前記車両の走行軌跡の曲率を評価し、
前記走行軌跡の曲率のばらつきが所定の閾値以上となるように、前記統合範囲を決定する
ことを特徴とする請求項1に記載の走行支援方法。 - 前記車両に搭載されたヨーレートセンサにより検出されるヨーレートに基づき、前記走行軌跡の曲率を算出することを特徴とする請求項12に記載の走行支援方法。
- 前記自己位置の点列を曲線で補間することにより、前記走行軌跡の曲率を算出することを特徴とする請求項12に記載の走行支援方法。
- 前記地図データに含まれる前記車両が属する車線の情報を用いて前記走行軌跡の曲率を算出することを特徴とする請求項12に記載の走行支援方法。
- 前記車両が移動する毎に、前記形状情報と、前記座標系を変換した車線境界線とを対応付け、
前記対応付けた結果を用いて、前記地図データを繰り返し補正する
ことを特徴とする請求項1~15のいずれか1項に記載の走行支援方法。 - 前記統合範囲を拡大する際に、前記統合範囲内の前記車線境界線を検出した各検出点、又は前記車線境界線を検出したときに算出された前記自己位置を間引くことを特徴とする請求項1に記載の走行支援方法。
- 前記統合範囲内の前記車線境界線で区画される車線の曲率に応じて、前記間引く間隔を変化させることを特徴とする請求項17に記載の走行支援方法。
- 前記統合範囲内の前記車線境界線で区画される車線の長さに応じて、前記間引く間隔を変化させることを特徴とする請求項17又は18に記載の走行支援方法。
- 前記決定した統合範囲外の前記車両の進行方向の前方の領域を含むように、前記統合データを生成することを特徴とする請求項1~19のいずれか1項に記載の走行支援方法。
- 車両の周囲に存在する車線境界線を検出するセンサと、
地図データを記憶する記憶装置と、
前記車両の自己位置を算出し、前記自己位置に基づき、前記検出した車線境界線の座標系を、前記地図データと同じ座標系に変換し、前記地図データに含まれる前記車線境界線の形状情報と、前記座標系を変換した車線境界線とを統合して統合データを生成するコントローラとを備え、
前記コントローラが、前記車線境界線が、互いに異なる曲率を有する複数の部分、及び互いに異なる方向を有する複数の直線部分の少なくとも一方を含むように、前記形状情報と、前記座標系を変換した車線境界線とを統合する際の統合範囲を決定し、前記決定した統合範囲を少なくとも含むように、前記形状情報と、前記座標系を変換した車線境界線とを対応付けて前記統合データを生成することを特徴とする走行支援装置。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020528523A JP6973964B2 (ja) | 2018-07-04 | 2018-07-04 | 走行支援方法及び走行支援装置 |
US17/256,265 US12123740B2 (en) | 2018-07-04 | 2018-07-04 | Travel assistance method and travel assistance device |
KR1020217003411A KR102611507B1 (ko) | 2018-07-04 | 2018-07-04 | 주행 지원 방법 및 주행 지원 장치 |
PCT/IB2018/000830 WO2020008221A1 (ja) | 2018-07-04 | 2018-07-04 | 走行支援方法及び走行支援装置 |
CN201880095345.6A CN112673230B (zh) | 2018-07-04 | 2018-07-04 | 行驶辅助方法及行驶辅助装置 |
EP18925560.7A EP3819594B1 (en) | 2018-07-04 | 2018-07-04 | Travel assistance method and travel assistance device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2018/000830 WO2020008221A1 (ja) | 2018-07-04 | 2018-07-04 | 走行支援方法及び走行支援装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020008221A1 true WO2020008221A1 (ja) | 2020-01-09 |
Family
ID=69060134
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2018/000830 WO2020008221A1 (ja) | 2018-07-04 | 2018-07-04 | 走行支援方法及び走行支援装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US12123740B2 (ja) |
EP (1) | EP3819594B1 (ja) |
JP (1) | JP6973964B2 (ja) |
KR (1) | KR102611507B1 (ja) |
CN (1) | CN112673230B (ja) |
WO (1) | WO2020008221A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111611958A (zh) * | 2020-05-28 | 2020-09-01 | 武汉四维图新科技有限公司 | 确定众包数据中车道线形状的方法、装置和设备 |
JP2021149321A (ja) * | 2020-03-17 | 2021-09-27 | 本田技研工業株式会社 | 走行制御装置、車両、走行制御方法及びプログラム |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2023013637A (ja) * | 2021-07-16 | 2023-01-26 | トヨタ自動車株式会社 | 車両用操舵ガイドトルク制御装置 |
DE102021210924A1 (de) * | 2021-09-29 | 2023-03-30 | Volkswagen Aktiengesellschaft | Verfahren zum Querführen eines Kraftfahrzeugs auf einer Straße mit wenigstens zwei Fahrspuren sowie Kraftfahrzeug |
EP4160154A1 (en) * | 2021-09-30 | 2023-04-05 | Aptiv Technologies Limited | Methods and systems for estimating lanes for a vehicle |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02151714A (ja) * | 1988-12-05 | 1990-06-11 | Nippondenso Co Ltd | 車両走行位置表示装置 |
JP2014160064A (ja) * | 2013-01-24 | 2014-09-04 | Aisan Technology Co Ltd | 道路地図作成装置、道路地図作成方法、及びナビゲーションシステム |
JP2015152346A (ja) * | 2014-02-12 | 2015-08-24 | 株式会社デンソー | 自車走行位置特定装置及び自車走行位置特定プログラム |
JP2017181870A (ja) | 2016-03-31 | 2017-10-05 | ソニー株式会社 | 情報処理装置及び情報処理サーバ |
JP2017198566A (ja) * | 2016-04-28 | 2017-11-02 | 日立オートモティブシステムズ株式会社 | レーンマーカ認識装置、自車両位置推定装置 |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2923664B2 (ja) * | 1990-02-07 | 1999-07-26 | 本田技研工業株式会社 | 自動走行装置 |
JP4659631B2 (ja) * | 2005-04-26 | 2011-03-30 | 富士重工業株式会社 | 車線認識装置 |
JP2007198566A (ja) * | 2006-01-30 | 2007-08-09 | Ntn Corp | 車輪用軸受装置 |
WO2011110156A2 (de) * | 2010-03-06 | 2011-09-15 | Continental Teves Ag & Co. Ohg | Spurhaltesystem für ein kraftfahrzeug |
DE102010033729B4 (de) * | 2010-08-07 | 2014-05-08 | Audi Ag | Verfahren und Vorrichtung zum Bestimmen der Position eines Fahrzeugs auf einer Fahrbahn sowie Kraftwagen mit einer solchen Vorrichtung |
JP5310674B2 (ja) * | 2010-08-17 | 2013-10-09 | 株式会社デンソー | 車両用挙動制御装置 |
JP2016045144A (ja) * | 2014-08-26 | 2016-04-04 | アルパイン株式会社 | 走行中レーン検出装置及び運転支援システム |
WO2017009923A1 (ja) * | 2015-07-13 | 2017-01-19 | 日産自動車株式会社 | 自己位置推定装置及び自己位置推定方法 |
US10384679B2 (en) * | 2015-09-30 | 2019-08-20 | Nissan Motor Co., Ltd. | Travel control method and travel control apparatus |
JP6798779B2 (ja) | 2015-11-04 | 2020-12-09 | トヨタ自動車株式会社 | 地図更新判定システム |
US10121367B2 (en) * | 2016-04-29 | 2018-11-06 | Ford Global Technologies, Llc | Vehicle lane map estimation |
JP6637400B2 (ja) * | 2016-10-12 | 2020-01-29 | 本田技研工業株式会社 | 車両制御装置 |
JP6690510B2 (ja) * | 2016-11-24 | 2020-04-28 | トヨタ自動車株式会社 | 物体認識装置 |
WO2018104563A2 (en) * | 2016-12-09 | 2018-06-14 | Tomtom Global Content B.V. | Method and system for video-based positioning and mapping |
WO2018126215A1 (en) * | 2016-12-30 | 2018-07-05 | DeepMap Inc. | High definition map updates |
CN106767853B (zh) * | 2016-12-30 | 2020-01-21 | 中国科学院合肥物质科学研究院 | 一种基于多信息融合的无人驾驶车辆高精度定位方法 |
US11560145B2 (en) * | 2017-01-10 | 2023-01-24 | Mitsubishi Electric Corporation | Travel path recognition apparatus and travel path recognition method |
US11254329B2 (en) * | 2017-04-24 | 2022-02-22 | Mobileye Vision Technologies Ltd. | Systems and methods for compression of lane data |
JP6799150B2 (ja) * | 2017-05-25 | 2020-12-09 | 本田技研工業株式会社 | 車両制御装置 |
-
2018
- 2018-07-04 WO PCT/IB2018/000830 patent/WO2020008221A1/ja active Application Filing
- 2018-07-04 KR KR1020217003411A patent/KR102611507B1/ko active IP Right Grant
- 2018-07-04 EP EP18925560.7A patent/EP3819594B1/en active Active
- 2018-07-04 CN CN201880095345.6A patent/CN112673230B/zh active Active
- 2018-07-04 JP JP2020528523A patent/JP6973964B2/ja active Active
- 2018-07-04 US US17/256,265 patent/US12123740B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02151714A (ja) * | 1988-12-05 | 1990-06-11 | Nippondenso Co Ltd | 車両走行位置表示装置 |
JP2014160064A (ja) * | 2013-01-24 | 2014-09-04 | Aisan Technology Co Ltd | 道路地図作成装置、道路地図作成方法、及びナビゲーションシステム |
JP2015152346A (ja) * | 2014-02-12 | 2015-08-24 | 株式会社デンソー | 自車走行位置特定装置及び自車走行位置特定プログラム |
JP2017181870A (ja) | 2016-03-31 | 2017-10-05 | ソニー株式会社 | 情報処理装置及び情報処理サーバ |
JP2017198566A (ja) * | 2016-04-28 | 2017-11-02 | 日立オートモティブシステムズ株式会社 | レーンマーカ認識装置、自車両位置推定装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3819594A4 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021149321A (ja) * | 2020-03-17 | 2021-09-27 | 本田技研工業株式会社 | 走行制御装置、車両、走行制御方法及びプログラム |
JP7377143B2 (ja) | 2020-03-17 | 2023-11-09 | 本田技研工業株式会社 | 走行制御装置、車両、走行制御方法及びプログラム |
CN111611958A (zh) * | 2020-05-28 | 2020-09-01 | 武汉四维图新科技有限公司 | 确定众包数据中车道线形状的方法、装置和设备 |
CN111611958B (zh) * | 2020-05-28 | 2023-08-22 | 武汉四维图新科技有限公司 | 确定众包数据中车道线形状的方法、装置和设备 |
Also Published As
Publication number | Publication date |
---|---|
KR20210027458A (ko) | 2021-03-10 |
JPWO2020008221A1 (ja) | 2021-07-15 |
EP3819594B1 (en) | 2023-07-26 |
EP3819594A4 (en) | 2021-07-28 |
KR102611507B1 (ko) | 2023-12-07 |
US20210262825A1 (en) | 2021-08-26 |
JP6973964B2 (ja) | 2021-12-01 |
CN112673230A (zh) | 2021-04-16 |
US12123740B2 (en) | 2024-10-22 |
EP3819594A1 (en) | 2021-05-12 |
CN112673230B (zh) | 2024-06-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3358302B1 (en) | Travel control method and travel control device | |
WO2020008221A1 (ja) | 走行支援方法及び走行支援装置 | |
RU2732634C1 (ru) | Способ помощи в путешествии и устройство помощи в путешествии | |
JP7154025B2 (ja) | 地図補正方法及び地図補正装置 | |
JP7123154B2 (ja) | 走行支援方法及び走行支援装置 | |
US11761787B2 (en) | Map information correction method, driving assistance method, and map information correction device | |
EP3660455B1 (en) | Travel assistance method and travel assistance device | |
JP6956268B2 (ja) | 走行環境情報の生成方法、運転制御方法、走行環境情報生成装置 | |
CN114987529A (zh) | 地图生成装置 | |
RU2721436C1 (ru) | Способ помощи вождению и устройство помощи вождению | |
JP4957021B2 (ja) | 車両用地図データ作成装置、及び、車両用地図データ更新装置 | |
RU2776105C1 (ru) | Способ помощи в движении и устройство помощи в движении | |
JP2019056953A (ja) | 車両制御装置、車両制御方法、およびプログラム | |
JP2019056951A (ja) | 周辺監視装置、周辺監視方法、およびプログラム | |
CN114987493A (zh) | 车辆位置识别装置 | |
JP2024064374A (ja) | 地図情報出力装置及び地図情報出力方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18925560 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020528523 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20217003411 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2018925560 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2018925560 Country of ref document: EP Effective date: 20210204 |