CN116890837A - Vehicle control device, vehicle control method, and storage medium - Google Patents
Vehicle control device, vehicle control method, and storage medium Download PDFInfo
- Publication number
- CN116890837A CN116890837A CN202310308444.0A CN202310308444A CN116890837A CN 116890837 A CN116890837 A CN 116890837A CN 202310308444 A CN202310308444 A CN 202310308444A CN 116890837 A CN116890837 A CN 116890837A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- driving mode
- line shown
- road
- camera image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 20
- 230000008859 change Effects 0.000 claims abstract description 106
- 230000001133 acceleration Effects 0.000 claims abstract description 21
- 238000004364 calculation method Methods 0.000 claims abstract description 7
- 238000012545 processing Methods 0.000 claims description 8
- 230000009471 action Effects 0.000 description 25
- 238000010586 diagram Methods 0.000 description 14
- 238000004891 communication Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 5
- 238000012544 monitoring process Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 238000002485 combustion reaction Methods 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000036461 convulsion Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/182—Selecting between different operative modes, e.g. comfort and performance modes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Human Computer Interaction (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Regulating Braking Force (AREA)
Abstract
A vehicle control device, a vehicle control method, and a storage medium capable of appropriately changing the driving control of a vehicle even when the road dividing line recognized by a camera is different from the content of map information mounted on the vehicle. The vehicle control device is provided with: an acquisition unit that acquires a camera image captured of a surrounding situation of a vehicle; a driving control unit that controls steering and acceleration/deceleration of the vehicle based on the camera image and the map information, independently of an operation by a driver of the vehicle; a mode determination unit that determines a driving mode of the vehicle as a first or second driving mode; a deviation determination unit that determines whether or not there is a deviation from one side of a road dividing line indicated by the camera image or a road dividing line indicated by the map information; a change amount calculation unit that calculates a change amount of the lane width of the road division line shown in the camera image and a change amount of the lane width of the road division line shown in the map information; the travel path generation unit generates a center line of a travel path along which the vehicle travels in the second driving mode.
Description
Technical Field
The invention relates to a vehicle control device, a vehicle control method, and a storage medium.
Background
Conventionally, a technique for controlling the travel of a host vehicle based on a road dividing line recognized by a camera mounted on the vehicle has been known. For example, japanese patent application laid-open No. 2020-050086 describes the following technique: the vehicle is driven based on the identified road dividing line, and when the degree of identification of the road dividing line does not satisfy a predetermined criterion, the vehicle is driven based on the track of the preceding vehicle.
The technique described in japanese patent application laid-open No. 2020-050086 controls the travel of the host vehicle based on the road dividing line recognized by the camera and the map information mounted on the host vehicle. However, in the conventional technique, when the road dividing line recognized by the camera is different from the content of the map information mounted on the own vehicle, the driving control of the vehicle may not be appropriately changed.
Disclosure of Invention
The present invention has been made in view of such circumstances, and an object thereof is to provide a vehicle control device, a vehicle control method, and a storage medium that can appropriately change the driving control of a vehicle even when the road division line recognized by a camera is different from the content of map information mounted on the vehicle.
The vehicle control device, the vehicle control method, and the storage medium of the present invention adopt the following configurations.
(1): a vehicle control device according to an aspect of the present invention includes: an acquisition unit that acquires a camera image obtained by capturing a surrounding situation of a vehicle; a driving control unit that controls steering and acceleration/deceleration of the vehicle based on the camera image and map information, independently of an operation of a driver of the vehicle; a mode determination unit that determines a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, at least a part of the plurality of driving modes including at least the second driving mode being controlled by the driving control unit, and when the determined task related to the driving mode is not executed by the driver, changing the driving mode of the vehicle to a driving mode in which the task is heavier; a deviation determination unit that determines whether or not there is a deviation with respect to a road division line indicated by the camera image and a single side of the road division line indicated by the map information; a change amount calculation unit that calculates a change amount of a lane width of a road division line shown in the camera image and a change amount of a lane width of a road division line shown in the map information when it is determined that there is a deviation on one side of the road division line shown in the camera image and the road division line shown in the map information; and a travel path generation unit that generates a center line of a travel path along which the vehicle travels in the second driving mode, based on an amount of change in a lane width of a road division line shown in the camera image and an amount of change in a lane width of the road division line shown in the map information.
(2): in the aspect of (1) above, the travel path generation unit generates the center line based on the road division line shown in the camera image when the amount of change in the lane width of the road division line shown in the camera image is smaller than a first threshold.
(3): in the aspect of (1) above, the travel path generation unit generates the center line based on the road division line shown in the map information when the amount of change in the lane width of the road division line shown in the camera image is equal to or larger than a first threshold value and smaller than a second threshold value and the amount of change in the lane width of the road division line shown in the map information is smaller than the first threshold value.
(4): in the aspect of (1) above, the travel path generation unit generates the center line based on the road division line shown in the camera image when the amount of change in the lane width of the road division line shown in the camera image is equal to or greater than a first threshold and less than a second threshold and the amount of change in the lane width of the road division line shown in the map information is equal to or greater than the first threshold.
(5): in the aspect of (1) above, the travel path generation unit generates the center line based on the road division line shown in the map information when the amount of change in the lane width of the road division line shown in the camera image is equal to or greater than a second threshold value and the amount of change in the lane width of the road division line shown in the map information is smaller than the second threshold value.
(6): in the aspect of (1) above, the travel path generation unit generates the center line based on the road division line shown in the camera image when the amount of change in the lane width of the road division line shown in the camera image and the amount of change in the lane width of the road division line shown in the map information are equal to or greater than a second threshold.
(7): in the aspect of (1) above, the mode determining unit may change the second driving mode to the first driving mode when the deviation determining unit determines that there is a deviation between the road dividing line indicated by the camera image and both sides of the road dividing line indicated by the map information in a next control cycle after the travel path generating unit generates the center line based on the road dividing line indicated by the camera image in a certain control cycle.
(8): on the basis of any one of the above (1) to (7), the second driving mode is a driving mode in which a task of holding an operation piece that receives a steering operation of the vehicle is not arranged for the driver, and the first driving mode is a driving mode in which at least a task of holding only the operation piece is arranged for the driver.
(9): a vehicle control method according to another aspect of the present invention causes a computer to perform: acquiring a camera image obtained by capturing a surrounding situation of a vehicle; controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle based on the camera image and map information; determining a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, and a part of the plurality of driving modes including at least the second driving mode being performed by controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle, and changing the driving mode of the vehicle to a driving mode in which the task is heavier if the determined task is not performed by the driver; determining whether there is a deviation with respect to a road division line shown in the camera image and a single side of the road division line shown in the map information; when it is determined that there is a deviation between the road dividing line shown in the camera image and one side of the road dividing line shown in the map information, calculating a change amount of the lane width of the road dividing line shown in the camera image and a change amount of the lane width of the road dividing line shown in the map information; a center line of a travel path on which the vehicle travels in the second driving mode is generated based on the amount of change in the lane width of the road division line shown in the camera image and the amount of change in the lane width of the road division line shown in the map information.
(10): a storage medium according to still another aspect of the present invention stores a program that causes a computer to perform: acquiring a camera image obtained by capturing a surrounding situation of a vehicle; controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle based on the camera image and map information; determining a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, and a part of the plurality of driving modes including at least the second driving mode being performed by controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle, and changing the driving mode of the vehicle to a driving mode in which the task is heavier if the determined task is not performed by the driver; determining whether there is a deviation with respect to a road division line shown in the camera image and a single side of the road division line shown in the map information; when it is determined that there is a deviation between the road dividing line shown in the camera image and one side of the road dividing line shown in the map information, calculating a change amount of the lane width of the road dividing line shown in the camera image and a change amount of the lane width of the road dividing line shown in the map information; a center line of a travel path on which the vehicle travels in the second driving mode is generated based on the amount of change in the lane width of the road division line shown in the camera image and the amount of change in the lane width of the road division line shown in the map information.
According to (1) to (10), even when the road dividing line recognized by the camera is different from the content of the map information mounted on the host vehicle, the driving control of the vehicle can be appropriately changed.
Drawings
Fig. 1 is a block diagram of a vehicle system 1 using a vehicle control device according to an embodiment.
Fig. 2 is a functional configuration diagram of the first control unit and the second control unit.
Fig. 3 is a diagram showing an example of the correspondence relationship between the driving mode and the control state and the task of the host vehicle M.
Fig. 4 is a diagram showing an example of a scenario in which the operation of the vehicle control device according to the embodiment is performed.
Fig. 5 is a diagram for explaining a method in which the change amount calculating unit calculates the change amount of the lane width.
Fig. 6 is a diagram for explaining a method in which the action plan generation unit generates the center line RL of the travel path.
Fig. 7 is a diagram showing an example of a table referred to when the action plan generation unit generates the center line RL of the travel path.
Fig. 8 is a flowchart showing an example of a flow of operations executed by the vehicle control device according to the embodiment.
Detailed Description
Embodiments of a vehicle control device, a vehicle control method, and a storage medium according to the present invention are described below with reference to the drawings.
[ integral Structure ]
Fig. 1 is a block diagram of a vehicle system 1 using a vehicle control device according to an embodiment. The vehicle on which the vehicle system 1 is mounted is, for example, a two-wheeled, three-wheeled, four-wheeled or the like vehicle, and the driving source thereof is an internal combustion engine such as a diesel engine or a gasoline engine, an electric motor, or a combination thereof. The motor operates using generated power generated by a generator connected to the internal combustion engine or discharge power of the secondary battery and the fuel cell.
The vehicle system 1 includes, for example, a camera 10, radar devices 12 and LIDAR (Light Detection and Ranging), an object recognition device 16, communication devices 20 and HMI (Human Machine Interface), a vehicle sensor 40, navigation devices 50 and MPU (Map Positioning Unit) 60, a driver monitoring camera 70, a driving operation element 80, an automatic driving control device 100, a running driving force output device 200, a braking device 210, and a steering device 220. These devices and apparatuses are connected to each other via a multi-way communication line such as CAN (Controller Area Network) communication line, a serial communication line, a wireless communication network, and the like. The configuration shown in fig. 1 is merely an example, and a part of the configuration may be omitted or another configuration may be added.
The camera 10 is, for example, a digital camera using a solid-state imaging device such as CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor). The camera 10 is mounted on an arbitrary portion of a vehicle (hereinafter referred to as the host vehicle M) on which the vehicle system 1 is mounted. When photographing the front, the camera 10 is mounted on the upper part of the front windshield, the rear view mirror of the vehicle interior, or the like. The camera 10, for example, periodically and repeatedly photographs the periphery of the host vehicle M. The camera 10 may also be a stereoscopic camera.
The radar device 12 emits radio waves such as millimeter waves to the periphery of the host vehicle M, and detects at least the position (distance and azimuth) of the object by detecting the radio waves (reflected waves) reflected by the object. The radar device 12 is mounted on an arbitrary portion of the host vehicle M. The radar device 12 may also detect the position and velocity of an object by the FM-CW (Frequency Modulated Continuous Wave) method.
The LIDAR14 irradiates light (or electromagnetic waves having wavelengths close to those of the light) to the periphery of the host vehicle M, and measures scattered light. The LIDAR14 detects the distance to the object based on the time from light emission to light reception. The irradiated light is, for example, pulsed laser light. The LIDAR14 is mounted on any portion of the host vehicle M.
The object recognition device 16 performs sensor fusion processing on detection results detected by some or all of the camera 10, the radar device 12, and the LIDAR14, and recognizes the position, type, speed, and the like of the object. The object recognition device 16 outputs the recognition result to the automatic driving control device 100. The object recognition device 16 may directly output the detection results of the camera 10, the radar device 12, and the LIDAR14 to the automated driving control device 100. The object recognition device 16 may also be omitted from the vehicle system 1.
The communication device 20 communicates with other vehicles existing in the vicinity of the host vehicle M, for example, using a cellular network, wi-Fi network, bluetooth (registered trademark), DSRC (Dedicated Short Range Communication), or the like, or communicates with various server devices via a wireless base station.
The HMI30 presents various information to the occupant of the own vehicle M, and accepts an input operation by the occupant. HMI30 includes various display devices, speakers, buzzers, touch panels, switches, keys, etc.
The vehicle sensor 40 includes a vehicle speed sensor that detects the speed of the host vehicle M, an acceleration sensor that detects acceleration, a yaw rate sensor that detects the angular velocity about the vertical axis, an azimuth sensor that detects the direction of the host vehicle M, and the like.
The navigation device 50 includes, for example, a GNSS (Global Navigation Satellite System) receiver 51, a navigation HMI52, and a route determination unit 53. The navigation device 50 holds the first map information 54 in a storage device such as HDD (Hard Disk Drive) or a flash memory. The GNSS receiver 51 determines the position of the own vehicle M based on the signals received from the GNSS satellites. The position of the host vehicle M may be determined or supplemented by INS (Inertial NavigationSystem) using the output of the vehicle sensor 40. The navigation HMI52 includes a display device, speakers, a touch panel, keys, etc. The navigation HMI52 may be partially or entirely shared with the HMI30 described above. The route determination unit 53 determines a route (hereinafter referred to as a route on a map) from the position of the host vehicle M (or an arbitrary position inputted thereto) specified by the GNSS receiver 51 to a destination inputted by the occupant using the navigation HMI52, for example, with reference to the first map information 54. The first map information 54 is, for example, information representing the shape of a road by a link representing the road and a node connected by the link. The first map information 54 may include curvature of a road, POI (Point Of Interest) information, and the like. The route on the map is output to the MPU 60. The navigation device 50 may perform route guidance using the navigation HMI52 based on the route on the map. The navigation device 50 may be realized by the functions of a terminal device such as a smart phone or a tablet terminal held by an occupant. The navigation device 50 may transmit the current position and the destination to the navigation server via the communication device 20, and acquire a route equivalent to the route on the map from the navigation server.
The MPU60 includes, for example, a recommended lane determining unit 61, and holds the second map information 62 in a storage device such as an HDD or a flash memory. The recommended lane determining unit 61 divides the route on the map supplied from the navigation device 50 (for example, by dividing every 100 m in the vehicle traveling direction) into a plurality of blocks, and determines the recommended lane for each block by referring to the second map information 62. The recommended lane determination unit 61 determines which lane from the left is to be driven. The recommended lane determining unit 61 determines the recommended lane so that the host vehicle M can travel on a reasonable route for traveling to the branching destination when the branching point exists on the route on the map.
The second map information 62 is map information having higher accuracy than the first map information 54. The second map information 62 includes, for example, information of the center of a lane or information of the boundary of a lane. The second map information 62 may include road information, traffic restriction information, residence information (residence, zip code), facility information, telephone number information, information of a prohibition region where the mode a or the mode B to be described later is prohibited, and the like. The second map information 62 may be updated at any time by the communication device 20 communicating with other devices.
The driver monitor camera 70 is, for example, a digital camera using a solid-state imaging device such as a CCD or CMOS. The driver monitor camera 70 is mounted on an arbitrary portion of the host vehicle M in a position and an orientation in which the head of an occupant (hereinafter referred to as a driver) seated in the driver of the host vehicle M can be imaged from the front (in the orientation of the imaged face). For example, the driver monitor camera 70 is mounted on an upper portion of a display device provided in a center portion of an instrument panel of the host vehicle M.
The steering operation device 80 includes, for example, an accelerator pedal, a brake pedal, a shift lever, and other operation devices in addition to the steering wheel 82. A sensor for detecting the amount of operation or the presence or absence of operation is attached to the driving operation element 80, and the detection result is output to the automatic driving control device 100, or to some or all of the running driving force output device 200, the brake device 210, and the steering device 220. The steering wheel 82 is an example of an "operation tool that receives a steering operation by a driver". The operating member need not necessarily be annular, and may be in the form of a special-shaped steering gear, a lever, a button, or the like. A steering wheel grip sensor 84 is attached to the steering wheel 82. The steering wheel grip sensor 84 is implemented by a capacitance sensor or the like, and outputs a signal to the automatic driving control device 100 that can detect whether the driver grips (i.e., touches) the steering wheel 82 in a forceful state.
The automatic driving control device 100 includes, for example, a first control unit 120 and a second control unit 160. The first control unit 120 and the second control unit 160 are each realized by a hardware processor such as CPU (Central Processing Unit) executing a program (software). Some or all of these components may be realized by hardware (including a circuit part) such as LSI (Large Scale Integration), ASIC (ApplicationSpecific Integrated Circuit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), or by cooperation of software and hardware. The program may be stored in advance in a storage device such as an HDD or a flash memory (a storage device including a non-transitory storage medium) of the autopilot control device 100, or may be stored in a removable storage medium such as a DVD or a CD-ROM, and installed in the HDD or the flash memory of the autopilot control device 100 by being mounted on a drive device via the storage medium (the non-transitory storage medium). The automatic driving control device 100 is an example of a "vehicle control device", and the action plan generation unit 140 and the second control unit 160 are combined to be an example of a "driving control unit".
Fig. 2 is a functional configuration diagram of the first control unit 120 and the second control unit 160. The first control unit 120 includes, for example, an identification unit 130, an action plan generation unit 140, and a mode determination unit 150. The first control unit 120 realizes, for example, a function based on AT (Artificial Intelligence: artificial intelligence) and a function based on a predetermined model in parallel. For example, the function of "identifying an intersection" may be realized by "performing, in parallel, identification of an intersection by deep learning or the like, and identification by a predetermined condition (presence of a signal, road sign, or the like that enables pattern matching), and scoring both sides to comprehensively evaluate". Thereby, reliability of automatic driving is ensured.
The recognition unit 130 recognizes the position, speed, acceleration, and other states of the object located in the vicinity of the host vehicle M based on the information input from the camera 10, the radar device 12, and the LIDAR14 via the object recognition device 16. The position of the object is identified as a position on absolute coordinates with the representative point (center of gravity, drive shaft center, etc.) of the host vehicle M as an origin, for example, and is used for control. The position of the object may be represented by a representative point such as a center of gravity or a corner of the object, or may be represented by a region. The "state" of the object may include acceleration, jerk, or "behavior state" of the object (for example, whether a lane change is being made or not, or whether a lane change is being made).
The recognition unit 130 recognizes, for example, a lane (driving lane) in which the host vehicle M is driving. For example, the identifying unit 130 identifies the driving lane by comparing the pattern of the road dividing line (for example, the arrangement of the solid line and the broken line) obtained from the second map information 62 with the pattern of the road dividing line around the host vehicle M identified from the image captured by the camera 10. The identification unit 130 may identify the driving lane by identifying the road dividing line, and the driving road boundary (road boundary) including a road shoulder, a curb, a center isolation belt, a guardrail, and the like, not limited to the road dividing line. In this identification, the position of the host vehicle M acquired from the navigation device 50 and the processing result by the INS may be considered. The identification unit 130 identifies a temporary stop line, an obstacle, a red light, a toll booth, and other road phenomena.
When recognizing the driving lane, the recognition unit 130 recognizes the position and posture of the host vehicle M with respect to the driving lane. The recognition unit 130 may recognize, for example, a deviation of the reference point of the host vehicle M from the center of the lane and an angle formed by the traveling direction of the host vehicle M with respect to a line connecting the centers of the lanes as a relative position and posture of the host vehicle M with respect to the traveling lane. Instead of this, the recognition unit 130 may recognize the position of the reference point of the host vehicle M with respect to an arbitrary side end portion (road dividing line or road boundary) of the travel lane as the relative position of the host vehicle M with respect to the travel lane.
The action plan generation unit 140 generates a target track in which the host vehicle M automatically (independent of the operation of the driver) runs in the future so as to be able to cope with the surrounding situation of the host vehicle M while traveling on the recommended lane determined by the recommended lane determination unit 61 in principle. The target track includes, for example, a speed element. For example, the target track is a track in which points (track points) where the host vehicle M should reach are arranged in order. The track point is a point where the own vehicle M should reach every predetermined travel distance (for example, several [ M ] degrees) in terms of the distance along the road, and is generated as a part of the target track at intervals of a predetermined sampling time (for example, several tenths [ sec ] degrees), unlike this point. The track point may be a position where the own vehicle M should reach at the sampling timing at each predetermined sampling time. In this case, the information of the target speed and the target acceleration is expressed by the interval of the track points.
The action plan generation unit 140 may set an event of automatic driving when generating the target trajectory. The event of automatic driving includes a constant speed driving event, a low speed following driving event, a lane change event, a branching event, a converging event, a take over event, and the like. The action plan generation unit 140 generates a target track corresponding to the started event. The action plan generation unit 140 is an example of a "travel path generation unit".
The mode determination unit 150 determines the driving mode of the host vehicle M as any one of a plurality of driving modes different in task to be set for the driver. The pattern determination unit 150 includes, for example, a deviation determination unit 152 and a change amount calculation unit 154. The functions of the deviation determining unit 152 and the change amount calculating unit 154 will be described later.
Fig. 3 is a diagram showing an example of the correspondence relationship between the driving mode and the control state and the task of the host vehicle M. In the driving mode of the host vehicle M, there are 5 modes, for example, a mode a to a mode E. Regarding the control state, that is, the degree of automation of the driving control of the host vehicle M, the pattern a is highest, and next, the pattern B, the pattern C, and the pattern D are sequentially lowered, and the pattern E is lowest. In contrast, with regard to the task placed on the driver, mode a is the mildest, and next, mode B, mode C, mode D become the heaviest in order, mode E is the heaviest. Since the automatic driving is not controlled in the modes D and E, it is obliged for the automatic driving control device 100 to perform a process of ending the control related to the automatic driving and shifting to the driving support or the manual driving. Hereinafter, the content of each driving mode is exemplified.
In the mode a, the vehicle is automatically driven, and neither the front monitoring nor the steering wheel 82 (steering wheel in the drawing) is disposed to the driver. However, even in the mode a, the driver is required to be in a body posture that can quickly shift to manual driving in response to a request from the system centering on the automatic driving control device 100. Here, the term "automatic driving" refers to steering and acceleration/deceleration that are controlled independently of the operation of the driver. The front side refers to a space in the traveling direction of the host vehicle M visually recognized through the front windshield. The mode a is a driving mode that can be executed when the host vehicle M is traveling at a predetermined speed or less (for example, about 50 km/h) on a vehicle-specific road such as an expressway, and a condition that a following target is present such as a preceding vehicle is satisfied, and is sometimes referred to as TJP (Traffic Jam Pilot). When this condition is no longer satisfied, the mode determination unit 150 changes the driving mode of the host vehicle M to the mode B.
In the mode B, a task of monitoring the front of the vehicle M (hereinafter referred to as front monitoring) is provided to the driver, but a task of holding the steering wheel 82 is not provided. In the mode C, the driving support state is set, and the driver is placed with a task of monitoring the front and a task of holding the steering wheel 82. Mode D is a driving mode in which at least one of steering and acceleration and deceleration of the host vehicle M requires a certain degree of driving operation by the driver. For example, in the mode D, driving assistance such as ACC (Adaptive Cruise Control) and LKAS (Lane Keeping Assist System) is performed. In the mode E, the manual driving state is set in which both steering and acceleration and deceleration require a driving operation by the driver. Both modes D and E are of course tasks for the driver to arrange to monitor the front of the own vehicle M.
The automatic driving control device 100 (and a driving support device (not shown)) executes an automatic lane change according to the driving mode. The automatic lane change includes an automatic lane change (1) based on a system request and an automatic lane change (2) based on a driver request. The automatic lane change (1) includes an automatic lane change for overtaking performed when the speed of the preceding vehicle is equal to or greater than the speed of the host vehicle by a reference, and an automatic lane change for traveling toward the destination (an automatic lane change performed by changing the recommended lane). The automatic lane change (2) is an automatic lane change that, when a driver operates a direction indicator when conditions relating to speed, positional relationship with respect to a surrounding vehicle, and the like are satisfied, causes the host vehicle M to make a lane change in the operation direction.
The automatic driving control apparatus 100 does not perform the automatic lane changes (1) and (2) in the mode a. The automatic driving control apparatus 100 performs automatic lane changes (1) and (2) in both modes B and C. The driving support device (not shown) executes the automatic lane change (2) without executing the automatic lane change (1) in the mode D. In mode E, neither of the automated lane changes (1) and (2) is performed.
The mode determination unit 150 changes the driving mode of the host vehicle M to the driving mode having a heavier task when the driver does not execute the task related to the determined driving mode (hereinafter referred to as the current driving mode).
For example, in the case where the driver cannot move to the manual driving in response to the request from the system in the mode a (for example, in the case where the driver is looking to the outside of the allowable area continuously, in the case where a sign of driving difficulty is detected), the mode determining unit 150 uses the HMI30 to prompt the driver to move to the manual driving, and if the driver does not respond, the driver performs such control that the host vehicle M is gradually stopped against the road shoulder and the automatic driving is stopped. After stopping the automatic driving, the host vehicle is in the mode D or E, and the host vehicle M can be started by a manual operation of the driver. Hereinafter, the same applies to "stop automatic driving". In the case where the driver does not monitor the front direction in the mode B, the mode determining unit 150 uses the HMI30 to prompt the driver to monitor the front direction, and if the driver does not respond, performs control such that the host vehicle M is gradually stopped by leaning to the road shoulder and the automatic driving is stopped. In the mode C, when the driver does not monitor the front direction or does not hold the steering wheel 82, the mode determining unit 150 uses the HMI30 to prompt the driver to monitor the front direction and/or hold the steering wheel 82, and if the driver does not respond, performs control to gradually stop the vehicle M against the road shoulder and stop the automatic driving.
The second control unit 160 controls the running driving force output device 200, the braking device 210, and the steering device 220 so that the vehicle M passes through the target track generated by the behavior plan generation unit 140 at a predetermined timing.
Returning to fig. 2, the second control unit 160 includes, for example, an acquisition unit 162, a speed control unit 164, and a steering control unit 166. The acquisition unit 162 acquires information of the target track (track point) generated by the action plan generation unit 140, and causes a memory (not shown) to store the information. The speed control unit 164 controls the traveling driving force output device 200 or the brake device 210 based on a speed element attached to the target track stored in the memory. The steering control unit 166 controls the steering device 220 according to the curve of the target track stored in the memory. The processing by the speed control unit 164 and the steering control unit 166 is realized by a combination of feedforward control and feedback control, for example. As an example, the steering control unit 166 performs a combination of feedforward control according to the curvature of the road ahead of the host vehicle M and feedback control based on the deviation from the target track.
The running driving force output device 200 outputs a running driving force (torque) for running the vehicle to the driving wheels. The running driving force output device 200 includes, for example, a combination of an internal combustion engine, an electric motor, a transmission, and the like, and ECU (Electronic Control Unit) for controlling these. The ECU controls the above-described configuration in accordance with information input from the second control portion 160 or information input from the driving operation element 80.
The brake device 210 includes, for example, a caliper, a hydraulic cylinder that transmits hydraulic pressure to the caliper, an electric motor that generates hydraulic pressure in the hydraulic cylinder, and a brake ECU. The brake ECU controls the electric motor in accordance with information input from the second control portion 160 or information input from the driving operation member 80 so that a braking torque corresponding to a braking operation is output to each wheel. The brake device 210 may be provided with a mechanism for transmitting the hydraulic pressure generated by the operation of the brake pedal included in the drive operation element 80 to the hydraulic cylinder via the master cylinder. The brake device 210 is not limited to the above-described configuration, and may be an electronically controlled hydraulic brake device that transmits the hydraulic pressure of the master cylinder to the hydraulic cylinder by controlling the actuator in accordance with information input from the second control unit 160.
The steering device 220 includes, for example, a steering ECU and an electric motor. The electric motor applies a force to the rack-and-pinion mechanism to change the direction of the steered wheel, for example. The steering ECU drives the electric motor in accordance with information input from the second control unit 160 or information input from the driving operation element 80 to change the direction of the steered wheels.
[ operation of vehicle control device ]
Next, the operation of the vehicle control device according to the embodiment will be described. In the following description, it is assumed that the host vehicle M is traveling in the driving mode of the mode B. Fig. 4 is a diagram showing an example of a scenario in which the operation of the vehicle control device according to the embodiment is performed.
As shown in fig. 4, while the host vehicle M is traveling on the lane L1, the recognition unit 130 recognizes the surrounding situation of the host vehicle M, in particular, the road dividing lines on both sides of the host vehicle M, based on the image captured by the camera 10. Hereinafter, a road division line recognized based on the image captured by the camera 10 (hereinafter referred to as "camera road division line CL") is denoted by CL, and a road division line recognized based on the second map information 62 (hereinafter referred to as "map road division line ML") is denoted by ML.
The deviation determination unit 152 determines whether there is a deviation (mismatch) between the camera road division line CL and the map road division line ML while the host vehicle M is traveling. Here, the deviation means, for example, that the distance Δy between the camera road dividing line CL and the map road dividing line ML is equal to or greater than a predetermined value, or that the angle Δθ between the camera road dividing line CL and the map road dividing line ML is equal to or greater than a predetermined value.
When the deviation determining unit 152 determines that there is a deviation only on one side of the camera road dividing line CL and the map road dividing line ML, the change amount calculating unit 154 calculates the change amount Δwcam of the lane width of the camera road dividing line CL and the change amount Δwmap of the map road dividing line ML.
Fig. 5 is a diagram for explaining a method in which the change amount calculating unit 154 calculates the change amount of the lane width. As shown in fig. 5, the change amount calculation unit 154 calculates, for example, the lane width Wc1 of the camera road dividing line CL at the current position of the host vehicle M and the lane width Wm1 of the map road dividing line ML, and calculates the lane width Wc2 of the camera road dividing line CL at the front position of the host vehicle M (for example, a position 1sec in front of the current speed of the host vehicle M) and the lane width Wm2 of the map road dividing line ML. Next, the change amount calculation unit 154 calculates the change amount Δwcam=wc2-Wcl, and calculates the change amount Δwmap=wm2-Wml.
When the change amount calculation unit 154 calculates the change amounts Δwcam and Δwmap, the action plan generation unit 140 generates a center line (reference line) RL of the travel path on which the host vehicle M travels in the driving mode of the mode B, based on the calculated change amounts Δwcam and Δwmap and a table described later with reference to fig. 7.
Fig. 6 is a diagram for explaining a method in which the action plan generation unit 140 generates the center line RL of the travel path. In the case of fig. 6, the action plan generation unit 140 refers to the table of fig. 7, and determines that the variation in the lane width of the map road division line ML is smaller than the camera road division line CL, and thus the map road division line ML is used from the side where the camera road division line CL coincides with the map road division line ML (i.e., the left side in fig. 6). On the other hand, a map road division line ML, which is a road division line having a relatively narrow width, is employed from a side (i.e., right side in fig. 6) where the camera road division line CL does not coincide with the map road division line ML. The action plan generation unit 140 calculates the center line RL of the travel path by shifting the road dividing line ML on the same side by a distance Wm/2 which is half of the width Wm between the two road dividing lines. This can generate a travel path of the host vehicle M with higher reliability.
Fig. 7 is a diagram showing an example of a table referred to when the action plan generation unit 140 generates the center line RL of the travel path. As described with reference to fig. 6, the table of fig. 7 basically defines that the reliability of the road dividing line in which the amount of change in the lane width (Δwcam or Δwmap) is small is evaluated to be high and used for the generation of the center line RL. When the amount of change in the lane width is equal, the camera road dividing line CL, which generally tends to have high reliability, is preferentially used.
First, as shown in patterns (a), (b), and (c) of fig. 7, when the amount of change Δwcam in the lane width of the camera road dividing line is smaller than the first threshold Th1, the action plan generating unit 140 uses the camera road dividing line CL from the side where the camera road dividing line CL coincides with the map road dividing line ML. On the other hand, from the side where the camera road dividing line CL does not coincide with the map road dividing line ML, a road dividing line having a narrower width is used among the camera road dividing line CL and the map road dividing line ML. The action plan generation unit 140 calculates the center line RL of the travel path by shifting the camera road dividing line CL on the matching side by a distance Wm/2 which is half of the width Wm between the two road dividing lines.
Next, as shown in the pattern (d) of fig. 7, when the amount of change Δwcam in the lane width of the camera road dividing line is equal to or larger than the first threshold value Th1 and smaller than the second threshold value Th2 (Th 2 > Th 1), and the amount of change Δwmap in the lane width of the map road dividing line ML is smaller than the first threshold value Th1, the action plan generating unit 140 uses the map road dividing line ML from the side where the camera road dividing line CL coincides with the map road dividing line ML. On the other hand, from the side where the camera road dividing line CL does not coincide with the map road dividing line ML, a road dividing line having a narrower width is used among the camera road dividing line CL and the map road dividing line ML. The action plan generation unit 140 calculates the center line RL of the travel path by shifting the map road division line ML on the matching side by a distance Wm/2 which is half of the width Wm between the two road division lines.
Next, as shown in patterns (e) and (f) of fig. 7, the action plan generation unit 140 uses the camera road division line CL from the side where the camera road division line CL coincides with the map road division line ML when the amount of change Δwcam in the lane width of the camera road division line is equal to or greater than the first threshold value Th1 and less than the second threshold value Th2 and the amount of change Δwmap in the lane width of the map road division line ML is equal to or greater than the first threshold value Th 1. On the other hand, from the side where the camera road dividing line CL does not coincide with the map road dividing line ML, a road dividing line having a narrower width is used among the camera road dividing line CL and the map road dividing line ML. The action plan generation unit 140 calculates the center line RL of the travel path by shifting the camera road dividing line CL on the matching side by a distance Wm/2 which is half of the width Wm between the two road dividing lines.
Next, as shown in patterns (g) and (h) of fig. 7, when the amount of change Δwcam in the lane width of the camera road dividing line is equal to or greater than the second threshold value Th2 and the amount of change Δwmap in the lane width of the map road dividing line ML is smaller than the second threshold value Th2, the action plan generating unit 140 uses the map road dividing line ML from both the coincident side and the non-coincident side of the camera road dividing line CL and the map road dividing line ML. That is, the action plan generation unit 140 calculates the center line RL of the travel path by shifting the map road division line ML on the matching side by a distance Wm/2 that is half the width Wm of the map road division lines ML on both sides.
Next, as shown in the pattern (i) of fig. 7, the action plan generation unit 140 uses the camera road division line CL from the side where the camera road division line CL coincides with the map road division line ML when the amount of change Δwcam in the lane width of the camera road division line and the amount of change Δwmap in the lane width of the map road division line ML are equal to or greater than the second threshold Th 2. On the other hand, from the side where the camera road dividing line CL does not coincide with the map road dividing line ML, a road dividing line having a narrower width is used among the camera road dividing line CL and the map road dividing line ML. The action plan generation unit 140 calculates the center line RL of the travel path by shifting the camera road dividing line CL on the matching side by a distance Wm/2 which is half of the width Wm between the two road dividing lines.
In this way, when the deviation determining unit 152 determines that the deviation has occurred with respect to one side of the camera road dividing line CL and the map road dividing line ML, the change amount calculating unit 154 calculates the change amount Δwcam of the lane width of the camera road dividing line CL and the change amount Δwmap of the lane width of the map road dividing line ML. The action plan generation unit 140 compares the calculated two amounts of change Δwcam and Δwmap with the threshold values Th1 and Th2, respectively, and evaluates the reliability of the road dividing line having a small amount of change in the lane width as high, and uses the result for the generation of the center line RL. This can generate a travel path of the host vehicle M with higher reliability.
Next, referring to fig. 8, a flowchart is shown that illustrates an example of a flow of operations executed by the vehicle control device according to the embodiment. Fig. 8 is a flowchart showing an example of a flow of operations executed by the vehicle control device according to the embodiment. The processing according to the present flowchart is executed in a predetermined cycle while the host vehicle M is traveling in the driving mode of the mode B.
First, the vehicle control device determines whether or not the center line RL based on the camera road division line CL was generated in the previous cycle due to the camera road division line CL coinciding with one side of the map road division line ML (step S100). When it is determined that the center line RL based on the camera road dividing line CL has been generated in the previous cycle, the vehicle control device determines whether the same one-sided agreement is maintained in the present cycle as well (step S102). When it is determined that the same one-sided agreement is not maintained in the present cycle (that is, when it is determined that both sides are not in agreement), the vehicle control device changes the driving mode from the mode B to the mode C (step S104).
On the other hand, when it is determined that the center line RL based on the camera road dividing line CL is not generated due to the camera road dividing line CL being coincident with one side of the map road dividing line ML in the previous cycle, the vehicle control device determines whether or not the camera road dividing line CL is coincident with one side of the map road dividing line ML in the present cycle (step S106). When it is determined that the camera road division line CL does not coincide with one side of the map road division line ML in the present cycle (that is, when it is determined that the two sides do not coincide), the vehicle control device calculates the center line of the map road division line ML on the two sides (that is, the map center line) as the center line RL of the travel path (step S108).
When it is determined that the camera road division line CL coincides with one side of the map road division line ML in the current cycle or that the one side of the previous cycle coincides with the one side of the map road division line ML in the current cycle, the vehicle control device determines whether or not the camera road division line CL on the one side coincides with the one side of the map road division line ML by a predetermined distance is present (step S110). When it is determined that the camera road division line CL on the matching side does not exist in the range up to the predetermined distance ahead, the vehicle control device changes the driving mode from the mode B to the mode C.
On the other hand, when it is determined that the camera road division line CL on the matching side is present in the range up to the predetermined distance in front, then the vehicle control device determines whether or not the map road division line ML on both sides is present (step S112). When it is determined that the map road division lines ML on both sides are not present, the vehicle control device calculates the center line of the camera road division lines CL on both sides (i.e., the camera center line) as the center line RL of the travel path (step S114). On the other hand, when it is determined that the map road division lines ML exist on both sides, the vehicle control device calculates the center line RL of the travel path based on the table shown in fig. 7 (step S116). Thus, the processing of the present flowchart ends.
According to the present embodiment described above, when it is determined that the camera road dividing line is deviated from the map road dividing line on one side, the amount of change in the lane width of the camera road dividing line and the amount of change in the lane width of the map road dividing line are calculated, and the center line of the running path on which the vehicle runs is generated based on the result obtained by comparing the calculated amounts of change with the threshold value. Thus, even when the road dividing line recognized by the camera is different from the content of the map information mounted on the host vehicle, the driving control of the vehicle can be appropriately changed.
The embodiments described above can be expressed as follows.
A vehicle control device is provided with:
a storage device storing a program; and
a hardware processor is provided with a processor that,
the following process (the processor executing the computer-readable instructions to:) is performed by the processor executing commands that can be read in by the computer
Acquiring a camera image obtained by capturing a surrounding situation of a vehicle;
controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle based on the camera image and map information;
Determining a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, and a part of the driving modes including at least the second driving mode among the plurality of driving modes being performed by controlling steering and acceleration/deceleration of the vehicle independently of an operation of a driver of the vehicle, and changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not performed by the driver;
determining whether there is a deviation with respect to a road division line shown by the camera image, a single side of the road division line shown by the map information;
when it is determined that there is a deviation between the road dividing line shown in the camera image and one side of the road dividing line shown in the map information, calculating a change amount of the lane width of the road dividing line shown in the camera image and a change amount of the lane width of the road dividing line shown in the map information;
A center line of a travel path on which the vehicle travels in the second driving mode is generated based on the amount of change in the lane width of the road division line shown in the camera image and the amount of change in the lane width of the road division line shown in the map information.
The specific embodiments of the present invention have been described above using the embodiments, but the present invention is not limited to such embodiments, and various modifications and substitutions can be made without departing from the scope of the present invention.
Claims (10)
1. A vehicle control apparatus, wherein,
the vehicle control device includes:
an acquisition unit that acquires a camera image obtained by capturing a surrounding situation of a vehicle;
a driving control unit that controls steering and acceleration/deceleration of the vehicle based on the camera image and map information, independently of an operation of a driver of the vehicle;
a mode determination unit that determines a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, at least a part of the driving modes including the second driving mode among the plurality of driving modes being controlled by the driving control unit, and the mode determination unit changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not executed by the driver;
A deviation determination unit that determines whether or not there is a deviation with respect to a road division line indicated by the camera image and a single side of the road division line indicated by the map information;
a change amount calculation unit that calculates a change amount of a lane width of a road division line shown in the camera image and a change amount of a lane width of a road division line shown in the map information when it is determined that there is a deviation on one side of the road division line shown in the camera image and the road division line shown in the map information; and
and a travel path generation unit that generates a center line of a travel path along which the vehicle travels in the second driving mode, based on an amount of change in the lane width of the road division line shown in the camera image and an amount of change in the lane width of the road division line shown in the map information.
2. The vehicle control apparatus according to claim 1, wherein,
the travel path generation unit generates the center line based on the road division line shown in the camera image when the amount of change in the lane width of the road division line shown in the camera image is smaller than a first threshold value.
3. The vehicle control apparatus according to claim 1, wherein,
The travel path generation unit generates the center line based on the road division line shown in the map information when the amount of change in the lane width of the road division line shown in the camera image is equal to or greater than a first threshold and less than a second threshold and the amount of change in the lane width of the road division line shown in the map information is less than the first threshold.
4. The vehicle control apparatus according to claim 1, wherein,
the travel path generation unit generates the center line based on the road division line shown in the camera image when the amount of change in the lane width of the road division line shown in the camera image is equal to or greater than a first threshold and less than a second threshold and the amount of change in the lane width of the road division line shown in the map information is equal to or greater than the first threshold.
5. The vehicle control apparatus according to claim 1, wherein,
the travel path generation unit generates the center line based on the road division line shown in the map information when the amount of change in the lane width of the road division line shown in the camera image is equal to or greater than a second threshold value and the amount of change in the lane width of the road division line shown in the map information is smaller than the second threshold value.
6. The vehicle control apparatus according to claim 1, wherein,
the travel path generation unit generates the center line based on the road division line shown in the camera image when both the amount of change in the lane width of the road division line shown in the camera image and the amount of change in the lane width of the road division line shown in the map information are equal to or greater than a second threshold.
7. The vehicle control apparatus according to claim 1, wherein,
the mode determination unit changes the second driving mode to the first driving mode when the travel path generation unit generates the center line based on the road division line shown in the camera image in a certain control cycle and then determines that there is a deviation between both sides of the road division line shown in the camera image and the road division line shown in the map information in a next control cycle.
8. The vehicle control apparatus according to any one of claims 1 to 7, wherein,
the second driving mode is a driving mode in which a task of holding an operation member that accepts a steering operation of the vehicle is not disposed for the driver,
The first driving mode is a driving mode in which at least a task of holding only the operation element is arranged for the driver.
9. A vehicle control method, wherein,
the vehicle control method causes a computer to perform the following processing:
acquiring a camera image obtained by capturing a surrounding situation of a vehicle;
controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle based on the camera image and map information;
determining a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, and a part of the driving modes including at least the second driving mode among the plurality of driving modes being performed by controlling steering and acceleration/deceleration of the vehicle independently of an operation of a driver of the vehicle, and changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not performed by the driver;
determining whether there is a deviation with respect to a road division line shown in the camera image and a single side of the road division line shown in the map information;
When it is determined that there is a deviation between the road dividing line shown in the camera image and one side of the road dividing line shown in the map information, calculating a change amount of the lane width of the road dividing line shown in the camera image and a change amount of the lane width of the road dividing line shown in the map information; and
a center line of a travel path on which the vehicle travels in the second driving mode is generated based on the amount of change in the lane width of the road division line shown in the camera image and the amount of change in the lane width of the road division line shown in the map information.
10. A storage medium storing a program, wherein,
the program causes a computer to perform the following processing:
acquiring a camera image obtained by capturing a surrounding situation of a vehicle;
controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle based on the camera image and map information;
determining a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, and a part of the driving modes including at least the second driving mode among the plurality of driving modes being performed by controlling steering and acceleration/deceleration of the vehicle independently of an operation of a driver of the vehicle, and changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not performed by the driver;
Determining whether there is a deviation with respect to a road division line shown in the camera image and a single side of the road division line shown in the map information;
when it is determined that there is a deviation between the road dividing line shown in the camera image and one side of the road dividing line shown in the map information, calculating a change amount of the lane width of the road dividing line shown in the camera image and a change amount of the lane width of the road dividing line shown in the map information; and
a center line of a travel path on which the vehicle travels in the second driving mode is generated based on the amount of change in the lane width of the road division line shown in the camera image and the amount of change in the lane width of the road division line shown in the map information.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022-059646 | 2022-03-31 | ||
JP2022059646A JP2023150506A (en) | 2022-03-31 | 2022-03-31 | Vehicle control device, vehicle control method and program |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116890837A true CN116890837A (en) | 2023-10-17 |
Family
ID=88311234
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310308444.0A Pending CN116890837A (en) | 2022-03-31 | 2023-03-27 | Vehicle control device, vehicle control method, and storage medium |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2023150506A (en) |
CN (1) | CN116890837A (en) |
-
2022
- 2022-03-31 JP JP2022059646A patent/JP2023150506A/en active Pending
-
2023
- 2023-03-27 CN CN202310308444.0A patent/CN116890837A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
JP2023150506A (en) | 2023-10-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114684184A (en) | Vehicle control device, vehicle control method, and storage medium | |
US12030530B2 (en) | Vehicle control device, vehicle control method, and storage medium | |
US11827246B2 (en) | Vehicle control device, vehicle control method, and storage medium | |
CN115443236B (en) | Vehicle control device, vehicle system, vehicle control method, and storage medium | |
CN116034066B (en) | Vehicle control device and vehicle control method | |
CN115140086A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN114644013A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN117622150A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN117584975A (en) | Vehicle control device, vehicle control method, and storage medium | |
US20230303099A1 (en) | Vehicle control device, vehicle control method, and storage medium | |
CN117227725A (en) | Moving object control device, moving object control method, and storage medium | |
JP2023030146A (en) | Vehicle control device, vehicle control method, and program | |
CN115503702A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN115140083A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN116710984A (en) | Vehicle control device, vehicle control method, and program | |
CN115279642B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN116890837A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN114684191B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN116710339B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN116890838A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN116323363B (en) | Vehicle control device and vehicle control method | |
US20230303126A1 (en) | Vehicle control device, vehicle control method, and storage medium | |
CN117657157A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN116890824A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN117584963A (en) | Vehicle control device, vehicle control method, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |