CN116890824A - Vehicle control device, vehicle control method, and storage medium - Google Patents
Vehicle control device, vehicle control method, and storage medium Download PDFInfo
- Publication number
- CN116890824A CN116890824A CN202310311201.2A CN202310311201A CN116890824A CN 116890824 A CN116890824 A CN 116890824A CN 202310311201 A CN202310311201 A CN 202310311201A CN 116890824 A CN116890824 A CN 116890824A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- driving mode
- dividing line
- driving
- parallelism
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 18
- 230000001133 acceleration Effects 0.000 claims abstract description 21
- 238000012545 processing Methods 0.000 claims description 8
- 230000008859 change Effects 0.000 description 22
- 238000010586 diagram Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 9
- 238000004364 calculation method Methods 0.000 description 8
- 230000009471 action Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 238000012544 monitoring process Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 238000002485 combustion reaction Methods 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000036461 convulsion Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
- B60W30/12—Lane keeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/005—Handover processes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/53—Road markings, e.g. lane marker or crosswalk
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Abstract
A vehicle control device, a vehicle control method, and a storage medium capable of appropriately changing the driving control of a vehicle even when the road division line recognized by a camera is different from the content of map information mounted on the vehicle. The vehicle control device is provided with: an acquisition unit that acquires a camera image captured of a surrounding situation of a vehicle; a driving control unit that controls steering and acceleration/deceleration of the vehicle based on the camera image and map information, independently of an operation of a driver of the vehicle; a mode determination unit that determines a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode; a deviation determination unit configured to determine whether or not there is a deviation between a road dividing line indicated by the camera image and a road dividing line indicated by the map information; and a parallelism calculating unit that calculates parallelism between a track of one or more other vehicles existing in the vicinity of the vehicle and a road dividing line shown in the camera image.
Description
Technical Field
The invention relates to a vehicle control device, a vehicle control method, and a storage medium.
Background
Conventionally, a technique for controlling the travel of a host vehicle based on a road dividing line recognized by a camera mounted on the vehicle has been known. For example, japanese patent application laid-open No. 2020-050086 describes the following technique: the vehicle is driven based on the identified road dividing line, and when the degree of identification of the road dividing line does not satisfy a predetermined criterion, the vehicle is driven based on the track of the preceding vehicle.
The technique described in japanese patent application laid-open No. 2020-050086 controls the travel of the host vehicle based on the road dividing line recognized by the camera and the map information mounted on the host vehicle. However, in the conventional technique, when the road dividing line recognized by the camera is different from the content of the map information mounted on the own vehicle, the driving control of the vehicle may not be appropriately changed.
Disclosure of Invention
The present invention has been made in view of such circumstances, and an object thereof is to provide a vehicle control device, a vehicle control method, and a storage medium that can appropriately change the driving control of a vehicle even when the road division line recognized by a camera is different from the content of map information mounted on the vehicle.
The vehicle control device, the vehicle control method, and the storage medium of the present invention adopt the following configurations.
(1): a vehicle control device according to an aspect of the present invention includes: an acquisition unit that acquires a camera image obtained by capturing a surrounding situation of a vehicle; a driving control unit that controls steering and acceleration/deceleration of the vehicle based on the camera image and map information, independently of an operation of a driver of the vehicle; a mode determination unit that determines a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, at least a part of the driving modes including the second driving mode among the plurality of driving modes being controlled by the driving control unit, and the mode determination unit changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not executed by the driver; a deviation determination unit that determines whether or not there is a deviation between a road dividing line shown in the camera image and a road dividing line shown in the map information; and a parallelism calculating unit that calculates a parallelism between a track of one or more other vehicles existing in the vicinity of the vehicle and a road dividing line indicated by the camera image when it is determined that there is a deviation between the road dividing line indicated by the camera image and the road dividing line indicated by the map information, wherein the mode determining unit determines to continue the second driving mode when the calculated parallelism is equal to or greater than a first threshold.
(2): in the aspect (1) above, the parallelism calculating unit calculates the parallelism when the number of one or more other vehicles existing in the vicinity of the vehicle is equal to or greater than a second threshold.
(3): in the aspect of (1) above, the parallelism calculating unit calculates the parallelism based on a trajectory of one or more other vehicles existing in the vicinity of the vehicle, a road dividing line indicated by the camera image, and a road dividing line indicated by the map information.
(4): in the aspect of (3) above, the parallelism calculating unit calculates the parallelism based on an average value of angles between the respective trajectories of the one or more other vehicles existing in the vicinity of the vehicle and the road dividing line indicated by the camera image, and an average value of angles between the respective trajectories of the one or more other vehicles existing in the vicinity of the vehicle and the road dividing line indicated by the map information.
(5): in the aspect of (3) above, the parallelism calculating unit calculates the parallelism based on a central value of an angle between a track of each of the one or more other vehicles existing in the vicinity of the vehicle and a road dividing line shown in the camera image, and a central value of an angle between a track of each of the one or more other vehicles existing in the vicinity of the vehicle and the road dividing line shown in the map information.
(6): in the above-described aspect (1) to (5), the second driving mode is a driving mode in which a task of holding an operation element that accepts a steering operation of the vehicle is not arranged for the driver, and the first driving mode is a driving mode in which at least a task of holding the operation element is arranged for the driver.
(7): another aspect of the invention provides a vehicle control method that causes a computer to perform: acquiring a camera image obtained by capturing a surrounding situation of a vehicle; controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle based on the camera image and map information; determining a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, wherein the second driving mode is a driving mode in which a task to be placed on the driver is lighter than the first driving mode, and the second driving mode is performed by controlling steering and acceleration/deceleration of the vehicle independently of an operation of the driver of the vehicle, and changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not executed by the driver; determining whether there is a deviation between a road dividing line shown in the camera image and a road dividing line shown in the map information; when it is determined that there is a deviation between the road dividing line shown in the camera image and the road dividing line shown in the map information, calculating a parallelism between a track of one or more other vehicles existing in the periphery of the vehicle and the road dividing line shown in the camera image; and determining to continue the second driving mode when the calculated parallelism is equal to or greater than a first threshold.
(8): a storage medium according to still another aspect of the present invention stores a program, wherein the program causes a computer to: acquiring a camera image obtained by capturing a surrounding situation of a vehicle; controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle based on the camera image and map information; determining a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, wherein the second driving mode is a driving mode in which a task to be placed on the driver is lighter than the first driving mode, and the second driving mode is performed by controlling steering and acceleration/deceleration of the vehicle independently of an operation of the driver of the vehicle, and changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not executed by the driver; determining whether there is a deviation between a road dividing line shown in the camera image and a road dividing line shown in the map information; when it is determined that there is a deviation between the road dividing line shown in the camera image and the road dividing line shown in the map information, calculating a parallelism between a track of one or more other vehicles existing in the periphery of the vehicle and the road dividing line shown in the camera image; and determining to continue the second driving mode when the calculated parallelism is equal to or greater than a first threshold.
According to (1) to (8), even when the road dividing line recognized by the camera is different from the content of the map information mounted on the host vehicle, the driving control of the vehicle can be appropriately changed.
Drawings
Fig. 1 is a configuration diagram of a vehicle system using a vehicle control device according to an embodiment.
Fig. 2 is a functional configuration diagram of the first control unit and the second control unit.
Fig. 3 is a diagram showing an example of the correspondence relationship between the driving mode and the control state and the task of the host vehicle M.
Fig. 4 is a diagram showing an example of a scenario in which the operation of the vehicle control device according to the embodiment is performed.
Fig. 5 is a graph for explaining a method of determining a driving mode based on parallelism.
Fig. 6 is a diagram showing an example of a vehicle other than the vehicle to be calculated from the calculation target of the parallelism.
Fig. 7 is a flowchart showing an example of a flow of operations executed by the vehicle control device according to the embodiment.
Detailed Description
Embodiments of a vehicle control device, a vehicle control method, and a storage medium according to the present invention are described below with reference to the drawings.
[ integral Structure ]
Fig. 1 is a block diagram of a vehicle system 1 using a vehicle control device according to an embodiment. The vehicle on which the vehicle system 1 is mounted is, for example, a two-wheeled, three-wheeled, four-wheeled or the like vehicle, and the driving source thereof is an internal combustion engine such as a diesel engine or a gasoline engine, an electric motor, or a combination thereof. The motor operates using generated power generated by a generator connected to the internal combustion engine or discharge power of the secondary battery and the fuel cell.
The vehicle system 1 includes, for example, a camera 10, radar devices 12 and LIDAR (Light Detection and Ranging), an object recognition device 16, communication devices 20 and HMI (Human Machine Interface), a vehicle sensor 40, navigation devices 50 and MPU (Map Positioning Unit) 60, a driver monitoring camera 70, a driving operation element 80, an automatic driving control device 100, a running driving force output device 200, a braking device 210, and a steering device 220. These devices and apparatuses are connected to each other via a multi-way communication line such as CAN (Controller Area Network) communication line, a serial communication line, a wireless communication network, and the like. The configuration shown in fig. 1 is merely an example, and a part of the configuration may be omitted or another configuration may be added.
The camera 10 is, for example, a digital camera using a solid-state imaging device such as CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor). The camera 10 is mounted on an arbitrary portion of a vehicle (hereinafter referred to as the host vehicle M) on which the vehicle system 1 is mounted. When photographing the front, the camera 10 is mounted on the upper part of the front windshield, the rear view mirror of the vehicle interior, or the like. The camera 10, for example, periodically and repeatedly photographs the periphery of the host vehicle M. The camera 10 may also be a stereoscopic camera.
The radar device 12 emits radio waves such as millimeter waves to the periphery of the host vehicle M, and detects at least the position (distance and azimuth) of the object by detecting the radio waves (reflected waves) reflected by the object. The radar device 12 is mounted on an arbitrary portion of the host vehicle M. The radar device 12 may also detect the position and velocity of an object by the FM-CW (Frequency Modulated Continuous Wave) method.
The LIDAR14 irradiates light (or electromagnetic waves having wavelengths close to those of the light) to the periphery of the host vehicle M, and measures scattered light. The LIDAR14 detects the distance to the object based on the time from light emission to light reception. The irradiated light is, for example, pulsed laser light. The LIDAR14 is mounted on any portion of the host vehicle M.
The object recognition device 16 performs sensor fusion processing on detection results detected by some or all of the camera 10, the radar device 12, and the LIDAR14, and recognizes the position, type, speed, and the like of the object. The object recognition device 16 outputs the recognition result to the automatic driving control device 100. The object recognition device 16 may directly output the detection results of the camera 10, the radar device 12, and the LIDAR14 to the automated driving control device 100. The object recognition device 16 may also be omitted from the vehicle system 1.
The communication device 20 communicates with other vehicles existing in the vicinity of the host vehicle M, for example, using a cellular network, wi-Fi network, bluetooth (registered trademark), DSRC (Dedicated Short Range Communication), or the like, or communicates with various server devices via a wireless base station.
The HMI30 presents various information to the occupant of the own vehicle M, and accepts an input operation by the occupant. HMI30 includes various display devices, speakers, buzzers, touch panels, switches, keys, etc.
The vehicle sensor 40 includes a vehicle speed sensor that detects the speed of the host vehicle M, an acceleration sensor that detects acceleration, a yaw rate sensor that detects the angular velocity about the vertical axis, an azimuth sensor that detects the direction of the host vehicle M, and the like.
The navigation device 50 includes, for example, a GNSS (Global Navigation Satellite System) receiver 51, a navigation HMI52, and a route determination unit 53. The navigation device 50 holds the first map information 54 in a storage device such as HDD (Hard Disk Drive) or a flash memory. The GNSS receiver 51 determines the position of the own vehicle M based on the signals received from the GNSS satellites. The position of the host vehicle M may be determined or supplemented by INS (Inertial Navigation System) using the output of the vehicle sensor 40. The navigation HMI52 includes a display device, speakers, a touch panel, keys, etc. The navigation HMI52 may be partially or entirely shared with the HMI30 described above. The route determination unit 53 determines a route (hereinafter referred to as a route on a map) from the position of the host vehicle M (or an arbitrary position inputted thereto) specified by the GNSS receiver 51 to a destination inputted by the occupant using the navigation HMI52, for example, with reference to the first map information 54. The first map information 54 is, for example, information representing the shape of a road by a link representing the road and a node connected by the link. The first map information 54 may include curvature of a road, POI (Point Of Interest) information, and the like. The route on the map is output to the MPU 60. The navigation device 50 may perform route guidance using the navigation HMI52 based on the route on the map. The navigation device 50 may be realized by the functions of a terminal device such as a smart phone or a tablet terminal held by an occupant. The navigation device 50 may transmit the current position and the destination to the navigation server via the communication device 20, and acquire a route equivalent to the route on the map from the navigation server.
The MPU60 includes, for example, a recommended lane determining unit 61, and holds the second map information 62 in a storage device such as an HDD or a flash memory. The recommended lane determining unit 61 divides the route on the map supplied from the navigation device 50 (for example, by dividing every 100 m in the vehicle traveling direction) into a plurality of blocks, and determines the recommended lane for each block by referring to the second map information 62. The recommended lane determination unit 61 determines which lane from the left is to be driven. The recommended lane determining unit 61 determines the recommended lane so that the host vehicle M can travel on a reasonable route for traveling to the branching destination when the branching point exists on the route on the map.
The second map information 62 is map information having higher accuracy than the first map information 54. The second map information 62 includes, for example, information of the center of a lane or information of the boundary of a lane. The second map information 62 may include road information, traffic restriction information, residence information (residence, zip code), facility information, telephone number information, information of a prohibition region where the mode a or the mode B to be described later is prohibited, and the like. The second map information 62 may be updated at any time by the communication device 20 communicating with other devices.
The driver monitor camera 70 is, for example, a digital camera using a solid-state imaging device such as a CCD or CMOS. The driver monitor camera 70 is mounted on an arbitrary portion of the host vehicle M in a position and an orientation in which the head of an occupant (hereinafter referred to as a driver) seated in the driver of the host vehicle M can be imaged from the front (in the orientation of the imaged face). For example, the driver monitor camera 70 is mounted on an upper portion of a display device provided in a center portion of an instrument panel of the host vehicle M.
The steering operation device 80 includes, for example, an accelerator pedal, a brake pedal, a shift lever, and other operation devices in addition to the steering wheel 82. A sensor for detecting the amount of operation or the presence or absence of operation is attached to the driving operation element 80, and the detection result is output to the automatic driving control device 100, or to some or all of the running driving force output device 200, the brake device 210, and the steering device 220. The steering wheel 82 is an example of an "operation tool that receives a steering operation by a driver". The operating member need not necessarily be annular, and may be in the form of a special-shaped steering gear, a lever, a button, or the like. A steering wheel grip sensor 84 is attached to the steering wheel 82. The steering wheel grip sensor 84 is implemented by a capacitance sensor or the like, and outputs a signal to the automatic driving control device 100 that can detect whether the driver grips (i.e., touches) the steering wheel 82 in a forceful state.
The automatic driving control device 100 includes, for example, a first control unit 120 and a second control unit 160. The first control unit 120 and the second control unit 160 are each realized by a hardware processor such as CPU (Central Processing Unit) executing a program (software). Some or all of these components may be realized by hardware (including a circuit part) such as LSI (Large Scale Integration), ASIC (Application Specific Integrated Circuit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), or by cooperation of software and hardware. The program may be stored in advance in a storage device such as an HDD or a flash memory (a storage device including a non-transitory storage medium) of the autopilot control device 100, or may be stored in a removable storage medium such as a DVD or a CD-ROM, and installed in the HDD or the flash memory of the autopilot control device 100 by being mounted on a drive device via the storage medium (the non-transitory storage medium). The automatic driving control device 100 is an example of a "vehicle control device", and the action plan generation unit 140 and the second control unit 160 are combined to be an example of a "driving control unit".
Fig. 2 is a functional configuration diagram of the first control unit 120 and the second control unit 160. The first control unit 120 includes, for example, an identification unit 130, an action plan generation unit 140, and a mode determination unit 150. The first control unit 120 realizes a function based on AI (Artificial Intelligence: artificial intelligence) and a function based on a predetermined model in parallel, for example. For example, the function of "identifying an intersection" may be realized by "performing, in parallel, identification of an intersection by deep learning or the like, and identification by a predetermined condition (presence of a signal, road sign, or the like that enables pattern matching), and scoring both sides to comprehensively evaluate". Thereby, reliability of automatic driving is ensured.
The recognition unit 130 recognizes the position, speed, acceleration, and other states of the object located in the vicinity of the host vehicle M based on the information input from the camera 10, the radar device 12, and the LIDAR14 via the object recognition device 16. The position of the object is identified as a position on absolute coordinates with the representative point (center of gravity, drive shaft center, etc.) of the host vehicle M as an origin, for example, and is used for control. The position of the object may be represented by a representative point such as a center of gravity or a corner of the object, or may be represented by a region. The "state" of the object may include acceleration, jerk, or "behavior state" of the object (for example, whether a lane change is being made or not, or whether a lane change is being made).
The recognition unit 130 recognizes, for example, a lane (driving lane) in which the host vehicle M is driving. For example, the identifying unit 130 identifies the driving lane by comparing the pattern of the road dividing line (for example, the arrangement of the solid line and the broken line) obtained from the second map information 62 with the pattern of the road dividing line around the host vehicle M identified from the image captured by the camera 10. The identification unit 130 may identify the driving lane by identifying the road dividing line, and the driving road boundary (road boundary) including a road shoulder, a curb, a center isolation belt, a guardrail, and the like, not limited to the road dividing line. In this identification, the position of the host vehicle M acquired from the navigation device 50 and the processing result by the INS may be considered. The identification unit 130 identifies a temporary stop line, an obstacle, a red light, a toll booth, and other road phenomena.
When recognizing the driving lane, the recognition unit 130 recognizes the position and posture of the host vehicle M with respect to the driving lane. The recognition unit 130 may recognize, for example, a deviation of the reference point of the host vehicle M from the center of the lane and an angle formed by the traveling direction of the host vehicle M with respect to a line connecting the centers of the lanes as a relative position and posture of the host vehicle M with respect to the traveling lane. Instead of this, the recognition unit 130 may recognize the position of the reference point of the host vehicle M with respect to an arbitrary side end portion (road dividing line or road boundary) of the travel lane as the relative position of the host vehicle M with respect to the travel lane.
The action plan generation unit 140 generates a target track in which the host vehicle M automatically (independent of the operation of the driver) runs in the future so as to be able to cope with the surrounding situation of the host vehicle M while traveling on the recommended lane determined by the recommended lane determination unit 61 in principle. The target track includes, for example, a speed element. For example, the target track is represented by a track in which points (track points) where the host vehicle M should reach are sequentially arranged. The track point is a point where the own vehicle M should reach every predetermined travel distance (for example, several [ M ] degrees) in terms of the distance along the road, and is generated as a part of the target track at intervals of a predetermined sampling time (for example, several tenths [ sec ] degrees), unlike this point. The track point may be a position where the own vehicle M should reach at the sampling timing at each predetermined sampling time. In this case, the information of the target speed and the target acceleration is expressed by the interval of the track points.
The action plan generation unit 140 may set an event of automatic driving when generating the target trajectory. The event of automatic driving includes a constant speed driving event, a low speed following driving event, a lane change event, a branching event, a converging event, a take over event, and the like. The action plan generation unit 140 generates a target track corresponding to the started event.
The mode determination unit 150 determines the driving mode of the host vehicle M as any one of a plurality of driving modes different in task to be set for the driver. The mode determining unit 150 includes, for example, a deviation determining unit 152 and a parallel calculating unit 154. The functions of the deviation determining unit 152 and the parallelism calculating unit 154 will be described later.
Fig. 3 is a diagram showing an example of the correspondence relationship between the driving mode and the control state and the task of the host vehicle M. In the driving mode of the host vehicle M, there are 5 modes, for example, a mode a to a mode E. Regarding the control state, that is, the degree of automation of the driving control of the host vehicle M, the pattern a is highest, and next, the pattern B, the pattern C, and the pattern D are sequentially lowered, and the pattern E is lowest. In contrast, with regard to the task placed on the driver, mode a is the mildest, and next, mode B, mode C, mode D become the heaviest in order, mode E is the heaviest. Since the automatic driving is not controlled in the modes D and E, it is obliged for the automatic driving control device 100 to perform a process of ending the control related to the automatic driving and shifting to the driving support or the manual driving. Hereinafter, the content of each driving mode is exemplified.
In the mode a, the vehicle is automatically driven, and neither the front monitoring nor the steering wheel 82 (steering wheel in the drawing) is disposed to the driver. However, even in the mode a, the driver is required to be in a body posture that can quickly shift to manual driving in response to a request from the system centering on the automatic driving control device 100. Here, the term "automatic driving" refers to steering and acceleration/deceleration that are controlled independently of the operation of the driver. The front side refers to a space in the traveling direction of the host vehicle M visually recognized through the front windshield. The mode a is a driving mode that can be executed when the host vehicle M is traveling at a predetermined speed or less (for example, about 50 km/h) on a vehicle-specific road such as an expressway, and a condition that a following target is present such as a preceding vehicle is satisfied, and is sometimes referred to as TJP (Traffic Jam Pilot). When this condition is no longer satisfied, the mode determination unit 150 changes the driving mode of the host vehicle M to the mode B.
In the mode B, a task of monitoring the front of the vehicle M (hereinafter referred to as front monitoring) is provided to the driver, but a task of holding the steering wheel 82 is not provided. In the mode C, the driving support state is set, and the driver is placed with a task of monitoring the front and a task of holding the steering wheel 82. Mode D is a driving mode in which at least one of steering and acceleration and deceleration of the host vehicle M requires a certain degree of driving operation by the driver. For example, in the mode D, driving assistance such as ACC (Adaptive Cruise Control) and LKAS (Lane Keeping Assist System) is performed. In the mode E, the manual driving state is set in which both steering and acceleration and deceleration require a driving operation by the driver. Both modes D and E are of course tasks for the driver to arrange to monitor the front of the own vehicle M.
The automatic driving control device 100 (and a driving support device (not shown)) executes an automatic lane change according to the driving mode. The automatic lane change includes an automatic lane change (1) based on a system request and an automatic lane change (2) based on a driver request. The automatic lane change (1) includes an automatic lane change for overtaking performed when the speed of the preceding vehicle is equal to or greater than the speed of the host vehicle by a reference, and an automatic lane change for traveling toward the destination (an automatic lane change performed by changing the recommended lane). The automatic lane change (2) is an automatic lane change that, when a driver operates a direction indicator when conditions relating to speed, positional relationship with respect to a surrounding vehicle, and the like are satisfied, causes the host vehicle M to make a lane change in the operation direction.
The automatic driving control apparatus 100 does not perform the automatic lane changes (1) and (2) in the mode a. The automatic driving control apparatus 100 performs automatic lane changes (1) and (2) in both modes B and C. The driving support device (not shown) executes the automatic lane change (2) without executing the automatic lane change (1) in the mode D. In mode E, neither of the automated lane changes (1) and (2) is performed.
The mode determination unit 150 changes the driving mode of the host vehicle M to the driving mode having a heavier task when the driver does not execute the task related to the determined driving mode (hereinafter referred to as the current driving mode).
For example, in the case where the driver cannot move to the manual driving in response to the request from the system in the mode a (for example, in the case where the driver is looking to the outside of the allowable area continuously, in the case where a sign of driving difficulty is detected), the mode determining unit 150 uses the HMI30 to prompt the driver to move to the manual driving, and if the driver does not respond, the driver performs such control that the host vehicle M is gradually stopped against the road shoulder and the automatic driving is stopped. After stopping the automatic driving, the host vehicle is in the mode D or E, and the host vehicle M can be started by a manual operation of the driver. Hereinafter, the same applies to "stop automatic driving". In the case where the driver does not monitor the front direction in the mode B, the mode determining unit 150 uses the HMI30 to prompt the driver to monitor the front direction, and if the driver does not respond, performs control such that the host vehicle M is gradually stopped by leaning to the road shoulder and the automatic driving is stopped. In the mode C, when the driver does not monitor the front direction or does not hold the steering wheel 82, the mode determining unit 150 uses the HMI30 to prompt the driver to monitor the front direction and/or hold the steering wheel 82, and if the driver does not respond, performs control such that the driver gradually stops the vehicle M against the road shoulder and stops the automatic driving.
The second control unit 160 controls the running driving force output device 200, the braking device 210, and the steering device 220 so that the vehicle M passes through the target track generated by the behavior plan generation unit 140 at a predetermined timing.
Returning to fig. 2, the second control unit 160 includes, for example, an acquisition unit 162, a speed control unit 164, and a steering control unit 166. The acquisition unit 162 acquires information of the target track (track point) generated by the action plan generation unit 140, and causes a memory (not shown) to store the information. The speed control unit 164 controls the traveling driving force output device 200 or the brake device 210 based on a speed element attached to the target track stored in the memory. The steering control unit 166 controls the steering device 220 according to the curve of the target track stored in the memory. The processing by the speed control unit 164 and the steering control unit 166 is realized by a combination of feedforward control and feedback control, for example. As an example, the steering control unit 166 performs a combination of feedforward control according to the curvature of the road ahead of the host vehicle M and feedback control based on the deviation from the target track.
The running driving force output device 200 outputs a running driving force (torque) for running the vehicle to the driving wheels. The running driving force output device 200 includes, for example, a combination of an internal combustion engine, an electric motor, a transmission, and the like, and ECU (Electronic Control Unit) for controlling these. The ECU controls the above-described configuration in accordance with information input from the second control portion 160 or information input from the driving operation element 80.
The brake device 210 includes, for example, a caliper, a hydraulic cylinder that transmits hydraulic pressure to the caliper, an electric motor that generates hydraulic pressure in the hydraulic cylinder, and a brake ECU. The brake ECU controls the electric motor in accordance with information input from the second control portion 160 or information input from the driving operation member 80 so that a braking torque corresponding to a braking operation is output to each wheel. The brake device 210 may be provided with a mechanism for transmitting the hydraulic pressure generated by the operation of the brake pedal included in the drive operation element 80 to the hydraulic cylinder via the master cylinder. The brake device 210 is not limited to the above-described configuration, and may be an electronically controlled hydraulic brake device that transmits the hydraulic pressure of the master cylinder to the hydraulic cylinder by controlling the actuator in accordance with information input from the second control unit 160.
The steering device 220 includes, for example, a steering ECU and an electric motor. The electric motor applies a force to the rack-and-pinion mechanism to change the direction of the steered wheel, for example. The steering ECU drives the electric motor in accordance with information input from the second control unit 160 or information input from the driving operation element 80 to change the direction of the steered wheels.
[ operation of vehicle control device ]
Next, the operation of the vehicle control device according to the embodiment will be described. In the following description, it is assumed that the host vehicle M is traveling in the driving mode of the mode B. Fig. 4 is a diagram showing an example of a scenario in which the operation of the vehicle control device according to the embodiment is performed. Fig. 4 shows a case where the host vehicle M is traveling on the lane L1 and 3 other vehicles M1, M2, M3 are traveling in front of the host vehicle M.
While the host vehicle M is traveling on the lane L1, the recognition unit 130 recognizes the surrounding situation of the host vehicle M, in particular, the road dividing lines on both sides of the host vehicle M, based on the image captured by the camera 10. Hereinafter, CL represents a road division line (hereinafter referred to as "camera road division line CL") identified based on the image captured by the camera 10, and ML represents a road division line (hereinafter referred to as "map road division line ML") identified based on the second map information 62.
The deviation determination unit 152 determines whether there is a deviation (mismatch) between the camera road division line CL and the map road division line ML while the host vehicle M is traveling. Here, the deviation means that, for example, the distance between the camera road dividing line CL and the map road dividing line ML is equal to or greater than a predetermined value, or the angle between the camera road dividing line CL and the map road dividing line ML is equal to or greater than a predetermined value.
When it is determined that there is a deviation between the camera road dividing line CL and the map road dividing line ML, the parallelism calculating section 154 determines whether or not there are a number of other vehicles equal to or greater than a threshold value within a range within a predetermined distance from the host vehicle M. When it is determined that there are a number of other vehicles within a predetermined distance from the host vehicle M, the parallelism calculating unit 154 calculates the parallelism between the trajectory of the other vehicles and the camera road dividing line CL by a method described below.
For example, in the case of fig. 4, the parallelism calculating unit 154 first calculates the travel tracks T1, T2, and T3 for the other vehicles M1, M2, and M3, respectively. The parallelism calculating unit 154 can calculate the travel tracks T1, T2, and T3 by measuring the displacement of the positions of the other vehicles M1, M2, and M3 within a predetermined period based on the camera image, for example.
Next, the parallelism calculating unit 154 calculates angles between the calculated travel tracks T1, T2, T3 and the camera road dividing line CL and the map road dividing line ML. For example, in the case of fig. 4, the parallelism calculating unit 154 calculates an angle θc1 between the travel track T1 and the camera road dividing line CL1 (the camera road dividing line CL1 is a road dividing line illustrated by moving the camera road dividing line CL in parallel for convenience of description of the calculation method, hereinafter, the same applies to CL2 and CL 3), and an angle θm1 between the travel track T1 and the map road dividing line ML1 (the map road dividing line ML1 is a road dividing line illustrated by moving the map road dividing line ML in parallel for convenience of description of the calculation method, hereinafter, the same applies to ML2 and ML 3). Similarly, the parallelism calculating unit 154 calculates an angle θc2 between the travel locus T2 and the camera road dividing line CL2 and an angle θm2 between the travel locus T2 and the map road dividing line ML2 with respect to the other vehicle M2. Similarly, the parallelism calculating unit 154 calculates an angle θc3 between the travel locus T3 and the camera road dividing line CL3 and an angle θm2 between the travel locus T3 and the map road dividing line ML3 with respect to the other vehicle M3. In this case, the parallelism calculating unit 154 calculates the clockwise direction as a positive angle and the counterclockwise direction as a negative angle with respect to the travel locus of the other vehicle, for example (the setting may be reversed).
Next, the parallelism calculating unit 154 calculates the average value of the calculated angle between the travel locus T1 and the camera road dividing line CL1 and the calculated angle between the travel locus T1 and the map road dividing line ML1, respectively, for the detected other vehicle. More specifically, in the case of fig. 4, the parallelism calculating unit 154 calculates an average value of the angle between the travel track T1 and the camera road dividing line CL1 from θc_av= (|θc1|+|θc2|+|θc3|)/3, and calculates an average value of the angle between the travel track T1 and the map road dividing line ML1 from θm_av= (|θm1|+|θm2|+|θm3|)/3.
Next, the parallelism calculating unit 154 calculates the parallelism between the detected travel locus T of the other vehicle and the camera road dividing line CL by θm_av—θc_av. That is, the parallelism θm_av- θc_av is an index value indicating which of the camera road dividing line CL and the map road dividing line ML the other vehicle travels in parallel. The larger the value of the parallelism θm_av- θc_av, the more parallel the other vehicle runs with the camera road dividing line CL, and the smaller the value of the parallelism θm_av- θc_av, the more parallel the other vehicle runs with the map road dividing line ML.
In the above example, the parallelism calculating unit 154 calculates the average value of the angles between the travel locus T1 and the camera road dividing line CL1 and the average value of the angles between the travel locus T1 and the map road dividing line ML1, respectively. However, the present invention is not limited to such a configuration, and the parallelism calculating unit 154 may calculate the central value of the angle between the travel track T1 and the camera road dividing line CLl and the central value of the angle between the travel track T1 and the map road dividing line ML1, respectively. This can prevent the accuracy of the calculation result from being lowered by the progress of the specific vehicle among the other vehicles along the abnormal travel track.
Fig. 5 is a graph for explaining a method of determining a driving mode based on parallelism. As shown in fig. 5, when the calculated parallelism θm_av- θc_av is equal to or greater than the threshold Th (the region indicated by the oblique line R), the mode determination unit 150 determines that the reliability of the camera road division line CL is higher than the reliability of the map road division line ML, and continues the driving mode in the mode B in which the camera road division line CL is used as the reference line. The threshold Th is set to a value larger than zero in consideration of the safety margin. On the other hand, when the calculated parallelism θm_av- θc_av is smaller than the threshold value, the mode determination unit 150 changes the driving mode to a driving mode (mode C, mode D, or mode E) having a heavier task than the mode B. Thus, even when the road dividing line recognized by the camera is different from the content of the map information mounted on the host vehicle, the driving control of the vehicle can be appropriately changed.
In the present embodiment, as described with reference to fig. 5, the mode determination unit 150 continues the driving mode in the mode B in which the camera road dividing line CL is used as the reference line when the calculated parallelism θm_av_θc_av is equal to or greater than the threshold Th. However, the present invention is not limited to such a configuration, and even when the calculated parallelism θm_av—θc_av is equal to or greater than the threshold Th, the mode determination unit 150 may change the driving mode from the mode B to the mode C after reporting to the occupant of the vehicle M. For example, when the deviation determination unit 152 determines that the deviation is equal to or greater than the threshold Th and the parallelism θm_av—θc_av is equal to or greater than the threshold Th, the mode determination unit 150 may report information indicating the deviation to the occupant of the vehicle M and recommend a change to the mode C to the occupant, or change to the mode C after a certain period from the determination of the deviation.
In the above description, the parallelism calculating unit 154 calculates the parallelism between the track of the other vehicle and the camera road dividing line CL when it is determined that there are a number of other vehicles within the predetermined distance from the host vehicle M. In this case, the parallelism calculating section 154 may exclude, from the calculation target, other vehicles that are not suitable for calculating the parallelism even if the other vehicles exist within a predetermined distance from the host vehicle M.
Fig. 6 is a diagram showing an example of a vehicle other than the vehicle to be calculated from the calculation target of the parallelism. Fig. 6 shows a situation in which there are other vehicles M1 traveling on the lane L1 and other vehicles M2 traveling on the lane L2 that is the adjacent lane to the lane L1 in addition to the host vehicle M. As shown in fig. 6, for example, when the calculated travel locus M1 cannot be approximated by a straight line (more specifically, when the error between the travel locus M1 and the approximated straight line is equal to or greater than a threshold value), the parallelism calculating unit 154 excludes the other vehicle M1 corresponding to the travel locus M1 from the calculation target for calculating the parallelism. For example, the other vehicle M2 traveling on a lane L2 different from the lane L1 on which the host vehicle M travels is excluded from the calculation targets for calculating the parallelism. This can improve the accuracy of calculating the parallelism.
Next, a flow of operations performed by the vehicle control device according to the embodiment will be described with reference to fig. 7. Fig. 7 is a flowchart showing an example of a flow of operations executed by the vehicle control device according to the embodiment. The processing according to the present flowchart is executed in a predetermined cycle while the host vehicle M is traveling in the driving mode of the mode B using the camera road dividing line CL.
First, the mode determining unit 150 obtains the camera road dividing line CL and the map road dividing line ML via the identifying unit 130 (step S100). Next, the deviation determination unit 152 determines whether there is a deviation between the acquired camera road division line CL and the map road division line ML (step S102).
Next, when determining that there is a deviation between the acquired camera road division line CL and the map road division line ML, the deviation determination unit 152 determines whether or not there are a number of other vehicles equal to or greater than a threshold value within a range within a predetermined distance from the host vehicle M (step S104). When it is determined that there are no more than the threshold number of other vehicles within the predetermined distance from the host vehicle M, the mode determination unit 150 changes the driving mode from the mode B to the mode C (step S106).
On the other hand, when it is determined that there are a number of other vehicles within the predetermined distance from the host vehicle M, the parallelism calculating unit 154 calculates parallelism based on the travel locus of the other vehicles, the camera road dividing line CL, and the map road dividing line ML (step S108). Next, the mode determining unit 150 determines whether or not the calculated parallelism is equal to or greater than a threshold value (step S110). When it is determined that the calculated parallelism is smaller than the threshold value, the mode determination unit 150 changes the driving mode from the mode B to the mode C in step S106. On the other hand, when it is determined that the calculated parallelism is equal to or greater than the threshold value, the mode determination unit 150 determines to continue the mode B using the camera road division line CL (step S112). Thus, the processing of the present flowchart ends.
According to the present embodiment described above, when a deviation occurs between the camera road dividing line and the map road dividing line and a plurality of other vehicles exist around the host vehicle, the parallelism is calculated based on the travel tracks of the plurality of other vehicles, the camera road dividing line and the map road dividing line, and the driving mode of the host vehicle is controlled based on the calculated parallelism. Thus, even when the road dividing line recognized by the camera is different from the content of the map information mounted on the host vehicle, the driving control of the vehicle can be appropriately changed.
The embodiments described above can be expressed as follows.
A vehicle control device is provided with:
a storage device storing a program; and
a hardware processor is provided with a processor that,
executing, by the processor, a command readable by the computer to: (the processor executing the computer-readable instructions to:)
Acquiring a camera image obtained by capturing a surrounding situation of a vehicle;
controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle based on the camera image and map information;
Determining a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, wherein the second driving mode is a driving mode in which a task to be placed on the driver is lighter than the first driving mode, and the second driving mode is performed by controlling steering and acceleration/deceleration of the vehicle independently of an operation of the driver of the vehicle, and changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not executed by the driver;
determining whether there is a deviation between a road dividing line shown in the camera image and a road dividing line shown in the map information;
when it is determined that there is a deviation between the road dividing line shown in the camera image and the road dividing line shown in the map information, calculating a parallelism between a track of one or more other vehicles existing in the periphery of the vehicle and the road dividing line shown in the camera image;
and determining to continue the second driving mode when the calculated parallelism is equal to or greater than a first threshold.
The specific embodiments of the present invention have been described above using the embodiments, but the present invention is not limited to such embodiments, and various modifications and substitutions can be made without departing from the scope of the present invention.
Claims (8)
1. A vehicle control apparatus, wherein,
the vehicle control device includes:
an acquisition unit that acquires a camera image obtained by capturing a surrounding situation of a vehicle;
a driving control unit that controls steering and acceleration/deceleration of the vehicle based on the camera image and map information, independently of an operation of a driver of the vehicle;
a mode determination unit that determines a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, at least a part of the driving modes including the second driving mode among the plurality of driving modes being controlled by the driving control unit, and the mode determination unit changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not executed by the driver;
A deviation determination unit that determines whether or not there is a deviation between a road division line shown in the camera image and a road division line shown in the map information; and
a parallelism calculating unit that calculates parallelism between a track of one or more other vehicles existing in the periphery of the vehicle and a road division line shown in the camera image when it is determined that there is a deviation between the road division line shown in the camera image and the road division line shown in the map information,
the mode determination unit determines to continue the second driving mode when the calculated parallelism is equal to or greater than a first threshold.
2. The vehicle control apparatus according to claim 1, wherein,
the parallelism calculating unit calculates the parallelism when the number of other vehicles existing in at least one of the surrounding vehicles is equal to or greater than a second threshold.
3. The vehicle control apparatus according to claim 1, wherein,
the parallelism calculating unit calculates the parallelism based on a track of one or more other vehicles existing in the vicinity of the vehicle, a road dividing line indicated by the camera image, and a road dividing line indicated by the map information.
4. The vehicle control apparatus according to claim 3, wherein,
the parallelism calculating unit calculates the parallelism based on an average value of angles between a track of each of the one or more other vehicles existing in the vicinity of the vehicle and a road dividing line shown in the camera image, and an average value of angles between a track of each of the one or more other vehicles existing in the vicinity of the vehicle and the road dividing line shown in the map information.
5. The vehicle control apparatus according to claim 3, wherein,
the parallelism calculating unit calculates the parallelism based on a central value of an angle between a track of each of the one or more other vehicles existing in the vicinity of the vehicle and a road dividing line shown in the camera image, and a central value of an angle between a track of each of the one or more other vehicles existing in the vicinity of the vehicle and the road dividing line shown in the map information.
6. The vehicle control apparatus according to any one of claims 1 to 5, wherein,
the second driving mode is a driving mode in which a task of holding an operation member that accepts a steering operation of the vehicle is not disposed for the driver,
The first driving mode is a driving mode in which at least a task of holding the operation element is arranged for the driver.
7. A vehicle control method, wherein,
the vehicle control method causes a computer to perform the following processing:
acquiring a camera image obtained by capturing a surrounding situation of a vehicle;
controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle based on the camera image and map information;
determining a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, wherein the second driving mode is a driving mode in which a task to be placed on the driver is lighter than the first driving mode, and the second driving mode is performed by controlling steering and acceleration/deceleration of the vehicle independently of an operation of the driver of the vehicle, and changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not executed by the driver;
determining whether there is a deviation between a road dividing line shown in the camera image and a road dividing line shown in the map information;
When it is determined that there is a deviation between the road dividing line shown in the camera image and the road dividing line shown in the map information, calculating a parallelism between a track of one or more other vehicles existing in the periphery of the vehicle and the road dividing line shown in the camera image;
and determining to continue the second driving mode when the calculated parallelism is equal to or greater than a first threshold.
8. A storage medium storing a program, wherein,
the program causes a computer to perform the following processing:
acquiring a camera image obtained by capturing a surrounding situation of a vehicle;
controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle based on the camera image and map information;
determining a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, wherein the second driving mode is a driving mode in which a task to be placed on the driver is lighter than the first driving mode, and the second driving mode is performed by controlling steering and acceleration/deceleration of the vehicle independently of an operation of the driver of the vehicle, and changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not executed by the driver;
Determining whether there is a deviation between a road dividing line shown in the camera image and a road dividing line shown in the map information;
when it is determined that there is a deviation between the road dividing line shown in the camera image and the road dividing line shown in the map information, calculating a parallelism between a track of one or more other vehicles existing in the periphery of the vehicle and the road dividing line shown in the camera image;
and determining to continue the second driving mode when the calculated parallelism is equal to or greater than a first threshold.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022-056398 | 2022-03-30 | ||
JP2022056398A JP2023148405A (en) | 2022-03-30 | 2022-03-30 | Vehicle control device, vehicle control method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116890824A true CN116890824A (en) | 2023-10-17 |
Family
ID=88288269
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310311201.2A Pending CN116890824A (en) | 2022-03-30 | 2023-03-27 | Vehicle control device, vehicle control method, and storage medium |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2023148405A (en) |
CN (1) | CN116890824A (en) |
-
2022
- 2022-03-30 JP JP2022056398A patent/JP2023148405A/en active Pending
-
2023
- 2023-03-27 CN CN202310311201.2A patent/CN116890824A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
JP2023148405A (en) | 2023-10-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114684184A (en) | Vehicle control device, vehicle control method, and storage medium | |
JP7194224B2 (en) | VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM | |
CN115443236B (en) | Vehicle control device, vehicle system, vehicle control method, and storage medium | |
CN116034066B (en) | Vehicle control device and vehicle control method | |
CN116075691B (en) | Vehicle control device and vehicle control method | |
CN115140082A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN117622150A (en) | Vehicle control device, vehicle control method, and storage medium | |
US20230303099A1 (en) | Vehicle control device, vehicle control method, and storage medium | |
CN117584975A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN114684188B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN117227725A (en) | Moving object control device, moving object control method, and storage medium | |
JP7203884B2 (en) | VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM | |
CN117584997A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN116710984A (en) | Vehicle control device, vehicle control method, and program | |
CN115140083A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN116890824A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN115279642B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN114684191B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN116710339B (en) | Vehicle control device, vehicle control method, and storage medium | |
US20230303126A1 (en) | Vehicle control device, vehicle control method, and storage medium | |
CN116890838A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN117584963A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN116890837A (en) | Vehicle control device, vehicle control method, and storage medium | |
CN117657157A (en) | Vehicle control device, vehicle control method, and storage medium | |
JP2024087350A (en) | Vehicle control device, vehicle control method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |