EP3862240A1 - Vehicle control system - Google Patents
Vehicle control system Download PDFInfo
- Publication number
- EP3862240A1 EP3862240A1 EP21153702.2A EP21153702A EP3862240A1 EP 3862240 A1 EP3862240 A1 EP 3862240A1 EP 21153702 A EP21153702 A EP 21153702A EP 3862240 A1 EP3862240 A1 EP 3862240A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- unit
- vehicle
- cameras
- recognition
- output
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012545 processing Methods 0.000 claims description 234
- 230000005856 abnormality Effects 0.000 claims description 22
- 230000033001 locomotion Effects 0.000 description 64
- 238000001514 detection method Methods 0.000 description 30
- 238000000034 method Methods 0.000 description 21
- 238000007781 pre-processing Methods 0.000 description 21
- 238000013135 deep learning Methods 0.000 description 13
- 238000012544 monitoring process Methods 0.000 description 9
- 238000003384 imaging method Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 8
- 239000000446 fuel Substances 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 230000001133 acceleration Effects 0.000 description 5
- 238000004422 calculation algorithm Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000006872 improvement Effects 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 230000006399 behavior Effects 0.000 description 3
- 230000004927 fusion Effects 0.000 description 3
- 238000013527 convolutional neural network Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
- B60W30/16—Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/02—Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
- B60W50/0205—Diagnosing or detecting failures; Failure detection models
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/02—Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
- B60W50/023—Avoiding failures by using redundant parts
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/02—Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
- B60W50/029—Adapting to failures or work around with other constraints, e.g. circumvention by avoiding use of failed parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0001—Details of the control system
- B60W2050/0002—Automatic control, details of type of controller or control system architecture
- B60W2050/0004—In digital systems, e.g. discrete-time systems involving sampling
- B60W2050/0006—Digital architecture hierarchy
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/02—Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
- B60W50/0205—Diagnosing or detecting failures; Failure detection models
- B60W2050/0215—Sensor drifts or sensor failures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/02—Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
- B60W50/029—Adapting to failures or work around with other constraints, e.g. circumvention by avoiding use of failed parts
- B60W2050/0297—Control Giving priority to different actuators or systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2510/00—Input parameters relating to a particular sub-units
- B60W2510/18—Braking system
- B60W2510/182—Brake pressure, e.g. of fluid or between pad and disc
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
- B60W2520/105—Longitudinal acceleration
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/10—Accelerator pedal position
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/18—Steering angle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/50—Barriers
Definitions
- the technique disclosed herein relates to a vehicle control system.
- the Japanese Unexamined Patent Publication No. H11-16099 discloses a vehicle traveling assist device.
- This vehicle traveling assist device includes: an infrared sensor, a radar, a superimposing unit, a determination unit, and an obstacle determination unit.
- the infrared sensor images an area outside the vehicle.
- the radar emits a radio wave toward the area imaged by the infrared sensor and receives a reflection wave from an object to detect a distance from the object and the direction toward the object.
- the superimposing unit superimposes information obtained by the infrared sensor and information obtained by the radar within an electronic circuit.
- the determination unit determines whether or not the object detected by the radar is a hot spot on the image obtained by the infrared sensor.
- the obstacle determination unit determines whether or not the hot spot is an obstacle on the traveling of a subject vehicle.
- the device such as in Japanese Unexamined Patent Publication No. HI 1-16099 is provided with a plurality of cameras (the infrared sensor in Japanese Unexamined Patent Publication No. H11-16099 ) in order to monitor an environment spread around the vehicle (external environment of the vehicle).
- a signal system including the cameras has an abnormality, it becomes difficult to continuously perform cruise control of the vehicle based on the output from the cameras.
- the technique disclosed herein has been made in view of this point, and an object thereof is to improve continuity of cruise control of the vehicle.
- the technique disclosed herein relates to a vehicle control system for controlling a vehicle.
- This vehicle control system includes: a plurality of first cameras disposed in the vehicle so as to surround the vehicle; a plurality of second cameras disposed in the vehicle so as to surround the vehicle; and a control unit that performs a first operation of outputting a control signal for cruise control of the vehicle based on both outputs from the plurality of the first cameras and outputs from the plurality of the second cameras, a second operation of outputting the control signal based on the outputs from the plurality of the first cameras, and a third operation of outputting the control signal based on the outputs from the plurality of the second cameras.
- This configuration includes two combinations of cameras provided in the vehicle so as to surround the vehicle. Provision of the first cameras so as to surround the vehicle allows an environment spread around the vehicle (the external environment of the vehicle) to be monitored based on the output from the first cameras. Similarly, provision of the second cameras so as to surround the vehicle allows the external environment of the vehicle to be monitored based on the output from the second cameras. As described above, the external environment of the vehicle may be monitored based on at least either one of the output from the first cameras or the output from the second cameras. Accordingly, even when either one of the first signal system including the first cameras or the second signal system including the second cameras has an abnormality, it is possible to continue the cruise control of the vehicle based on the output from the cameras in the other signal system by performing the second or third operation. This allows improvement in continuity of cruise control of the vehicle.
- control unit may be configured to perform the first operation when a first signal system including the plurality of first cameras and a second signal system including the plurality of second cameras both do not have an abnormality, the second operation when the second signal system between the first and second signal systems has an abnormality, and the third operation when the first signal system between the first and second signal systems has an abnormality.
- This configuration allows automatic switching among the first, second, and third operations according to the presence or absence of abnormality in the first and second signal systems. Accordingly, when either one of the first or second signal system has an abnormality, the second or third operation may be automatically performed. This allows automatic continuation of the cruise control of the vehicle.
- the plurality of first cameras may include a first front camera that images an area in front of the vehicle, a first diagonally backward right camera that images an area diagonally backward right of the vehicle, and a first diagonally backward left camera that images an area diagonally backward left of the vehicle.
- the plurality of second cameras may include a second front camera that images an area in front of the vehicle, a second diagonally backward right camera that images an area diagonally backward right of the vehicle, and a second diagonally backward left camera that images an area diagonally backward left of the vehicle.
- This configuration allows continuous cruise control of the vehicle based on at least an area in front of the vehicle, an area diagonally backward right of the vehicle, and an area diagonally backward left of the vehicle in the environment spread around the vehicle (external environment of the vehicle). Accordingly, even when either one of the first or second signal system has an abnormality, this allows continuous cruise control based on the area in front of the vehicle (e.g., control for maintaining an appropriate distance from other vehicles traveling in front of the subject vehicle) and a control based on the area diagonally backward right of the vehicle and the area diagonally backward left of the vehicle (e.g., control for sensing critical situations when the subject vehicle performs lane changing).
- the control unit may include: a first recognition processing IC unit that performs recognition processing for recognizing an external environment of the vehicle based on an output from the plurality of first cameras; and a second recognition processing IC unit that performs the recognition processing based on the output of the plurality of second cameras.
- the control unit may be configured to output, during the first operation, the control signal based on both of a result of the recognition processing by the first recognition processing IC unit and a result of the recognition processing by the second recognition processing IC unit, to output, during the second operation, the control signal based on the result of the recognition processing by the first recognition processing IC unit, and to output, during the third operation, the control signal based on the result of the recognition processing by the second recognition processing IC unit.
- This configuration allows improvement of recognition accuracy of the recognition processing by both of the first and second recognition processing IC units, compared with the recognition accuracy of the recognition processing by either one of the first or second recognition processing IC unit.
- FIG. 1 illustrates an appearance of a vehicle 100 including a vehicle control system 1 according to an embodiment.
- the vehicle control system 1 is provided in the vehicle 100 (specifically, a four-wheeled vehicle).
- the vehicle 100 can switch among manual driving, assisted driving, and self-driving.
- the manual driving is driving to travel in accordance with driver's operation (e.g., an accelerator operation and the like).
- the assisted driving is driving to travel with assistance of the driver's operation.
- the self-driving is driving to travel without the driver's operation.
- the vehicle control system 1 controls the vehicle 100 during the assisted driving and the self-driving. Specifically, the vehicle control system 1 controls an actuator (not shown) provided in the vehicle 100 to control the motion (specifically, traveling) of the vehicle 100.
- the vehicle 100 provided with the vehicle control system 1 is referred to as "the subject vehicle,” whereas another vehicle present around the subject vehicle is referred to as "another vehicle (other vehicles).”
- the actuator provided in the vehicle 100 includes a drive actuator, a steering actuator, a braking actuator, and the like.
- Examples of the drive actuator include an engine, a motor, and a transmission.
- Examples of the steering actuator include steering.
- Examples of the braking actuator include a brake.
- the vehicle control system 1 includes an information acquisition unit 10 and a control unit 20.
- the control unit 20 is housed in a single housing installed in a specific position within the vehicle 100 such as a lower part of a passenger's seat or a trunk, for example.
- the information acquisition unit 10 acquires various kinds of information for use in control (specifically, cruise control) of the vehicle 100. As illustrated in FIGS. 1 , 5 , and 6 , the information acquisition unit 10 includes a plurality of cameras 11, a plurality of radars 12, a position sensor 13, an external input unit 14, mechanical sensors 15, and a driver input unit 16. FIGS. 1 and 5 omit illustration of the position sensor 13, the external input unit 14, the mechanical sensors 15, and the driver input unit 16.
- the cameras 11 have the same configuration.
- the cameras 11 are provided in the vehicle 100 so as to surround the vehicle 100.
- Each of the cameras 11 images part of an environment spread around the vehicle 100 (an external environment of the vehicle 100) to acquire image data indicating part of the external environment of the vehicle 100.
- the image data obtained by each of the cameras 11 is transmitted to the control unit 20.
- the cameras 11 are each a monocular camera having a wide-angle lens.
- the cameras 11 are each constituted using a solid imaging element such as a charge coupled device (CCD) and a complementary metal-oxide-semiconductor (CMOS), for example.
- CMOS complementary metal-oxide-semiconductor
- the cameras 11 may each be a monocular camera having a narrow-angle lens or a stereo camera having wide-angle lenses or narrow-angle lenses.
- the cameras 11 include a plurality of first cameras 11a and a plurality of second cameras 11b.
- This vehicle 100 has two combinations of the cameras 11 provided in the vehicle 100 so as to surround the vehicle 100.
- the first cameras 11a are provided in the vehicle 100 so as to surround the vehicle 100. Specifically, the first cameras 11a are provided in the vehicle such that imaging areas of the first cameras 11a surround the vehicle 100.
- the first cameras 11a include a first front camera 111a, a first diagonally backward right camera 112a, a first diagonally backward left camera 113a, and a first back camera 114a.
- the first front camera 111a images an area in front of the vehicle 100.
- the first diagonally backward right camera 112a images an area diagonally backward right of the vehicle 100.
- the first diagonally backward left camera 113a images an area diagonally backward left of the vehicle 100.
- the first back camera 114a images an area behind the vehicle 100.
- the second cameras 11b are provided in the vehicle 100 so as to surround the vehicle 100. Specifically, the second cameras 11b are provided in the vehicle such that imaging areas of the second cameras 11b surround the vehicle 100.
- the second cameras 11b include a second front camera 111b, a second diagonally backward right camera 112b, a second diagonally backward left camera 113b, and a second back camera 114b.
- the second front camera 111b images an area in front of the vehicle 100.
- the second diagonally backward right camera 112b images an area diagonally backward right of the vehicle 100.
- the second diagonally backward left camera 113b images an area diagonally backward right of the vehicle 100.
- the second back camera 114b images an area behind the vehicle 100.
- the radars 12 have the same configuration.
- the radars 12 are provided in the vehicle 100 so as to surround the vehicle 100.
- Each of the radars 12 detects part of the external environment of the vehicle 100.
- the radars 12 each transmit radio waves toward part of the external environment of the vehicle 100 and receive reflected waves from the part of the external environment of the vehicle 100 to detect the part of the external environment of the vehicle 100. Detection results of the radars 12 are transmitted to the control unit 20.
- the radars 12 may each be a millimeter-wave radar that transmits millimeter waves, a lidar (light detection and ranging) that transmits laser light, an infrared radar that transmits infrared rays, or an ultrasonic radar that transmits ultrasonic waves, for example.
- the radars 12 include a plurality of first radars 12a and a plurality of second radars 12b. This vehicle 100 has two combinations of the radars 12 provided in the vehicle 100 so as to surround the vehicle 100.
- the first radars 12a are provided in the vehicle 100 so as to surround the vehicle 100. Specifically, the first radars 12a are provided in the vehicle such that detection areas of the first radars 12a surround the vehicle 100.
- the first radars 12a include a first front radar 121a, a first diagonally backward right radar 122a, and a first diagonally backward left radar 123a.
- the first front radar 121a detects the external environment in front of the vehicle 100.
- the first diagonally backward right radar 122a detects the external environment diagonally backward right of the vehicle 100.
- the first diagonally backward left radar 123a detects the external environment diagonally backward left of the vehicle 100.
- the second radars 12b are provided in the vehicle 100 so as to surround the vehicle 100. Specifically, the second radars 12b are provided in the vehicle 100 such that detection areas of the second radars 12b surround the vehicle 100.
- the second radars 12b include a second front radar 121b, a second diagonally backward right radar 122b, a second diagonally backward left radar 123b, and a second back radar 124b.
- the second front radar 121b detects the external environment in front of the vehicle 100.
- the second diagonally backward right radar 122b detects the external environment diagonally backward right of the vehicle 100.
- the second diagonally backward left radar 123b detects the external environment diagonally backward left of the vehicle 100.
- the second back radar 124b detects the external environment behind the vehicle 100.
- FIG. 2 illustrates imaging areas (monitoring areas) of the cameras 11 and detection areas (monitoring areas) of the radars 12.
- FIG. 3 illustrates imaging areas of the first cameras 11a and detection areas of the first radars 12a.
- FIG. 4 illustrates imaging areas of second cameras 11b and detection areas of the second radars 12b.
- each thicker broken line indicates an imaging area of each of the first cameras 11a
- each thicker dot-and-dash line indicates a detection area of each of the first radars 12a.
- Each thinner broken line indicates an imaging area of each of the second cameras 11b
- a thinner dot-and-dash line indicates a detection area of each of the second radars 12b.
- each monitoring area and each arrangement of the first cameras 11a and the first radars 12a are set such that a combination of their monitoring areas surrounds the entire circumference of the vehicle 100.
- each monitoring area and each arrangement of the second cameras 11b and the second radars 12b are set such that a combination of their monitoring areas surrounds the entire circumference of the vehicle 100.
- the position sensor 13 detects the position (e.g., the latitude and the longitude) of the vehicle 100.
- the position sensor 13 receives GPS information from the Global Positioning System and detects the position of the vehicle 100 based on the GPS information, for example.
- the information (the position of the vehicle 100) obtained by the position sensor 13 is transmitted to the control unit 20.
- the external input unit 14 receives information through an extra-vehicle network (e.g., the Internet and the like) provided outside the vehicle 100.
- the external input unit 14 receives communication information from another vehicle (not shown) positioned around the vehicle 100, car navigation data from a navigation system (not shown), traffic information, high-precision map information, and the like, for example.
- the information obtained by the external input unit 14 is transmitted to the control unit 20.
- the mechanical sensors 15 detect the status (e.g., the speed, the acceleration, the yaw rate, and the like) of the vehicle 100.
- the mechanical sensors 15 include a vehicle speed sensor that detects the speed of the vehicle 100, an acceleration sensor that detects the acceleration of the vehicle 100, a yaw rate sensor that detects the yaw rate of the vehicle 100, and the like, for example.
- the information (the status of the vehicle 100) obtained by the mechanical sensors 15 is transmitted to the control unit 20.
- the driver input unit 16 detects driving operations applied to the vehicle 100.
- the driver input unit 16 includes an accelerator position sensor, a steering angle sensor, a brake hydraulic pressure sensor, and the like, for example.
- the accelerator position sensor detects an accelerator operation amount of the vehicle 100.
- the steering angle sensor detects a steering angle of a steering wheel of the vehicle 100.
- the brake hydraulic pressure sensor detects a brake operation amount of the vehicle 100.
- the information (the driving operation of the vehicle 100) obtained by the driver input unit 16 is transmitted to the control unit 20.
- FIG. 5 illustrates a configuration of the control unit 20.
- the control unit 20 includes a first signal processing IC unit 21a, a second signal processing IC unit 21b, a first recognition processing IC unit 22a, a second recognition processing IC unit 22b, a first control IC unit 23a, and a second control IC unit 23b.
- Each of these IC units may include a single integrated circuit (IC) or a plurality of ICs.
- the IC may house a single core or die or house a plurality of cores or dies cooperating with each other.
- the core or die may include a CPU (processor) and a memory storing therein a program for operating the CPU and information such as processing results by the CPU, for example.
- the first signal processing IC unit 21a and the second signal processing IC unit 21b constitute a signal processing unit 201.
- the first recognition processing IC unit 22a and the second recognition processing IC unit 22b constitute a recognition processing unit 202.
- the first control IC unit 23a constitutes a determination processing unit 203.
- the second control IC unit 23b constitutes a backup processing unit 204.
- the signal processing unit 201 performs image processing with respect to the output from the cameras 11.
- the signal processing unit 201 outputs image data obtained by the image processing.
- the first signal processing IC unit 21a performs image processing with respect to the output from the first cameras 11a.
- the second signal processing IC unit 21b performs image processing with respect to the output from the second cameras 11b.
- the recognition processing unit 202 performs recognition processing for recognizing the external environment of the vehicle 100 based on the output (the image data) from the signal processing unit 201.
- the recognition processing unit 202 outputs external environment data obtained by the recognition processing.
- the first recognition processing IC unit 22a performs recognition processing based on the output which has been processed by the first signal processing IC unit 21a and output from the first cameras 11a.
- the second recognition processing IC unit 22b performs recognition processing based on the output which has been processed by the second signal processing IC unit 21b and output from the second cameras 11b.
- the determination processing unit 203 performs determination processing for cruise control of the vehicle 100 based on the output (the external environment data) from the recognition processing unit 202. Specifically, the first control IC unit 23a performs determination processing based on the output from the first recognition processing IC unit 22a and/or the output from the second recognition processing IC unit 22b. The determination processing unit 203 then outputs a control signal for cruise control of the vehicle 100 based on a result of the determination processing.
- the backup processing unit 204 performs recognition processing for recognizing the external environment of the vehicle 100 based on the output (the image data) from the signal processing unit 201. Specifically, the second control IC unit 23b performs recognition processing based on the output from the first signal processing IC unit 21a and/or the output from the second signal processing IC unit 21b. The backup processing unit 204 performs determination processing for cruise control of the vehicle 100 based on a result of the recognition processing. The backup processing unit 204 then outputs a control signal for cruise control of the vehicle 100 based on the result of the determination processing.
- the function of the vehicle control system 1 is broadly divided into a recognition block B1, a determination block B2, and an operation block B3.
- the recognition block B1 recognizes the external environment of the vehicle 100 based on the various kinds of information acquired by the information acquisition unit 10.
- the recognition block B1 may be configured to recognize an internal environment of the vehicle 100.
- the determination block B2 determines a status and condition of the vehicle 100 based on a recognition result of the recognition block B1 and determines a target operation of the vehicle 100 based on a result of the determination.
- the operation block B3 generates a signal for controlling the actuator AC provided in the vehicle 100 based on the target operation of the vehicle 100 determined by the determination block B2 and outputs the signal to the actuator AC.
- the vehicle control system 1 includes a main arithmetic unit F1, a safety functional unit F2, and a backup functional unit F3.
- the main arithmetic unit F1 recognizes the external environment of the vehicle 100 based on the output from the information acquisition unit 10 and determines a target route of the vehicle 100 based on the external environment of the vehicle 100.
- the main arithmetic unit F1 determines a target motion of the vehicle 100 based on the target route of the vehicle 100 and outputs a control signal based on the target motion of the vehicle 100.
- a learning model generated by deep learning is used for the processing by the main arithmetic unit F1, a learning model generated by deep learning is used.
- a multilayered neural network deep neural network
- Examples of the multilayered neural network include Convolutional Neural Network (CNN).
- the main arithmetic unit F1 includes a vehicle status detection unit F001, a driver operation recognition unit F002, an object recognition unit F101 (an image system), an object recognition unit F102 (a radar system), a map generation unit F103, an external environment estimation unit F104, an external environment model F105, a route search unit F106, a route generation unit F107, a critical status determination unit F108, a first vehicle model F109, a second vehicle model F110, a route determination unit F111, a target motion determination unit F112, a vehicle motion energy setting unit F113, an energy management unit F114, a selector F115, and a selector F116.
- the vehicle status detection unit F001, the driver operation recognition unit F002, the object recognition unit F101, the object recognition unit F102, the map generation unit F103, the external environment estimation unit F104, and the external environment model F105 belong to the recognition block B1.
- the route search unit F106, the route generation unit F107, the critical status determination unit F108, the first vehicle model F109, the route determination unit F111, and the target motion determination unit F112 belong to the determination block B2.
- the second vehicle model F110, the vehicle motion energy setting unit F113, the energy management unit F114, the selector F115, and the selector F116 belong to the operation block B3.
- the signal processing unit 201 includes part of the object recognition unit F101 (the image system), while the recognition processing unit 202 includes the rest thereof.
- the recognition processing unit 202 includes the object recognition unit F102 (the radar system) and the map generation unit F103.
- the recognition processing unit 202 (specifically, the first recognition processing IC unit 22a) includes the external environment estimation unit F104, the external environment model F105, the route search unit F106, the route generation unit F107, the first vehicle model F109, and the second vehicle model F110.
- the determination processing unit 203 (specifically, the first control IC unit 23a) includes the vehicle status detection unit F001, the driver operation recognition unit F002, the critical status determination unit F108, the route determination unit F111, the target motion determination unit F112, the vehicle motion energy setting unit F113, the energy management unit F114, the selector F115, and the selector F116.
- the vehicle status detection unit F001 recognizes the status of the vehicle 100 (e.g., speed, acceleration, yaw rate, and the like) based on the output from the mechanical sensors 15.
- the driver operation recognition unit F002 recognizes the driving operations applied to the vehicle 100 based on the output from the driver input unit 16.
- the object recognition unit F101 recognizes an object included in the external environment of the vehicle 100 based on the output from the cameras 11. Thus, information on the object (object information) is obtained.
- the object information indicates the type of the object, the shape of the object, and the like, for example.
- Examples of the object include a dynamic object that moves with the lapse of time and a stationary object that does not move with the lapse of time.
- Examples of the dynamic object include four-wheeled vehicles, motorcycles, bicycles, pedestrians, and the like.
- Examples of the stationary object include signs, roadside trees, median strips, center poles, buildings, and the like.
- the object recognition unit F101 includes an image processing unit and an image recognition unit.
- the image processing unit performs image processing with respect to the image data which is the output from the cameras 11. This image processing includes distortion correction processing for correcting the distortion of an image presented in the image data, white balance adjustment processing for adjusting the brightness of the image presented in the image data, and the like.
- the image recognition unit recognizes the object included in the external environment of the vehicle 100 based on the image data processed by the image processing unit.
- a known object recognition technique an image data-based object recognition technique
- the image recognition unit of the object recognition unit F101 may be configured to perform the object recognition processing using a learning model generated by deep learning.
- the image processing unit of the object recognition unit F101 includes a first image processing unit that performs processing based on an output from the first cameras 11a and a second image processing unit that performs processing based on an output from the second cameras 11b.
- the image recognition unit of the object recognition unit F101 includes a first image recognition unit that performs processing based on an output from the first image processing unit and a second image recognition unit that performs processing based on an output from the second image processing unit.
- the signal processing unit 201 includes the image processing unit of the object recognition unit F101
- the recognition processing unit 202 includes the image recognition unit of the object recognition unit F101.
- the first signal processing IC unit 21a includes the first image processing unit
- the second signal processing IC unit 21b includes the second image processing unit
- the first recognition processing IC unit 22a includes the first image recognition unit
- the second recognition processing IC unit 22b includes the second image recognition unit.
- the object recognition unit F102 recognizes the object included in the external environment of the vehicle 100 based on a detection result which is the output from the radars 12 (e.g., a peak list of the reflected waves). Thus, the object information is obtained. Specifically, the object recognition unit F102 performs analysis processing (processing for obtaining the object information) on the detection result of the radars 12. For the object recognition processing by the object recognition unit F102, a known object recognition technique (an object recognition technique based on the detection result of the radars 12) may be used, for example.
- the obj ect recognition unit F102 may be configured to perform the obj ect recognition processing using a learning model generated by deep learning.
- the object recognition unit F102 includes a first radar recognition unit that performs processing based on an output from the first radars 12a and a second radar recognition unit that performs processing based on an output from the second radars 12b.
- the first recognition processing IC unit 22a includes the first radar recognition unit
- the second recognition processing IC unit 22b includes the second radar recognition unit.
- the map generation unit F103 generates map data (e.g., three-dimensional map data) indicating the external environment of the vehicle 100 based on an output from the object recognition unit F101 (image system) and an output from the obj ect recognition unit F 102 (radar system).
- the map generation unit F103 generates the map data for each of a plurality of areas (e.g., four areas of front, rear, right, and left) obtained by dividing a surrounding area surrounding the vehicle 100, for example.
- the map generation unit F103 In response to the input of the object information obtained by each of the object recognition unit F101 (the image system) and the object recognition unit F102 (the radar system) to the map generation unit F103, the map generation unit F103 fuses the pieces of object information, and reflects the object information obtained by the fusion in the map data.
- the map generation unit F103 includes a first map generation unit that performs processing based on an output from the first image recognition unit of the object recognition unit F101 and an output from the first radar recognition unit of the object recognition unit F102 and a second map generation unit that performs processing based on an output from the second image recognition unit of the object recognition unit F101 and an output from the second radar recognition unit of the object recognition unit F102.
- the first recognition processing IC unit 22a includes the first map generation unit
- the second recognition processing IC unit 22b includes the second map generation unit.
- the external environment estimation unit F 104 estimates the external environment of the vehicle 100 based on an output from the vehicle status detection unit F001, an output from the map generation unit F103, an output from the position sensor 13, and an output from the external input unit 14 (e.g., high-precision map information). Specifically, the external environment estimation unit F104 generates the three-dimensional map data indicating the external environment of the vehicle 100 by image recognition processing based on the external environment model F105.
- the external environment estimation unit F104 performs the following operation. First, the external environment estimation unit F104 fuses map data for each of a plurality of areas (e.g., four areas of front, rear, right, and left) to generate fused map data indicating the surroundings (the external environment) of the vehicle 100. Next, for each of dynamic objects included in the fused map data, the external environment estimation unit F104 predicts changes in the distance, direction, and relative speed between the dynamic object and the subject vehicle. The external environment estimation unit F104 then incorporates a result of the prediction into the external environment model F105.
- a plurality of areas e.g., four areas of front, rear, right, and left
- the external environment estimation unit F104 estimates the position of the subject vehicle in the fused map data and calculates a route cost based on the output from the position sensor 13 (the position of the vehicle 100), the output from the external input unit 14 (the high-precision map information), and the output from the vehicle status detection unit F001 (e.g., vehicle speed information, six degrees of freedom (6DoF) information, and the like).
- the external environment estimation unit F104 incorporates a result of the estimation and a result of the calculation together with information on the subject vehicle acquired by various kinds of sensors into the external environment model F105. With the foregoing processing, the external environment model F105 is updated at any time.
- the external environment model F105 indicates the external environment of the vehicle 100.
- the external environment model F105 is a learning model generated by deep learning.
- the route search unit F106 searches for a wide-area route of the vehicle 100 based on the output from the position sensor 13 and the output from the external input unit 14 (e.g., car navigation data).
- the route generation unit F107 generates a travel route of the vehicle 100 based on an output from the external environment model F105 and an output from the route search unit F106. To the travel route generated by the route generation unit F107, a score of the safety, the fuel consumption, or the like of the vehicle 100 in the travel route is added, for example. Higher safety of the vehicle 100 in the travel route gives a lower score of the travel route. Lower fuel consumption of the vehicle 100 in the travel route gives a lower score of the travel route. The route generation unit F107 generates at least one travel route giving a relatively low (e.g., the lowest) score.
- the route generation unit F107 may generate a plurality of travel routes based on a plurality of viewpoints.
- the route generation unit F107 may be configured to receive the output from the driver input unit 16 and adjust the travel route in accordance with the output from the driver input unit 16, for example.
- a travel route with a relatively low score and a travel route adjusted in accordance with the output from the driver input unit 16 are generated, for example.
- the critical status determination unit F108 determines whether the vehicle 100 is in a critical status based on an output from a preprocessing unit F204 of the safety functional unit F2 (the position of the subject vehicle relative to the object included in the external environment of the vehicle 100). Examples of the critical status of the vehicle 100 include a status in which vehicle 100 may collide with the object, a status in which the vehicle 100 may go out of a lane, and the like.
- the critical status determination unit F108 may determine whether the vehicle 100 is in the critical status based on the external environment model F105. When determining that the vehicle 100 is in the critical status, the critical status determination unit F108 generates a target route for avoiding the critical situations.
- the first vehicle model F109 is a 6DoF vehicle model indicating the motion on six axes of the vehicle 100.
- the 6DoF vehicle model is obtained by modeling acceleration along three axes, namely, in the "forward/backward (surge)", “left/right (sway)", and “up/down (heave)” directions of the traveling vehicle 100, and the angular velocity along the three axes, namely, "pitch", "roll”, and "yaw.” That is, the first vehicle model F109 is a numerical model not grasping the motion of the vehicle 100 only on the plane (the forward/backward and left/right directions (i.e., the movement along the X-Y plane) and the yawing (along the Z-axis)) according to the classical vehicle motion engineering, but reproducing the behavior of the vehicle 100 using six axes in total.
- the six axes further include the pitching (along the Y-axis), rolling (along the X-axis) and the movement along the Z-axis (i.e., the up/down motion) of the vehicle body mounted on the four wheels with the suspension interposed therebetween.
- the first vehicle model F109 is generated based on the basic motion function of the vehicle 100 set in advance, the external environment of the vehicle 100, and the like. The first vehicle model F109 is updated as appropriate in accordance with changes in the external environment of the vehicle 100 and the like.
- the first vehicle model F109 is a learning model generated by deep learning, for example.
- the second vehicle model F110 indicates the energy consumption of the vehicle. Specifically, the second vehicle model F110 indicates cost (fuel consumption or electricity consumption) for the operation of the actuator AC of the vehicle 100.
- the second vehicle model F110 is obtained by modeling the opening/closing timing of intake/exhaust valves (not shown), the timing of injectors (not shown) injecting the fuel, the opening/closing timing of the valves for the exhaust gas recirculation system, and the like, for example, at the most improved fuel consumption in outputting a predetermined amount of the engine torque.
- the second vehicle model F110 is generated during the travel of the vehicle, and is updated as appropriate.
- the second vehicle model F110 is a learning model generated by deep learning, for example.
- the route determination unit F111 determines the target route of the vehicle 100 based on an output from the driver operation recognition unit F002, an output from the route generation unit F107, and an output from a route generation unit F206 of the safety functional unit F2. Specifically, the route determination unit F111 selects either the travel route generated by the route generation unit F107 or a travel route generated by the route generation unit F206 of the safety functional unit F2 as the target route. The route determination unit F111 may adjust the selected target route in accordance with the output from the driver operation recognition unit F002.
- the route determination unit F111 may preferentially select the travel route generated by the route generation unit F107 during normal traveling as the target route, for example.
- the route determination unit F111 may select the travel route generated by the route generation unit F206 of the safety functional unit F2 as the target route when the travel route generated by the route generation unit F107 does not pass through free space searched for by a free space search unit F205 of the safety functional unit F2.
- the target motion determination unit F112 determines the target motion of the vehicle 100 based on an output from the critical status determination unit F108, the first vehicle model F109, and an output from the route determination unit F111
- the target motion determination unit F112 may, upon input of the target route generated by the critical status determination unit F108 (the target route for avoiding critical situations), determine the target motion of the vehicle 100 based on the target route generated by the critical status determination unit F108 and the first vehicle model F109, for example.
- the target motion determination unit F112 may, when the target route generated by the critical status determination unit F108 is not input (the vehicle 100 is not in a critical status), determine the target motion of the vehicle 100 based on the target route generated by the route determination unit F111 and the first vehicle model F109.
- the vehicle motion energy setting unit F113 calculates driving torque required for the drive actuator, steering torque required for the steering actuator, and braking torque required for the braking actuator based on an output from the target motion determination unit F112. Specifically, the vehicle motion energy setting unit F113 calculates the driving torque, the steering torque, and the braking torque such that the motion of the vehicle 100 becomes the target motion determined by the target motion determination unit F112.
- the energy management unit F114 calculates a control amount of the actuator AC based on the second vehicle model F110 and an output from the vehicle motion energy setting unit F113. Specifically, the energy management unit F114 calculates the control amount of the actuator AC based on the second vehicle model F110 at the highest energy efficiency to achieve the target motion determined by the target motion determination unit F112. The energy management unit F114 calculates the opening/closing timing of intake/exhaust valves (not shown), the timing of injectors (not shown) injecting the fuel, and the like at the most improved fuel consumption to achieve the engine torque determined by the vehicle motion energy setting unit F113.
- the selector F115 outputs either the output from the vehicle motion energy setting unit F113 or an output from a vehicle motion energy setting unit F310 of the backup functional unit F3.
- the selector F116 outputs either an output from the energy management unit F114 or an output from an energy management unit F311 of the backup functional unit F3.
- the output from the selector F115 and the output from the selector F116 are each a control signal for cruise control of the vehicle 100.
- the selector F115 selects the output from the vehicle motion energy setting unit F113 when no abnormality (e.g., a fault) occurs in the main arithmetic unit F1, and selects the output from the vehicle motion energy setting unit F310 of the backup functional unit F3 when an abnormality occurs in the main arithmetic unit F1.
- the selector F116 selects the output from the energy management unit F114 when no abnormality occurs in the main arithmetic unit F1, and selects the output from the energy management unit F311 of the backup functional unit F3 when an abnormality occurs in the main arithmetic unit F1.
- the safety functional unit F2 recognizes the external environment of the vehicle 100 based on the output from the information acquisition unit 10 and searches the external environment of the vehicle 100 for free space. The safety functional unit F2 then generates a travel route passing through the free space. The travel route (the travel route passing through the free space) obtained by the safety functional unit F2 is used in the processing to determine the target route by the main arithmetic unit F1. For the processing by the safety functional unit F2, an algorithm based on a rule set in advance is used in place of the learning model generated by deep learning. In the safety functional unit F2, rule-based processing is performed.
- the safety functional unit F2 includes an object recognition unit F201 (image system), an object recognition unit F202 (radar system), a classification unit F203, the preprocessing unit F204, the free space search unit F205, and the route generation unit F206.
- the object recognition unit F201, the object recognition unit F202, the classification unit F203, and the preprocessing unit F204 belong to the recognition block B1.
- the free space search unit F205 and the route generation unit F206 belong to the determination block B2.
- the signal processing unit 201 includes part of the object recognition unit F201 (image system), while the recognition processing unit 202 includes the rest thereof.
- the determination processing unit 203 (specifically, the first control IC unit 23a) includes the object recognition unit F202 (radar system), the classification unit F203, the preprocessing unit F204, the free space search unit F205, and the route generation unit F206.
- the object recognition unit F201 recognizes the object included in the external environment of the vehicle 100 based on the output from the cameras 11. Thus, the object information is obtained.
- the object recognition unit F201 includes an image processing unit and an image recognition unit.
- the image processing unit performs image processing with respect to the image data which is the output from the cameras 11.
- the image recognition unit recognizes the object included in the external environment of the vehicle 100 based on the image data processed by the image processing unit.
- the image recognition unit of the object recognition unit F201 of the object recognition unit F201 performs object recognition processing using a known pattern recognition technique without using any learning model generated by deep learning, for example.
- another known object recognition technique image data-based object recognition technique
- the image processing unit of the object recognition unit F201 includes a first image processing unit that performs processing based on an output from the first cameras 11a and a second image processing unit that performs processing based on an output from the second cameras 11b.
- the image recognition unit of the object recognition unit F201 includes a first image recognition unit that performs processing based on an output from the first image processing unit and a second image recognition unit that performs processing based on an output from the second image processing unit.
- the signal processing unit 201 includes the image processing unit of the object recognition unit F201
- the determination processing unit 203 includes the image recognition unit of the object recognition unit F201.
- the first signal processing IC unit 21a includes the first image processing unit
- the second signal processing IC unit 21b includes the second image processing unit
- the first recognition processing IC unit 22a includes the first image recognition unit
- the second recognition processing IC unit 22b includes the second image recognition unit.
- the object recognition unit F202 recognizes the object included in the external environment of the vehicle 100 based on a detection result which is the output from the radars 12. Thus, the object information is obtained. Specifically, the object recognition unit F202 performs analysis processing with respect to the detection result of the radars 12. The object recognition unit F202 performs object recognition processing using a known object recognition technique (object recognition technique based on the detection result of the radars 12) without using any learning model generated by deep learning, for example.
- the object recognition unit F202 includes a first radar recognition unit that performs processing based on an output from the first radars 12a and a second radar recognition unit that performs processing based on an output from the second radars 12b.
- the first control IC unit 23a includes the first radar recognition unit and the second radar recognition unit.
- the classification unit F203 recognizes the external environment of the vehicle 100 based on an output from the object recognition unit F201 (image system) and an output from the object recognition unit F202 (radar system).
- the classification unit F203 performs recognition processing (rule-based recognition processing) using an algorithm based on a rule set in advance without using any learning model generated by deep learning.
- recognition processing for the rule-based recognition processing, a known recognition processing technique may be used.
- the classification unit F203 classifies the object recognized by the object recognition unit F201 and the object recognition unit F202 into a dynamic object and a stationary object.
- the classification unit F203 fuses the object information obtained by the object recognition unit F201 (image system) and the object information obtained by the object recognition unit F202 (radar system) for each of a plurality of areas (e.g., four areas of front, rear, right, and left) obtained by dividing a surrounding area surrounding the vehicle 100, for example.
- the classification unit F203 generates classification information of the object included in each of the areas.
- the classification information indicates that the object corresponds to which of the dynamic object and the stationary object.
- the preprocessing unit F204 performs preprocessing based on an output from the classification unit F203, the output from the vehicle status detection unit F001 of the main arithmetic unit F1, the output from the position sensor 13, and the output from the external input unit 14. In the preprocessing, classified-information fusion, object behavior prediction, and self-position estimation are performed.
- the preprocessing unit F204 fuses the classification information generated for each of a plurality of areas (e.g., four areas of front, rear, right, and left).
- the fused classification information is managed on a grid map (not shown) as the classification information on the surrounding area of the vehicle 100.
- the preprocessing unit F204 detects the dynamic object included in the external environment of the vehicle 100 based on the fused classification information.
- the preprocessing unit F204 predicts changes in the distance between the dynamic object and the vehicle, the direction of the dynamic object with respect to the subject vehicle, and the relative speed of the dynamic object with respect to the vehicle.
- a result of the prediction by the preprocessing unit F204 is managed as additional information of the dynamic obj ect.
- the preprocessing unit F204 estimates the position of the subject vehicle with respect to the object (the dynamic object and the stationary object) included in the external environment of the vehicle 100 based on the position of the vehicle 100 as the output from the position sensor 13, the high-precision map information as an example output from the external input unit 14, and the status of the vehicle 100 (the vehicle speed information, the 6DoF information, and the like) as the output from the vehicle status detection unit F001.
- the free space search unit F205 searches the external environment of the vehicle 100 for free space based on the output from the preprocessing unit F204.
- the free space is an area in which no obstacles are present out of roads included in the external environment of the vehicle 100.
- the obstacles include a dynamic obstacle and a static obstacle. Examples of the dynamic obstacle include other vehicles and pedestrians. Examples of the static obstacle include median strips, center poles, and the like.
- the free space may include a space on a road shoulder allowing emergency parking and the like, for example.
- the free space search unit F205 searches for the free space that can avoid a collision with the object whose position has been estimated by the preprocessing unit F204.
- the free space search unit F205 searched for the free space based on a search rule set in advance, for example.
- the search rule may include a rule that a predetermined range around the object (e.g., a range of a few meters) is set to be an unavoidable range.
- the free space search unit F205 may, when the object is the dynamic object, search for the free space in consideration of the moving speed of the dynamic object.
- the route generation unit F206 generates the travel route of the vehicle 100 based on an output from the free space search unit F205 and the output from the route search unit F106 of the main arithmetic unit F1 (wide-area route of the vehicle 100). Specifically, the route generation unit F206 generates a travel route passing through the free space obtained by the free space search unit F205.
- the route generation unit F206 may be configured to generate a plurality of travel routes passing through the free space, and select the one requiring the lowest route costs out of the travel routes, for example.
- the travel route (travel route passing through the free space) generated by the route generation unit F206 is output to the route determination unit F111 of the main arithmetic unit F1.
- the backup functional unit F3 recognizes the external environment of the vehicle 100 based on the output from the information acquisition unit 10, searches the external environment of the vehicle 100 for free space, and determines the target route of the vehicle 100 passing through the free space. The backup functional unit F3 then determines the target motion of the vehicle 100 based on the target route of the vehicle 100 and outputs a control signal based on the target motion of the vehicle 100.
- the control signal obtained by the backup functional unit F3 is supplied to the main arithmetic unit F1.
- an algorithm based on a rule set in advance is used for the processing by the backup functional unit F3, an algorithm based on a rule set in advance is used. In the backup functional unit F3, rule-based processing is performed.
- the backup functional unit F3 includes a vehicle status detection unit F301, a driver operation recognition unit F302, a classification unit F303, a preprocessing unit F304, a free space search unit F305, a route generation unit F306, a critical status determination unit F307, a route determination unit F308, a target motion determination unit F309, a vehicle motion energy setting unit F310, and an energy management unit F311.
- the vehicle status detection unit F301, the driver operation recognition unit F302, the classification unit F303, and the preprocessing unit F304 belong to the recognition block B1.
- the free space search unit F305, the route generation unit F306, the critical status determination unit F307, the route determination unit F308, and the target motion determination unit F309 belong to the determination block B2.
- the vehicle motion energy setting unit F310 and the energy management unit F311 belong to the operation block B3.
- the backup processing unit 204 (specifically, the second control IC unit 23b) includes a vehicle status detection unit F301, a driver operation recognition unit F302, a classification unit F303, a preprocessing unit F304, a free space search unit F305, a route generation unit F306, a critical status determination unit F307, a route determination unit F308, a target motion determination unit F309, a vehicle motion energy setting unit F310, and an energy management unit F311.
- the functions of the vehicle status detection unit F301 and the driver operation recognition unit F302 are the same as the respective functions of the vehicle status detection unit F001 and the driver operation recognition unit F002 of the main arithmetic unit F1.
- the functions of the classification unit F303, the preprocessing unit F304, the free space search unit F305, and the route generation unit F306 are the same as the respective functions of the classification unit F203, the preprocessing unit F204, the free space search unit F205, and the route generation unit F206 of the safety functional unit F2.
- the classification unit F303 performs processing based on the output from the object recognition unit F201 (image system) and the output from the object recognition unit F202 (radar system) of the safety functional unit F2.
- the backup functional unit F3 may include an object recognition unit (image system) and an object recognition unit (radar system) which is the same as the object recognition unit F201 (image system) and the object recognition unit F202 (radar system) of the safety functional unit F2, respectively.
- the classification unit F303 may perform the processing based on an output from the object recognition unit (the image system) and an output from the object recognition unit (the radar system) of the backup functional unit F3.
- the route determination unit F308 determines the target route of the vehicle 100 based on an output from the driver operation recognition unit F302 and an output from the route generation unit F306 (travel route passing through the free space). The route determination unit F308 selects any one out of a plurality of travel routes generated by the route generation unit F306 as the target route, for example. The route determination unit F308 may adjust the selected target route in accordance with the output from the driver operation recognition unit F302.
- the target motion determination unit F309 determines the target motion of the vehicle 100 based on an output from the critical status determination unit F307 (target route) and an output from the route determination unit F308. Unlike the target motion determination unit F112 of the main arithmetic unit F1, the target motion determination unit F309 determines the target motion of the vehicle 100 using an algorithm based on a rule set in advance without using any learning model generated by deep learning.
- the target motion determination unit F309 may, upon input of the target route generated by the critical status determination unit F307 (the target route for avoiding critical situations), determine the target motion of the vehicle 100 based on the target route generated by the critical status determination unit F307, for example.
- the target motion determination unit F309 may, when the target route generated by the critical status determination unit F307 is not input (the vehicle 100 is not in a critical status), determine the target motion of the vehicle 100 based on the target route generated by the route determination unit F308.
- the vehicle motion energy setting unit F310 calculates the driving torque required for the drive actuator, the steering torque required for the steering actuator, and the braking torque required for the braking actuator based on an output from the target motion determination unit F309. Each torque calculated by the vehicle motion energy setting unit F310 are output to the selector F115 of the main arithmetic unit F1.
- the energy management unit F311 calculates the control amount of the actuator AC based on the output from the vehicle motion energy setting unit F310. Specifically, the energy management unit F311 calculates the control amount of the actuator AC at the highest energy efficiency to achieve the target motion determined by the target motion determination unit F309. Unlike the energy management unit F114 of the main arithmetic unit F1, the energy management unit F311 calculates the control amount of the actuator AC using an algorithm based on a rule set in advance without using any learning model generated by deep learning. The control amount calculated by the energy management unit F311 is output to the selector F116 of the main arithmetic unit F1.
- the control unit 20 performs a first operation, a second operation, and a third operation.
- the vehicle control system 1 is provided with a first signal system including a plurality of first cameras 11a, and a second signal system including a plurality of second cameras 11b.
- the control unit 20 performs the first operation when both of the first and second signal systems have no abnormality.
- the control unit 20 performs the second operation when the second signal system between the first and second signal systems has an abnormality.
- the control unit 20 performs the third operation when the first signal system between the first and second signal systems has an abnormality.
- the control unit 20 outputs a control signal (signal for cruise control of the vehicle 100) based on both of the output from the first cameras 11a and the output from the second cameras 11b.
- the control unit 20 outputs, during the first operation, the control signal based on both of the result of the recognition processing by the first recognition processing IC unit 22a and the result of the recognition processing by the second recognition processing IC unit 22b.
- the first signal processing IC unit 21a processes the output from the first cameras 11a
- the first recognition processing IC unit 22a processes the output from the first signal processing IC unit 21a
- the second signal processing IC unit 21b processes the output from the second cameras 11b
- the second recognition processing IC unit 22b processes the output from the second signal processing IC unit 21b.
- the second recognition processing IC unit 22b outputs the result of the recognition processing to the first recognition processing IC unit 22a.
- the first recognition processing IC unit 22a fuses the result of the recognition processing by the first recognition processing IC unit 22a and the result of the recognition processing by the second recognition processing IC unit 22b, and outputs a recognition result obtained by this fusing to the first control IC unit 23a.
- the first control IC unit 23a outputs the control signal based on the output from the first recognition processing IC unit 22a.
- control unit 20 outputs the control signal based on the output from the first cameras 11a.
- control unit 20 outputs, during the second operation, the control signal based on the result of the recognition processing by the first recognition processing IC unit 22a.
- the first signal processing IC unit 21a processes the output from the first cameras 11a
- the first recognition processing IC unit 22a processes the output from the first signal processing IC unit 21a.
- the first recognition processing IC unit 22a outputs the result of the recognition processing by the first recognition processing IC unit 22a to the first control IC unit 23a.
- the first control IC unit 23a outputs the control signal based on the result of the recognition processing by the first recognition processing IC unit 22a.
- control unit 20 outputs the control signal based on the output from the second cameras 11b.
- control unit 20 outputs, during the third operation, the control signal based on the result of the recognition processing by the second recognition processing IC unit 22b.
- the second signal processing IC unit 21b processes the output from the second cameras 11b
- the second recognition processing IC unit 22b processes the output from the second signal processing IC unit 21b.
- the second recognition processing IC unit 22b outputs the result of the recognition processing to the first recognition processing IC unit 22a.
- the first recognition processing IC unit 22a outputs the result of the recognition processing by the second recognition processing IC unit 22b to the first control IC unit 23a.
- the first control IC unit 23a outputs the control signal based on the result of the recognition processing by the first recognition processing IC unit 22a.
- the vehicle 100 has two combinations of the cameras 11 provided in the vehicle 100 so as to surround the vehicle 100. Provision of the first cameras 11a so as to surround the vehicle 100 allows an environment spread around the vehicle 100 (the external environment of the vehicle 100) to be monitored based on the output from the first cameras 11a. Similarly, provision of the second cameras 11b so as to surround the vehicle 100 allows the external environment of the vehicle 100 to be monitored based on the output from the second cameras 11b. As described above, the external environment of the vehicle 100 may be monitored based on at least either one of the output from the first cameras 11a or the output from the second cameras 11b.
- This configuration allows automatic switching among the first, second, and third operations according to the presence or absence of abnormality in the first and second signal systems. Accordingly, when either one of the first or second signal system has an abnormality, the second or third operation may be automatically continued. This allows automatic continuation of the cruise control of the vehicle 100.
- the first cameras 11a include a first front camera 111a, a first diagonally backward right camera 112a, and a first diagonally backward left camera 113a.
- the second cameras 11b include a second front camera 111b, a second diagonally backward right camera 112b, and a second diagonally backward left camera 113b. This configuration allows the cruise control of the vehicle 100 to be continued based on at least an area in front of the vehicle 100, an area diagonally backward right of the vehicle 100, and an area diagonally backward left of the vehicle 100 in the environment spread around the vehicle 100 (external environment of the vehicle 100).
- this allows continuous cruise control based on the area in front of the vehicle 100 (e.g., control for maintaining an appropriate distance from other vehicles traveling in front of the subject vehicle) and control based on the area diagonally backward right of the vehicle 100 and the area diagonally backward left of the vehicle 100 (e.g., control for sensing critical situations when the subject vehicle performs lane changing).
- the recognition processing by both of the first recognition processing IC unit 22a and the second recognition processing IC unit 22b allows improvement of recognition accuracy, compared with recognition processing by either one of the first recognition processing IC unit 22a or the second recognition processing IC unit 22b.
- the recognition processing unit 202 may be constituted as a single recognition processing IC unit (first recognition processing IC unit 22a in the example of FIG. 8 ).
- the output from the second signal processing IC unit 21b is supplied to the first recognition processing IC unit 22a.
- the first recognition processing IC unit 22a performs recognition processing based on the output from the first signal processing IC unit 21a and the output from the second signal processing IC unit 21b.
- the signal processing unit 201 may be constituted as a single signal processing IC unit (e.g., the first signal processing IC unit 21a).
- the above description is made based on the case in which two combinations of cameras 11 are provided in the vehicle 100 so as to surround the vehicle 100 as an example, but is not limited by this example.
- three combinations of cameras 11 may be provided in the vehicle 100 so as to surround the vehicle 100.
- the above description is made based on the case in which two combinations of the radars 12 are provided in the vehicle 100 so as to surround the vehicle 100 as an example, but is not limited by this example.
- three combinations of radars 12 may be provided in the vehicle 100 so as to surround the vehicle 100.
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
Abstract
Description
- The technique disclosed herein relates to a vehicle control system.
- The Japanese Unexamined Patent Publication No.
H11-16099 - The device such as in Japanese Unexamined Patent Publication No.
HI 1-16099 H11-16099 - The technique disclosed herein has been made in view of this point, and an object thereof is to improve continuity of cruise control of the vehicle.
- The technique disclosed herein relates to a vehicle control system for controlling a vehicle. This vehicle control system includes: a plurality of first cameras disposed in the vehicle so as to surround the vehicle; a plurality of second cameras disposed in the vehicle so as to surround the vehicle; and a control unit that performs a first operation of outputting a control signal for cruise control of the vehicle based on both outputs from the plurality of the first cameras and outputs from the plurality of the second cameras, a second operation of outputting the control signal based on the outputs from the plurality of the first cameras, and a third operation of outputting the control signal based on the outputs from the plurality of the second cameras.
- This configuration includes two combinations of cameras provided in the vehicle so as to surround the vehicle. Provision of the first cameras so as to surround the vehicle allows an environment spread around the vehicle (the external environment of the vehicle) to be monitored based on the output from the first cameras. Similarly, provision of the second cameras so as to surround the vehicle allows the external environment of the vehicle to be monitored based on the output from the second cameras. As described above, the external environment of the vehicle may be monitored based on at least either one of the output from the first cameras or the output from the second cameras. Accordingly, even when either one of the first signal system including the first cameras or the second signal system including the second cameras has an abnormality, it is possible to continue the cruise control of the vehicle based on the output from the cameras in the other signal system by performing the second or third operation. This allows improvement in continuity of cruise control of the vehicle.
- In the vehicle control system, the control unit may be configured to perform the first operation when a first signal system including the plurality of first cameras and a second signal system including the plurality of second cameras both do not have an abnormality, the second operation when the second signal system between the first and second signal systems has an abnormality, and the third operation when the first signal system between the first and second signal systems has an abnormality.
- This configuration allows automatic switching among the first, second, and third operations according to the presence or absence of abnormality in the first and second signal systems. Accordingly, when either one of the first or second signal system has an abnormality, the second or third operation may be automatically performed. This allows automatic continuation of the cruise control of the vehicle.
- In the vehicle control system, the plurality of first cameras may include a first front camera that images an area in front of the vehicle, a first diagonally backward right camera that images an area diagonally backward right of the vehicle, and a first diagonally backward left camera that images an area diagonally backward left of the vehicle. The plurality of second cameras may include a second front camera that images an area in front of the vehicle, a second diagonally backward right camera that images an area diagonally backward right of the vehicle, and a second diagonally backward left camera that images an area diagonally backward left of the vehicle.
- This configuration allows continuous cruise control of the vehicle based on at least an area in front of the vehicle, an area diagonally backward right of the vehicle, and an area diagonally backward left of the vehicle in the environment spread around the vehicle (external environment of the vehicle). Accordingly, even when either one of the first or second signal system has an abnormality, this allows continuous cruise control based on the area in front of the vehicle (e.g., control for maintaining an appropriate distance from other vehicles traveling in front of the subject vehicle) and a control based on the area diagonally backward right of the vehicle and the area diagonally backward left of the vehicle (e.g., control for sensing critical situations when the subject vehicle performs lane changing).
- In the vehicle control system, the control unit may include: a first recognition processing IC unit that performs recognition processing for recognizing an external environment of the vehicle based on an output from the plurality of first cameras; and a second recognition processing IC unit that performs the recognition processing based on the output of the plurality of second cameras. The control unit may be configured to output, during the first operation, the control signal based on both of a result of the recognition processing by the first recognition processing IC unit and a result of the recognition processing by the second recognition processing IC unit, to output, during the second operation, the control signal based on the result of the recognition processing by the first recognition processing IC unit, and to output, during the third operation, the control signal based on the result of the recognition processing by the second recognition processing IC unit.
- This configuration allows improvement of recognition accuracy of the recognition processing by both of the first and second recognition processing IC units, compared with the recognition accuracy of the recognition processing by either one of the first or second recognition processing IC unit.
-
-
FIG. 1 is a plan view illustrating a vehicle including a vehicle control system according to an embodiment. -
FIG. 2 is a plan view illustrating monitoring areas of cameras and radars. -
FIG. 3 is a plan view illustrating monitoring areas of first cameras and first radars. -
FIG. 4 is a plan view illustrating monitoring areas of second camera and second radars. -
FIG. 5 is a block diagram illustrating a configuration of the vehicle control system according to an embodiment. -
FIG. 6 is a block diagram illustrating a functional configuration of the vehicle control system. -
FIG. 7 is a block diagram illustrating a functional configuration of the vehicle control system. -
FIG. 8 is a block diagram illustrating the configuration of the vehicle control system according to a variation of the embodiment. - Embodiments will be described in detail below with reference to the drawings. In the drawings, the same or equivalent parts are denoted by the same reference numerals, and a description thereof is not repeated.
-
FIG. 1 illustrates an appearance of avehicle 100 including avehicle control system 1 according to an embodiment. Thevehicle control system 1 is provided in the vehicle 100 (specifically, a four-wheeled vehicle). Thevehicle 100 can switch among manual driving, assisted driving, and self-driving. The manual driving is driving to travel in accordance with driver's operation (e.g., an accelerator operation and the like). The assisted driving is driving to travel with assistance of the driver's operation. The self-driving is driving to travel without the driver's operation. Thevehicle control system 1 controls thevehicle 100 during the assisted driving and the self-driving. Specifically, thevehicle control system 1 controls an actuator (not shown) provided in thevehicle 100 to control the motion (specifically, traveling) of thevehicle 100. In the following description, thevehicle 100 provided with thevehicle control system 1 is referred to as "the subject vehicle," whereas another vehicle present around the subject vehicle is referred to as "another vehicle (other vehicles)." - The actuator provided in the
vehicle 100 includes a drive actuator, a steering actuator, a braking actuator, and the like. Examples of the drive actuator include an engine, a motor, and a transmission. Examples of the steering actuator include steering. Examples of the braking actuator include a brake. - The
vehicle control system 1 includes aninformation acquisition unit 10 and acontrol unit 20. Thecontrol unit 20 is housed in a single housing installed in a specific position within thevehicle 100 such as a lower part of a passenger's seat or a trunk, for example. - The
information acquisition unit 10 acquires various kinds of information for use in control (specifically, cruise control) of thevehicle 100. As illustrated inFIGS. 1 ,5 , and6 , theinformation acquisition unit 10 includes a plurality ofcameras 11, a plurality ofradars 12, aposition sensor 13, anexternal input unit 14,mechanical sensors 15, and adriver input unit 16.FIGS. 1 and5 omit illustration of theposition sensor 13, theexternal input unit 14, themechanical sensors 15, and thedriver input unit 16. - The
cameras 11 have the same configuration. Thecameras 11 are provided in thevehicle 100 so as to surround thevehicle 100. Each of thecameras 11 images part of an environment spread around the vehicle 100 (an external environment of the vehicle 100) to acquire image data indicating part of the external environment of thevehicle 100. The image data obtained by each of thecameras 11 is transmitted to thecontrol unit 20. - In this example, the
cameras 11 are each a monocular camera having a wide-angle lens. Thecameras 11 are each constituted using a solid imaging element such as a charge coupled device (CCD) and a complementary metal-oxide-semiconductor (CMOS), for example. Thecameras 11 may each be a monocular camera having a narrow-angle lens or a stereo camera having wide-angle lenses or narrow-angle lenses. - The
cameras 11 include a plurality offirst cameras 11a and a plurality ofsecond cameras 11b. Thisvehicle 100 has two combinations of thecameras 11 provided in thevehicle 100 so as to surround thevehicle 100. - The
first cameras 11a are provided in thevehicle 100 so as to surround thevehicle 100. Specifically, thefirst cameras 11a are provided in the vehicle such that imaging areas of thefirst cameras 11a surround thevehicle 100. In this example, thefirst cameras 11a include a firstfront camera 111a, a first diagonally backwardright camera 112a, a first diagonally backward leftcamera 113a, and afirst back camera 114a. The firstfront camera 111a images an area in front of thevehicle 100. The first diagonally backwardright camera 112a images an area diagonally backward right of thevehicle 100. The first diagonally backward leftcamera 113a images an area diagonally backward left of thevehicle 100. Thefirst back camera 114a images an area behind thevehicle 100. - The
second cameras 11b are provided in thevehicle 100 so as to surround thevehicle 100. Specifically, thesecond cameras 11b are provided in the vehicle such that imaging areas of thesecond cameras 11b surround thevehicle 100. In this example, thesecond cameras 11b include a secondfront camera 111b, a second diagonally backwardright camera 112b, a second diagonally backward leftcamera 113b, and asecond back camera 114b. The secondfront camera 111b images an area in front of thevehicle 100. The second diagonally backwardright camera 112b images an area diagonally backward right of thevehicle 100. The second diagonally backward leftcamera 113b images an area diagonally backward right of thevehicle 100. Thesecond back camera 114b images an area behind thevehicle 100. - The
radars 12 have the same configuration. Theradars 12 are provided in thevehicle 100 so as to surround thevehicle 100. Each of theradars 12 detects part of the external environment of thevehicle 100. Specifically, theradars 12 each transmit radio waves toward part of the external environment of thevehicle 100 and receive reflected waves from the part of the external environment of thevehicle 100 to detect the part of the external environment of thevehicle 100. Detection results of theradars 12 are transmitted to thecontrol unit 20. - The
radars 12 may each be a millimeter-wave radar that transmits millimeter waves, a lidar (light detection and ranging) that transmits laser light, an infrared radar that transmits infrared rays, or an ultrasonic radar that transmits ultrasonic waves, for example. - The
radars 12 include a plurality offirst radars 12a and a plurality ofsecond radars 12b. Thisvehicle 100 has two combinations of theradars 12 provided in thevehicle 100 so as to surround thevehicle 100. - The
first radars 12a are provided in thevehicle 100 so as to surround thevehicle 100. Specifically, thefirst radars 12a are provided in the vehicle such that detection areas of thefirst radars 12a surround thevehicle 100. In this example, thefirst radars 12a include afirst front radar 121a, a first diagonally backwardright radar 122a, and a first diagonally backward leftradar 123a. Thefirst front radar 121a detects the external environment in front of thevehicle 100. The first diagonally backwardright radar 122a detects the external environment diagonally backward right of thevehicle 100. The first diagonally backward leftradar 123a detects the external environment diagonally backward left of thevehicle 100. - The
second radars 12b are provided in thevehicle 100 so as to surround thevehicle 100. Specifically, thesecond radars 12b are provided in thevehicle 100 such that detection areas of thesecond radars 12b surround thevehicle 100. In this example, thesecond radars 12b include asecond front radar 121b, a second diagonally backwardright radar 122b, a second diagonally backward leftradar 123b, and asecond back radar 124b. Thesecond front radar 121b detects the external environment in front of thevehicle 100. The second diagonally backwardright radar 122b detects the external environment diagonally backward right of thevehicle 100. The second diagonally backward leftradar 123b detects the external environment diagonally backward left of thevehicle 100. Thesecond back radar 124b detects the external environment behind thevehicle 100. -
FIG. 2 illustrates imaging areas (monitoring areas) of thecameras 11 and detection areas (monitoring areas) of theradars 12.FIG. 3 illustrates imaging areas of thefirst cameras 11a and detection areas of thefirst radars 12a.FIG. 4 illustrates imaging areas ofsecond cameras 11b and detection areas of thesecond radars 12b. InFIGS. 2 to 4 , each thicker broken line indicates an imaging area of each of thefirst cameras 11a, and each thicker dot-and-dash line indicates a detection area of each of thefirst radars 12a. Each thinner broken line indicates an imaging area of each of thesecond cameras 11b, and a thinner dot-and-dash line indicates a detection area of each of thesecond radars 12b. - As illustrated in
FIG. 3 , in this example, each monitoring area and each arrangement of thefirst cameras 11a and thefirst radars 12a are set such that a combination of their monitoring areas surrounds the entire circumference of thevehicle 100. Similarly, as illustrated inFIG. 4 , each monitoring area and each arrangement of thesecond cameras 11b and thesecond radars 12b are set such that a combination of their monitoring areas surrounds the entire circumference of thevehicle 100. - The
position sensor 13 detects the position (e.g., the latitude and the longitude) of thevehicle 100. Theposition sensor 13 receives GPS information from the Global Positioning System and detects the position of thevehicle 100 based on the GPS information, for example. The information (the position of the vehicle 100) obtained by theposition sensor 13 is transmitted to thecontrol unit 20. - The
external input unit 14 receives information through an extra-vehicle network (e.g., the Internet and the like) provided outside thevehicle 100. Theexternal input unit 14 receives communication information from another vehicle (not shown) positioned around thevehicle 100, car navigation data from a navigation system (not shown), traffic information, high-precision map information, and the like, for example. The information obtained by theexternal input unit 14 is transmitted to thecontrol unit 20. - The
mechanical sensors 15 detect the status (e.g., the speed, the acceleration, the yaw rate, and the like) of thevehicle 100. Themechanical sensors 15 include a vehicle speed sensor that detects the speed of thevehicle 100, an acceleration sensor that detects the acceleration of thevehicle 100, a yaw rate sensor that detects the yaw rate of thevehicle 100, and the like, for example. The information (the status of the vehicle 100) obtained by themechanical sensors 15 is transmitted to thecontrol unit 20. - The
driver input unit 16 detects driving operations applied to thevehicle 100. Thedriver input unit 16 includes an accelerator position sensor, a steering angle sensor, a brake hydraulic pressure sensor, and the like, for example. The accelerator position sensor detects an accelerator operation amount of thevehicle 100. The steering angle sensor detects a steering angle of a steering wheel of thevehicle 100. The brake hydraulic pressure sensor detects a brake operation amount of thevehicle 100. The information (the driving operation of the vehicle 100) obtained by thedriver input unit 16 is transmitted to thecontrol unit 20. -
FIG. 5 illustrates a configuration of thecontrol unit 20. In this example, thecontrol unit 20 includes a first signalprocessing IC unit 21a, a second signalprocessing IC unit 21b, a first recognitionprocessing IC unit 22a, a second recognitionprocessing IC unit 22b, a firstcontrol IC unit 23a, and a secondcontrol IC unit 23b. Each of these IC units may include a single integrated circuit (IC) or a plurality of ICs. The IC may house a single core or die or house a plurality of cores or dies cooperating with each other. The core or die may include a CPU (processor) and a memory storing therein a program for operating the CPU and information such as processing results by the CPU, for example. - In this example, the first signal
processing IC unit 21a and the second signalprocessing IC unit 21b constitute asignal processing unit 201. The first recognitionprocessing IC unit 22a and the second recognitionprocessing IC unit 22b constitute arecognition processing unit 202. The firstcontrol IC unit 23a constitutes adetermination processing unit 203. The secondcontrol IC unit 23b constitutes abackup processing unit 204. - The
signal processing unit 201 performs image processing with respect to the output from thecameras 11. Thesignal processing unit 201 outputs image data obtained by the image processing. Specifically, the first signalprocessing IC unit 21a performs image processing with respect to the output from thefirst cameras 11a. The second signalprocessing IC unit 21b performs image processing with respect to the output from thesecond cameras 11b. - The
recognition processing unit 202 performs recognition processing for recognizing the external environment of thevehicle 100 based on the output (the image data) from thesignal processing unit 201. Therecognition processing unit 202 outputs external environment data obtained by the recognition processing. Specifically, the first recognitionprocessing IC unit 22a performs recognition processing based on the output which has been processed by the first signalprocessing IC unit 21a and output from thefirst cameras 11a. The second recognitionprocessing IC unit 22b performs recognition processing based on the output which has been processed by the second signalprocessing IC unit 21b and output from thesecond cameras 11b. - The
determination processing unit 203 performs determination processing for cruise control of thevehicle 100 based on the output (the external environment data) from therecognition processing unit 202. Specifically, the firstcontrol IC unit 23a performs determination processing based on the output from the first recognitionprocessing IC unit 22a and/or the output from the second recognitionprocessing IC unit 22b. Thedetermination processing unit 203 then outputs a control signal for cruise control of thevehicle 100 based on a result of the determination processing. - The
backup processing unit 204 performs recognition processing for recognizing the external environment of thevehicle 100 based on the output (the image data) from thesignal processing unit 201. Specifically, the secondcontrol IC unit 23b performs recognition processing based on the output from the first signalprocessing IC unit 21a and/or the output from the second signalprocessing IC unit 21b. Thebackup processing unit 204 performs determination processing for cruise control of thevehicle 100 based on a result of the recognition processing. Thebackup processing unit 204 then outputs a control signal for cruise control of thevehicle 100 based on the result of the determination processing. - The following describes a functional configuration of the
vehicle control system 1 with reference toFIGS. 6 and7 . The function of thevehicle control system 1 is broadly divided into a recognition block B1, a determination block B2, and an operation block B3. The recognition block B1 recognizes the external environment of thevehicle 100 based on the various kinds of information acquired by theinformation acquisition unit 10. The recognition block B1 may be configured to recognize an internal environment of thevehicle 100. The determination block B2 determines a status and condition of thevehicle 100 based on a recognition result of the recognition block B1 and determines a target operation of thevehicle 100 based on a result of the determination. The operation block B3 generates a signal for controlling the actuator AC provided in thevehicle 100 based on the target operation of thevehicle 100 determined by the determination block B2 and outputs the signal to the actuator AC. - In this example, the
vehicle control system 1 includes a main arithmetic unit F1, a safety functional unit F2, and a backup functional unit F3. - The main arithmetic unit F1 recognizes the external environment of the
vehicle 100 based on the output from theinformation acquisition unit 10 and determines a target route of thevehicle 100 based on the external environment of thevehicle 100. The main arithmetic unit F1 determines a target motion of thevehicle 100 based on the target route of thevehicle 100 and outputs a control signal based on the target motion of thevehicle 100. For the processing by the main arithmetic unit F1, a learning model generated by deep learning is used. In the deep learning, a multilayered neural network (deep neural network) is used. Examples of the multilayered neural network include Convolutional Neural Network (CNN). - In this example, the main arithmetic unit F1 includes a vehicle status detection unit F001, a driver operation recognition unit F002, an object recognition unit F101 (an image system), an object recognition unit F102 (a radar system), a map generation unit F103, an external environment estimation unit F104, an external environment model F105, a route search unit F106, a route generation unit F107, a critical status determination unit F108, a first vehicle model F109, a second vehicle model F110, a route determination unit F111, a target motion determination unit F112, a vehicle motion energy setting unit F113, an energy management unit F114, a selector F115, and a selector F116.
- The vehicle status detection unit F001, the driver operation recognition unit F002, the object recognition unit F101, the object recognition unit F102, the map generation unit F103, the external environment estimation unit F104, and the external environment model F105 belong to the recognition block B1. The route search unit F106, the route generation unit F107, the critical status determination unit F108, the first vehicle model F109, the route determination unit F111, and the target motion determination unit F112 belong to the determination block B2. The second vehicle model F110, the vehicle motion energy setting unit F113, the energy management unit F114, the selector F115, and the selector F116 belong to the operation block B3.
- In this example, the
signal processing unit 201 includes part of the object recognition unit F101 (the image system), while therecognition processing unit 202 includes the rest thereof. Therecognition processing unit 202 includes the object recognition unit F102 (the radar system) and the map generation unit F103. The recognition processing unit 202 (specifically, the first recognitionprocessing IC unit 22a) includes the external environment estimation unit F104, the external environment model F105, the route search unit F106, the route generation unit F107, the first vehicle model F109, and the second vehicle model F110. The determination processing unit 203 (specifically, the firstcontrol IC unit 23a) includes the vehicle status detection unit F001, the driver operation recognition unit F002, the critical status determination unit F108, the route determination unit F111, the target motion determination unit F112, the vehicle motion energy setting unit F113, the energy management unit F114, the selector F115, and the selector F116. - The vehicle status detection unit F001 recognizes the status of the vehicle 100 (e.g., speed, acceleration, yaw rate, and the like) based on the output from the
mechanical sensors 15. - The driver operation recognition unit F002 recognizes the driving operations applied to the
vehicle 100 based on the output from thedriver input unit 16. - The object recognition unit F101 recognizes an object included in the external environment of the
vehicle 100 based on the output from thecameras 11. Thus, information on the object (object information) is obtained. The object information indicates the type of the object, the shape of the object, and the like, for example. Examples of the object include a dynamic object that moves with the lapse of time and a stationary object that does not move with the lapse of time. Examples of the dynamic object include four-wheeled vehicles, motorcycles, bicycles, pedestrians, and the like. Examples of the stationary object include signs, roadside trees, median strips, center poles, buildings, and the like. - Specifically, the object recognition unit F101 includes an image processing unit and an image recognition unit. The image processing unit performs image processing with respect to the image data which is the output from the
cameras 11. This image processing includes distortion correction processing for correcting the distortion of an image presented in the image data, white balance adjustment processing for adjusting the brightness of the image presented in the image data, and the like. The image recognition unit recognizes the object included in the external environment of thevehicle 100 based on the image data processed by the image processing unit. For object recognition processing by the image recognition unit of the object recognition unit F101, a known object recognition technique (an image data-based object recognition technique) may be used, for example. The image recognition unit of the object recognition unit F101 may be configured to perform the object recognition processing using a learning model generated by deep learning. - In this example, the image processing unit of the object recognition unit F101 includes a first image processing unit that performs processing based on an output from the
first cameras 11a and a second image processing unit that performs processing based on an output from thesecond cameras 11b. The image recognition unit of the object recognition unit F101 includes a first image recognition unit that performs processing based on an output from the first image processing unit and a second image recognition unit that performs processing based on an output from the second image processing unit. In this example, thesignal processing unit 201 includes the image processing unit of the object recognition unit F101, while therecognition processing unit 202 includes the image recognition unit of the object recognition unit F101. Specifically, the first signalprocessing IC unit 21a includes the first image processing unit, and the second signalprocessing IC unit 21b includes the second image processing unit. The first recognitionprocessing IC unit 22a includes the first image recognition unit, and the second recognitionprocessing IC unit 22b includes the second image recognition unit. - The object recognition unit F102 recognizes the object included in the external environment of the
vehicle 100 based on a detection result which is the output from the radars 12 (e.g., a peak list of the reflected waves). Thus, the object information is obtained. Specifically, the object recognition unit F102 performs analysis processing (processing for obtaining the object information) on the detection result of theradars 12. For the object recognition processing by the object recognition unit F102, a known object recognition technique (an object recognition technique based on the detection result of the radars 12) may be used, for example. The obj ect recognition unit F102 may be configured to perform the obj ect recognition processing using a learning model generated by deep learning. - In this example, the object recognition unit F102 includes a first radar recognition unit that performs processing based on an output from the
first radars 12a and a second radar recognition unit that performs processing based on an output from thesecond radars 12b. In this example, the first recognitionprocessing IC unit 22a includes the first radar recognition unit, and the second recognitionprocessing IC unit 22b includes the second radar recognition unit. - The map generation unit F103 generates map data (e.g., three-dimensional map data) indicating the external environment of the
vehicle 100 based on an output from the object recognition unit F101 (image system) and an output from the obj ect recognition unit F 102 (radar system). The map generation unit F103 generates the map data for each of a plurality of areas (e.g., four areas of front, rear, right, and left) obtained by dividing a surrounding area surrounding thevehicle 100, for example. In response to the input of the object information obtained by each of the object recognition unit F101 (the image system) and the object recognition unit F102 (the radar system) to the map generation unit F103, the map generation unit F103 fuses the pieces of object information, and reflects the object information obtained by the fusion in the map data. - In this example, the map generation unit F103 includes a first map generation unit that performs processing based on an output from the first image recognition unit of the object recognition unit F101 and an output from the first radar recognition unit of the object recognition unit F102 and a second map generation unit that performs processing based on an output from the second image recognition unit of the object recognition unit F101 and an output from the second radar recognition unit of the object recognition unit F102. In this example, the first recognition
processing IC unit 22a includes the first map generation unit, and the second recognitionprocessing IC unit 22b includes the second map generation unit. - The external environment estimation unit F 104 estimates the external environment of the
vehicle 100 based on an output from the vehicle status detection unit F001, an output from the map generation unit F103, an output from theposition sensor 13, and an output from the external input unit 14 (e.g., high-precision map information). Specifically, the external environment estimation unit F104 generates the three-dimensional map data indicating the external environment of thevehicle 100 by image recognition processing based on the external environment model F105. - In this example, the external environment estimation unit F104 performs the following operation. First, the external environment estimation unit F104 fuses map data for each of a plurality of areas (e.g., four areas of front, rear, right, and left) to generate fused map data indicating the surroundings (the external environment) of the
vehicle 100. Next, for each of dynamic objects included in the fused map data, the external environment estimation unit F104 predicts changes in the distance, direction, and relative speed between the dynamic object and the subject vehicle. The external environment estimation unit F104 then incorporates a result of the prediction into the external environment model F105. Further, the external environment estimation unit F104 estimates the position of the subject vehicle in the fused map data and calculates a route cost based on the output from the position sensor 13 (the position of the vehicle 100), the output from the external input unit 14 (the high-precision map information), and the output from the vehicle status detection unit F001 (e.g., vehicle speed information, six degrees of freedom (6DoF) information, and the like). The external environment estimation unit F104 incorporates a result of the estimation and a result of the calculation together with information on the subject vehicle acquired by various kinds of sensors into the external environment model F105. With the foregoing processing, the external environment model F105 is updated at any time. - The external environment model F105 indicates the external environment of the
vehicle 100. The external environment model F105 is a learning model generated by deep learning. - The route search unit F106 searches for a wide-area route of the
vehicle 100 based on the output from theposition sensor 13 and the output from the external input unit 14 (e.g., car navigation data). - The route generation unit F107 generates a travel route of the
vehicle 100 based on an output from the external environment model F105 and an output from the route search unit F106. To the travel route generated by the route generation unit F107, a score of the safety, the fuel consumption, or the like of thevehicle 100 in the travel route is added, for example. Higher safety of thevehicle 100 in the travel route gives a lower score of the travel route. Lower fuel consumption of thevehicle 100 in the travel route gives a lower score of the travel route. The route generation unit F107 generates at least one travel route giving a relatively low (e.g., the lowest) score. - The route generation unit F107 may generate a plurality of travel routes based on a plurality of viewpoints. The route generation unit F107 may be configured to receive the output from the
driver input unit 16 and adjust the travel route in accordance with the output from thedriver input unit 16, for example. Thus, a travel route with a relatively low score and a travel route adjusted in accordance with the output from thedriver input unit 16 are generated, for example. - The critical status determination unit F108 determines whether the
vehicle 100 is in a critical status based on an output from a preprocessing unit F204 of the safety functional unit F2 (the position of the subject vehicle relative to the object included in the external environment of the vehicle 100). Examples of the critical status of thevehicle 100 include a status in whichvehicle 100 may collide with the object, a status in which thevehicle 100 may go out of a lane, and the like. The critical status determination unit F108 may determine whether thevehicle 100 is in the critical status based on the external environment model F105. When determining that thevehicle 100 is in the critical status, the critical status determination unit F108 generates a target route for avoiding the critical situations. - The first vehicle model F109 is a 6DoF vehicle model indicating the motion on six axes of the
vehicle 100. The 6DoF vehicle model is obtained by modeling acceleration along three axes, namely, in the "forward/backward (surge)", "left/right (sway)", and "up/down (heave)" directions of the travelingvehicle 100, and the angular velocity along the three axes, namely, "pitch", "roll", and "yaw." That is, the first vehicle model F109 is a numerical model not grasping the motion of thevehicle 100 only on the plane (the forward/backward and left/right directions (i.e., the movement along the X-Y plane) and the yawing (along the Z-axis)) according to the classical vehicle motion engineering, but reproducing the behavior of thevehicle 100 using six axes in total. The six axes further include the pitching (along the Y-axis), rolling (along the X-axis) and the movement along the Z-axis (i.e., the up/down motion) of the vehicle body mounted on the four wheels with the suspension interposed therebetween. The first vehicle model F109 is generated based on the basic motion function of thevehicle 100 set in advance, the external environment of thevehicle 100, and the like. The first vehicle model F109 is updated as appropriate in accordance with changes in the external environment of thevehicle 100 and the like. The first vehicle model F109 is a learning model generated by deep learning, for example. - The second vehicle model F110 indicates the energy consumption of the vehicle. Specifically, the second vehicle model F110 indicates cost (fuel consumption or electricity consumption) for the operation of the actuator AC of the
vehicle 100. The second vehicle model F110 is obtained by modeling the opening/closing timing of intake/exhaust valves (not shown), the timing of injectors (not shown) injecting the fuel, the opening/closing timing of the valves for the exhaust gas recirculation system, and the like, for example, at the most improved fuel consumption in outputting a predetermined amount of the engine torque. The second vehicle model F110 is generated during the travel of the vehicle, and is updated as appropriate. The second vehicle model F110 is a learning model generated by deep learning, for example. - The route determination unit F111 determines the target route of the
vehicle 100 based on an output from the driver operation recognition unit F002, an output from the route generation unit F107, and an output from a route generation unit F206 of the safety functional unit F2. Specifically, the route determination unit F111 selects either the travel route generated by the route generation unit F107 or a travel route generated by the route generation unit F206 of the safety functional unit F2 as the target route. The route determination unit F111 may adjust the selected target route in accordance with the output from the driver operation recognition unit F002. - The route determination unit F111 may preferentially select the travel route generated by the route generation unit F107 during normal traveling as the target route, for example. The route determination unit F111 may select the travel route generated by the route generation unit F206 of the safety functional unit F2 as the target route when the travel route generated by the route generation unit F107 does not pass through free space searched for by a free space search unit F205 of the safety functional unit F2.
- The target motion determination unit F112 determines the target motion of the
vehicle 100 based on an output from the critical status determination unit F108, the first vehicle model F109, and an output from the route determination unit F111 The target motion determination unit F112 may, upon input of the target route generated by the critical status determination unit F108 (the target route for avoiding critical situations), determine the target motion of thevehicle 100 based on the target route generated by the critical status determination unit F108 and the first vehicle model F109, for example. The target motion determination unit F112 may, when the target route generated by the critical status determination unit F108 is not input (thevehicle 100 is not in a critical status), determine the target motion of thevehicle 100 based on the target route generated by the route determination unit F111 and the first vehicle model F109. - The vehicle motion energy setting unit F113 calculates driving torque required for the drive actuator, steering torque required for the steering actuator, and braking torque required for the braking actuator based on an output from the target motion determination unit F112. Specifically, the vehicle motion energy setting unit F113 calculates the driving torque, the steering torque, and the braking torque such that the motion of the
vehicle 100 becomes the target motion determined by the target motion determination unit F112. - The energy management unit F114 calculates a control amount of the actuator AC based on the second vehicle model F110 and an output from the vehicle motion energy setting unit F113. Specifically, the energy management unit F114 calculates the control amount of the actuator AC based on the second vehicle model F110 at the highest energy efficiency to achieve the target motion determined by the target motion determination unit F112. The energy management unit F114 calculates the opening/closing timing of intake/exhaust valves (not shown), the timing of injectors (not shown) injecting the fuel, and the like at the most improved fuel consumption to achieve the engine torque determined by the vehicle motion energy setting unit F113.
- The selector F115 outputs either the output from the vehicle motion energy setting unit F113 or an output from a vehicle motion energy setting unit F310 of the backup functional unit F3. The selector F116 outputs either an output from the energy management unit F114 or an output from an energy management unit F311 of the backup functional unit F3. The output from the selector F115 and the output from the selector F116 are each a control signal for cruise control of the
vehicle 100. - Specifically, the selector F115 selects the output from the vehicle motion energy setting unit F113 when no abnormality (e.g., a fault) occurs in the main arithmetic unit F1, and selects the output from the vehicle motion energy setting unit F310 of the backup functional unit F3 when an abnormality occurs in the main arithmetic unit F1. Similarly, the selector F116 selects the output from the energy management unit F114 when no abnormality occurs in the main arithmetic unit F1, and selects the output from the energy management unit F311 of the backup functional unit F3 when an abnormality occurs in the main arithmetic unit F1.
- The safety functional unit F2 recognizes the external environment of the
vehicle 100 based on the output from theinformation acquisition unit 10 and searches the external environment of thevehicle 100 for free space. The safety functional unit F2 then generates a travel route passing through the free space. The travel route (the travel route passing through the free space) obtained by the safety functional unit F2 is used in the processing to determine the target route by the main arithmetic unit F1. For the processing by the safety functional unit F2, an algorithm based on a rule set in advance is used in place of the learning model generated by deep learning. In the safety functional unit F2, rule-based processing is performed. - In this example, the safety functional unit F2 includes an object recognition unit F201 (image system), an object recognition unit F202 (radar system), a classification unit F203, the preprocessing unit F204, the free space search unit F205, and the route generation unit F206.
- The object recognition unit F201, the object recognition unit F202, the classification unit F203, and the preprocessing unit F204 belong to the recognition block B1. The free space search unit F205 and the route generation unit F206 belong to the determination block B2.
- The
signal processing unit 201 includes part of the object recognition unit F201 (image system), while therecognition processing unit 202 includes the rest thereof. The determination processing unit 203 (specifically, the firstcontrol IC unit 23a) includes the object recognition unit F202 (radar system), the classification unit F203, the preprocessing unit F204, the free space search unit F205, and the route generation unit F206. - The object recognition unit F201 recognizes the object included in the external environment of the
vehicle 100 based on the output from thecameras 11. Thus, the object information is obtained. Specifically, the object recognition unit F201 includes an image processing unit and an image recognition unit. The image processing unit performs image processing with respect to the image data which is the output from thecameras 11. The image recognition unit recognizes the object included in the external environment of thevehicle 100 based on the image data processed by the image processing unit. The image recognition unit of the object recognition unit F201 of the object recognition unit F201 performs object recognition processing using a known pattern recognition technique without using any learning model generated by deep learning, for example. For the object recognition processing by the image recognition unit of the object recognition unit F201, another known object recognition technique (image data-based object recognition technique) may be used. - In this example, the image processing unit of the object recognition unit F201 includes a first image processing unit that performs processing based on an output from the
first cameras 11a and a second image processing unit that performs processing based on an output from thesecond cameras 11b. The image recognition unit of the object recognition unit F201 includes a first image recognition unit that performs processing based on an output from the first image processing unit and a second image recognition unit that performs processing based on an output from the second image processing unit. In this example, thesignal processing unit 201 includes the image processing unit of the object recognition unit F201, and thedetermination processing unit 203 includes the image recognition unit of the object recognition unit F201. Specifically, the first signalprocessing IC unit 21a includes the first image processing unit, and the second signalprocessing IC unit 21b includes the second image processing unit. The first recognitionprocessing IC unit 22a includes the first image recognition unit, and the second recognitionprocessing IC unit 22b includes the second image recognition unit. - The object recognition unit F202 recognizes the object included in the external environment of the
vehicle 100 based on a detection result which is the output from theradars 12. Thus, the object information is obtained. Specifically, the object recognition unit F202 performs analysis processing with respect to the detection result of theradars 12. The object recognition unit F202 performs object recognition processing using a known object recognition technique (object recognition technique based on the detection result of the radars 12) without using any learning model generated by deep learning, for example. - In this example, the object recognition unit F202 includes a first radar recognition unit that performs processing based on an output from the
first radars 12a and a second radar recognition unit that performs processing based on an output from thesecond radars 12b. In this example, the firstcontrol IC unit 23a includes the first radar recognition unit and the second radar recognition unit. - The classification unit F203 recognizes the external environment of the
vehicle 100 based on an output from the object recognition unit F201 (image system) and an output from the object recognition unit F202 (radar system). The classification unit F203 performs recognition processing (rule-based recognition processing) using an algorithm based on a rule set in advance without using any learning model generated by deep learning. For the rule-based recognition processing, a known recognition processing technique may be used. Specifically, the classification unit F203 classifies the object recognized by the object recognition unit F201 and the object recognition unit F202 into a dynamic object and a stationary object. The classification unit F203 fuses the object information obtained by the object recognition unit F201 (image system) and the object information obtained by the object recognition unit F202 (radar system) for each of a plurality of areas (e.g., four areas of front, rear, right, and left) obtained by dividing a surrounding area surrounding thevehicle 100, for example. The classification unit F203 generates classification information of the object included in each of the areas. The classification information indicates that the object corresponds to which of the dynamic object and the stationary object. - The preprocessing unit F204 performs preprocessing based on an output from the classification unit F203, the output from the vehicle status detection unit F001 of the main arithmetic unit F1, the output from the
position sensor 13, and the output from theexternal input unit 14. In the preprocessing, classified-information fusion, object behavior prediction, and self-position estimation are performed. - In the classified-information fusion, the preprocessing unit F204 fuses the classification information generated for each of a plurality of areas (e.g., four areas of front, rear, right, and left). The fused classification information is managed on a grid map (not shown) as the classification information on the surrounding area of the
vehicle 100. - In the object behavior prediction, the preprocessing unit F204 detects the dynamic object included in the external environment of the
vehicle 100 based on the fused classification information. The preprocessing unit F204 predicts changes in the distance between the dynamic object and the vehicle, the direction of the dynamic object with respect to the subject vehicle, and the relative speed of the dynamic object with respect to the vehicle. A result of the prediction by the preprocessing unit F204 is managed as additional information of the dynamic obj ect. - In the self-position estimation, the preprocessing unit F204 estimates the position of the subject vehicle with respect to the object (the dynamic object and the stationary object) included in the external environment of the
vehicle 100 based on the position of thevehicle 100 as the output from theposition sensor 13, the high-precision map information as an example output from theexternal input unit 14, and the status of the vehicle 100 (the vehicle speed information, the 6DoF information, and the like) as the output from the vehicle status detection unit F001. - The free space search unit F205 searches the external environment of the
vehicle 100 for free space based on the output from the preprocessing unit F204. The free space is an area in which no obstacles are present out of roads included in the external environment of thevehicle 100. The obstacles include a dynamic obstacle and a static obstacle. Examples of the dynamic obstacle include other vehicles and pedestrians. Examples of the static obstacle include median strips, center poles, and the like. The free space may include a space on a road shoulder allowing emergency parking and the like, for example. - Specifically, the free space search unit F205 searches for the free space that can avoid a collision with the object whose position has been estimated by the preprocessing unit F204. The free space search unit F205 searched for the free space based on a search rule set in advance, for example. The search rule may include a rule that a predetermined range around the object (e.g., a range of a few meters) is set to be an unavoidable range. The free space search unit F205 may, when the object is the dynamic object, search for the free space in consideration of the moving speed of the dynamic object.
- The route generation unit F206 generates the travel route of the
vehicle 100 based on an output from the free space search unit F205 and the output from the route search unit F106 of the main arithmetic unit F1 (wide-area route of the vehicle 100). Specifically, the route generation unit F206 generates a travel route passing through the free space obtained by the free space search unit F205. The route generation unit F206 may be configured to generate a plurality of travel routes passing through the free space, and select the one requiring the lowest route costs out of the travel routes, for example. The travel route (travel route passing through the free space) generated by the route generation unit F206 is output to the route determination unit F111 of the main arithmetic unit F1. - The backup functional unit F3 recognizes the external environment of the
vehicle 100 based on the output from theinformation acquisition unit 10, searches the external environment of thevehicle 100 for free space, and determines the target route of thevehicle 100 passing through the free space. The backup functional unit F3 then determines the target motion of thevehicle 100 based on the target route of thevehicle 100 and outputs a control signal based on the target motion of thevehicle 100. The control signal obtained by the backup functional unit F3 is supplied to the main arithmetic unit F1. For the processing by the backup functional unit F3, an algorithm based on a rule set in advance is used. In the backup functional unit F3, rule-based processing is performed. - In this example, the backup functional unit F3 includes a vehicle status detection unit F301, a driver operation recognition unit F302, a classification unit F303, a preprocessing unit F304, a free space search unit F305, a route generation unit F306, a critical status determination unit F307, a route determination unit F308, a target motion determination unit F309, a vehicle motion energy setting unit F310, and an energy management unit F311.
- The vehicle status detection unit F301, the driver operation recognition unit F302, the classification unit F303, and the preprocessing unit F304 belong to the recognition block B1. The free space search unit F305, the route generation unit F306, the critical status determination unit F307, the route determination unit F308, and the target motion determination unit F309 belong to the determination block B2. The vehicle motion energy setting unit F310 and the energy management unit F311 belong to the operation block B3.
- In this example, the backup processing unit 204 (specifically, the second
control IC unit 23b) includes a vehicle status detection unit F301, a driver operation recognition unit F302, a classification unit F303, a preprocessing unit F304, a free space search unit F305, a route generation unit F306, a critical status determination unit F307, a route determination unit F308, a target motion determination unit F309, a vehicle motion energy setting unit F310, and an energy management unit F311. - The functions of the vehicle status detection unit F301 and the driver operation recognition unit F302 are the same as the respective functions of the vehicle status detection unit F001 and the driver operation recognition unit F002 of the main arithmetic unit F1.
- The functions of the classification unit F303, the preprocessing unit F304, the free space search unit F305, and the route generation unit F306 are the same as the respective functions of the classification unit F203, the preprocessing unit F204, the free space search unit F205, and the route generation unit F206 of the safety functional unit F2.
- In the example in
FIG. 6 , the classification unit F303 performs processing based on the output from the object recognition unit F201 (image system) and the output from the object recognition unit F202 (radar system) of the safety functional unit F2. The backup functional unit F3 may include an object recognition unit (image system) and an object recognition unit (radar system) which is the same as the object recognition unit F201 (image system) and the object recognition unit F202 (radar system) of the safety functional unit F2, respectively. In this case, the classification unit F303 may perform the processing based on an output from the object recognition unit (the image system) and an output from the object recognition unit (the radar system) of the backup functional unit F3. - The route determination unit F308 determines the target route of the
vehicle 100 based on an output from the driver operation recognition unit F302 and an output from the route generation unit F306 (travel route passing through the free space). The route determination unit F308 selects any one out of a plurality of travel routes generated by the route generation unit F306 as the target route, for example. The route determination unit F308 may adjust the selected target route in accordance with the output from the driver operation recognition unit F302. - The target motion determination unit F309 determines the target motion of the
vehicle 100 based on an output from the critical status determination unit F307 (target route) and an output from the route determination unit F308. Unlike the target motion determination unit F112 of the main arithmetic unit F1, the target motion determination unit F309 determines the target motion of thevehicle 100 using an algorithm based on a rule set in advance without using any learning model generated by deep learning. The target motion determination unit F309 may, upon input of the target route generated by the critical status determination unit F307 (the target route for avoiding critical situations), determine the target motion of thevehicle 100 based on the target route generated by the critical status determination unit F307, for example. The target motion determination unit F309 may, when the target route generated by the critical status determination unit F307 is not input (thevehicle 100 is not in a critical status), determine the target motion of thevehicle 100 based on the target route generated by the route determination unit F308. - Like the vehicle motion energy setting unit F113 of the main arithmetic unit F1, the vehicle motion energy setting unit F310 calculates the driving torque required for the drive actuator, the steering torque required for the steering actuator, and the braking torque required for the braking actuator based on an output from the target motion determination unit F309. Each torque calculated by the vehicle motion energy setting unit F310 are output to the selector F115 of the main arithmetic unit F1.
- The energy management unit F311 calculates the control amount of the actuator AC based on the output from the vehicle motion energy setting unit F310. Specifically, the energy management unit F311 calculates the control amount of the actuator AC at the highest energy efficiency to achieve the target motion determined by the target motion determination unit F309. Unlike the energy management unit F114 of the main arithmetic unit F1, the energy management unit F311 calculates the control amount of the actuator AC using an algorithm based on a rule set in advance without using any learning model generated by deep learning. The control amount calculated by the energy management unit F311 is output to the selector F116 of the main arithmetic unit F1.
- The
control unit 20 performs a first operation, a second operation, and a third operation. Thevehicle control system 1 is provided with a first signal system including a plurality offirst cameras 11a, and a second signal system including a plurality ofsecond cameras 11b. In this example, thecontrol unit 20 performs the first operation when both of the first and second signal systems have no abnormality. Thecontrol unit 20 performs the second operation when the second signal system between the first and second signal systems has an abnormality. Thecontrol unit 20 performs the third operation when the first signal system between the first and second signal systems has an abnormality. - During the first operation, the
control unit 20 outputs a control signal (signal for cruise control of the vehicle 100) based on both of the output from thefirst cameras 11a and the output from thesecond cameras 11b. In this example, thecontrol unit 20 outputs, during the first operation, the control signal based on both of the result of the recognition processing by the first recognitionprocessing IC unit 22a and the result of the recognition processing by the second recognitionprocessing IC unit 22b. - Specifically, during the first operation, the first signal
processing IC unit 21a processes the output from thefirst cameras 11a, and the first recognitionprocessing IC unit 22a processes the output from the first signalprocessing IC unit 21a. The second signalprocessing IC unit 21b processes the output from thesecond cameras 11b, and the second recognitionprocessing IC unit 22b processes the output from the second signalprocessing IC unit 21b. The second recognitionprocessing IC unit 22b outputs the result of the recognition processing to the first recognitionprocessing IC unit 22a. The first recognitionprocessing IC unit 22a fuses the result of the recognition processing by the first recognitionprocessing IC unit 22a and the result of the recognition processing by the second recognitionprocessing IC unit 22b, and outputs a recognition result obtained by this fusing to the firstcontrol IC unit 23a. The firstcontrol IC unit 23a outputs the control signal based on the output from the first recognitionprocessing IC unit 22a. - During the second operation, the
control unit 20 outputs the control signal based on the output from thefirst cameras 11a. In this example, thecontrol unit 20 outputs, during the second operation, the control signal based on the result of the recognition processing by the first recognitionprocessing IC unit 22a. - Specifically, during the second operation, the first signal
processing IC unit 21a processes the output from thefirst cameras 11a, and the first recognitionprocessing IC unit 22a processes the output from the first signalprocessing IC unit 21a. The first recognitionprocessing IC unit 22a outputs the result of the recognition processing by the first recognitionprocessing IC unit 22a to the firstcontrol IC unit 23a. The firstcontrol IC unit 23a outputs the control signal based on the result of the recognition processing by the first recognitionprocessing IC unit 22a. - During the third operation, the
control unit 20 outputs the control signal based on the output from thesecond cameras 11b. In this example, thecontrol unit 20 outputs, during the third operation, the control signal based on the result of the recognition processing by the second recognitionprocessing IC unit 22b. - Specifically, during the third operation, the second signal
processing IC unit 21b processes the output from thesecond cameras 11b, and the second recognitionprocessing IC unit 22b processes the output from the second signalprocessing IC unit 21b. The second recognitionprocessing IC unit 22b outputs the result of the recognition processing to the first recognitionprocessing IC unit 22a. The first recognitionprocessing IC unit 22a outputs the result of the recognition processing by the second recognitionprocessing IC unit 22b to the firstcontrol IC unit 23a. The firstcontrol IC unit 23a outputs the control signal based on the result of the recognition processing by the first recognitionprocessing IC unit 22a. - As described above, the
vehicle 100 has two combinations of thecameras 11 provided in thevehicle 100 so as to surround thevehicle 100. Provision of thefirst cameras 11a so as to surround thevehicle 100 allows an environment spread around the vehicle 100 (the external environment of the vehicle 100) to be monitored based on the output from thefirst cameras 11a. Similarly, provision of thesecond cameras 11b so as to surround thevehicle 100 allows the external environment of thevehicle 100 to be monitored based on the output from thesecond cameras 11b. As described above, the external environment of thevehicle 100 may be monitored based on at least either one of the output from thefirst cameras 11a or the output from thesecond cameras 11b. Accordingly, even when either one of the first signal system including thefirst cameras 11a or the second signal system including thesecond cameras 11b has an abnormality, it is possible to continue the cruise control of thevehicle 100 based on the output from thecameras 11 in the other signal system by performing the second or third operation. This allows improvement in continuity of cruise control of thevehicle 100. - This configuration allows automatic switching among the first, second, and third operations according to the presence or absence of abnormality in the first and second signal systems. Accordingly, when either one of the first or second signal system has an abnormality, the second or third operation may be automatically continued. This allows automatic continuation of the cruise control of the
vehicle 100. - In this example, the
first cameras 11a include a firstfront camera 111a, a first diagonally backwardright camera 112a, and a first diagonally backward leftcamera 113a. Thesecond cameras 11b include a secondfront camera 111b, a second diagonally backwardright camera 112b, and a second diagonally backward leftcamera 113b. This configuration allows the cruise control of thevehicle 100 to be continued based on at least an area in front of thevehicle 100, an area diagonally backward right of thevehicle 100, and an area diagonally backward left of thevehicle 100 in the environment spread around the vehicle 100 (external environment of the vehicle 100). Accordingly, even when either one of the first or second signal system has an abnormality, this allows continuous cruise control based on the area in front of the vehicle 100 (e.g., control for maintaining an appropriate distance from other vehicles traveling in front of the subject vehicle) and control based on the area diagonally backward right of thevehicle 100 and the area diagonally backward left of the vehicle 100 (e.g., control for sensing critical situations when the subject vehicle performs lane changing). - The recognition processing by both of the first recognition
processing IC unit 22a and the second recognitionprocessing IC unit 22b allows improvement of recognition accuracy, compared with recognition processing by either one of the first recognitionprocessing IC unit 22a or the second recognitionprocessing IC unit 22b. - As shown in
FIG. 8 , therecognition processing unit 202 may be constituted as a single recognition processing IC unit (first recognitionprocessing IC unit 22a in the example ofFIG. 8 ). In this example, the output from the second signalprocessing IC unit 21b is supplied to the first recognitionprocessing IC unit 22a. The first recognitionprocessing IC unit 22a performs recognition processing based on the output from the first signalprocessing IC unit 21a and the output from the second signalprocessing IC unit 21b. - Similarly, the
signal processing unit 201 may be constituted as a single signal processing IC unit (e.g., the first signalprocessing IC unit 21a). - The above description is made based on the case in which two combinations of
cameras 11 are provided in thevehicle 100 so as to surround thevehicle 100 as an example, but is not limited by this example. For example, three combinations ofcameras 11 may be provided in thevehicle 100 so as to surround thevehicle 100. Similarly, the above description is made based on the case in which two combinations of theradars 12 are provided in thevehicle 100 so as to surround thevehicle 100 as an example, but is not limited by this example. For example, three combinations ofradars 12 may be provided in thevehicle 100 so as to surround thevehicle 100. - The foregoing embodiments may be performed in combination as appropriate. The foregoing embodiments are merely exemplary ones in nature, and are not intended to limit the scope, applications, or use of the present invention.
Claims (4)
- A vehicle control system for controlling a vehicle, the system comprising:a plurality of first cameras disposed in the vehicle so as to surround the vehicle;a plurality of second cameras disposed in the vehicle so as to surround the vehicle; anda control unit that performs a first operation of outputting a control signal for cruise control of the vehicle based on both outputs from the plurality of the first cameras and outputs from the plurality of the second cameras, a second operation of outputting the control signal based on the outputs from the plurality of the first cameras, and a third operation of outputting the control signal based on the outputs from the plurality of the second cameras.
- The vehicle control system of claim 1, wherein
the control unit performs:the first operation when a first signal system including the plurality of first cameras and a second signal system including the plurality of second cameras both do not have an abnormality,the second operation when the second signal system between the first and second signal systems has an abnormality, andthe third operation when the first signal system between the first and second signal systems has an abnormality. - The vehicle control system of claim 1 or 2, wherein
the plurality of first cameras include a first front camera that images an area in front of the vehicle, a first diagonally backward right camera that images an area diagonally backward right of the vehicle, and a first diagonally backward left camera that images an area diagonally backward left of the vehicle, and
the plurality of second cameras include a second front camera that images an area in front of the vehicle, a second diagonally backward right camera that images an area diagonally backward right of the vehicle, and a second diagonally backward left camera that images an area diagonally backward left of the vehicle. - The vehicle control system of any one of claims 1 to 3, wherein
the control unit includes: a first recognition processing IC unit that performs recognition processing for recognizing an external environment of the vehicle based on an output from the plurality of first cameras; and a second recognition processing IC unit that performs the recognition processing based on the output of the plurality of second cameras, and
during the first operation, the control unit outputs the control signal based on both of a result of the recognition processing by the first recognition processing IC unit and a result of the recognition processing by the second recognition processing IC unit,
during the second operation, the control unit outputs the control signal based on the result of the recognition processing by the first recognition processing IC unit, and
during the third operation, the control unit outputs the control signal based on the result of the recognition processing by the second recognition processing IC unit.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020018014A JP2021123234A (en) | 2020-02-05 | 2020-02-05 | Vehicle control system |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3862240A1 true EP3862240A1 (en) | 2021-08-11 |
EP3862240B1 EP3862240B1 (en) | 2024-05-15 |
Family
ID=74285372
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP21153702.2A Active EP3862240B1 (en) | 2020-02-05 | 2021-01-27 | Vehicle control system |
Country Status (4)
Country | Link |
---|---|
US (1) | US20210241001A1 (en) |
EP (1) | EP3862240B1 (en) |
JP (1) | JP2021123234A (en) |
CN (1) | CN113291289B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4183655A1 (en) * | 2021-11-23 | 2023-05-24 | Tusimple, Inc. | Method and in-vehicle computing system for controlling operation of an autonomous vehicle |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1116099A (en) | 1997-06-27 | 1999-01-22 | Hitachi Ltd | Automobile traveling supporting device |
EP3299221A1 (en) * | 2016-05-10 | 2018-03-28 | Dura Operating, LLC | Scalable driver assistance system |
WO2019035458A1 (en) * | 2017-08-18 | 2019-02-21 | Sony Semiconductor Solutions Corporation | Control device and control system |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008269494A (en) * | 2007-04-24 | 2008-11-06 | Auto Network Gijutsu Kenkyusho:Kk | On-vehicle imaging system |
CN202169907U (en) * | 2011-07-29 | 2012-03-21 | 富士重工业株式会社 | Device used for identifying environment outside vehicle |
JP6381066B2 (en) * | 2013-12-27 | 2018-08-29 | 株式会社Subaru | Vehicle lane keeping control device |
JP6137081B2 (en) * | 2014-07-29 | 2017-05-31 | 株式会社デンソー | Car equipment |
EP3293543B1 (en) * | 2016-09-08 | 2021-06-09 | KNORR-BREMSE Systeme für Nutzfahrzeuge GmbH | Apparatus for sensing a vehicular environment when fitted to a vehicle |
CN106708040B (en) * | 2016-12-09 | 2019-10-08 | 重庆长安汽车股份有限公司 | Sensor module, automated driving system and the method for automated driving system |
JP2019174168A (en) * | 2018-03-27 | 2019-10-10 | パナソニックIpマネジメント株式会社 | Controller, vehicle, control method, and control program |
JP6990137B2 (en) * | 2018-03-28 | 2022-01-12 | 本田技研工業株式会社 | Vehicle control device |
JP7165012B2 (en) * | 2018-09-28 | 2022-11-02 | 株式会社Subaru | Vehicle stereo camera device |
JP2020173744A (en) * | 2019-04-15 | 2020-10-22 | 株式会社日立製作所 | Image processing method using machine learning and electronic control device using it |
-
2020
- 2020-02-05 JP JP2020018014A patent/JP2021123234A/en active Pending
-
2021
- 2021-01-25 CN CN202110096172.3A patent/CN113291289B/en active Active
- 2021-01-27 US US17/159,178 patent/US20210241001A1/en not_active Abandoned
- 2021-01-27 EP EP21153702.2A patent/EP3862240B1/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1116099A (en) | 1997-06-27 | 1999-01-22 | Hitachi Ltd | Automobile traveling supporting device |
EP3299221A1 (en) * | 2016-05-10 | 2018-03-28 | Dura Operating, LLC | Scalable driver assistance system |
WO2019035458A1 (en) * | 2017-08-18 | 2019-02-21 | Sony Semiconductor Solutions Corporation | Control device and control system |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4183655A1 (en) * | 2021-11-23 | 2023-05-24 | Tusimple, Inc. | Method and in-vehicle computing system for controlling operation of an autonomous vehicle |
Also Published As
Publication number | Publication date |
---|---|
CN113291289A (en) | 2021-08-24 |
US20210241001A1 (en) | 2021-08-05 |
JP2021123234A (en) | 2021-08-30 |
CN113291289B (en) | 2023-12-05 |
EP3862240B1 (en) | 2024-05-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11467596B2 (en) | Target abnormality determination device | |
US20170043771A1 (en) | Device for signalling objects to a navigation module of a vehicle equipped with this device | |
EP3929063A1 (en) | Arithmetic operation system for vehicles | |
CN118922742A (en) | Camera-radar data fusion for efficient target detection | |
EP3862836A1 (en) | Ecu for vehicle cruise control and control system using two of such ecu's | |
EP3862240B1 (en) | Vehicle control system | |
EP3862919A1 (en) | Vehicle control device | |
CN116113566A (en) | Automatic driving crisis determination | |
EP3985643A1 (en) | Outside environment recognition device | |
JP7296899B2 (en) | vehicle controller | |
US12091046B2 (en) | Driving assistance apparatus | |
US20220237899A1 (en) | Outside environment recognition device | |
EP3979221A1 (en) | Outside environment recognition device | |
CN116353589A (en) | Vehicle control method and device based on unmanned system and vehicle | |
EP3871940B1 (en) | Vehicle control device | |
CN117836182B (en) | Vehicle control method and vehicle control device | |
Michalke et al. | Evolution in Advanced Driver Assistance: From Steering Support in Highway Construction Zones to Assistance in Urban Narrow Road Scenarios | |
To et al. | Environmental Perception And Control Strategies For An Advanced Highway Assistant |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20220211 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20220426 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20240201 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602021013193 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20240515 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240915 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240816 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240916 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1686711 Country of ref document: AT Kind code of ref document: T Effective date: 20240515 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240916 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240815 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240915 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240816 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240515 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240815 |