US20160214607A1 - Predictive reasoning for controlling speed of a vehicle - Google Patents
Predictive reasoning for controlling speed of a vehicle Download PDFInfo
- Publication number
- US20160214607A1 US20160214607A1 US14/827,578 US201514827578A US2016214607A1 US 20160214607 A1 US20160214607 A1 US 20160214607A1 US 201514827578 A US201514827578 A US 201514827578A US 2016214607 A1 US2016214607 A1 US 2016214607A1
- Authority
- US
- United States
- Prior art keywords
- vehicle
- autonomous vehicle
- control object
- traffic control
- speed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 63
- 230000008859 change Effects 0.000 claims description 18
- 230000001133 acceleration Effects 0.000 claims description 11
- 230000033001 locomotion Effects 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 8
- 230000004888 barrier function Effects 0.000 claims description 2
- 230000006870 function Effects 0.000 description 28
- 238000004891 communication Methods 0.000 description 27
- 230000006399 behavior Effects 0.000 description 17
- 230000005540 biological transmission Effects 0.000 description 11
- 238000011217 control strategy Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 10
- 238000004590 computer program Methods 0.000 description 6
- 230000004927 fusion Effects 0.000 description 6
- 230000002093 peripheral effect Effects 0.000 description 5
- 230000000670 limiting effect Effects 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000011156 evaluation Methods 0.000 description 3
- 239000000446 fuel Substances 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 2
- ATUOYWHBWRKTHZ-UHFFFAOYSA-N Propane Chemical compound CCC ATUOYWHBWRKTHZ-UHFFFAOYSA-N 0.000 description 2
- 238000001069 Raman spectroscopy Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000003502 gasoline Substances 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000036961 partial effect Effects 0.000 description 2
- 238000012913 prioritisation Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- HBBGRARXTFLTSG-UHFFFAOYSA-N Lithium ion Chemical compound [Li+] HBBGRARXTFLTSG-UHFFFAOYSA-N 0.000 description 1
- 239000002253 acid Substances 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- -1 batteries Substances 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000005057 finger movement Effects 0.000 description 1
- 239000002828 fuel tank Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 229910001416 lithium ion Inorganic materials 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 239000010705 motor oil Substances 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 239000003208 petroleum Substances 0.000 description 1
- 239000001294 propane Substances 0.000 description 1
- 230000002829 reductive effect Effects 0.000 description 1
- 230000001172 regenerating effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
- B60W30/16—Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
- B60W30/143—Speed control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
- B60W30/16—Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
- B60W30/17—Control of distance between vehicles, e.g. keeping a distance to preceding vehicle with provision for special action when the preceding vehicle comes to a halt, e.g. stop and go
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
- B60W40/105—Speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0015—Planning or execution of driving tasks specially adapted for safety
- B60W60/0016—Planning or execution of driving tasks specially adapted for safety of the vehicle or its occupants
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/865—Combination of radar systems with lidar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/22—Platooning, i.e. convoy of communicating vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B60W2550/22—
-
- B60W2550/308—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/801—Lateral distance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/802—Longitudinal distance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/803—Relative lateral speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/804—Relative longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/805—Azimuth angle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2555/00—Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
- B60W2555/60—Traffic rules, e.g. speed limits or right of way
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2754/00—Output or target parameters relating to objects
- B60W2754/10—Spatial relation or speed relative to objects
- B60W2754/30—Longitudinal distance
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9325—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles for inter-vehicle distance regulation, e.g. navigating in platoons
Definitions
- Autonomous vehicles use various computing systems to aid in transporting passengers from one location to another. Some autonomous vehicles may require some initial input or continuous input from an operator, such as a pilot, driver, or passenger. Other systems, for example autopilot systems, may be used only when the system has been engaged, which permits the operator to switch from a manual mode (where the operator exercises a high degree of control over the movement of the vehicle) to an autonomous mode (where the vehicle essentially drives itself) to modes that lie somewhere in between.
- a manual mode where the operator exercises a high degree of control over the movement of the vehicle
- autonomous mode where the vehicle essentially drives itself
- the present application discloses embodiments that relate to predictive reasoning for controlling speed of a vehicle.
- the present application describes a method.
- the method may comprise identifying a first vehicle travelling ahead of an autonomous vehicle.
- the method may also comprise identifying a second vehicle ahead of the first vehicle, the first and second vehicles travelling in substantially a same lane as the autonomous vehicle.
- the method may also comprise determining a first buffer distance, the first buffer distance being a minimal distance behind the first vehicle at which the autonomous vehicle will substantially reach a speed of the first vehicle.
- the method may also comprise determining a second buffer distance, the second buffer distance being a minimal distance behind the second vehicle at which the first vehicle will substantially reach a speed of the second vehicle.
- the method may also comprise determining a distance at which to adjust a speed of the autonomous vehicle based on the first and second buffer distances and the speed of the autonomous vehicle.
- the method may further comprise providing instructions by a computing device to adjust the speed of the autonomous vehicle based on the distance.
- the present application describes a non-transitory computer readable medium having stored thereon executable instructions that, upon execution by a computing device, cause the computing device to perform functions.
- the functions may comprise identifying a first vehicle travelling ahead of an autonomous vehicle.
- the functions may also comprise identifying a second vehicle ahead of the first vehicle, the first and second vehicles travelling in substantially a same lane as the autonomous vehicle.
- the functions may also comprise determining a first buffer distance, the first buffer distance being a minimal distance behind the first vehicle at which the autonomous vehicle will substantially reach a speed of the first vehicle.
- the functions may also comprise determining a second buffer distance, the second buffer distance being a minimal distance behind the second vehicle at which the first vehicle will substantially reach a speed of the second vehicle.
- the functions may also comprise determining a distance at which to adjust a speed of the autonomous vehicle based on the first and second buffer distances and the speed of the autonomous vehicle.
- the functions may further comprise providing instructions to adjust the speed of the autonomous vehicle based on the distance.
- the present application describes a system.
- the system may comprise at least one processor.
- the system also may comprise a memory having stored thereon instructions that, upon execution by the at least one processor, cause the system to perform functions.
- the functions may comprise identifying a first object ahead of an autonomous vehicle.
- the functions may also comprise identifying a second object ahead of the first object, where the first and second objects are in substantially a same lane as the autonomous vehicle.
- the functions may also comprise determining a first buffer distance, the first buffer distance being a minimal distance behind the first object at which the autonomous vehicle will substantially reach a speed of the first object.
- the functions may also comprise determining a second buffer distance, the second buffer distance being a minimal distance behind the second object at which the first object will substantially reach a speed of the second object.
- the functions may also comprise determining a distance at which to adjust a speed of the autonomous vehicle based on the first and second buffer distances and the speed of the autonomous vehicle.
- the functions may further comprise providing instructions to adjust the speed of the autonomous vehicle
- FIG. 1 is a simplified block diagram of an example automobile.
- FIG. 2 illustrates an example automobile.
- FIG. 3 is a flow chart of an example method for adjusting a speed of an autonomous vehicle.
- FIG. 4A illustrates an example for determining a distance at which to adjust the speed of the autonomous vehicle.
- FIG. 4B illustrates an example for determining a distance at which to adjust the speed of the autonomous vehicle when a traffic control object is present.
- FIG. 5 illustrates an implementation of the example method on a road of travel.
- FIG. 6 is a schematic illustrating a conceptual partial view of a computer program.
- An autonomous vehicle operating on a road or path of travel may be configured to identify objects within an environment of the autonomous vehicle in order to determine an adjustment to the autonomous vehicle's current speed.
- the objects can be other vehicles, traffic control objects, or other types of objects.
- each identified object may be considered independently, and the respective characteristics of the object, such as its current speed, acceleration, and range to the vehicle, may be used to determine a speed for the autonomous vehicle to adjust to.
- the autonomous vehicle or computing device associated with the autonomous vehicle, may be configured to predict behaviors of the identified objects based on the characteristics of the objects and a state of the surrounding environment (e.g., traffic, rain, ice on the road, etc.), and the objects may all be considered together—each dependent on each other's behavior.
- the autonomous vehicle can then adjust its speed based on the predicted behaviors of the objects.
- the autonomous vehicle can determine what steady state the vehicle will need to adjust to (e.g., speed up, slow down, or stop) based on the predicted behaviors of the objects.
- Other characteristics/factors may be considered as well in order to determine the speed of the autonomous vehicle, such as a lateral position of the autonomous vehicle in a road/lane of travel, curvature of the road, proximity of static and dynamic objects, etc.
- a computing device configured to adjust the speed of an autonomous vehicle, may identify multiple objects ahead of the vehicle.
- the objects may include, for example, other vehicles travelling ahead of the autonomous vehicle in the same lane as the autonomous vehicle, such as trucks, bicycles, and motorcycles.
- the objects may also include other types of static or dynamic objects, such as pedestrians, stop signs, a toll booth, trees, guard rails, etc.
- the computing device may be configured to estimate characteristics of each object, such as the object's speed, acceleration, size, weight, direction of travel, and longitudinal and lateral speeds.
- the computing device may determine a buffer distance for each object between the autonomous vehicle and the farthest identified object from the autonomous vehicle. For example, if the computing device identifies a first and second object ahead of the vehicle, the second object being at a greater distance from the autonomous vehicle than the first object, the computing device may determine a first buffer distance at which the autonomous vehicle will substantially reach a speed of the first object, and also determine a second buffer distance at which the first object will substantially reach a speed of the second object.
- the buffer distances may be based on the speeds of the identified objects. In some examples, the buffer distances may also be based on other characteristics of the identified objects.
- the computing device may then determine a distance at which to adjust the speed of the autonomous vehicle.
- the distance may also be a function of other characteristics of the objects and the autonomous vehicle, as well as any predetermined (e.g., calibrated) constants.
- the computing device may be configured to then provide instructions to adjust the speed of the autonomous vehicle based on the distance.
- the instructions may be provided prior to the computing device detecting a change of the speed of at least one of the objects ahead of the autonomous vehicle.
- the autonomous vehicle may adjust its speed based on an estimation of the change of the speed of at least one of the objects prior to such change occurring.
- Such a change in the speed of the object(s) may be evaluated differently in various embodiments. For example, the change in the speed may be indicated by the speed of the object(s) exceeding a given threshold. Other examples are also possible.
- the computing device may be configured to provide instructions to modify a steering angle of the autonomous vehicle so as to cause the autonomous vehicle to follow a given trajectory and/or maintain safe lateral and longitudinal distances with the objects in the vicinity of the autonomous vehicle (e.g., cars in adjacent lanes on a road).
- the computing device may also be configured to implement heuristics to mimic human-like behavior to determine the distance and adjust the speed of the autonomous vehicle accordingly (and possibly control the autonomous vehicle in other manners, such as adjusting the autonomous vehicle's steering/trajectory).
- An example vehicle control system may be implemented in or may take the form of an automobile.
- a vehicle control system may be implemented in or take the form of other vehicles, such as cars, trucks, motorcycles, buses, boats, airplanes, helicopters, lawn mowers, recreational vehicles, amusement park vehicles, farm equipment, construction equipment, trams, golf carts, trains, and trolleys. Other vehicles are possible as well.
- an example system may take the form of a non-transitory computer-readable medium, which has program instructions stored thereon that are executable by at least one processor to provide the functionality described herein.
- An example system may also take the form of an automobile or a subsystem of an automobile that includes such a non-transitory computer-readable medium having such program instructions stored thereon.
- FIG. 1 is a simplified block diagram of an example automobile 100 , in accordance with an example embodiment.
- Components coupled to or included in the automobile 100 may include a propulsion system 102 , a sensor system 104 , a control system 106 , peripherals 108 , a power supply 110 , a computing device 111 , and a user interface 112 .
- the computing device 111 may include a processor 113 , and a memory 114 .
- the computing device 111 may be a controller, or part of the controller, of the automobile 100 .
- the memory 114 may include instructions 115 executable by the processor 113 , and may also store map data 116 .
- Components of the automobile 100 may be configured to work in an interconnected fashion with each other and/or with other components coupled to respective systems.
- the power supply 110 may provide power to all the components of the automobile 100 .
- the computing device 111 may be configured to receive information from and control the propulsion system 102 , the sensor system 104 , the control system 106 , and the peripherals 108 .
- the computing device 111 may be configured to generate a display of images on and receive inputs from the user interface 112 .
- the automobile 100 may include more, fewer, or different systems, and each system may include more, fewer, or different components. Additionally, the systems and components shown may be combined or divided in any number of ways.
- the propulsion system 102 may be configured to provide powered motion for the automobile 100 .
- the propulsion system 102 includes an engine/motor 118 , an energy source 120 , a transmission 122 , and wheels/tires 124 .
- the engine/motor 118 may be or include any combination of an internal combustion engine, an electric motor, a steam engine, and a Stirling engine. Other motors and engines are possible as well.
- the propulsion system 102 could include multiple types of engines and/or motors.
- a gas-electric hybrid car could include a gasoline engine and an electric motor. Other examples are possible.
- the energy source 120 may be a source of energy that powers the engine/motor 118 in full or in part. That is, the engine/motor 118 may be configured to convert the energy source 120 into mechanical energy. Examples of energy sources 120 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. The energy source(s) 120 could additionally or alternatively include any combination of fuel tanks, batteries, capacitors, and/or flywheels. In some examples, the energy source 120 may provide energy for other systems of the automobile 100 as well.
- the transmission 122 may be configured to transmit mechanical power from the engine/motor 118 to the wheels/tires 124 .
- the transmission 122 may include a gearbox, clutch, differential, drive shafts, and/or other elements.
- the drive shafts could include one or more axles that are configured to be coupled to the wheels/tires 124 .
- the wheels/tires 124 of automobile 100 could be configured in various formats, including a unicycle, bicycle/motorcycle, tricycle, or car/truck four-wheel format. Other wheel/tire formats are possible as well, such as those including six or more wheels.
- the wheels/tires 124 of automobile 100 may be configured to rotate differentially with respect to other wheels/tires 124 .
- the wheels/tires 124 may include at least one wheel that is fixedly attached to the transmission 122 and at least one tire coupled to a rim of the wheel that could make contact with the driving surface.
- the wheels/tires 124 may include any combination of metal and rubber, or combination of other materials.
- the propulsion system 102 may additionally or alternatively include components other than those shown.
- the sensor system 104 may include a number of sensors configured to sense information about an environment in which the automobile 100 is located. As shown, the sensors of the sensor system include a Global Positioning System (GPS) module 126 , an inertial measurement unit (IMU) 128 , a radio detection and ranging (RADAR) unit 130 , a laser rangefinder and/or light detection and ranging (LIDAR) unit 132 , a camera 134 , and actuators 136 configured to modify a position and/or orientation of the sensors.
- GPS Global Positioning System
- IMU inertial measurement unit
- RADAR radio detection and ranging
- LIDAR laser rangefinder and/or light detection and ranging
- the sensor system 104 may include additional sensors as well, including, for example, sensors that monitor internal systems of the automobile 100 (e.g., an O 2 monitor, a fuel gauge, an engine oil temperature, etc.). Other sensors are possible as well.
- the GPS module 126 may be any sensor configured to estimate a geographic location of the automobile 100 .
- the GPS module 126 may include a transceiver configured to estimate a position of the automobile 100 with respect to the Earth, based on satellite-based positioning data.
- the computing device 111 may be configured to use the GPS module 126 in combination with the map data 116 to estimate a location of a lane boundary on road on which the automobile 100 may be travelling on.
- the GPS module 126 may take other forms as well.
- the IMU 128 may be any combination of sensors configured to sense position and orientation changes of the automobile 100 based on inertial acceleration.
- the combination of sensors may include, for example, accelerometers and gyroscopes. Other combinations of sensors are possible as well.
- the RADAR unit 130 may be considered as an object detection system that may be configured to use radio waves to determine characteristics of the object such as range, altitude, direction, or speed of the object.
- the RADAR unit 130 may be configured to transmit pulses of radio waves or microwaves that may bounce off any object in a path of the waves.
- the object may return a part of energy of the waves to a receiver (e.g., dish or antenna), which may be part of the RADAR unit 130 as well.
- the RADAR unit 130 also may be configured to perform digital signal processing of received signals (bouncing off the object) and may be configured to identify the object.
- LIDAR light detection and ranging
- the LIDAR unit 132 may include a sensor configured to sense or detect objects in an environment in which the automobile 100 is located using light.
- LIDAR is an optical remote sensing technology that can measure distance to, or other properties of, a target by illuminating the target with light.
- the LIDAR unit 132 may include a laser source and/or laser scanner configured to emit laser pulses and a detector configured to receive reflections of the laser pulses.
- the LIDAR unit 132 may include a laser range finder reflected by a rotating mirror, and the laser is scanned around a scene being digitized, in one or two dimensions, gathering distance measurements at specified angle intervals.
- the LIDAR unit 132 may include components such as light (e.g., laser) source, scanner and optics, photo-detector and receiver electronics, and position and navigation system.
- the LIDAR unit 132 may be configured to use ultraviolet (UV), visible, or infrared light to image objects and can be used with a wide range of targets, including non-metallic objects.
- UV ultraviolet
- a narrow laser beam can be used to map physical features of an object with high resolution.
- wavelengths in a range from about 10 micrometers (infrared) to about 250 nm (UV) could be used.
- light is reflected via backscattering.
- Different types of scattering are used for different LIDAR applications, such as Rayleigh scattering, Mie scattering and Raman scattering, as well as fluorescence.
- LIDAR can be accordingly called Rayleigh LIDAR, Mie LIDAR, Raman LIDAR and Na/Fe/K Fluorescence LIDAR, as examples.
- Suitable combinations of wavelengths can allow for remote mapping of objects by looking for wavelength-dependent changes in intensity of reflected signals, for example.
- Three-dimensional (3D) imaging can be achieved using both scanning and non-scanning LIDAR systems.
- “3D gated viewing laser radar” is an example of a non-scanning laser ranging system that applies a pulsed laser and a fast gated camera.
- Imaging LIDAR can also be performed using an array of high speed detectors and a modulation sensitive detectors array typically built on single chips using CMOS (complementary metal-oxide-semiconductor) and hybrid CMOS/CCD (charge-coupled device) fabrication techniques.
- CMOS complementary metal-oxide-semiconductor
- CCD charge-coupled device
- a point cloud may include a set of vertices in a 3D coordinate system. These vertices may be defined by X, Y, and Z coordinates, for example, and may represent an external surface of an object.
- the LIDAR unit 132 may be configured to create the point cloud by measuring a large number of points on the surface of the object, and may output the point cloud as a data file. As the result of a 3D scanning process of the object by the LIDAR unit 132 , the point cloud can be used to identify and visualize the object.
- the point cloud can be directly rendered to visualize the object.
- the point cloud may be converted to polygon or triangle mesh models through a process that may be referred to as surface reconstruction.
- Example techniques for converting a point cloud to a 3D surface may include Delaunay triangulation, alpha shapes, and ball pivoting. These techniques include building a network of triangles over existing vertices of the point cloud.
- Other example techniques may include converting the point cloud into a volumetric distance field and reconstructing an implicit surface so defined through a marching cubes algorithm.
- the camera 134 may be any camera (e.g., a still camera, a video camera, etc.) configured to capture images of the environment in which the automobile 100 is located. To this end, the camera may be configured to detect visible light, or may be configured to detect light from other portions of the spectrum, such as infrared or ultraviolet light. Other types of cameras are possible as well.
- the camera 134 may be a two-dimensional detector, or may have a three-dimensional spatial range. In some examples, the camera 134 may be, for example, a range detector configured to generate a two-dimensional image indicating a distance from the camera 134 to a number of points in the environment. To this end, the camera 134 may use one or more range detecting techniques.
- the camera 134 may be configured to use a structured light technique in which the automobile 100 illuminates an object in the environment with a predetermined light pattern, such as a grid or checkerboard pattern and uses the camera 134 to detect a reflection of the predetermined light pattern off the object. Based on distortions in the reflected light pattern, the automobile 100 may be configured to determine the distance to the points on the object.
- the predetermined light pattern may comprise infrared light, or light of another wavelength.
- the actuators 136 may, for example, be configured to modify a position and/or orientation of the sensors.
- the sensor system 104 may additionally or alternatively include components other than those shown.
- the control system 106 may be configured to control operation of the automobile 100 and its components. To this end, the control system 106 may include a steering unit 138 , a throttle 140 , a brake unit 142 , a sensor fusion algorithm 144 , a computer vision system 146 , a navigation or pathing system 148 , and an obstacle avoidance system 150 .
- the steering unit 138 may be any combination of mechanisms configured to adjust the heading or direction of the automobile 100 .
- the throttle 140 may be any combination of mechanisms configured to control the operating speed and acceleration of the engine/motor 118 and, in turn, the speed and acceleration of the automobile 100 .
- the brake unit 142 may be any combination of mechanisms configured to decelerate the automobile 100 .
- the brake unit 142 may use friction to slow the wheels/tires 124 .
- the brake unit 142 may be configured to be regenerative and convert the kinetic energy of the wheels/tires 124 to electric current.
- the brake unit 142 may take other forms as well.
- the sensor fusion algorithm 144 may include an algorithm (or a computer program product storing an algorithm) executable by the computing device 111 , for example.
- the sensor fusion algorithm 144 may be configured to accept data from the sensor system 104 as an input.
- the data may include, for example, data representing information sensed at the sensors of the sensor system 104 .
- the sensor fusion algorithm 144 may include, for example, a Kalman filter, a Bayesian network, or another algorithm.
- the sensor fusion algorithm 144 further may be configured to provide various assessments based on the data from the sensor system 104 , including, for example, evaluations of individual objects and/or features in the environment in which the automobile 100 is located, evaluations of particular situations, and/or evaluations of possible impacts based on particular situations. Other assessments are possible as well
- the computer vision system 146 may be any system configured to process and analyze images captured by the camera 134 in order to identify objects and/or features in the environment in which the automobile 100 is located, including, for example, lane information, traffic signals and obstacles. To this end, the computer vision system 146 may use an object recognition algorithm, a Structure from Motion (SFM) algorithm, video tracking, or other computer vision techniques. In some examples, the computer vision system 146 may additionally be configured to map the environment, track objects, estimate speed of objects, etc.
- SFM Structure from Motion
- the navigation and pathing system 148 may be any system configured to determine a driving path for the automobile 100 .
- the navigation and pathing system 148 may additionally be configured to update the driving path dynamically while the automobile 100 is in operation.
- the navigation and pathing system 148 may be configured to incorporate data from the sensor fusion algorithm 144 , the GPS module 126 , and one or more predetermined maps so as to determine the driving path for the automobile 100 .
- the obstacle avoidance system 150 may be any system configured to identify, evaluate, and avoid or otherwise negotiate obstacles in the environment in which the automobile 100 is located.
- the control system 106 may additionally or alternatively include components other than those shown.
- Peripherals 108 may be configured to allow the automobile 100 to interact with external sensors, other automobiles, and/or a user.
- the peripherals 108 may include, for example, a wireless communication system 152 , a touchscreen 154 , a microphone 156 , and/or a speaker 158 .
- the wireless communication system 152 may be any system configured to be wirelessly coupled to one or more other automobiles, sensors, or other entities, either directly or via a communication network. To this end, the wireless communication system 152 may include an antenna and a chipset for communicating with the other automobiles, sensors, or other entities either directly or over an air interface.
- the chipset or wireless communication system 152 in general may be arranged to communicate according to one or more other types of wireless communication (e.g., protocols) such as Bluetooth, communication protocols described in IEEE 802.11 (including any IEEE 802.11 revisions), cellular technology (such as GSM, CDMA, UMTS, EV-DO, WiMAX, or LTE), Zigbee, dedicated short range communications (DSRC), and radio frequency identification (RFID) communications, among other possibilities.
- the wireless communication system 152 may take other forms as well.
- the touchscreen 154 may be used by a user to input commands to the automobile 100 .
- the touchscreen 154 may be configured to sense at least one of a position and a movement of a user's finger via capacitive sensing, resistance sensing, or a surface acoustic wave process, among other possibilities.
- the touchscreen 154 may be capable of sensing finger movement in a direction parallel or planar to the touchscreen surface, in a direction normal to the touchscreen surface, or both, and may also be capable of sensing a level of pressure applied to the touchscreen surface.
- the touchscreen 154 may be formed of one or more translucent or transparent insulating layers and one or more translucent or transparent conducting layers.
- the touchscreen 154 may take other forms as well.
- the microphone 156 may be configured to receive audio (e.g., a voice command or other audio input) from a user of the automobile 100 .
- the speakers 158 may be configured to output audio to the user of the automobile 100 .
- the peripherals 108 may additionally or alternatively include components other than those shown.
- the power supply 110 may be configured to provide power to some or all of the components of the automobile 100 .
- the power supply 110 may include, for example, a rechargeable lithium-ion or lead-acid battery.
- one or more banks of batteries could be configured to provide electrical power.
- Other power supply materials and configurations are possible as well.
- the power supply 110 and energy source 120 may be implemented together, as in some all-electric cars.
- the processor 113 included in the computing device 111 may comprise one or more general-purpose processors and/or one or more special-purpose processors (e.g., image processor, digital signal processor, etc.). To the extent that the processor 113 includes more than one processor, such processors could work separately or in combination.
- the computing device 111 may be configured to control functions of the automobile 100 based on input received through the user interface 112 , for example.
- the memory 114 may comprise one or more volatile and/or one or more non-volatile storage components, such as optical, magnetic, and/or organic storage, and the memory 114 may be integrated in whole or in part with the processor 113 .
- the memory 114 may contain the instructions 115 (e.g., program logic) executable by the processor 113 to execute various automobile functions, including any of the functions or methods described herein.
- the components of the automobile 100 could be configured to work in an interconnected fashion with other components within and/or outside their respective systems. To this end, the components and systems of the automobile 100 may be communicatively linked together by a system bus, network, and/or other connection mechanism (not shown).
- one or more components or systems may be removably mounted on or otherwise connected (mechanically or electrically) to the automobile 100 using wired or wireless connections.
- the automobile 100 may include one or more elements in addition to or instead of those shown.
- the automobile 100 may include one or more additional interfaces and/or power supplies.
- Other additional components are possible as well.
- the memory 114 may further include instructions executable by the processor 113 to control and/or communicate with the additional components.
- FIG. 2 illustrates an example automobile 200 , in accordance with an embodiment.
- FIG. 2 shows a Right Side View, Front View, Back View, and Top View of the automobile 200 .
- automobile 200 is illustrated in FIG. 2 as a car, other examples are possible.
- the automobile 200 could represent a truck, a van, a semi-trailer truck, a motorcycle, a golf cart, an off-road vehicle, or a farm vehicle, among other examples.
- the automobile 200 includes a first sensor unit 202 , a second sensor unit 204 , a third sensor unit 206 , a wireless communication system 208 , and a camera 210 .
- Each of the first, second, and third sensor units 202 - 206 may include any combination of global positioning system sensors, inertial measurement units, RADAR units, LIDAR units, cameras, lane detection sensors, and acoustic sensors. Other types of sensors are possible as well.
- first, second, and third sensor units 202 are shown to be mounted in particular locations on the automobile 200 , in some examples the sensor unit 202 may be mounted elsewhere on the automobile 200 , either inside or outside the automobile 200 . Further, while only three sensor units are shown, in some examples more or fewer sensor units may be included in the automobile 200 .
- one or more of the first, second, and third sensor units 202 - 206 may include one or more movable mounts on which the sensors may be movably mounted.
- the movable mount may include, for example, a rotating platform. Sensors mounted on the rotating platform could be rotated so that the sensors may obtain information from each direction around the automobile 200 .
- the movable mount may include a tilting platform. Sensors mounted on the tilting platform could be tilted within a particular range of angles and/or azimuths so that the sensors may obtain information from a variety of angles.
- the movable mount may take other forms as well.
- one or more of the first, second, and third sensor units 202 - 206 may include one or more actuators configured to adjust the position and/or orientation of sensors in the sensor unit by moving the sensors and/or movable mounts.
- Example actuators include motors, pneumatic actuators, hydraulic pistons, relays, solenoids, and piezoelectric actuators. Other actuators are possible as well.
- the wireless communication system 208 may be any system configured to wirelessly couple to one or more other automobiles, sensors, or other entities, either directly or via a communication network as described above with respect to the wireless communication system 152 in FIG. 1 . While the wireless communication system 208 is shown to be positioned on a roof of the automobile 200 , in other examples the wireless communication system 208 could be located, fully or in part, elsewhere.
- the camera 210 may be any camera (e.g., a still camera, a video camera, etc.) configured to capture images of the environment in which the automobile 200 is located. To this end, the camera 210 may take any of the forms described above with respect to the camera 134 in FIG. 1 . While the camera 210 is shown to be mounted inside a front windshield of the automobile 200 , in other examples the camera 210 may be mounted elsewhere on the automobile 200 , either inside or outside the automobile 200 .
- a camera e.g., a still camera, a video camera, etc.
- the automobile 200 may include one or more other components in addition to or instead of those shown.
- a control system of the automobile 200 may be configured to control the automobile 200 in accordance with a control strategy from among multiple possible control strategies.
- the control system may be configured to receive information from sensors coupled to the automobile 200 (on or off the automobile 200 ), modify the control strategy (and an associated driving behavior) based on the information, and control the automobile 200 in accordance with the modified control strategy.
- the control system further may be configured to monitor the information received from the sensors, and continuously evaluate driving conditions; and also may be configured to modify the control strategy and driving behavior based on changes in the driving conditions.
- FIG. 3 is a flow chart of an example method 300 for adjusting a speed of a vehicle.
- the method 300 may include one or more operations, functions, or actions as illustrated by one or more of blocks 302 - 308 .
- the blocks are illustrated in a sequential order, these blocks may in some instances be performed in parallel, and/or in a different order than those described herein. Also, the various blocks may be combined into fewer blocks, divided into additional blocks, and/or removed based upon the desired implementation.
- each block may represent a module, a segment, or a portion of program code, which includes one or more instructions executable by a processor for implementing specific logical functions or steps in the process.
- the program code may be stored on any type of computer readable medium or memory, for example, such as a storage device including a disk or hard drive.
- the computer readable medium may include a non-transitory computer readable medium, for example, such as computer-readable media that stores data for short periods of time like register memory, processor cache and Random Access Memory (RAM).
- the computer readable medium may also include non-transitory media or memory, such as secondary or persistent long term storage, like read only memory (ROM), optical or magnetic disks, compact-disc read only memory (CD-ROM), for example.
- the computer readable media may also be any other volatile or non-volatile storage systems.
- the computer readable medium may be considered a computer readable storage medium, a tangible storage device, or other article of manufacture, for example.
- each block in FIG. 3 may represent circuitry that is wired to perform the specific logical functions in the process.
- the method 300 shown in FIG. 3 will be described as implemented by an example computing device, such as the computing device 111 in FIG. 1 .
- the method 300 can also be described as implemented by an autonomous vehicle, as the computing device may be onboard the vehicle or may be off-board but in wireless communication with the vehicle. Therefore the terms “computing device” and “autonomous vehicle” can be interchangeable herein.
- the computing device may be configured to control the vehicle in an autonomous or semi-autonomous operation mode. It should be understood that other entities or combinations of entities can implement one or more steps of the example method 300 .
- the method 300 includes identifying a first object ahead of an autonomous vehicle. Further, at block 304 , the method includes identifying a second object ahead of the first object, where the first and second objects are in substantially in a same lane as the autonomous vehicle. It should be understood, however, that additionally or alternatively to identifying objects ahead of (or substantially in front of) the autonomous vehicle and in substantially the same lane as the autonomous vehicle, the computing device can be configured to identify other objects within an environment of the autonomous vehicle, including objects to the side of the autonomous vehicle (e.g., adjacent lanes on a road), and/or behind the autonomous vehicle, for example.
- other objects may be identified by the computing device between the autonomous vehicle and the second object, such as multiple vehicles travelling in front of the autonomous vehicle and behind the second object.
- the objects may be within a longitudinal distance threshold from the autonomous vehicle and/or within a lateral distance threshold from the autonomous vehicle.
- the autonomous vehicle may adjust its speed based on the behavior of other vehicles or objects in front of the autonomous vehicle that are in the same lane as the vehicle.
- the autonomous vehicle may adjust its speed based on the behavior of vehicles or other objects in adjacent lanes on the road of travel, such as an adjustment made when a nearby vehicle moves from its current lane to the lane in which the autonomous vehicle is travelling.
- the autonomous vehicle may not be desirable for the autonomous vehicle to adjust its speed based on the behavior of objects at that same given distance laterally from the autonomous vehicle (e.g., objects in lanes that are beyond the adjacent lanes).
- the second object ahead of the first object, as well as other objects between the autonomous vehicle and the second object may include another vehicle (e.g., cars, bicycles, etc.).
- the second object may include a traffic control object, such as a stop sign, traffic light, traffic cone, road surface marking, road boundary barrier, and the like.
- the second object may be a pedestrian, such as a pedestrian crossing the street at an upcoming intersection. Other examples are also possible.
- the computing device may be configured to determine respective characteristics of each object. For example, the computing device may be configured to determine a type of an object or classify the object (e.g., car or truck, car or motorcycle, traffic sign or a pedestrian, etc.). Further, the computing device can determine whether the object is moving or stationary. In some examples, at least one object identified between the autonomous vehicle and the second object may be a dynamic (e.g., moving) object.
- a type of an object or classify the object e.g., car or truck, car or motorcycle, traffic sign or a pedestrian, etc.
- the computing device can determine whether the object is moving or stationary.
- at least one object identified between the autonomous vehicle and the second object may be a dynamic (e.g., moving) object.
- the computing device may be configured to estimate a size (e.g., width and length) and weight of the object. Further, the computing device may be configured to determine a direction of motion of the object, such as if the object is moving towards the autonomous vehicle or away from the vehicle. Still further, the computing device may be configured to determine a transmission type (e.g., manual and automatic) and transmission mode of the object, such as whether the object is in park, drive, reverse, or neutral transmission mode. Yet still further, the computing device may be configured to determine a position of the object in a respective lane on the road or path of travel, and how close the object may be to lane boundaries. In some examples, the computing device may be configured to determine relative longitudinal speed and lateral speed of the object with respect to the autonomous vehicle. These characteristics are examples for illustration, and other characteristics can be determined as well.
- Characteristics of traffic control objects may also be determined, such as a color of a traffic light.
- a red traffic light is identified as an object substantially in front of the vehicle (e.g., the second object)
- the computing device may be configured to ignore objects beyond the traffic light (e.g., further away from the autonomous vehicle than the traffic light). The same configuration may apply in a scenario in which a stop sign is identified.
- the computing device may be configured to ignore the green traffic light as an object and, as such, the speed of the autonomous vehicle may not be adjusted based on the presence of the green traffic light.
- a yellow traffic light is identified by the computing device, the computing device may be configured to perform functions as if the traffic light was red or green, or may perform other functions.
- the computing device may also be configured to detect when a traffic light will change colors.
- the computing device may be configured to use the sensors and devices coupled to the autonomous vehicle.
- a camera such as the camera 134 in FIG. 1 or the camera 210 in FIG. 2 or any other image-capture device, may be coupled to the autonomous vehicle and may be in communication with the computing device.
- the camera may be configured to capture images or a video of the path/road of travel and vicinity of the path/road of travel.
- the computing device may be configured to receive the images or video and identify, using image processing techniques for example, objects depicted in the image or the video.
- the computing device may be configured compare portions of the images to templates of objects to identify the objects, for example.
- the computing device may be configured to receive, from a LIDAR device (e.g., the LIDAR unit 132 in FIG. 1 ) coupled to the autonomous vehicle and in communication with the computing device, LIDAR-based information that may include a three-dimensional (3D) point cloud.
- LIDAR-based information may include a three-dimensional (3D) point cloud.
- the 3D point cloud may include points corresponding to light emitted from the LIDAR device and reflected from objects on the road or in the vicinity of the road.
- operation of the LIDAR device may involve an optical remote sensing technology that enables measuring properties of scattered light to find range and/or other information of a distant target.
- the LIDAR device may be configured to emit laser pulses as a beam, and scan the beam to generate two dimensional or three dimensional range matrices.
- the range matrices may be used to determine distance to an object or surface by measuring time delay between transmission of a pulse and detection of a respective reflected signal.
- the LIDAR device may be configured to scan an environment surrounding the autonomous vehicle in three dimensions. In some examples, more than one LIDAR device may be coupled to the vehicle to scan a complete 360° horizon of the vehicle.
- the LIDAR device may be configured to provide to the computing device a cloud of point data representing obstacles or objects, which have been hit by the laser, on the road and the vicinity of the road.
- the points may be represented by the LIDAR device in terms of azimuth and elevation angles, in addition to range, which can be converted to (X, Y, Z) point data relative to a local coordinate frame attached to the autonomous vehicle.
- the LIDAR device may be configured to provide to the computing device intensity values of the light or laser reflected off the obstacles that may be indicative of a surface type of a given object. Based on such information, the computing device may be configured to identify the objects and characteristics of the objects such as type of the object, size, speed, whether the object is a traffic sign with a retroreflective surface, etc.
- the computing device may be configured to receive, from a RADAR device (e.g., the RADAR unit 130 in FIG. 1 ) coupled to the autonomous vehicle and in communication with the computing device, RADAR-based information relating to location and characteristics of the objects.
- the RADAR device may be configured to emit radio waves and receive back the emitted radio waves that bounced off the surface of objects on the road and in the vicinity of the road.
- the received signals or RADAR-based information may be indicative, for example, of dimensional characteristics of a given object, and may indicate whether the given object is stationary or moving.
- the computing device may be configured to have access to map information that identifies static objects that are permanently placed on the road such as traffic lights, traffic signs, guard rails, etc.
- the map information may also be updated periodically, and may include information about accidents that have recently occurred and the resulting wreckage or traffic that may be in the vicinity of the autonomous vehicle.
- the computing device may be configured to detect and identify the objects and characteristics of the objects based on information received from multiple sources such as the image-capture device, the LIDAR device, the RADAR device, etc.
- the computing device may be configured to identify the objects based on information received from a subset of the multiple sources. For example, images captured by the image-capture device may be blurred due to a malfunction of the image-capture device, and in another example, details of the road may be obscured in the images because of fog.
- the computing device may be configured to identify the objects based on information received from the LIDAR and/or RADAR units and may be configured to disregard the information received from the image-capture device.
- the autonomous vehicle may be travelling in a portion of the road where some electric noise or jamming signals may cause the LIDAR device and/or RADAR device to operate incorrectly.
- the computing device may be configured to identify the objects based on information received from the image-capture device, and may be configured to disregard the information received from the LIDAR and/or RADAR units.
- the computing device may be configured to rank these sources of information based on a condition of the road (e.g., fog, electronic jamming, etc.).
- the ranking may be indicative of which device(s) to rely on or give more weight to in identifying the objects.
- the LIDAR and RADAR devices may be ranked higher than the image-based device, and information received from the LIDAR and/or RADAR devices may be given more weight than respective information received from the image-capture device.
- the computing device may also be configured to receive, from sensors and devices coupled to the autonomous vehicle, information associated with, for example, condition of systems and subsystems of the autonomous vehicle. Further, the computing device may be configured to receive information associated with the surrounding environment of the autonomous vehicle, such as driving conditions and road conditions (e.g., rain, snow, etc.). For example, information indicating that the road is icy or wet ahead of the vehicle may cause the computing device to modify its adjustment of the autonomous vehicle's speed. Other examples are also possible.
- the method 300 includes determining a first buffer distance, the first buffer distance being a minimal distance behind the first object at which the autonomous vehicle will substantially reach a speed of the first object. Further, at block 308 , the method includes determining a second buffer distance, the second buffer distance being a minimal distance behind the second object at which the first object will substantially reach a speed of the second object. Still further, at block 310 , the method includes determining a distance at which to adjust a speed of the autonomous vehicle based on the first and second buffer distances and the speed of the autonomous vehicle. It should be understood that other buffer distances may be determined as well when other objects are identified within an environment of the autonomous vehicle in addition to the first and second objects.
- FIG. 4A illustrates an example scenario for determining the buffer distances and the distance at which to adjust the speed of the vehicle.
- an autonomous vehicle 400 e.g., a computing device of the autonomous vehicle travelling in the +y-direction may identify a plurality of objects 402 - 408 substantially in front of the autonomous vehicle 400 and also travelling in the +y-direction.
- the plurality of objects 402 - 408 may include an object travelling ahead of the autonomous vehicle 400 , such as vehicle 402 , and multiple objects between the autonomous vehicle 400 and vehicle 402 , such as vehicle 404 , vehicle 406 , and vehicle 408 .
- each vehicle may have respective characteristics that can be used to determine the distance, such as a range from the vehicle to the autonomous vehicle 400 , r i , a speed/velocity of the car, v i , and an acceleration (or deceleration) of the car, a i .
- the autonomous vehicle 400 may also have characteristics that can be used to determine the distance, such as a speed/velocity of the autonomous vehicle 400 , r 0 , and an acceleration/deceleration of the autonomous vehicle 400 , a 0 . Further, the distance may be based on a longitudinal speed of the autonomous vehicle, V LONG , and a lateral speed of the autonomous vehicle, v LAT . In some examples, in addition to the distance being based on the speed of the autonomous vehicle (e.g., the current speed of the autonomous vehicle), the distance may be further based on other characteristics of the vehicle, including the autonomous vehicle's direction of motion, size, position on a path of travel, and type, among other characteristics.
- the autonomous vehicle 400 may also determine a buffer distance for each of the vehicles 402 - 408 and may use each buffer distance to predict the behavior of the vehicles 402 - 408 and determine a distance behind the vehicles that the autonomous vehicle 400 should adjust its speed.
- Each buffer distance may represent a minimal distance behind a given vehicle at which another vehicle directly following the given vehicle will match (or substantially reach) the speed of the given vehicle. By determining the buffer distances, the autonomous vehicle 400 can determine when (or at what distance from the nearest or furthest object) it will need to adjust its speed.
- the autonomous vehicle 400 may determine that vehicle 404 will match the speed of vehicle 402 at buffer distance b 4 , that vehicle 406 will match the speed of vehicle 404 at buffer distance b 3 , and that vehicle 408 will match the speed of vehicle 406 at buffer distance b 2 .
- buffer distance b 1 is approximately zero since vehicle 408 is closest in proximity to the autonomous vehicle 400 and thus the autonomous vehicle 400 can be assumed to be matching the speed of vehicle 408 .
- the autonomous vehicle 400 may determine the buffer distances and thus determine that for each vehicle between the autonomous vehicle 400 and vehicle 402 , the autonomous vehicle 400 will need to slow down earlier (at a certain distance from vehicle 402 ). The autonomous vehicle 400 may then begin to slow down before vehicles 404 - 408 react to the speed change of vehicle 402 and begin to slow down as well.
- the buffer distance may be a function of the range from the vehicle to the given vehicle, and may be shorter when the vehicle is further away from the autonomous vehicle 400 .
- the buffer distance of vehicle 402 , b 4 may be shorter than the other buffer distances because vehicle 402 is the furthest away from the autonomous vehicle 400 and thus its future behavior does not affect the autonomous vehicle 400 as much as the behaviors of the vehicles closer to the vehicle 400 .
- the buffer distance may be determined by multiplying a given time constant by the speed of the vehicle 400 . Other examples of determining buffer distances are also possible.
- FIG. 4B another example scenario for determining the distance at which to adjust the speed of the vehicle.
- an autonomous vehicle 450 similar to the autonomous vehicle 400 of FIG. 4A that is travelling in the +y-direction may identify a plurality of objects 452 - 458 substantially in front of the autonomous vehicle 450 .
- the autonomous vehicle 450 may identify a red traffic light 452 (e.g., the second object) at a range of r S from the vehicle 450 .
- the autonomous vehicle 450 may identify vehicle 454 , vehicle 456 , and vehicle 458 , each travelling in the +y-direction, and determine characteristics of each vehicle, such as respective speed (longitudinal and lateral), acceleration, range from the autonomous vehicle 450 , and others.
- the autonomous vehicle 450 may be configured to determine a distance at which to adjust its speed. Further, the autonomous vehicle 450 may determine the distance before the vehicles 454 - 458 begin to slow down and stop due to the upcoming red traffic light ahead. In other words, the autonomous vehicle 450 may predict that the vehicles 454 - 458 will need to begin slowing down and stop before the red traffic light 452 and thus adjust its speed before the vehicles 454 - 458 come to a stop or begin to decelerate. In some scenarios, the red traffic light 452 may instead be a stopped vehicle or other object that is not moving, and would be identified and treated the same way by the autonomous vehicle 450 in such scenarios (e.g., as an object travelling at zero speed).
- the autonomous vehicle 450 may determine that it needs to come to a stop over the distance r S . However, in examples such as the one illustrated in FIG. 4B , the autonomous vehicle 450 may determine that it has a shorter distance to come to a stop since vehicles 454 - 458 are in front of it. As such, the autonomous vehicle 450 may determine the distance at which to adjust its speed based on the range, r S , and a buffer distance, b S .
- the buffer distance may be based on a length of a given object or objects and a predetermined minimal gap between each object at zero speed.
- each vehicle may have a length, y.
- the length of the objects may be different in other examples.
- the predetermined (e.g., estimated) minimal gap between each vehicle when stopped may be a gap, x.
- the autonomous vehicle may determine the distance at which to adjust its speed based on the current state of the traffic light.
- the autonomous vehicle may be configured to predict a change in a state of the traffic light. For instance, the autonomous vehicle may determine that, while the traffic light is currently red, the traffic light may change to green after a given period of time. Depending on the given period of time, the autonomous vehicle may consider the traffic light as a green traffic light and predict that the vehicles between the autonomous vehicle and the traffic light will begin to increase their speed. The autonomous vehicle may then speed up, slow down, or maintain speed, depending on its current speed.
- the autonomous vehicle may be approaching a red traffic light, but may be a far enough distance away from the traffic light and stopped vehicles in front of the traffic light that it has not yet begun to slow down.
- the autonomous vehicle may determine that the traffic light will change from red to green after a short period of time and thus determine that the vehicles will begin to accelerate soon. Since the autonomous vehicle may still be a far distance from the traffic light and the vehicles, the autonomous vehicle may maintain its current speed (or increase its speed) if it determines that the vehicles will have sped up enough by the time the autonomous vehicle gets closer to them so that the autonomous vehicle's current speed (or increased speed) may substantially match the speed of the previously-stopped vehicle closest to the autonomous vehicle that the autonomous vehicle is approaching (e.g., vehicle 458 in FIG. 4B ). Other examples are also possible.
- the reasoning applied in the example described with respect to FIG. 4B can also apply to examples where the plurality of identified objects are moving (e.g., no traffic control devices), such as the example described with respect to FIG. 4A .
- the distances/ranges between the autonomous vehicle 450 , the red traffic light 452 , and the vehicles 454 - 458 as illustrated in FIG. 4B may not be to scale. Specifically, it is likely that the distance between vehicle 454 and the red traffic light 452 would be much greater than the minimal gap distance between vehicles, x.
- the method 300 includes providing instructions to adjust the speed of the autonomous vehicle based on the distance.
- the computing device may adjust the speed of the autonomous vehicle prior to determining a change of the speed of one or more of the dynamic objects.
- the control system of the autonomous vehicle may comprise multiple control strategies that may be predetermined or adaptive to changes in a driving environment of the autonomous vehicle, the driving environment including the predicted actions of objects substantially in front of the autonomous vehicle, behind the autonomous vehicle, and/or to the side of the autonomous vehicle.
- a control strategy may comprise sets of instructions or rules associated with traffic interaction in various driving contexts.
- the control strategy may comprise rules that determine a speed of the autonomous vehicle, steering angle, and a lane that the autonomous vehicle may travel on while taking into account safety and traffic rules and concerns (e.g., other vehicles stopped at an intersection and windows-of-opportunity in yield situation, lane tracking, speed control, distance from other vehicles on the road, passing other vehicles, and queuing in stop-and-go traffic, and avoiding areas that may result in unsafe behavior such as oncoming-traffic lanes, etc.).
- the computing device may be configured to determine, based on the distance determined at block 310 , a control strategy comprising rules for actions that control speed, steering angle, and lane of the autonomous vehicle.
- the control strategy may also be further based on a lateral distance between the autonomous vehicle and nearby objects (e.g., road boundaries and vehicles travelling in adjacent lanes). Further, a given control strategy (or multiple strategies) may comprise a program or computer instructions that characterize actuators controlling the autonomous vehicle (e.g., throttle, steering gear, brake, accelerator, or transmission shifter).
- actuators controlling the autonomous vehicle e.g., throttle, steering gear, brake, accelerator, or transmission shifter
- the instructions provided by the computing device to adjust the control of the autonomous vehicle may be based on road geometry, such as if the road is straight, curving slightly, curving sharply, etc.
- FIG. 5 illustrates an implementation of the example method on a road of travel.
- the vehicle 500 may be in a lane 501 on a road such as a highway.
- the computing device configured to control the autonomous vehicle 500 may be configured to identify a plurality of objects substantially in front of the autonomous vehicle 500 on the road of travel.
- the plurality of objects may include an object 502 , such as a moving object (e.g., cars, trucks, etc.), that is in the same lane 501 as the autonomous vehicle 500 .
- the plurality of objects may also include moving objects 504 , 506 , 508 , and 510 that are in an adjacent lane 511 to the lane 501 of the autonomous vehicle 500 .
- the computing device may not be configured to identify object 510 until the entirety of the length of the object 510 is in front of the autonomous vehicle 500 .
- the computing device may be configured to identify other objects within the environment of the autonomous vehicle 500 , such as object 512 located behind the autonomous vehicle 500 in an adjacent lane. In other examples, the computing device may be configured to ignore objects, such as object 514 , that may be beyond a threshold distance from the autonomous vehicle 500 . The computing device may also identify static objects such as a guard rail 516 . The computing device further may be configured to determine characteristics of the objects 502 - 516 , such as size, location, speed, etc.
- the autonomous vehicle 500 may be configured to only identify objects substantially in front of it, and may thus ignore objects 512 and 514 .
- the computing device may be configured to identify objects 512 and 514 , but may ignore them until they are within a threshold distance from the autonomous vehicle 500 .
- the computing device may monitor characteristics of objects 512 and 514 so as to predict their future behavior while not yet taking their characteristics into account in determining the distance at which to adjust the speed of the autonomous vehicle 500 until the objects are within the threshold distance from the autonomous vehicle 500 .
- the autonomous vehicle 500 may predict that object 512 will accelerate, exceed the speed of the autonomous vehicle 500 , and pass the autonomous vehicle 500 .
- the autonomous vehicle 500 may predict other actions of the object 512 as well (e.g., object 512 may pass the autonomous vehicle and move into the same lane as the autonomous vehicle).
- the computing device may determine the distance at which to adjust the speed of the autonomous vehicle 500 based on identified objects 502 - 516 , their characteristics, and respective buffer distances. In some examples, however, the computing device may determine that one or more objects substantially in front of the autonomous vehicle 500 , such as object 508 , may change lanes or are in the process of changing lanes. As such, the computing device may modify the distance to account for this (e.g., adjust the buffer distances). For instance, if object 508 changes lanes from lane 511 to lane 501 , object 508 may be closer in proximity to the autonomous vehicle 500 and thus the autonomous vehicle 500 may need to adjust its speed in order to match the speed of object 508 .
- the autonomous vehicle 500 may have been travelling at a higher speed since no objects were identified to be in the same lane 501 as the autonomous vehicle 500 and object 502 , and after detecting that object 508 has changed lanes, the autonomous vehicle 500 may reduce its speed. Further, the computing device may predict that object 510 will speed up to match the speed of object 506 once object 508 has fully or partially entered lane 501 . The computing device may be configured to make other determinations/predictions as well, and modify the distance accordingly.
- the computing device may be configured to prioritize amongst the identified objects 502 - 516 in order to determine the distance. For instance, if object 508 is in the same lane 501 as the autonomous vehicle 500 and object 502 , the behavior of object 508 may be taken more into account than the behaviors of objects 504 , 506 , and 510 , which are in the adjacent lane 511 . Such prioritization may take the form of modified buffer distances, for example (e.g., the buffer distance of object 510 may be shorter than the buffer distance of object 508 , despite object 510 being closer in proximity to the autonomous vehicle 500 ). Thus, the computing device may be configured to add or subtract a buffer amount of distance to the determined distance to account for or compensate for such lane changes, as well as for any other changes in the environment of the autonomous vehicle 500 . The prioritization may be implemented in other ways as well.
- the method described above may only be implemented by the computing device when there is at least one moving object between the autonomous vehicle and the second object, in addition to the identified first object. In examples where there are no moving objects between the autonomous vehicle and the second object, the method described above may not be implemented, or may be implemented in accordance with another method or methods not described herein.
- FIG. 6 is a schematic illustrating a conceptual partial view of an example computer program product 600 that includes a computer program for executing a computer process on a computing device, arranged according to at least some embodiments presented herein.
- the example computer program product 600 is provided using a signal bearing medium 601 .
- the signal bearing medium 601 may include one or more program instructions 602 that, when executed by one or more processors (e.g., processor 113 in the computing device 111 ) may provide functionality or portions of the functionality described above with respect to FIGS. 1-5 .
- one or more features of blocks 302 - 306 may be undertaken by one or more instructions associated with the signal bearing medium 601 .
- the program instructions 602 in FIG. 6 describe example instructions as well.
- the signal bearing medium 601 may encompass a computer-readable medium 603 , such as, but not limited to, a hard disk drive, a Compact Disc (CD), a Digital Video Disk (DVD), a digital tape, memory, etc.
- the signal bearing medium 601 may encompass a computer recordable medium 604 , such as, but not limited to, memory, read/write (R/W) CDs, R/W DVDs, etc.
- the signal bearing medium 601 may encompass a communications medium 605 , such as, but not limited to, a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.).
- the signal bearing medium 601 may be conveyed by a wireless form of the communications medium 605 (e.g., a wireless communications medium conforming to the IEEE 802.11 standard or other transmission protocol).
- the one or more programming instructions 602 may be, for example, computer executable and/or logic implemented instructions.
- a computing device such as the computing device described with respect to FIGS. 1-5 may be configured to provide various operations, functions, or actions in response to the programming instructions 602 conveyed to the computing device by one or more of the computer readable medium 603 , the computer recordable medium 604 , and/or the communications medium 605 .
- arrangements described herein are for purposes of example only. As such, those skilled in the art will appreciate that other arrangements and other elements (e.g. machines, interfaces, functions, orders, and groupings of functions, etc.) can be used instead, and some elements may be omitted altogether according to the desired results. Further, many of the elements that are described are functional entities that may be implemented as discrete or distributed components or in conjunction with other components, in any suitable combination and location.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Mathematical Physics (AREA)
- Human Computer Interaction (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Regulating Braking Force (AREA)
Abstract
Description
- The present application is a continuation of U.S. patent application Ser. No. 13/886,563, filed on May 3, 2013, and entitled “Predictive Reasoning for Controlling Speed of a Vehicle,” which is herein incorporated by reference as if fully set forth in this description.
- Autonomous vehicles use various computing systems to aid in transporting passengers from one location to another. Some autonomous vehicles may require some initial input or continuous input from an operator, such as a pilot, driver, or passenger. Other systems, for example autopilot systems, may be used only when the system has been engaged, which permits the operator to switch from a manual mode (where the operator exercises a high degree of control over the movement of the vehicle) to an autonomous mode (where the vehicle essentially drives itself) to modes that lie somewhere in between.
- The present application discloses embodiments that relate to predictive reasoning for controlling speed of a vehicle. In one aspect, the present application describes a method. The method may comprise identifying a first vehicle travelling ahead of an autonomous vehicle. The method may also comprise identifying a second vehicle ahead of the first vehicle, the first and second vehicles travelling in substantially a same lane as the autonomous vehicle. The method may also comprise determining a first buffer distance, the first buffer distance being a minimal distance behind the first vehicle at which the autonomous vehicle will substantially reach a speed of the first vehicle. The method may also comprise determining a second buffer distance, the second buffer distance being a minimal distance behind the second vehicle at which the first vehicle will substantially reach a speed of the second vehicle. The method may also comprise determining a distance at which to adjust a speed of the autonomous vehicle based on the first and second buffer distances and the speed of the autonomous vehicle. The method may further comprise providing instructions by a computing device to adjust the speed of the autonomous vehicle based on the distance.
- In another aspect, the present application describes a non-transitory computer readable medium having stored thereon executable instructions that, upon execution by a computing device, cause the computing device to perform functions. The functions may comprise identifying a first vehicle travelling ahead of an autonomous vehicle. The functions may also comprise identifying a second vehicle ahead of the first vehicle, the first and second vehicles travelling in substantially a same lane as the autonomous vehicle. The functions may also comprise determining a first buffer distance, the first buffer distance being a minimal distance behind the first vehicle at which the autonomous vehicle will substantially reach a speed of the first vehicle. The functions may also comprise determining a second buffer distance, the second buffer distance being a minimal distance behind the second vehicle at which the first vehicle will substantially reach a speed of the second vehicle. The functions may also comprise determining a distance at which to adjust a speed of the autonomous vehicle based on the first and second buffer distances and the speed of the autonomous vehicle. The functions may further comprise providing instructions to adjust the speed of the autonomous vehicle based on the distance.
- In still another aspect, the present application describes a system. The system may comprise at least one processor. The system also may comprise a memory having stored thereon instructions that, upon execution by the at least one processor, cause the system to perform functions. The functions may comprise identifying a first object ahead of an autonomous vehicle. The functions may also comprise identifying a second object ahead of the first object, where the first and second objects are in substantially a same lane as the autonomous vehicle. The functions may also comprise determining a first buffer distance, the first buffer distance being a minimal distance behind the first object at which the autonomous vehicle will substantially reach a speed of the first object. The functions may also comprise determining a second buffer distance, the second buffer distance being a minimal distance behind the second object at which the first object will substantially reach a speed of the second object. The functions may also comprise determining a distance at which to adjust a speed of the autonomous vehicle based on the first and second buffer distances and the speed of the autonomous vehicle. The functions may further comprise providing instructions to adjust the speed of the autonomous vehicle based on the distance.
- The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the figures and the following detailed description.
-
FIG. 1 is a simplified block diagram of an example automobile. -
FIG. 2 illustrates an example automobile. -
FIG. 3 is a flow chart of an example method for adjusting a speed of an autonomous vehicle. -
FIG. 4A illustrates an example for determining a distance at which to adjust the speed of the autonomous vehicle. -
FIG. 4B illustrates an example for determining a distance at which to adjust the speed of the autonomous vehicle when a traffic control object is present. -
FIG. 5 illustrates an implementation of the example method on a road of travel. -
FIG. 6 is a schematic illustrating a conceptual partial view of a computer program. - The following detailed description describes various features and functions of the disclosed systems and methods with reference to the accompanying figures. In the figures, similar symbols identify similar components, unless context dictates otherwise. The illustrative system and method embodiments described herein are not meant to be limiting. It may be readily understood that certain aspects of the disclosed systems and methods can be arranged and combined in a wide variety of different configurations, all of which are contemplated herein.
- An autonomous vehicle operating on a road or path of travel may be configured to identify objects within an environment of the autonomous vehicle in order to determine an adjustment to the autonomous vehicle's current speed. The objects can be other vehicles, traffic control objects, or other types of objects. In some examples, each identified object may be considered independently, and the respective characteristics of the object, such as its current speed, acceleration, and range to the vehicle, may be used to determine a speed for the autonomous vehicle to adjust to.
- However, in other examples, the autonomous vehicle, or computing device associated with the autonomous vehicle, may be configured to predict behaviors of the identified objects based on the characteristics of the objects and a state of the surrounding environment (e.g., traffic, rain, ice on the road, etc.), and the objects may all be considered together—each dependent on each other's behavior. The autonomous vehicle can then adjust its speed based on the predicted behaviors of the objects. In other words, the autonomous vehicle can determine what steady state the vehicle will need to adjust to (e.g., speed up, slow down, or stop) based on the predicted behaviors of the objects. Other characteristics/factors may be considered as well in order to determine the speed of the autonomous vehicle, such as a lateral position of the autonomous vehicle in a road/lane of travel, curvature of the road, proximity of static and dynamic objects, etc.
- In one example of predictive speed control, a computing device, configured to adjust the speed of an autonomous vehicle, may identify multiple objects ahead of the vehicle. The objects may include, for example, other vehicles travelling ahead of the autonomous vehicle in the same lane as the autonomous vehicle, such as trucks, bicycles, and motorcycles. The objects may also include other types of static or dynamic objects, such as pedestrians, stop signs, a toll booth, trees, guard rails, etc. Upon identifying the objects, the computing device may be configured to estimate characteristics of each object, such as the object's speed, acceleration, size, weight, direction of travel, and longitudinal and lateral speeds.
- After identifying the objects, the computing device may determine a buffer distance for each object between the autonomous vehicle and the farthest identified object from the autonomous vehicle. For example, if the computing device identifies a first and second object ahead of the vehicle, the second object being at a greater distance from the autonomous vehicle than the first object, the computing device may determine a first buffer distance at which the autonomous vehicle will substantially reach a speed of the first object, and also determine a second buffer distance at which the first object will substantially reach a speed of the second object. The buffer distances may be based on the speeds of the identified objects. In some examples, the buffer distances may also be based on other characteristics of the identified objects.
- Based on the buffer distances and the speed of the autonomous vehicle, the computing device may then determine a distance at which to adjust the speed of the autonomous vehicle. The distance may also be a function of other characteristics of the objects and the autonomous vehicle, as well as any predetermined (e.g., calibrated) constants. The computing device may be configured to then provide instructions to adjust the speed of the autonomous vehicle based on the distance.
- In some embodiments, the instructions may be provided prior to the computing device detecting a change of the speed of at least one of the objects ahead of the autonomous vehicle. As such, the autonomous vehicle may adjust its speed based on an estimation of the change of the speed of at least one of the objects prior to such change occurring. Such a change in the speed of the object(s) may be evaluated differently in various embodiments. For example, the change in the speed may be indicated by the speed of the object(s) exceeding a given threshold. Other examples are also possible.
- In addition to providing instructions to adjust the speed of the autonomous vehicle, the computing device may be configured to provide instructions to modify a steering angle of the autonomous vehicle so as to cause the autonomous vehicle to follow a given trajectory and/or maintain safe lateral and longitudinal distances with the objects in the vicinity of the autonomous vehicle (e.g., cars in adjacent lanes on a road). The computing device may also be configured to implement heuristics to mimic human-like behavior to determine the distance and adjust the speed of the autonomous vehicle accordingly (and possibly control the autonomous vehicle in other manners, such as adjusting the autonomous vehicle's steering/trajectory).
- An example vehicle control system may be implemented in or may take the form of an automobile. Alternatively, a vehicle control system may be implemented in or take the form of other vehicles, such as cars, trucks, motorcycles, buses, boats, airplanes, helicopters, lawn mowers, recreational vehicles, amusement park vehicles, farm equipment, construction equipment, trams, golf carts, trains, and trolleys. Other vehicles are possible as well.
- Further, an example system may take the form of a non-transitory computer-readable medium, which has program instructions stored thereon that are executable by at least one processor to provide the functionality described herein. An example system may also take the form of an automobile or a subsystem of an automobile that includes such a non-transitory computer-readable medium having such program instructions stored thereon.
- Referring now to the Figures,
FIG. 1 is a simplified block diagram of anexample automobile 100, in accordance with an example embodiment. Components coupled to or included in theautomobile 100 may include apropulsion system 102, asensor system 104, acontrol system 106,peripherals 108, apower supply 110, acomputing device 111, and auser interface 112. Thecomputing device 111 may include aprocessor 113, and amemory 114. Thecomputing device 111 may be a controller, or part of the controller, of theautomobile 100. Thememory 114 may includeinstructions 115 executable by theprocessor 113, and may also storemap data 116. Components of theautomobile 100 may be configured to work in an interconnected fashion with each other and/or with other components coupled to respective systems. For example, thepower supply 110 may provide power to all the components of theautomobile 100. Thecomputing device 111 may be configured to receive information from and control thepropulsion system 102, thesensor system 104, thecontrol system 106, and theperipherals 108. Thecomputing device 111 may be configured to generate a display of images on and receive inputs from theuser interface 112. - In other examples, the
automobile 100 may include more, fewer, or different systems, and each system may include more, fewer, or different components. Additionally, the systems and components shown may be combined or divided in any number of ways. - The
propulsion system 102 may be configured to provide powered motion for theautomobile 100. As shown, thepropulsion system 102 includes an engine/motor 118, anenergy source 120, atransmission 122, and wheels/tires 124. - The engine/
motor 118 may be or include any combination of an internal combustion engine, an electric motor, a steam engine, and a Stirling engine. Other motors and engines are possible as well. In some examples, thepropulsion system 102 could include multiple types of engines and/or motors. For instance, a gas-electric hybrid car could include a gasoline engine and an electric motor. Other examples are possible. - The
energy source 120 may be a source of energy that powers the engine/motor 118 in full or in part. That is, the engine/motor 118 may be configured to convert theenergy source 120 into mechanical energy. Examples ofenergy sources 120 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. The energy source(s) 120 could additionally or alternatively include any combination of fuel tanks, batteries, capacitors, and/or flywheels. In some examples, theenergy source 120 may provide energy for other systems of theautomobile 100 as well. - The
transmission 122 may be configured to transmit mechanical power from the engine/motor 118 to the wheels/tires 124. To this end, thetransmission 122 may include a gearbox, clutch, differential, drive shafts, and/or other elements. In examples where thetransmission 122 includes drive shafts, the drive shafts could include one or more axles that are configured to be coupled to the wheels/tires 124. - The wheels/
tires 124 ofautomobile 100 could be configured in various formats, including a unicycle, bicycle/motorcycle, tricycle, or car/truck four-wheel format. Other wheel/tire formats are possible as well, such as those including six or more wheels. The wheels/tires 124 ofautomobile 100 may be configured to rotate differentially with respect to other wheels/tires 124. In some examples, the wheels/tires 124 may include at least one wheel that is fixedly attached to thetransmission 122 and at least one tire coupled to a rim of the wheel that could make contact with the driving surface. The wheels/tires 124 may include any combination of metal and rubber, or combination of other materials. - The
propulsion system 102 may additionally or alternatively include components other than those shown. - The
sensor system 104 may include a number of sensors configured to sense information about an environment in which theautomobile 100 is located. As shown, the sensors of the sensor system include a Global Positioning System (GPS)module 126, an inertial measurement unit (IMU) 128, a radio detection and ranging (RADAR)unit 130, a laser rangefinder and/or light detection and ranging (LIDAR)unit 132, acamera 134, andactuators 136 configured to modify a position and/or orientation of the sensors. Thesensor system 104 may include additional sensors as well, including, for example, sensors that monitor internal systems of the automobile 100 (e.g., an O2 monitor, a fuel gauge, an engine oil temperature, etc.). Other sensors are possible as well. - The
GPS module 126 may be any sensor configured to estimate a geographic location of theautomobile 100. To this end, theGPS module 126 may include a transceiver configured to estimate a position of theautomobile 100 with respect to the Earth, based on satellite-based positioning data. In an example, thecomputing device 111 may be configured to use theGPS module 126 in combination with themap data 116 to estimate a location of a lane boundary on road on which theautomobile 100 may be travelling on. TheGPS module 126 may take other forms as well. - The
IMU 128 may be any combination of sensors configured to sense position and orientation changes of theautomobile 100 based on inertial acceleration. In some examples, the combination of sensors may include, for example, accelerometers and gyroscopes. Other combinations of sensors are possible as well. - The
RADAR unit 130 may be considered as an object detection system that may be configured to use radio waves to determine characteristics of the object such as range, altitude, direction, or speed of the object. TheRADAR unit 130 may be configured to transmit pulses of radio waves or microwaves that may bounce off any object in a path of the waves. The object may return a part of energy of the waves to a receiver (e.g., dish or antenna), which may be part of theRADAR unit 130 as well. TheRADAR unit 130 also may be configured to perform digital signal processing of received signals (bouncing off the object) and may be configured to identify the object. - Other systems similar to RADAR have been used in other parts of the electromagnetic spectrum. One example is LIDAR (light detection and ranging), which may be configured to use visible light from lasers rather than radio waves.
- The
LIDAR unit 132 may include a sensor configured to sense or detect objects in an environment in which theautomobile 100 is located using light. Generally, LIDAR is an optical remote sensing technology that can measure distance to, or other properties of, a target by illuminating the target with light. As an example, theLIDAR unit 132 may include a laser source and/or laser scanner configured to emit laser pulses and a detector configured to receive reflections of the laser pulses. For example, theLIDAR unit 132 may include a laser range finder reflected by a rotating mirror, and the laser is scanned around a scene being digitized, in one or two dimensions, gathering distance measurements at specified angle intervals. In examples, theLIDAR unit 132 may include components such as light (e.g., laser) source, scanner and optics, photo-detector and receiver electronics, and position and navigation system. - In an example, The
LIDAR unit 132 may be configured to use ultraviolet (UV), visible, or infrared light to image objects and can be used with a wide range of targets, including non-metallic objects. In one example, a narrow laser beam can be used to map physical features of an object with high resolution. - In examples, wavelengths in a range from about 10 micrometers (infrared) to about 250 nm (UV) could be used. Typically light is reflected via backscattering. Different types of scattering are used for different LIDAR applications, such as Rayleigh scattering, Mie scattering and Raman scattering, as well as fluorescence. Based on different kinds of backscattering, LIDAR can be accordingly called Rayleigh LIDAR, Mie LIDAR, Raman LIDAR and Na/Fe/K Fluorescence LIDAR, as examples. Suitable combinations of wavelengths can allow for remote mapping of objects by looking for wavelength-dependent changes in intensity of reflected signals, for example.
- Three-dimensional (3D) imaging can be achieved using both scanning and non-scanning LIDAR systems. “3D gated viewing laser radar” is an example of a non-scanning laser ranging system that applies a pulsed laser and a fast gated camera. Imaging LIDAR can also be performed using an array of high speed detectors and a modulation sensitive detectors array typically built on single chips using CMOS (complementary metal-oxide-semiconductor) and hybrid CMOS/CCD (charge-coupled device) fabrication techniques. In these devices, each pixel may be processed locally by demodulation or gating at high speed such that the array can be processed to represent an image from a camera. Using this technique, many thousands of pixels may be acquired simultaneously to create a 3D point cloud representing an object or scene being detected by the
LIDAR unit 132. - A point cloud may include a set of vertices in a 3D coordinate system. These vertices may be defined by X, Y, and Z coordinates, for example, and may represent an external surface of an object. The
LIDAR unit 132 may be configured to create the point cloud by measuring a large number of points on the surface of the object, and may output the point cloud as a data file. As the result of a 3D scanning process of the object by theLIDAR unit 132, the point cloud can be used to identify and visualize the object. - In one example, the point cloud can be directly rendered to visualize the object. In another example, the point cloud may be converted to polygon or triangle mesh models through a process that may be referred to as surface reconstruction. Example techniques for converting a point cloud to a 3D surface may include Delaunay triangulation, alpha shapes, and ball pivoting. These techniques include building a network of triangles over existing vertices of the point cloud. Other example techniques may include converting the point cloud into a volumetric distance field and reconstructing an implicit surface so defined through a marching cubes algorithm.
- The
camera 134 may be any camera (e.g., a still camera, a video camera, etc.) configured to capture images of the environment in which theautomobile 100 is located. To this end, the camera may be configured to detect visible light, or may be configured to detect light from other portions of the spectrum, such as infrared or ultraviolet light. Other types of cameras are possible as well. Thecamera 134 may be a two-dimensional detector, or may have a three-dimensional spatial range. In some examples, thecamera 134 may be, for example, a range detector configured to generate a two-dimensional image indicating a distance from thecamera 134 to a number of points in the environment. To this end, thecamera 134 may use one or more range detecting techniques. For example, thecamera 134 may be configured to use a structured light technique in which theautomobile 100 illuminates an object in the environment with a predetermined light pattern, such as a grid or checkerboard pattern and uses thecamera 134 to detect a reflection of the predetermined light pattern off the object. Based on distortions in the reflected light pattern, theautomobile 100 may be configured to determine the distance to the points on the object. The predetermined light pattern may comprise infrared light, or light of another wavelength. - The
actuators 136 may, for example, be configured to modify a position and/or orientation of the sensors. - The
sensor system 104 may additionally or alternatively include components other than those shown. - The
control system 106 may be configured to control operation of theautomobile 100 and its components. To this end, thecontrol system 106 may include asteering unit 138, athrottle 140, abrake unit 142, asensor fusion algorithm 144, acomputer vision system 146, a navigation or pathingsystem 148, and anobstacle avoidance system 150. - The
steering unit 138 may be any combination of mechanisms configured to adjust the heading or direction of theautomobile 100. - The
throttle 140 may be any combination of mechanisms configured to control the operating speed and acceleration of the engine/motor 118 and, in turn, the speed and acceleration of theautomobile 100. - The
brake unit 142 may be any combination of mechanisms configured to decelerate theautomobile 100. For example, thebrake unit 142 may use friction to slow the wheels/tires 124. As another example, thebrake unit 142 may be configured to be regenerative and convert the kinetic energy of the wheels/tires 124 to electric current. Thebrake unit 142 may take other forms as well. - The
sensor fusion algorithm 144 may include an algorithm (or a computer program product storing an algorithm) executable by thecomputing device 111, for example. Thesensor fusion algorithm 144 may be configured to accept data from thesensor system 104 as an input. The data may include, for example, data representing information sensed at the sensors of thesensor system 104. Thesensor fusion algorithm 144 may include, for example, a Kalman filter, a Bayesian network, or another algorithm. Thesensor fusion algorithm 144 further may be configured to provide various assessments based on the data from thesensor system 104, including, for example, evaluations of individual objects and/or features in the environment in which theautomobile 100 is located, evaluations of particular situations, and/or evaluations of possible impacts based on particular situations. Other assessments are possible as well - The
computer vision system 146 may be any system configured to process and analyze images captured by thecamera 134 in order to identify objects and/or features in the environment in which theautomobile 100 is located, including, for example, lane information, traffic signals and obstacles. To this end, thecomputer vision system 146 may use an object recognition algorithm, a Structure from Motion (SFM) algorithm, video tracking, or other computer vision techniques. In some examples, thecomputer vision system 146 may additionally be configured to map the environment, track objects, estimate speed of objects, etc. - The navigation and
pathing system 148 may be any system configured to determine a driving path for theautomobile 100. The navigation andpathing system 148 may additionally be configured to update the driving path dynamically while theautomobile 100 is in operation. In some examples, the navigation andpathing system 148 may be configured to incorporate data from thesensor fusion algorithm 144, theGPS module 126, and one or more predetermined maps so as to determine the driving path for theautomobile 100. - The
obstacle avoidance system 150 may be any system configured to identify, evaluate, and avoid or otherwise negotiate obstacles in the environment in which theautomobile 100 is located. - The
control system 106 may additionally or alternatively include components other than those shown. -
Peripherals 108 may be configured to allow theautomobile 100 to interact with external sensors, other automobiles, and/or a user. To this end, theperipherals 108 may include, for example, awireless communication system 152, atouchscreen 154, amicrophone 156, and/or aspeaker 158. - The
wireless communication system 152 may be any system configured to be wirelessly coupled to one or more other automobiles, sensors, or other entities, either directly or via a communication network. To this end, thewireless communication system 152 may include an antenna and a chipset for communicating with the other automobiles, sensors, or other entities either directly or over an air interface. The chipset orwireless communication system 152 in general may be arranged to communicate according to one or more other types of wireless communication (e.g., protocols) such as Bluetooth, communication protocols described in IEEE 802.11 (including any IEEE 802.11 revisions), cellular technology (such as GSM, CDMA, UMTS, EV-DO, WiMAX, or LTE), Zigbee, dedicated short range communications (DSRC), and radio frequency identification (RFID) communications, among other possibilities. Thewireless communication system 152 may take other forms as well. - The
touchscreen 154 may be used by a user to input commands to theautomobile 100. To this end, thetouchscreen 154 may be configured to sense at least one of a position and a movement of a user's finger via capacitive sensing, resistance sensing, or a surface acoustic wave process, among other possibilities. Thetouchscreen 154 may be capable of sensing finger movement in a direction parallel or planar to the touchscreen surface, in a direction normal to the touchscreen surface, or both, and may also be capable of sensing a level of pressure applied to the touchscreen surface. Thetouchscreen 154 may be formed of one or more translucent or transparent insulating layers and one or more translucent or transparent conducting layers. Thetouchscreen 154 may take other forms as well. - The
microphone 156 may be configured to receive audio (e.g., a voice command or other audio input) from a user of theautomobile 100. Similarly, thespeakers 158 may be configured to output audio to the user of theautomobile 100. - The
peripherals 108 may additionally or alternatively include components other than those shown. - The
power supply 110 may be configured to provide power to some or all of the components of theautomobile 100. To this end, thepower supply 110 may include, for example, a rechargeable lithium-ion or lead-acid battery. In some examples, one or more banks of batteries could be configured to provide electrical power. Other power supply materials and configurations are possible as well. In some examples, thepower supply 110 andenergy source 120 may be implemented together, as in some all-electric cars. - The
processor 113 included in thecomputing device 111 may comprise one or more general-purpose processors and/or one or more special-purpose processors (e.g., image processor, digital signal processor, etc.). To the extent that theprocessor 113 includes more than one processor, such processors could work separately or in combination. Thecomputing device 111 may be configured to control functions of theautomobile 100 based on input received through theuser interface 112, for example. - The
memory 114, in turn, may comprise one or more volatile and/or one or more non-volatile storage components, such as optical, magnetic, and/or organic storage, and thememory 114 may be integrated in whole or in part with theprocessor 113. Thememory 114 may contain the instructions 115 (e.g., program logic) executable by theprocessor 113 to execute various automobile functions, including any of the functions or methods described herein. - The components of the
automobile 100 could be configured to work in an interconnected fashion with other components within and/or outside their respective systems. To this end, the components and systems of theautomobile 100 may be communicatively linked together by a system bus, network, and/or other connection mechanism (not shown). - Further, while each of the components and systems is shown to be integrated in the
automobile 100, in some examples, one or more components or systems may be removably mounted on or otherwise connected (mechanically or electrically) to theautomobile 100 using wired or wireless connections. - The
automobile 100 may include one or more elements in addition to or instead of those shown. For example, theautomobile 100 may include one or more additional interfaces and/or power supplies. Other additional components are possible as well. In these examples, thememory 114 may further include instructions executable by theprocessor 113 to control and/or communicate with the additional components. -
FIG. 2 illustrates anexample automobile 200, in accordance with an embodiment. In particular,FIG. 2 shows a Right Side View, Front View, Back View, and Top View of theautomobile 200. Althoughautomobile 200 is illustrated inFIG. 2 as a car, other examples are possible. For instance, theautomobile 200 could represent a truck, a van, a semi-trailer truck, a motorcycle, a golf cart, an off-road vehicle, or a farm vehicle, among other examples. As shown, theautomobile 200 includes afirst sensor unit 202, asecond sensor unit 204, athird sensor unit 206, awireless communication system 208, and acamera 210. - Each of the first, second, and third sensor units 202-206 may include any combination of global positioning system sensors, inertial measurement units, RADAR units, LIDAR units, cameras, lane detection sensors, and acoustic sensors. Other types of sensors are possible as well.
- While the first, second, and
third sensor units 202 are shown to be mounted in particular locations on theautomobile 200, in some examples thesensor unit 202 may be mounted elsewhere on theautomobile 200, either inside or outside theautomobile 200. Further, while only three sensor units are shown, in some examples more or fewer sensor units may be included in theautomobile 200. - In some examples, one or more of the first, second, and third sensor units 202-206 may include one or more movable mounts on which the sensors may be movably mounted. The movable mount may include, for example, a rotating platform. Sensors mounted on the rotating platform could be rotated so that the sensors may obtain information from each direction around the
automobile 200. Alternatively or additionally, the movable mount may include a tilting platform. Sensors mounted on the tilting platform could be tilted within a particular range of angles and/or azimuths so that the sensors may obtain information from a variety of angles. The movable mount may take other forms as well. - Further, in some examples, one or more of the first, second, and third sensor units 202-206 may include one or more actuators configured to adjust the position and/or orientation of sensors in the sensor unit by moving the sensors and/or movable mounts. Example actuators include motors, pneumatic actuators, hydraulic pistons, relays, solenoids, and piezoelectric actuators. Other actuators are possible as well.
- The
wireless communication system 208 may be any system configured to wirelessly couple to one or more other automobiles, sensors, or other entities, either directly or via a communication network as described above with respect to thewireless communication system 152 inFIG. 1 . While thewireless communication system 208 is shown to be positioned on a roof of theautomobile 200, in other examples thewireless communication system 208 could be located, fully or in part, elsewhere. - The
camera 210 may be any camera (e.g., a still camera, a video camera, etc.) configured to capture images of the environment in which theautomobile 200 is located. To this end, thecamera 210 may take any of the forms described above with respect to thecamera 134 inFIG. 1 . While thecamera 210 is shown to be mounted inside a front windshield of theautomobile 200, in other examples thecamera 210 may be mounted elsewhere on theautomobile 200, either inside or outside theautomobile 200. - The
automobile 200 may include one or more other components in addition to or instead of those shown. - A control system of the
automobile 200 may be configured to control theautomobile 200 in accordance with a control strategy from among multiple possible control strategies. The control system may be configured to receive information from sensors coupled to the automobile 200 (on or off the automobile 200), modify the control strategy (and an associated driving behavior) based on the information, and control theautomobile 200 in accordance with the modified control strategy. The control system further may be configured to monitor the information received from the sensors, and continuously evaluate driving conditions; and also may be configured to modify the control strategy and driving behavior based on changes in the driving conditions. -
FIG. 3 is a flow chart of anexample method 300 for adjusting a speed of a vehicle. Themethod 300 may include one or more operations, functions, or actions as illustrated by one or more of blocks 302-308. Although the blocks are illustrated in a sequential order, these blocks may in some instances be performed in parallel, and/or in a different order than those described herein. Also, the various blocks may be combined into fewer blocks, divided into additional blocks, and/or removed based upon the desired implementation. - In addition, for the
method 300 and other processes and methods disclosed herein, the flowchart shows functionality and operation of one possible implementation of present embodiments. In this regard, each block may represent a module, a segment, or a portion of program code, which includes one or more instructions executable by a processor for implementing specific logical functions or steps in the process. The program code may be stored on any type of computer readable medium or memory, for example, such as a storage device including a disk or hard drive. The computer readable medium may include a non-transitory computer readable medium, for example, such as computer-readable media that stores data for short periods of time like register memory, processor cache and Random Access Memory (RAM). The computer readable medium may also include non-transitory media or memory, such as secondary or persistent long term storage, like read only memory (ROM), optical or magnetic disks, compact-disc read only memory (CD-ROM), for example. The computer readable media may also be any other volatile or non-volatile storage systems. The computer readable medium may be considered a computer readable storage medium, a tangible storage device, or other article of manufacture, for example. - In addition, for the
method 300 and other processes and methods disclosed herein, each block inFIG. 3 may represent circuitry that is wired to perform the specific logical functions in the process. For the sake of example, themethod 300 shown inFIG. 3 will be described as implemented by an example computing device, such as thecomputing device 111 inFIG. 1 . Themethod 300 can also be described as implemented by an autonomous vehicle, as the computing device may be onboard the vehicle or may be off-board but in wireless communication with the vehicle. Therefore the terms “computing device” and “autonomous vehicle” can be interchangeable herein. However, in some examples, the computing device may be configured to control the vehicle in an autonomous or semi-autonomous operation mode. It should be understood that other entities or combinations of entities can implement one or more steps of theexample method 300. - At
block 302, themethod 300 includes identifying a first object ahead of an autonomous vehicle. Further, atblock 304, the method includes identifying a second object ahead of the first object, where the first and second objects are in substantially in a same lane as the autonomous vehicle. It should be understood, however, that additionally or alternatively to identifying objects ahead of (or substantially in front of) the autonomous vehicle and in substantially the same lane as the autonomous vehicle, the computing device can be configured to identify other objects within an environment of the autonomous vehicle, including objects to the side of the autonomous vehicle (e.g., adjacent lanes on a road), and/or behind the autonomous vehicle, for example. - In some examples, in addition to the first object, other objects may be identified by the computing device between the autonomous vehicle and the second object, such as multiple vehicles travelling in front of the autonomous vehicle and behind the second object. The objects may be within a longitudinal distance threshold from the autonomous vehicle and/or within a lateral distance threshold from the autonomous vehicle. For example, on a road of travel, the autonomous vehicle may adjust its speed based on the behavior of other vehicles or objects in front of the autonomous vehicle that are in the same lane as the vehicle. Further, the autonomous vehicle may adjust its speed based on the behavior of vehicles or other objects in adjacent lanes on the road of travel, such as an adjustment made when a nearby vehicle moves from its current lane to the lane in which the autonomous vehicle is travelling. In some scenarios, while it may be desirable for the autonomous vehicle to adjust its speed based on a vehicle travelling in the same lane at a given longitudinal distance in front of the autonomous vehicle or behind the autonomous vehicle, it may not be desirable for the autonomous vehicle to adjust its speed based on the behavior of objects at that same given distance laterally from the autonomous vehicle (e.g., objects in lanes that are beyond the adjacent lanes).
- In some examples, the second object ahead of the first object, as well as other objects between the autonomous vehicle and the second object, may include another vehicle (e.g., cars, bicycles, etc.). In other examples, the second object may include a traffic control object, such as a stop sign, traffic light, traffic cone, road surface marking, road boundary barrier, and the like. In still other examples, the second object may be a pedestrian, such as a pedestrian crossing the street at an upcoming intersection. Other examples are also possible.
- In addition to identifying the objects, the computing device may be configured to determine respective characteristics of each object. For example, the computing device may be configured to determine a type of an object or classify the object (e.g., car or truck, car or motorcycle, traffic sign or a pedestrian, etc.). Further, the computing device can determine whether the object is moving or stationary. In some examples, at least one object identified between the autonomous vehicle and the second object may be a dynamic (e.g., moving) object.
- The computing device may be configured to estimate a size (e.g., width and length) and weight of the object. Further, the computing device may be configured to determine a direction of motion of the object, such as if the object is moving towards the autonomous vehicle or away from the vehicle. Still further, the computing device may be configured to determine a transmission type (e.g., manual and automatic) and transmission mode of the object, such as whether the object is in park, drive, reverse, or neutral transmission mode. Yet still further, the computing device may be configured to determine a position of the object in a respective lane on the road or path of travel, and how close the object may be to lane boundaries. In some examples, the computing device may be configured to determine relative longitudinal speed and lateral speed of the object with respect to the autonomous vehicle. These characteristics are examples for illustration, and other characteristics can be determined as well.
- Characteristics of traffic control objects may also be determined, such as a color of a traffic light. In some examples, when a red traffic light is identified as an object substantially in front of the vehicle (e.g., the second object), the computing device may be configured to ignore objects beyond the traffic light (e.g., further away from the autonomous vehicle than the traffic light). The same configuration may apply in a scenario in which a stop sign is identified. In other examples, when a green traffic light is identified by the computing device, the computing device may be configured to ignore the green traffic light as an object and, as such, the speed of the autonomous vehicle may not be adjusted based on the presence of the green traffic light. In still other examples, when a yellow traffic light is identified by the computing device, the computing device may be configured to perform functions as if the traffic light was red or green, or may perform other functions. The computing device may also be configured to detect when a traffic light will change colors.
- To identify the object and characteristics of the objects, the computing device may be configured to use the sensors and devices coupled to the autonomous vehicle. For example, a camera, such as the
camera 134 inFIG. 1 or thecamera 210 inFIG. 2 or any other image-capture device, may be coupled to the autonomous vehicle and may be in communication with the computing device. The camera may be configured to capture images or a video of the path/road of travel and vicinity of the path/road of travel. The computing device may be configured to receive the images or video and identify, using image processing techniques for example, objects depicted in the image or the video. The computing device may be configured compare portions of the images to templates of objects to identify the objects, for example. - In another example, the computing device may be configured to receive, from a LIDAR device (e.g., the
LIDAR unit 132 inFIG. 1 ) coupled to the autonomous vehicle and in communication with the computing device, LIDAR-based information that may include a three-dimensional (3D) point cloud. The 3D point cloud may include points corresponding to light emitted from the LIDAR device and reflected from objects on the road or in the vicinity of the road. - As described with respect to the
LIDAR unit 132 inFIG. 1 , operation of the LIDAR device may involve an optical remote sensing technology that enables measuring properties of scattered light to find range and/or other information of a distant target. The LIDAR device, for example, may be configured to emit laser pulses as a beam, and scan the beam to generate two dimensional or three dimensional range matrices. In an example, the range matrices may be used to determine distance to an object or surface by measuring time delay between transmission of a pulse and detection of a respective reflected signal. - In examples, the LIDAR device may be configured to scan an environment surrounding the autonomous vehicle in three dimensions. In some examples, more than one LIDAR device may be coupled to the vehicle to scan a complete 360° horizon of the vehicle. The LIDAR device may be configured to provide to the computing device a cloud of point data representing obstacles or objects, which have been hit by the laser, on the road and the vicinity of the road. The points may be represented by the LIDAR device in terms of azimuth and elevation angles, in addition to range, which can be converted to (X, Y, Z) point data relative to a local coordinate frame attached to the autonomous vehicle. Additionally, the LIDAR device may be configured to provide to the computing device intensity values of the light or laser reflected off the obstacles that may be indicative of a surface type of a given object. Based on such information, the computing device may be configured to identify the objects and characteristics of the objects such as type of the object, size, speed, whether the object is a traffic sign with a retroreflective surface, etc.
- In still another example, the computing device may be configured to receive, from a RADAR device (e.g., the
RADAR unit 130 inFIG. 1 ) coupled to the autonomous vehicle and in communication with the computing device, RADAR-based information relating to location and characteristics of the objects. The RADAR device may be configured to emit radio waves and receive back the emitted radio waves that bounced off the surface of objects on the road and in the vicinity of the road. The received signals or RADAR-based information may be indicative, for example, of dimensional characteristics of a given object, and may indicate whether the given object is stationary or moving. - In yet still another example, the computing device may be configured to have access to map information that identifies static objects that are permanently placed on the road such as traffic lights, traffic signs, guard rails, etc. The map information may also be updated periodically, and may include information about accidents that have recently occurred and the resulting wreckage or traffic that may be in the vicinity of the autonomous vehicle.
- In one example, the computing device may be configured to detect and identify the objects and characteristics of the objects based on information received from multiple sources such as the image-capture device, the LIDAR device, the RADAR device, etc. However, in another example, the computing device may be configured to identify the objects based on information received from a subset of the multiple sources. For example, images captured by the image-capture device may be blurred due to a malfunction of the image-capture device, and in another example, details of the road may be obscured in the images because of fog. In these examples, the computing device may be configured to identify the objects based on information received from the LIDAR and/or RADAR units and may be configured to disregard the information received from the image-capture device.
- In another example, the autonomous vehicle may be travelling in a portion of the road where some electric noise or jamming signals may cause the LIDAR device and/or RADAR device to operate incorrectly. In this case, the computing device may be configured to identify the objects based on information received from the image-capture device, and may be configured to disregard the information received from the LIDAR and/or RADAR units.
- In one example, the computing device may be configured to rank these sources of information based on a condition of the road (e.g., fog, electronic jamming, etc.). The ranking may be indicative of which device(s) to rely on or give more weight to in identifying the objects. As an example, if fog is present in a portion of the road, then the LIDAR and RADAR devices may be ranked higher than the image-based device, and information received from the LIDAR and/or RADAR devices may be given more weight than respective information received from the image-capture device.
- The computing device may also be configured to receive, from sensors and devices coupled to the autonomous vehicle, information associated with, for example, condition of systems and subsystems of the autonomous vehicle. Further, the computing device may be configured to receive information associated with the surrounding environment of the autonomous vehicle, such as driving conditions and road conditions (e.g., rain, snow, etc.). For example, information indicating that the road is icy or wet ahead of the vehicle may cause the computing device to modify its adjustment of the autonomous vehicle's speed. Other examples are also possible.
- At
block 306, themethod 300 includes determining a first buffer distance, the first buffer distance being a minimal distance behind the first object at which the autonomous vehicle will substantially reach a speed of the first object. Further, atblock 308, the method includes determining a second buffer distance, the second buffer distance being a minimal distance behind the second object at which the first object will substantially reach a speed of the second object. Still further, atblock 310, the method includes determining a distance at which to adjust a speed of the autonomous vehicle based on the first and second buffer distances and the speed of the autonomous vehicle. It should be understood that other buffer distances may be determined as well when other objects are identified within an environment of the autonomous vehicle in addition to the first and second objects. -
FIG. 4A illustrates an example scenario for determining the buffer distances and the distance at which to adjust the speed of the vehicle. As shown, an autonomous vehicle 400 (e.g., a computing device of the autonomous vehicle) travelling in the +y-direction may identify a plurality of objects 402-408 substantially in front of theautonomous vehicle 400 and also travelling in the +y-direction. The plurality of objects 402-408 may include an object travelling ahead of theautonomous vehicle 400, such asvehicle 402, and multiple objects between theautonomous vehicle 400 andvehicle 402, such asvehicle 404,vehicle 406, andvehicle 408. As shown, each vehicle may have respective characteristics that can be used to determine the distance, such as a range from the vehicle to theautonomous vehicle 400, ri, a speed/velocity of the car, vi, and an acceleration (or deceleration) of the car, ai. - The
autonomous vehicle 400 may also have characteristics that can be used to determine the distance, such as a speed/velocity of theautonomous vehicle 400, r0, and an acceleration/deceleration of theautonomous vehicle 400, a0. Further, the distance may be based on a longitudinal speed of the autonomous vehicle, VLONG, and a lateral speed of the autonomous vehicle, vLAT. In some examples, in addition to the distance being based on the speed of the autonomous vehicle (e.g., the current speed of the autonomous vehicle), the distance may be further based on other characteristics of the vehicle, including the autonomous vehicle's direction of motion, size, position on a path of travel, and type, among other characteristics. - In some examples, the
autonomous vehicle 400 may also determine a buffer distance for each of the vehicles 402-408 and may use each buffer distance to predict the behavior of the vehicles 402-408 and determine a distance behind the vehicles that theautonomous vehicle 400 should adjust its speed. Each buffer distance may represent a minimal distance behind a given vehicle at which another vehicle directly following the given vehicle will match (or substantially reach) the speed of the given vehicle. By determining the buffer distances, theautonomous vehicle 400 can determine when (or at what distance from the nearest or furthest object) it will need to adjust its speed. For instance, theautonomous vehicle 400 may determine thatvehicle 404 will match the speed ofvehicle 402 at buffer distance b4, thatvehicle 406 will match the speed ofvehicle 404 at buffer distance b3, and thatvehicle 408 will match the speed ofvehicle 406 at buffer distance b2. Note that buffer distance b1 is approximately zero sincevehicle 408 is closest in proximity to theautonomous vehicle 400 and thus theautonomous vehicle 400 can be assumed to be matching the speed ofvehicle 408. Further, ifvehicle 402 begins to slow down, theautonomous vehicle 400 may determine the buffer distances and thus determine that for each vehicle between theautonomous vehicle 400 andvehicle 402, theautonomous vehicle 400 will need to slow down earlier (at a certain distance from vehicle 402). Theautonomous vehicle 400 may then begin to slow down before vehicles 404-408 react to the speed change ofvehicle 402 and begin to slow down as well. - The buffer distance may be a function of the range from the vehicle to the given vehicle, and may be shorter when the vehicle is further away from the
autonomous vehicle 400. For example, the buffer distance ofvehicle 402, b4, may be shorter than the other buffer distances becausevehicle 402 is the furthest away from theautonomous vehicle 400 and thus its future behavior does not affect theautonomous vehicle 400 as much as the behaviors of the vehicles closer to thevehicle 400. In some examples, the buffer distance may be determined by multiplying a given time constant by the speed of thevehicle 400. Other examples of determining buffer distances are also possible. -
FIG. 4B another example scenario for determining the distance at which to adjust the speed of the vehicle. As shown, anautonomous vehicle 450 similar to theautonomous vehicle 400 ofFIG. 4A that is travelling in the +y-direction may identify a plurality of objects 452-458 substantially in front of theautonomous vehicle 450. Theautonomous vehicle 450 may identify a red traffic light 452 (e.g., the second object) at a range of rS from thevehicle 450. Further, theautonomous vehicle 450 may identifyvehicle 454,vehicle 456, andvehicle 458, each travelling in the +y-direction, and determine characteristics of each vehicle, such as respective speed (longitudinal and lateral), acceleration, range from theautonomous vehicle 450, and others. - Upon the
autonomous vehicle 450 identifying thered traffic light 452 and the moving vehicles 454-458 between theautonomous vehicle 450 and thered traffic light 452, theautonomous vehicle 450 may be configured to determine a distance at which to adjust its speed. Further, theautonomous vehicle 450 may determine the distance before the vehicles 454-458 begin to slow down and stop due to the upcoming red traffic light ahead. In other words, theautonomous vehicle 450 may predict that the vehicles 454-458 will need to begin slowing down and stop before thered traffic light 452 and thus adjust its speed before the vehicles 454-458 come to a stop or begin to decelerate. In some scenarios, thered traffic light 452 may instead be a stopped vehicle or other object that is not moving, and would be identified and treated the same way by theautonomous vehicle 450 in such scenarios (e.g., as an object travelling at zero speed). - In some examples where vehicles 454-458 are not present and no objects are between the
autonomous vehicle 450 and thered traffic light 452, theautonomous vehicle 450 may determine that it needs to come to a stop over the distance rS. However, in examples such as the one illustrated inFIG. 4B , theautonomous vehicle 450 may determine that it has a shorter distance to come to a stop since vehicles 454-458 are in front of it. As such, theautonomous vehicle 450 may determine the distance at which to adjust its speed based on the range, rS, and a buffer distance, bS. - In addition to or alternatively to the factors affecting buffer distance as noted above, the buffer distance may be based on a length of a given object or objects and a predetermined minimal gap between each object at zero speed. For instance, as shown, each vehicle may have a length, y. The length of the objects may be different in other examples. Further, the predetermined (e.g., estimated) minimal gap between each vehicle when stopped may be a gap, x. As such, the
autonomous vehicle 450 may determine bS by adding the lengths of the vehicles to the predetermined gaps (e.g., bS=y+y+y+x+x=3y+2x). Therefore, the distance at which thevehicle 450 should adjust its speed may equal the range to thered traffic light 452, rS, reduced by the buffer distance, bS (e.g., rS, −bS). - In general, when a traffic light is identified, the autonomous vehicle (e.g., the computing device of the vehicle) may determine the distance at which to adjust its speed based on the current state of the traffic light. In some examples, however, the autonomous vehicle may be configured to predict a change in a state of the traffic light. For instance, the autonomous vehicle may determine that, while the traffic light is currently red, the traffic light may change to green after a given period of time. Depending on the given period of time, the autonomous vehicle may consider the traffic light as a green traffic light and predict that the vehicles between the autonomous vehicle and the traffic light will begin to increase their speed. The autonomous vehicle may then speed up, slow down, or maintain speed, depending on its current speed.
- As an example, the autonomous vehicle may be approaching a red traffic light, but may be a far enough distance away from the traffic light and stopped vehicles in front of the traffic light that it has not yet begun to slow down. The autonomous vehicle may determine that the traffic light will change from red to green after a short period of time and thus determine that the vehicles will begin to accelerate soon. Since the autonomous vehicle may still be a far distance from the traffic light and the vehicles, the autonomous vehicle may maintain its current speed (or increase its speed) if it determines that the vehicles will have sped up enough by the time the autonomous vehicle gets closer to them so that the autonomous vehicle's current speed (or increased speed) may substantially match the speed of the previously-stopped vehicle closest to the autonomous vehicle that the autonomous vehicle is approaching (e.g.,
vehicle 458 inFIG. 4B ). Other examples are also possible. - It should be understood that the reasoning applied in the example described with respect to
FIG. 4B can also apply to examples where the plurality of identified objects are moving (e.g., no traffic control devices), such as the example described with respect toFIG. 4A . It should also be understood that the distances/ranges between theautonomous vehicle 450, thered traffic light 452, and the vehicles 454-458 as illustrated inFIG. 4B may not be to scale. Specifically, it is likely that the distance betweenvehicle 454 and thered traffic light 452 would be much greater than the minimal gap distance between vehicles, x. - Referring back to
FIG. 3 , atblock 312, themethod 300 includes providing instructions to adjust the speed of the autonomous vehicle based on the distance. In some examples where multiple dynamic objects are between the autonomous vehicle and the second object, the computing device may adjust the speed of the autonomous vehicle prior to determining a change of the speed of one or more of the dynamic objects. - The control system of the autonomous vehicle may comprise multiple control strategies that may be predetermined or adaptive to changes in a driving environment of the autonomous vehicle, the driving environment including the predicted actions of objects substantially in front of the autonomous vehicle, behind the autonomous vehicle, and/or to the side of the autonomous vehicle. Generally, a control strategy may comprise sets of instructions or rules associated with traffic interaction in various driving contexts. The control strategy, for example, may comprise rules that determine a speed of the autonomous vehicle, steering angle, and a lane that the autonomous vehicle may travel on while taking into account safety and traffic rules and concerns (e.g., other vehicles stopped at an intersection and windows-of-opportunity in yield situation, lane tracking, speed control, distance from other vehicles on the road, passing other vehicles, and queuing in stop-and-go traffic, and avoiding areas that may result in unsafe behavior such as oncoming-traffic lanes, etc.). For instance, the computing device may be configured to determine, based on the distance determined at
block 310, a control strategy comprising rules for actions that control speed, steering angle, and lane of the autonomous vehicle. The control strategy may also be further based on a lateral distance between the autonomous vehicle and nearby objects (e.g., road boundaries and vehicles travelling in adjacent lanes). Further, a given control strategy (or multiple strategies) may comprise a program or computer instructions that characterize actuators controlling the autonomous vehicle (e.g., throttle, steering gear, brake, accelerator, or transmission shifter). - In some examples, the instructions provided by the computing device to adjust the control of the autonomous vehicle (e.g., speed, steering, etc.) may be based on road geometry, such as if the road is straight, curving slightly, curving sharply, etc.
-
FIG. 5 illustrates an implementation of the example method on a road of travel. Thevehicle 500 may be in alane 501 on a road such as a highway. The computing device configured to control theautonomous vehicle 500 may be configured to identify a plurality of objects substantially in front of theautonomous vehicle 500 on the road of travel. The plurality of objects may include anobject 502, such as a moving object (e.g., cars, trucks, etc.), that is in thesame lane 501 as theautonomous vehicle 500. The plurality of objects may also include movingobjects adjacent lane 511 to thelane 501 of theautonomous vehicle 500. In some examples, the computing device may not be configured to identifyobject 510 until the entirety of the length of theobject 510 is in front of theautonomous vehicle 500. - In some examples, the computing device may be configured to identify other objects within the environment of the
autonomous vehicle 500, such asobject 512 located behind theautonomous vehicle 500 in an adjacent lane. In other examples, the computing device may be configured to ignore objects, such asobject 514, that may be beyond a threshold distance from theautonomous vehicle 500. The computing device may also identify static objects such as aguard rail 516. The computing device further may be configured to determine characteristics of the objects 502-516, such as size, location, speed, etc. - In some examples, the
autonomous vehicle 500 may be configured to only identify objects substantially in front of it, and may thus ignoreobjects objects autonomous vehicle 500. In such examples, the computing device may monitor characteristics ofobjects autonomous vehicle 500 until the objects are within the threshold distance from theautonomous vehicle 500. For instance, theautonomous vehicle 500 may predict thatobject 512 will accelerate, exceed the speed of theautonomous vehicle 500, and pass theautonomous vehicle 500. Theautonomous vehicle 500 may predict other actions of theobject 512 as well (e.g., object 512 may pass the autonomous vehicle and move into the same lane as the autonomous vehicle). - The computing device may determine the distance at which to adjust the speed of the
autonomous vehicle 500 based on identified objects 502-516, their characteristics, and respective buffer distances. In some examples, however, the computing device may determine that one or more objects substantially in front of theautonomous vehicle 500, such asobject 508, may change lanes or are in the process of changing lanes. As such, the computing device may modify the distance to account for this (e.g., adjust the buffer distances). For instance, ifobject 508 changes lanes fromlane 511 tolane 501,object 508 may be closer in proximity to theautonomous vehicle 500 and thus theautonomous vehicle 500 may need to adjust its speed in order to match the speed ofobject 508. Prior to detecting thatobject 508 is changing lanes, theautonomous vehicle 500 may have been travelling at a higher speed since no objects were identified to be in thesame lane 501 as theautonomous vehicle 500 andobject 502, and after detecting thatobject 508 has changed lanes, theautonomous vehicle 500 may reduce its speed. Further, the computing device may predict thatobject 510 will speed up to match the speed ofobject 506 onceobject 508 has fully or partially enteredlane 501. The computing device may be configured to make other determinations/predictions as well, and modify the distance accordingly. - In some examples, the computing device may be configured to prioritize amongst the identified objects 502-516 in order to determine the distance. For instance, if
object 508 is in thesame lane 501 as theautonomous vehicle 500 andobject 502, the behavior ofobject 508 may be taken more into account than the behaviors ofobjects adjacent lane 511. Such prioritization may take the form of modified buffer distances, for example (e.g., the buffer distance ofobject 510 may be shorter than the buffer distance ofobject 508, despiteobject 510 being closer in proximity to the autonomous vehicle 500). Thus, the computing device may be configured to add or subtract a buffer amount of distance to the determined distance to account for or compensate for such lane changes, as well as for any other changes in the environment of theautonomous vehicle 500. The prioritization may be implemented in other ways as well. - In some examples, the method described above may only be implemented by the computing device when there is at least one moving object between the autonomous vehicle and the second object, in addition to the identified first object. In examples where there are no moving objects between the autonomous vehicle and the second object, the method described above may not be implemented, or may be implemented in accordance with another method or methods not described herein.
- In some embodiments, the disclosed methods may be implemented as computer program instructions encoded on a computer-readable storage media in a machine-readable format, or on other non-transitory media or articles of manufacture.
FIG. 6 is a schematic illustrating a conceptual partial view of an examplecomputer program product 600 that includes a computer program for executing a computer process on a computing device, arranged according to at least some embodiments presented herein. In one embodiment, the examplecomputer program product 600 is provided using a signal bearing medium 601. The signal bearing medium 601 may include one ormore program instructions 602 that, when executed by one or more processors (e.g.,processor 113 in the computing device 111) may provide functionality or portions of the functionality described above with respect toFIGS. 1-5 . Thus, for example, referring to the embodiments shown inFIG. 3 , one or more features of blocks 302-306 may be undertaken by one or more instructions associated with the signal bearing medium 601. In addition, theprogram instructions 602 inFIG. 6 describe example instructions as well. - In some examples, the signal bearing medium 601 may encompass a computer-
readable medium 603, such as, but not limited to, a hard disk drive, a Compact Disc (CD), a Digital Video Disk (DVD), a digital tape, memory, etc. In some implementations, the signal bearing medium 601 may encompass acomputer recordable medium 604, such as, but not limited to, memory, read/write (R/W) CDs, R/W DVDs, etc. In some implementations, the signal bearing medium 601 may encompass acommunications medium 605, such as, but not limited to, a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.). Thus, for example, the signal bearing medium 601 may be conveyed by a wireless form of the communications medium 605 (e.g., a wireless communications medium conforming to the IEEE 802.11 standard or other transmission protocol). - The one or
more programming instructions 602 may be, for example, computer executable and/or logic implemented instructions. In some examples, a computing device such as the computing device described with respect toFIGS. 1-5 may be configured to provide various operations, functions, or actions in response to theprogramming instructions 602 conveyed to the computing device by one or more of the computerreadable medium 603, thecomputer recordable medium 604, and/or thecommunications medium 605. It should be understood that arrangements described herein are for purposes of example only. As such, those skilled in the art will appreciate that other arrangements and other elements (e.g. machines, interfaces, functions, orders, and groupings of functions, etc.) can be used instead, and some elements may be omitted altogether according to the desired results. Further, many of the elements that are described are functional entities that may be implemented as discrete or distributed components or in conjunction with other components, in any suitable combination and location. - While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope being indicated by the following claims, along with the full scope of equivalents to which such claims are entitled. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/827,578 US9381917B1 (en) | 2013-05-03 | 2015-08-17 | Predictive reasoning for controlling speed of a vehicle |
US15/170,211 US9561797B2 (en) | 2013-05-03 | 2016-06-01 | Predictive reasoning for controlling speed of a vehicle |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/886,563 US9254846B2 (en) | 2013-05-03 | 2013-05-03 | Predictive reasoning for controlling speed of a vehicle |
US14/827,578 US9381917B1 (en) | 2013-05-03 | 2015-08-17 | Predictive reasoning for controlling speed of a vehicle |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/886,563 Continuation US9254846B2 (en) | 2013-05-03 | 2013-05-03 | Predictive reasoning for controlling speed of a vehicle |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/170,211 Continuation US9561797B2 (en) | 2013-05-03 | 2016-06-01 | Predictive reasoning for controlling speed of a vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
US9381917B1 US9381917B1 (en) | 2016-07-05 |
US20160214607A1 true US20160214607A1 (en) | 2016-07-28 |
Family
ID=51841907
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/886,563 Expired - Fee Related US9254846B2 (en) | 2013-05-03 | 2013-05-03 | Predictive reasoning for controlling speed of a vehicle |
US14/827,578 Active US9381917B1 (en) | 2013-05-03 | 2015-08-17 | Predictive reasoning for controlling speed of a vehicle |
US15/170,211 Active US9561797B2 (en) | 2013-05-03 | 2016-06-01 | Predictive reasoning for controlling speed of a vehicle |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/886,563 Expired - Fee Related US9254846B2 (en) | 2013-05-03 | 2013-05-03 | Predictive reasoning for controlling speed of a vehicle |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/170,211 Active US9561797B2 (en) | 2013-05-03 | 2016-06-01 | Predictive reasoning for controlling speed of a vehicle |
Country Status (6)
Country | Link |
---|---|
US (3) | US9254846B2 (en) |
EP (1) | EP2991875A4 (en) |
JP (3) | JP6192812B2 (en) |
KR (3) | KR101614677B1 (en) |
CN (3) | CN106828492B (en) |
WO (1) | WO2014179109A1 (en) |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107161213A (en) * | 2017-05-22 | 2017-09-15 | 武汉理工大学 | A kind of orderly driving system of automotive safety based on car networking |
WO2018063434A1 (en) * | 2016-09-30 | 2018-04-05 | Nissan North America, Inc. | Optimizing autonomous car's driving time and user experience using traffic signal information |
US20180180880A1 (en) * | 2016-12-28 | 2018-06-28 | Keita KATAGIRI | Head-up display, vehicle apparatus, display method, and recording medium |
CN108255171A (en) * | 2016-12-29 | 2018-07-06 | 百度(美国)有限责任公司 | For improving the method and system of the stability of autonomous land vehicle |
CN108596396A (en) * | 2018-04-28 | 2018-09-28 | 中国公路工程咨询集团有限公司 | One kind is based on the modified pavement performance prediction of maintenance history and maintenance process and device |
WO2018142394A3 (en) * | 2017-02-06 | 2018-10-11 | Vayavision Sensing Ltd. | Computer aided driving |
WO2018201162A1 (en) * | 2017-04-25 | 2018-11-01 | TuSimple | System and method for vehicle position and velocity estimation based on camera and lidar data |
US10126135B2 (en) | 2015-12-15 | 2018-11-13 | Nissan North America, Inc. | Traffic signal timing estimation using an artificial neural network model |
US10147193B2 (en) | 2017-03-10 | 2018-12-04 | TuSimple | System and method for semantic segmentation using hybrid dilated convolution (HDC) |
US10259458B2 (en) * | 2016-05-03 | 2019-04-16 | Hyundai Motor Company | Path planning apparatus and method for autonomous vehicle |
WO2019112853A1 (en) * | 2017-12-07 | 2019-06-13 | Waymo Llc | Early object detection for unprotected turns |
US20190204076A1 (en) * | 2016-09-16 | 2019-07-04 | Panasonic Intellectual Property Corporation Of America | Three-dimensional data creation method and three-dimensional data creation device |
WO2019152054A1 (en) * | 2018-02-05 | 2019-08-08 | Cummins Inc. | System and method for tractor trailer dynamic load adjustment |
EP3679445A4 (en) * | 2018-02-27 | 2020-11-18 | Samsung Electronics Co., Ltd. | Method of planning traveling path and electronic device therefor |
US10994748B2 (en) | 2018-02-28 | 2021-05-04 | Nissan North America, Inc. | Transportation network infrastructure for autonomous vehicle decision making |
US11091101B2 (en) | 2019-06-28 | 2021-08-17 | Toyota Jidosha Kabushiki Kaisha | Vehicle capable of drive assist or automatic driving |
US11155274B2 (en) | 2017-07-20 | 2021-10-26 | Nissan Motor Co., Ltd. | Vehicle travel control method and vehicle travel control device |
US11170238B2 (en) * | 2019-06-26 | 2021-11-09 | Woven Planet North America, Inc. | Approaches for determining traffic light state |
US11393123B2 (en) | 2018-01-15 | 2022-07-19 | Canon Kabushiki Kaisha | Information processing device, control method therefor, non-transitory computer-readable storage medium, and driving control system |
US11402510B2 (en) | 2020-07-21 | 2022-08-02 | Leddartech Inc. | Systems and methods for wide-angle LiDAR using non-uniform magnification optics |
US11422266B2 (en) | 2020-07-21 | 2022-08-23 | Leddartech Inc. | Beam-steering devices and methods for LIDAR applications |
JP2022141781A (en) * | 2016-09-20 | 2022-09-29 | イノヴィズ テクノロジーズ リミテッド | LIDAR system and method |
US11567179B2 (en) | 2020-07-21 | 2023-01-31 | Leddartech Inc. | Beam-steering device particularly for LIDAR systems |
Families Citing this family (190)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10520581B2 (en) | 2011-07-06 | 2019-12-31 | Peloton Technology, Inc. | Sensor fusion for autonomous or partially autonomous vehicle control |
US11334092B2 (en) | 2011-07-06 | 2022-05-17 | Peloton Technology, Inc. | Devices, systems, and methods for transmitting vehicle data |
US9582006B2 (en) | 2011-07-06 | 2017-02-28 | Peloton Technology, Inc. | Systems and methods for semi-autonomous convoying of vehicles |
WO2018039114A1 (en) * | 2016-08-22 | 2018-03-01 | Peloton Technology, Inc. | Systems for vehicular platooning and methods therefor |
US20170242443A1 (en) | 2015-11-02 | 2017-08-24 | Peloton Technology, Inc. | Gap measurement for vehicle convoying |
US10474166B2 (en) | 2011-07-06 | 2019-11-12 | Peloton Technology, Inc. | System and method for implementing pre-cognition braking and/or avoiding or mitigation risks among platooning vehicles |
US9381916B1 (en) | 2012-02-06 | 2016-07-05 | Google Inc. | System and method for predicting behaviors of detected objects through environment representation |
US11209286B2 (en) | 2013-02-26 | 2021-12-28 | Polaris Industies Inc. | Recreational vehicle interactive telemetry, mapping and trip planning system |
CA3216574A1 (en) | 2013-02-26 | 2014-09-04 | Polaris Industries Inc. | Recreational vehicle interactive telemetry, mapping, and trip planning system |
US11294396B2 (en) | 2013-03-15 | 2022-04-05 | Peloton Technology, Inc. | System and method for implementing pre-cognition braking and/or avoiding or mitigation risks among platooning vehicles |
US8825259B1 (en) * | 2013-06-21 | 2014-09-02 | Google Inc. | Detecting lane closures and lane shifts by an autonomous vehicle |
JP6105439B2 (en) * | 2013-08-22 | 2017-03-29 | アイシン・エィ・ダブリュ株式会社 | Deceleration setting system, method and program |
US20150100189A1 (en) * | 2013-10-07 | 2015-04-09 | Ford Global Technologies, Llc | Vehicle-to-infrastructure communication |
US10422649B2 (en) * | 2014-02-24 | 2019-09-24 | Ford Global Technologies, Llc | Autonomous driving sensing system and method |
US9349284B2 (en) * | 2014-04-24 | 2016-05-24 | International Business Machines Corporation | Regional driving trend modification using autonomous vehicles |
US9707960B2 (en) * | 2014-07-31 | 2017-07-18 | Waymo Llc | Traffic signal response for autonomous vehicles |
US9558659B1 (en) | 2014-08-29 | 2017-01-31 | Google Inc. | Determining the stationary state of detected vehicles |
KR101610502B1 (en) * | 2014-09-02 | 2016-04-07 | 현대자동차주식회사 | Apparatus and method for recognizing driving enviroment for autonomous vehicle |
US9881349B1 (en) * | 2014-10-24 | 2018-01-30 | Gopro, Inc. | Apparatus and methods for computerized object identification |
US9921307B2 (en) * | 2015-01-30 | 2018-03-20 | Toyota Motor Engineering & Manufacturing North America, Inc. | Combined RADAR sensor and LIDAR sensor processing |
US10678261B2 (en) | 2015-02-06 | 2020-06-09 | Aptiv Technologies Limited | Method and apparatus for controlling an autonomous vehicle |
US20180012492A1 (en) | 2015-02-06 | 2018-01-11 | Delphi Technologies, Inc. | Method of automatically controlling an autonomous vehicle based on electronic messages from roadside infrastructure or other vehicles |
CN118816908A (en) * | 2015-02-10 | 2024-10-22 | 御眼视觉技术有限公司 | Sparse map for autonomous vehicle navigation |
US9555736B2 (en) | 2015-04-03 | 2017-01-31 | Magna Electronics Inc. | Vehicle headlamp control using sensing and communication systems |
US9555807B2 (en) * | 2015-05-01 | 2017-01-31 | Delphi Technologies, Inc. | Automated vehicle parameter modification based on operator override |
US9869560B2 (en) | 2015-07-31 | 2018-01-16 | International Business Machines Corporation | Self-driving vehicle's response to a proximate emergency vehicle |
US9785145B2 (en) | 2015-08-07 | 2017-10-10 | International Business Machines Corporation | Controlling driving modes of self-driving vehicles |
US9721397B2 (en) | 2015-08-11 | 2017-08-01 | International Business Machines Corporation | Automatic toll booth interaction with self-driving vehicles |
US9718471B2 (en) * | 2015-08-18 | 2017-08-01 | International Business Machines Corporation | Automated spatial separation of self-driving vehicles from manually operated vehicles |
US9896100B2 (en) | 2015-08-24 | 2018-02-20 | International Business Machines Corporation | Automated spatial separation of self-driving vehicles from other vehicles based on occupant preferences |
CN105128857B (en) * | 2015-09-02 | 2017-11-14 | 郑州宇通客车股份有限公司 | A kind of automobile autonomous driving control method and a kind of automobile autonomous driving system |
US9731726B2 (en) | 2015-09-02 | 2017-08-15 | International Business Machines Corporation | Redirecting self-driving vehicles to a product provider based on physiological states of occupants of the self-driving vehicles |
DE102015218196A1 (en) * | 2015-09-22 | 2017-03-23 | Bayerische Motoren Werke Aktiengesellschaft | Method and device for operating a motor vehicle when approaching an intersection with a traffic signal system |
KR102365272B1 (en) * | 2015-09-24 | 2022-02-21 | 현대모비스 주식회사 | Apparatus and method for vehicle's automatic drive |
US9566986B1 (en) | 2015-09-25 | 2017-02-14 | International Business Machines Corporation | Controlling driving modes of self-driving vehicles |
JP6252575B2 (en) * | 2015-09-28 | 2017-12-27 | トヨタ自動車株式会社 | Automatic driving device |
US9950619B1 (en) | 2015-09-30 | 2018-04-24 | Waymo Llc | Occupant facing vehicle display |
CN106560367B (en) * | 2015-09-30 | 2018-11-30 | 上海汽车集团股份有限公司 | Adaptive cruise control device, method and system |
US9834224B2 (en) | 2015-10-15 | 2017-12-05 | International Business Machines Corporation | Controlling driving modes of self-driving vehicles |
US9811786B2 (en) | 2015-10-15 | 2017-11-07 | At&T Intellectual Property I, L.P. | Reservations-based intelligent roadway traffic management |
US10557939B2 (en) | 2015-10-19 | 2020-02-11 | Luminar Technologies, Inc. | Lidar system with improved signal-to-noise ratio in the presence of solar background noise |
US9944291B2 (en) | 2015-10-27 | 2018-04-17 | International Business Machines Corporation | Controlling driving modes of self-driving vehicles |
US9751532B2 (en) | 2015-10-27 | 2017-09-05 | International Business Machines Corporation | Controlling spacing of self-driving vehicles based on social network relationships |
US10080325B2 (en) | 2015-10-27 | 2018-09-25 | Cnh Industrial America Llc | Predictive overlap control model |
US10607293B2 (en) | 2015-10-30 | 2020-03-31 | International Business Machines Corporation | Automated insurance toggling for self-driving vehicles |
US9841495B2 (en) | 2015-11-05 | 2017-12-12 | Luminar Technologies, Inc. | Lidar system with improved scanning speed for high-resolution depth mapping |
US10176525B2 (en) | 2015-11-09 | 2019-01-08 | International Business Machines Corporation | Dynamically adjusting insurance policy parameters for a self-driving vehicle |
US9791861B2 (en) | 2015-11-12 | 2017-10-17 | International Business Machines Corporation | Autonomously servicing self-driving vehicles |
JP6852085B2 (en) * | 2015-11-30 | 2021-03-31 | ルミナー テクノロジーズ インコーポレイテッド | Photodetection and ranging systems with distributed lasers and multiple sensor heads, and pulsed lasers for photodetection and ranging systems |
US10061326B2 (en) | 2015-12-09 | 2018-08-28 | International Business Machines Corporation | Mishap amelioration based on second-order sensing by a self-driving vehicle |
US9946259B2 (en) * | 2015-12-18 | 2018-04-17 | Raytheon Company | Negative obstacle detector |
US10108864B2 (en) * | 2015-12-29 | 2018-10-23 | Texas Instruments Incorporated | Stationary-vehicle structure from motion |
BR102016024930B1 (en) | 2016-01-06 | 2021-08-24 | Cnh Industrial America Llc | CONTROL SYSTEM FOR A TOW VEHICLE AND METHOD FOR CONTROLLING AN AGRICULTURAL VEHICLE |
KR102530497B1 (en) * | 2016-01-21 | 2023-05-09 | 주식회사 에이치엘클레무브 | Self-control driving apparatus and method based on traffic signs |
US9836973B2 (en) | 2016-01-27 | 2017-12-05 | International Business Machines Corporation | Selectively controlling a self-driving vehicle's access to a roadway |
JP2017138694A (en) * | 2016-02-02 | 2017-08-10 | ソニー株式会社 | Picture processing device and picture processing method |
JP6429202B2 (en) * | 2016-02-10 | 2018-11-28 | 本田技研工業株式会社 | Vehicle, vehicle control apparatus, vehicle control method, and vehicle control program |
MX2018009169A (en) | 2016-02-10 | 2018-11-29 | Polaris Inc | Recreational vehicle group management system. |
US10239529B2 (en) * | 2016-03-01 | 2019-03-26 | Ford Global Technologies, Llc | Autonomous vehicle operation based on interactive model predictive control |
US10317522B2 (en) * | 2016-03-01 | 2019-06-11 | GM Global Technology Operations LLC | Detecting long objects by sensor fusion |
US10037696B2 (en) * | 2016-03-31 | 2018-07-31 | Delphi Technologies, Inc. | Cooperative automated vehicle system |
US10685391B2 (en) | 2016-05-24 | 2020-06-16 | International Business Machines Corporation | Directing movement of a self-driving vehicle based on sales activity |
JP7005526B2 (en) | 2016-05-31 | 2022-01-21 | ぺロトン テクノロジー インコーポレイテッド | State machine of platooning controller |
US20170349181A1 (en) * | 2016-06-02 | 2017-12-07 | Delphi Technologies, Inc. | Lane management system for an automated vehicle |
US10126136B2 (en) | 2016-06-14 | 2018-11-13 | nuTonomy Inc. | Route planning for an autonomous vehicle |
US11092446B2 (en) | 2016-06-14 | 2021-08-17 | Motional Ad Llc | Route planning for an autonomous vehicle |
US10309792B2 (en) | 2016-06-14 | 2019-06-04 | nuTonomy Inc. | Route planning for an autonomous vehicle |
CN106056936B (en) * | 2016-06-17 | 2019-01-01 | 京东方科技集团股份有限公司 | A kind of method of adjustment of traveling lane, apparatus and system |
JP6550016B2 (en) * | 2016-06-27 | 2019-07-24 | 株式会社デンソー | Vehicle control apparatus and vehicle control method |
DE102016211587A1 (en) | 2016-06-28 | 2017-12-28 | Robert Bosch Gmbh | Method and device for controlling a vehicle |
US10093311B2 (en) * | 2016-07-06 | 2018-10-09 | Waymo Llc | Testing predictions for autonomous vehicles |
US10025318B2 (en) | 2016-08-05 | 2018-07-17 | Qualcomm Incorporated | Shape detecting autonomous vehicle |
US10107631B2 (en) * | 2016-08-19 | 2018-10-23 | GM Global Technology Operations LLC | Methods and systems for vehicle positioning feedback |
US10369998B2 (en) | 2016-08-22 | 2019-08-06 | Peloton Technology, Inc. | Dynamic gap control for automated driving |
DE102016215825A1 (en) * | 2016-08-23 | 2018-03-01 | Bayerische Motoren Werke Aktiengesellschaft | Method for externally providing map data for assistance systems of motor vehicles |
US10640111B1 (en) | 2016-09-07 | 2020-05-05 | Waymo Llc | Speed planning for autonomous vehicles |
US10154377B2 (en) * | 2016-09-12 | 2018-12-11 | Polaris Industries Inc. | Vehicle to vehicle communications device and methods for recreational vehicles |
US10093322B2 (en) | 2016-09-15 | 2018-10-09 | International Business Machines Corporation | Automatically providing explanations for actions taken by a self-driving vehicle |
US10643256B2 (en) | 2016-09-16 | 2020-05-05 | International Business Machines Corporation | Configuring a self-driving vehicle for charitable donations pickup and delivery |
FR3056532B1 (en) | 2016-09-28 | 2018-11-30 | Valeo Schalter Und Sensoren Gmbh | DRIVING ASSISTANCE ON HIGHWAYS WITH SEPARATE ROADS THROUGH A SAFETY RAIL |
WO2018070475A1 (en) * | 2016-10-12 | 2018-04-19 | パイオニア株式会社 | Travel control device, travel control method and program |
US10202118B2 (en) | 2016-10-14 | 2019-02-12 | Waymo Llc | Planning stopping locations for autonomous vehicles |
US10473470B2 (en) | 2016-10-20 | 2019-11-12 | nuTonomy Inc. | Identifying a stopping place for an autonomous vehicle |
US10681513B2 (en) | 2016-10-20 | 2020-06-09 | nuTonomy Inc. | Identifying a stopping place for an autonomous vehicle |
US10331129B2 (en) | 2016-10-20 | 2019-06-25 | nuTonomy Inc. | Identifying a stopping place for an autonomous vehicle |
US10857994B2 (en) | 2016-10-20 | 2020-12-08 | Motional Ad Llc | Identifying a stopping place for an autonomous vehicle |
EP3315998B1 (en) * | 2016-10-25 | 2021-12-08 | KNORR-BREMSE Systeme für Nutzfahrzeuge GmbH | Apparatus and method for determining a speed of a vehicle |
US10421460B2 (en) * | 2016-11-09 | 2019-09-24 | Baidu Usa Llc | Evaluation framework for decision making of autonomous driving vehicle |
US10726640B2 (en) * | 2016-11-15 | 2020-07-28 | At&T Mobility Ii Llc | Facilitation of smart communications hub to support driverless vehicles in 5G networks or other next generation networks |
US10114374B2 (en) * | 2016-11-16 | 2018-10-30 | Baidu Usa Llc | Emergency handling system for an autonomous driving vehicle (ADV) |
US10730465B2 (en) | 2016-12-07 | 2020-08-04 | Joyson Safety Systems Acquisition Llc | 3D time of flight active reflecting sensing systems and methods |
US10366286B2 (en) * | 2016-12-13 | 2019-07-30 | Google Llc | Detection of traffic light signal changes |
DE102016224913A1 (en) * | 2016-12-14 | 2018-06-14 | Robert Bosch Gmbh | Method for automatically setting the speed of a motorcycle |
US10259452B2 (en) | 2017-01-04 | 2019-04-16 | International Business Machines Corporation | Self-driving vehicle collision management system |
US10529147B2 (en) | 2017-01-05 | 2020-01-07 | International Business Machines Corporation | Self-driving vehicle road safety flare deploying system |
US10363893B2 (en) | 2017-01-05 | 2019-07-30 | International Business Machines Corporation | Self-driving vehicle contextual lock control system |
WO2018132607A2 (en) * | 2017-01-12 | 2018-07-19 | Mobileye Vision Technologies Ltd. | Navigation based on vehicle activity |
US10152060B2 (en) | 2017-03-08 | 2018-12-11 | International Business Machines Corporation | Protecting contents of a smart vault being transported by a self-driving vehicle |
WO2018170074A1 (en) * | 2017-03-14 | 2018-09-20 | Starsky Robotics, Inc. | Vehicle sensor system and method of use |
US9810786B1 (en) | 2017-03-16 | 2017-11-07 | Luminar Technologies, Inc. | Optical parametric oscillator for lidar system |
US10124798B2 (en) * | 2017-03-16 | 2018-11-13 | Michael Hall | Performance of autonomous control |
US9905992B1 (en) | 2017-03-16 | 2018-02-27 | Luminar Technologies, Inc. | Self-Raman laser for lidar system |
US9810775B1 (en) | 2017-03-16 | 2017-11-07 | Luminar Technologies, Inc. | Q-switched laser for LIDAR system |
US9869754B1 (en) | 2017-03-22 | 2018-01-16 | Luminar Technologies, Inc. | Scan patterns for lidar systems |
US10545240B2 (en) | 2017-03-28 | 2020-01-28 | Luminar Technologies, Inc. | LIDAR transmitter and detector system using pulse encoding to reduce range ambiguity |
US10007001B1 (en) | 2017-03-28 | 2018-06-26 | Luminar Technologies, Inc. | Active short-wave infrared four-dimensional camera |
JP6940969B2 (en) | 2017-03-29 | 2021-09-29 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Vehicle control device, vehicle control method and program |
US10643084B2 (en) | 2017-04-18 | 2020-05-05 | nuTonomy Inc. | Automatically perceiving travel signals |
DE102017208700A1 (en) * | 2017-05-23 | 2018-11-29 | Robert Bosch Gmbh | Method and device for object detection and LIDAR system |
JP6509279B2 (en) * | 2017-05-31 | 2019-05-08 | 本田技研工業株式会社 | Target recognition system, target recognition method, and program |
JP6580087B2 (en) * | 2017-06-02 | 2019-09-25 | 本田技研工業株式会社 | Traveling track determination device and automatic driving device |
JP2019003605A (en) * | 2017-06-19 | 2019-01-10 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Information processing device and program |
US20180362047A1 (en) * | 2017-06-19 | 2018-12-20 | Panasonic Intellectual Property Corporation Of America | Information processing device and recording medium |
TWI643162B (en) * | 2017-06-22 | 2018-12-01 | 合盈光電科技股份有限公司 | Traffic sign control system |
US10705105B2 (en) | 2017-07-21 | 2020-07-07 | Applied Concepts, Inc. | Absolute speed detector |
KR102339776B1 (en) * | 2017-08-09 | 2021-12-15 | 삼성전자주식회사 | Method and apparatus for controlling driving vehicle |
US10216189B1 (en) * | 2017-08-23 | 2019-02-26 | Uber Technologies, Inc. | Systems and methods for prioritizing object prediction for autonomous vehicles |
US11827219B2 (en) * | 2017-08-25 | 2023-11-28 | Hitachi Astemo, Ltd. | Motion control device for moving body |
US10545505B2 (en) | 2017-08-28 | 2020-01-28 | Toyota Research Institute, Inc | Trajectory plan modification for an autonomous vehicle operation in a heterogeneous vehicle environment |
US20190061756A1 (en) * | 2017-08-30 | 2019-02-28 | GM Global Technology Operations LLC | System and method for following distance adjustment for an autonomous vehicle |
US10850732B2 (en) * | 2017-09-05 | 2020-12-01 | Aptiv Technologies Limited | Automated speed control system |
US10532741B2 (en) * | 2017-10-02 | 2020-01-14 | Deere & Company | Method of using predicted vehicle accelerations to improve roll and pitch measurement accuracy in a tracked machine |
WO2019069425A1 (en) * | 2017-10-05 | 2019-04-11 | 本田技研工業株式会社 | Vehicle control device, vehicle control method, and program |
WO2019073583A1 (en) * | 2017-10-12 | 2019-04-18 | 日産自動車株式会社 | Method and apparatus for controlling automated vehicle |
JP6937218B2 (en) * | 2017-10-19 | 2021-09-22 | 株式会社東芝 | Information processing equipment, information processing methods, and programs |
CN111148675B (en) * | 2017-10-26 | 2023-07-25 | 日产自动车株式会社 | Control method and control device for automatic driving vehicle |
JP6941543B2 (en) * | 2017-11-17 | 2021-09-29 | 本田技研工業株式会社 | Vehicle control devices, vehicle control methods, and programs |
US10627825B2 (en) | 2017-11-22 | 2020-04-21 | Waymo Llc | Using discomfort for speed planning in autonomous vehicles |
US10967861B2 (en) | 2018-11-13 | 2021-04-06 | Waymo Llc | Using discomfort for speed planning in responding to tailgating vehicles for autonomous vehicles |
KR102417908B1 (en) * | 2017-12-19 | 2022-07-07 | 현대자동차주식회사 | Apparatus and method for controlling autonomous driving of vehicle |
KR101895777B1 (en) * | 2017-12-26 | 2018-09-07 | 주식회사 이 아우토반 | Apparatus for shift control of the unmanned vehicle appropriate for automatic driving and method using the same |
GB201802475D0 (en) * | 2018-02-15 | 2018-04-04 | Jaguar Land Rover Ltd | Controller and vehicle |
DE102018202615A1 (en) | 2018-02-21 | 2019-08-22 | Robert Bosch Gmbh | Subscriber station for a bus system and method for increasing the data rate of a bus system |
JP7000202B2 (en) * | 2018-02-27 | 2022-01-19 | 本田技研工業株式会社 | Vehicle control systems, vehicle control methods, and programs |
US10906536B2 (en) * | 2018-04-11 | 2021-02-02 | Aurora Innovation, Inc. | Control of autonomous vehicle based on determined yaw parameter(s) of additional vehicle |
CN109951791A (en) * | 2018-04-29 | 2019-06-28 | 中山市澳多电子科技有限公司 | A kind of inflection point benefit propagation algorithm applied to TBOX |
CN108749815A (en) * | 2018-06-04 | 2018-11-06 | 苏州格目软件技术有限公司 | A kind of three body balanced type automatic Pilot method of straight line and its system |
AU2018286580B2 (en) | 2018-06-25 | 2020-10-15 | Beijing Didi Infinity Technology And Development Co., Ltd. | A high-definition map acquisition system |
US10656647B2 (en) * | 2018-06-27 | 2020-05-19 | Aptiv Technologies Limited | Verification of vehicle operator awareness before transition from autonomous-mode to manual-mode |
DE102018005261A1 (en) * | 2018-07-02 | 2020-01-02 | Daimler Ag | Method and assistance system for operating an autonomous driving operation and vehicle |
US10909866B2 (en) | 2018-07-20 | 2021-02-02 | Cybernet Systems Corp. | Autonomous transportation system and methods |
US11866042B2 (en) | 2018-08-20 | 2024-01-09 | Indian Motorcycle International, LLC | Wheeled vehicle adaptive speed control method and system |
US20210197816A1 (en) * | 2018-08-20 | 2021-07-01 | Indian Motorcycle International, LLC | Wheeled vehicle adaptive speed control method and system |
US10824155B2 (en) * | 2018-08-22 | 2020-11-03 | Ford Global Technologies, Llc | Predicting movement intent of objects |
US10768637B2 (en) * | 2018-08-30 | 2020-09-08 | Pony Ai Inc. | Prioritizing vehicle navigation |
JP7040399B2 (en) * | 2018-10-23 | 2022-03-23 | トヨタ自動車株式会社 | Information processing system and information processing method |
KR102518600B1 (en) * | 2018-10-26 | 2023-04-06 | 현대자동차 주식회사 | Method for controlling deceleration of environmentally friendly vehicle |
US10762791B2 (en) | 2018-10-29 | 2020-09-01 | Peloton Technology, Inc. | Systems and methods for managing communications between vehicles |
TWI680895B (en) * | 2018-11-09 | 2020-01-01 | 財團法人資訊工業策進會 | Automatic braking system and method thereof |
DE102018221063A1 (en) * | 2018-12-05 | 2020-06-10 | Volkswagen Aktiengesellschaft | Configuration of a control system for an at least partially autonomous motor vehicle |
US11597394B2 (en) * | 2018-12-17 | 2023-03-07 | Sri International | Explaining behavior by autonomous devices |
JP2022028989A (en) * | 2018-12-18 | 2022-02-17 | ソニーセミコンダクタソリューションズ株式会社 | Information processor, method for processing information, and program |
US10960886B2 (en) * | 2019-01-29 | 2021-03-30 | Motional Ad Llc | Traffic light estimation |
DK201970221A1 (en) | 2019-01-29 | 2020-08-05 | Aptiv Tech Ltd | Traffic light estimation |
FR3092304B1 (en) * | 2019-01-31 | 2021-01-01 | Psa Automobiles Sa | Management via an equivalent speed of autonomous driving with at least two target objects |
FR3092547B1 (en) * | 2019-02-12 | 2021-09-10 | Psa Automobiles Sa | Autonomous driving based on distance and speed of separate target objects |
FR3093057B1 (en) * | 2019-02-21 | 2021-02-19 | Renault Sas | Method of securing a vehicle. |
JP7259939B2 (en) * | 2019-03-28 | 2023-04-18 | 日産自動車株式会社 | Behavior prediction method, behavior prediction device, and vehicle control device |
US11400924B2 (en) | 2019-03-31 | 2022-08-02 | Gm Cruise Holdings Llc | Autonomous vehicle maneuvering based upon risk associated with occluded regions |
US10962371B2 (en) * | 2019-04-02 | 2021-03-30 | GM Global Technology Operations LLC | Method and apparatus of parallel tracking and localization via multi-mode slam fusion process |
CN111275661B (en) * | 2019-04-09 | 2020-11-17 | 杨丽 | Automatic data correction method |
US11427196B2 (en) | 2019-04-15 | 2022-08-30 | Peloton Technology, Inc. | Systems and methods for managing tractor-trailers |
US11001200B2 (en) * | 2019-05-30 | 2021-05-11 | Nissan North America, Inc. | Vehicle occupant warning system |
US11988758B2 (en) | 2019-06-18 | 2024-05-21 | Harley-Davidson Motor Company, Inc. | Global Positioning System assisted cruise control |
JP7166988B2 (en) * | 2019-06-26 | 2022-11-08 | 本田技研工業株式会社 | VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM |
KR102645062B1 (en) * | 2019-06-27 | 2024-03-11 | 현대자동차주식회사 | Apparatus and method for controlling transmission of vehicle |
CN110209178B (en) * | 2019-07-08 | 2020-09-15 | 陈坤燕 | Unmanned vehicle control method based on big data |
CN111144211B (en) * | 2019-08-28 | 2023-09-12 | 华为技术有限公司 | Point cloud display method and device |
KR102628027B1 (en) * | 2019-09-10 | 2024-01-24 | 주식회사 에이치엘클레무브 | Apparatus and Method for recognizing an object |
US11834045B2 (en) | 2019-10-08 | 2023-12-05 | Motional Ad Llc | Navigating multi-way stop intersections with an autonomous vehicle |
US11873000B2 (en) | 2020-02-18 | 2024-01-16 | Toyota Motor North America, Inc. | Gesture detection for transport control |
US11055998B1 (en) | 2020-02-27 | 2021-07-06 | Toyota Motor North America, Inc. | Minimizing traffic signal delays with transports |
JP7519792B2 (en) * | 2020-03-19 | 2024-07-22 | 本田技研工業株式会社 | Control device, control method and program |
US11994589B2 (en) * | 2020-03-30 | 2024-05-28 | Gm Cruise Holdings Llc | Vapor detection in lidar point cloud |
US11290856B2 (en) | 2020-03-31 | 2022-03-29 | Toyota Motor North America, Inc. | Establishing connections in transports |
US20210304595A1 (en) | 2020-03-31 | 2021-09-30 | Toyota Motor North America, Inc. | Traffic manager transports |
GB2594079A (en) * | 2020-04-16 | 2021-10-20 | Daimler Ag | A method for tracking at least one object in the surroundings of an autonomous motor vehicle with a Frenet frame, as well as an assistance system |
US11847919B2 (en) | 2020-05-19 | 2023-12-19 | Toyota Motor North America, Inc. | Control of transport en route |
EP4173918A4 (en) * | 2020-06-29 | 2023-12-27 | Sony Semiconductor Solutions Corporation | Control device, control method, storage medium, and control system |
EP4201745A4 (en) * | 2020-08-21 | 2024-02-28 | Koito Manufacturing Co., Ltd. | Automotive sensing system and gating camera |
US12106583B2 (en) | 2020-10-02 | 2024-10-01 | Magna Electronics Inc. | Vehicular lane marker determination system with lane marker estimation based in part on a LIDAR sensing system |
US11521394B2 (en) * | 2020-10-09 | 2022-12-06 | Motional Ad Llc | Ground plane estimation using LiDAR semantic network |
US11729016B2 (en) * | 2020-11-23 | 2023-08-15 | Institute For Information Industry | Vehicle data analysis device and vehicle data analysis method |
US12116018B2 (en) | 2020-12-28 | 2024-10-15 | Waymo Llc | Permeable speed constraints |
JP7505442B2 (en) * | 2021-04-27 | 2024-06-25 | トヨタ自動車株式会社 | vehicle |
US20230068703A1 (en) * | 2021-08-24 | 2023-03-02 | Waymo Llc | Planning system for autonomously navigating around lane-sharing road agents |
KR20230136794A (en) * | 2022-03-17 | 2023-09-27 | 현대자동차주식회사 | Apparatus and method for controlling autonomous driving vehicle |
CN117333837A (en) * | 2022-06-22 | 2024-01-02 | 鸿海精密工业股份有限公司 | Driving safety auxiliary method, electronic equipment and storage medium |
DE102022115620A1 (en) * | 2022-06-23 | 2023-12-28 | Bayerische Motoren Werke Aktiengesellschaft | Method and assistance system for assessing the relevance of surrounding objects in a motor vehicle |
FR3137982A1 (en) * | 2022-07-13 | 2024-01-19 | Psa Automobiles Sa | Method and device for managing a stop of an autonomous vehicle traveling on a traffic lane. |
Family Cites Families (60)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05217099A (en) * | 1992-02-05 | 1993-08-27 | Toyota Motor Corp | Traveling control device for vehicle |
JPH06230131A (en) * | 1993-01-29 | 1994-08-19 | Omron Corp | Running device for following with constant vehicle gap |
DE4313568C1 (en) * | 1993-04-26 | 1994-06-16 | Daimler Benz Ag | Guiding motor vehicle driver when changing traffic lanes - using radar devices to detect velocity and spacing of vehicles in next lane and indicate when lane changing is possible |
US5983161A (en) | 1993-08-11 | 1999-11-09 | Lemelson; Jerome H. | GPS vehicle collision avoidance warning and control system and method |
JPH08132931A (en) * | 1994-11-14 | 1996-05-28 | Toyota Motor Corp | Travel control device for vehicle |
US7085637B2 (en) | 1997-10-22 | 2006-08-01 | Intelligent Technologies International, Inc. | Method and system for controlling a vehicle |
JP3732292B2 (en) * | 1996-11-27 | 2006-01-05 | 本田技研工業株式会社 | Vehicle group running control system |
US6292725B1 (en) | 1997-04-04 | 2001-09-18 | Komatsu Ltd. | Interference preventing device for vehicle |
JPH10338053A (en) * | 1997-06-06 | 1998-12-22 | Honda Motor Co Ltd | Travel controller for vehicle |
SE517480C2 (en) | 1997-08-11 | 2002-06-11 | Volvo Car Corp | Methods and systems for controlling the speed of a vehicle |
JP3473356B2 (en) * | 1997-10-31 | 2003-12-02 | トヨタ自動車株式会社 | Travel control device for self-driving vehicles |
JP4023059B2 (en) * | 2000-01-17 | 2007-12-19 | 株式会社デンソー | Vehicle travel control device |
JP3767353B2 (en) * | 2000-09-19 | 2006-04-19 | 日産自動車株式会社 | Vehicle tracking control device |
JP2002154347A (en) * | 2000-11-20 | 2002-05-28 | Honda Motor Co Ltd | Driving safety device for vehicle |
JP4252722B2 (en) * | 2000-12-01 | 2009-04-08 | 本田技研工業株式会社 | Auto cruise equipment |
JP4432270B2 (en) * | 2001-03-22 | 2010-03-17 | 日産自動車株式会社 | Inter-vehicle distance control device |
JP3630124B2 (en) * | 2001-07-26 | 2005-03-16 | 日産自動車株式会社 | Leading vehicle tracking control device |
JP3736400B2 (en) * | 2001-08-31 | 2006-01-18 | 株式会社デンソー | Vehicle travel control device |
DE10251037A1 (en) * | 2002-11-02 | 2004-05-19 | Robert Bosch Gmbh | Device for adaptive distance and speed regulation with jolt limiting has dynamic device that detects sudden changes in detected traffic situation, restricts operation of jolt limiter depending on situation |
DE10354073A1 (en) * | 2003-11-19 | 2005-06-09 | Daimlerchrysler Ag | Method for longitudinal movement control of a motor vehicle |
JP4134894B2 (en) * | 2003-12-09 | 2008-08-20 | 株式会社デンソー | Vehicle driving support device |
JP2005199930A (en) * | 2004-01-16 | 2005-07-28 | Denso Corp | Vehicle traveling control device |
JP2005231490A (en) * | 2004-02-19 | 2005-09-02 | Honda Motor Co Ltd | Follow-up traveling control device |
JP4872188B2 (en) * | 2004-05-26 | 2012-02-08 | 日産自動車株式会社 | Driving assistance device |
JP2005352636A (en) * | 2004-06-09 | 2005-12-22 | Alpine Electronics Inc | On-vehicle alarm generation device |
JP2006205860A (en) * | 2005-01-27 | 2006-08-10 | Advics:Kk | Vehicle traveling support device |
DE102005050277A1 (en) * | 2005-10-20 | 2007-04-26 | Robert Bosch Gmbh | Distance and speed controller with jam detection |
JP4844103B2 (en) * | 2005-11-30 | 2011-12-28 | 日産自動車株式会社 | Potential risk level warning device and potential risk level warning method |
SE0502820L (en) * | 2005-12-13 | 2006-12-19 | Scania Cv Abp | Adaptive cruise control system |
US8046146B2 (en) * | 2006-02-03 | 2011-10-25 | Kelsey-Hayes Company | Adaptive ABS control |
JP2008049917A (en) * | 2006-08-25 | 2008-03-06 | Toyota Motor Corp | Automatic stop control device |
JP4371137B2 (en) * | 2006-11-10 | 2009-11-25 | トヨタ自動車株式会社 | Automatic operation control device |
DE102007036787A1 (en) * | 2007-08-03 | 2009-02-05 | Robert Bosch Gmbh | Distance controller with automatic stop function |
JP2009070254A (en) * | 2007-09-14 | 2009-04-02 | Nissan Motor Co Ltd | Vehicle risk estimation device |
JP4715826B2 (en) * | 2007-09-28 | 2011-07-06 | 住友電気工業株式会社 | Vehicle driving support system, driving support device, vehicle, and vehicle driving support method |
JP5118468B2 (en) * | 2007-12-19 | 2013-01-16 | 富士重工業株式会社 | Vehicle travel control device. |
JP2009163434A (en) * | 2007-12-28 | 2009-07-23 | Toyota Motor Corp | Emergency evacuation system and method |
JP4538762B2 (en) | 2008-05-20 | 2010-09-08 | トヨタ自動車株式会社 | Inter-vehicle distance control device |
EP2535883B1 (en) | 2008-07-10 | 2014-03-19 | Mitsubishi Electric Corporation | Train-of-vehicle travel support device |
US8311720B2 (en) * | 2009-01-09 | 2012-11-13 | Robert Bosch Gmbh | Lost target function for adaptive cruise control |
WO2010084569A1 (en) * | 2009-01-20 | 2010-07-29 | トヨタ自動車株式会社 | Row-running control system and vehicle |
JP5172764B2 (en) * | 2009-03-30 | 2013-03-27 | 本田技研工業株式会社 | Road friction coefficient estimation device |
US8395529B2 (en) | 2009-04-02 | 2013-03-12 | GM Global Technology Operations LLC | Traffic infrastructure indicator on head-up display |
US8352111B2 (en) * | 2009-04-06 | 2013-01-08 | GM Global Technology Operations LLC | Platoon vehicle management |
US8744661B2 (en) * | 2009-10-21 | 2014-06-03 | Berthold K. P. Horn | Method and apparatus for reducing motor vehicle traffic flow instabilities and increasing vehicle throughput |
JP5273013B2 (en) * | 2009-10-27 | 2013-08-28 | トヨタ自動車株式会社 | Driving assistance device |
JP2011121417A (en) * | 2009-12-08 | 2011-06-23 | Hiroshima City Univ | Travel control system, control program, and recording medium |
JP2011192177A (en) * | 2010-03-16 | 2011-09-29 | Toyota Motor Corp | Forward situation prediction device |
JP5593800B2 (en) * | 2010-04-14 | 2014-09-24 | トヨタ自動車株式会社 | Travel control device |
JP2012066758A (en) * | 2010-09-27 | 2012-04-05 | Fuji Heavy Ind Ltd | Vehicle cruise control apparatus |
US8509982B2 (en) * | 2010-10-05 | 2013-08-13 | Google Inc. | Zone driving |
JP5427203B2 (en) * | 2011-03-30 | 2014-02-26 | 富士重工業株式会社 | Vehicle driving support device |
US9605971B2 (en) * | 2011-06-17 | 2017-03-28 | Robert Bosch Gmbh | Method and device for assisting a driver in lane guidance of a vehicle on a roadway |
KR101281629B1 (en) * | 2011-06-30 | 2013-07-03 | 재단법인 경북아이티융합 산업기술원 | Driving guidance system using sensors |
KR20130005107A (en) * | 2011-07-05 | 2013-01-15 | 현대자동차주식회사 | System for controlling vehicle interval automatically and method thereof |
DE102011082126B4 (en) * | 2011-09-05 | 2020-07-23 | Robert Bosch Gmbh | SAFETY DEVICE FOR MOTOR VEHICLES |
US9180890B2 (en) * | 2012-02-27 | 2015-11-10 | Ford Global Technologies | Smart adaptive cruise control |
DE102012210608A1 (en) * | 2012-06-22 | 2013-12-24 | Robert Bosch Gmbh | Method and device for generating a control parameter for a distance assistance system of a vehicle |
US20140005907A1 (en) * | 2012-06-29 | 2014-01-02 | Magna Electronics Inc. | Vision-based adaptive cruise control system |
DE102012219449B4 (en) * | 2012-10-24 | 2019-01-24 | Bayerische Motoren Werke Aktiengesellschaft | Method for speed and / or distance control in motor vehicles |
-
2013
- 2013-05-03 US US13/886,563 patent/US9254846B2/en not_active Expired - Fee Related
-
2014
- 2014-04-22 CN CN201710000895.2A patent/CN106828492B/en active Active
- 2014-04-22 KR KR1020157031517A patent/KR101614677B1/en active IP Right Grant
- 2014-04-22 KR KR1020167009720A patent/KR102051090B1/en active IP Right Grant
- 2014-04-22 KR KR1020197034984A patent/KR20190133804A/en not_active IP Right Cessation
- 2014-04-22 WO PCT/US2014/034903 patent/WO2014179109A1/en active Application Filing
- 2014-04-22 CN CN201480037918.1A patent/CN105358397B/en not_active Expired - Fee Related
- 2014-04-22 EP EP14791472.5A patent/EP2991875A4/en active Pending
- 2014-04-22 JP JP2016512910A patent/JP6192812B2/en not_active Expired - Fee Related
- 2014-04-22 CN CN201710000906.7A patent/CN107097787B/en active Active
-
2015
- 2015-08-17 US US14/827,578 patent/US9381917B1/en active Active
-
2016
- 2016-06-01 US US15/170,211 patent/US9561797B2/en active Active
-
2017
- 2017-08-08 JP JP2017153343A patent/JP6411595B2/en active Active
- 2017-08-08 JP JP2017153342A patent/JP6736527B2/en active Active
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10126135B2 (en) | 2015-12-15 | 2018-11-13 | Nissan North America, Inc. | Traffic signal timing estimation using an artificial neural network model |
US10259458B2 (en) * | 2016-05-03 | 2019-04-16 | Hyundai Motor Company | Path planning apparatus and method for autonomous vehicle |
US11959740B2 (en) * | 2016-09-16 | 2024-04-16 | Panasonic Intellectual Property Corporation Of America | Three-dimensional data creation method and three-dimensional data creation device |
US20190204076A1 (en) * | 2016-09-16 | 2019-07-04 | Panasonic Intellectual Property Corporation Of America | Three-dimensional data creation method and three-dimensional data creation device |
JP2022141781A (en) * | 2016-09-20 | 2022-09-29 | イノヴィズ テクノロジーズ リミテッド | LIDAR system and method |
JP7241223B2 (en) | 2016-09-20 | 2023-03-16 | イノヴィズ テクノロジーズ リミテッド | LIDAR system and method |
US10115305B2 (en) | 2016-09-30 | 2018-10-30 | Nissan North America, Inc. | Optimizing autonomous car's driving time and user experience using traffic signal information |
WO2018063434A1 (en) * | 2016-09-30 | 2018-04-05 | Nissan North America, Inc. | Optimizing autonomous car's driving time and user experience using traffic signal information |
US10845592B2 (en) * | 2016-12-28 | 2020-11-24 | Ricoh Company, Ltd. | Head-up display, vehicle apparatus, display method, and recording medium |
US20180180880A1 (en) * | 2016-12-28 | 2018-06-28 | Keita KATAGIRI | Head-up display, vehicle apparatus, display method, and recording medium |
CN108255171A (en) * | 2016-12-29 | 2018-07-06 | 百度(美国)有限责任公司 | For improving the method and system of the stability of autonomous land vehicle |
WO2018142394A3 (en) * | 2017-02-06 | 2018-10-11 | Vayavision Sensing Ltd. | Computer aided driving |
US10147193B2 (en) | 2017-03-10 | 2018-12-04 | TuSimple | System and method for semantic segmentation using hybrid dilated convolution (HDC) |
WO2018201162A1 (en) * | 2017-04-25 | 2018-11-01 | TuSimple | System and method for vehicle position and velocity estimation based on camera and lidar data |
AU2018256926B2 (en) * | 2017-04-25 | 2023-03-30 | Tusimple, Inc. | System and method for vehicle position and velocity estimation based on camera and lidar data |
CN107161213A (en) * | 2017-05-22 | 2017-09-15 | 武汉理工大学 | A kind of orderly driving system of automotive safety based on car networking |
US11155274B2 (en) | 2017-07-20 | 2021-10-26 | Nissan Motor Co., Ltd. | Vehicle travel control method and vehicle travel control device |
US11453392B2 (en) | 2017-12-07 | 2022-09-27 | Waymo Llc | Early object detection for unprotected turns |
US10501085B2 (en) | 2017-12-07 | 2019-12-10 | Waymo Llc | Early object detection for unprotected turns |
WO2019112853A1 (en) * | 2017-12-07 | 2019-06-13 | Waymo Llc | Early object detection for unprotected turns |
US12005891B2 (en) | 2017-12-07 | 2024-06-11 | Waymo Llc | Early object detection for unprotected turns |
US11393123B2 (en) | 2018-01-15 | 2022-07-19 | Canon Kabushiki Kaisha | Information processing device, control method therefor, non-transitory computer-readable storage medium, and driving control system |
WO2019152054A1 (en) * | 2018-02-05 | 2019-08-08 | Cummins Inc. | System and method for tractor trailer dynamic load adjustment |
US11046310B2 (en) | 2018-02-27 | 2021-06-29 | Samsung Electronics Co., Ltd. | Method of planning traveling path and electronic device therefor |
EP3679445A4 (en) * | 2018-02-27 | 2020-11-18 | Samsung Electronics Co., Ltd. | Method of planning traveling path and electronic device therefor |
US10994748B2 (en) | 2018-02-28 | 2021-05-04 | Nissan North America, Inc. | Transportation network infrastructure for autonomous vehicle decision making |
CN108596396A (en) * | 2018-04-28 | 2018-09-28 | 中国公路工程咨询集团有限公司 | One kind is based on the modified pavement performance prediction of maintenance history and maintenance process and device |
US11170238B2 (en) * | 2019-06-26 | 2021-11-09 | Woven Planet North America, Inc. | Approaches for determining traffic light state |
US11091101B2 (en) | 2019-06-28 | 2021-08-17 | Toyota Jidosha Kabushiki Kaisha | Vehicle capable of drive assist or automatic driving |
US11422266B2 (en) | 2020-07-21 | 2022-08-23 | Leddartech Inc. | Beam-steering devices and methods for LIDAR applications |
US11402510B2 (en) | 2020-07-21 | 2022-08-02 | Leddartech Inc. | Systems and methods for wide-angle LiDAR using non-uniform magnification optics |
US11474253B2 (en) | 2020-07-21 | 2022-10-18 | Leddartech Inc. | Beam-steering devices and methods for LIDAR applications |
US11543533B2 (en) | 2020-07-21 | 2023-01-03 | Leddartech Inc. | Systems and methods for wide-angle LiDAR using non-uniform magnification optics |
US11567179B2 (en) | 2020-07-21 | 2023-01-31 | Leddartech Inc. | Beam-steering device particularly for LIDAR systems |
US11828853B2 (en) | 2020-07-21 | 2023-11-28 | Leddartech Inc. | Beam-steering device particularly for LIDAR systems |
US12066576B2 (en) | 2020-07-21 | 2024-08-20 | Leddartech Inc. | Beam-steering device particularly for lidar systems |
Also Published As
Publication number | Publication date |
---|---|
CN107097787B (en) | 2019-05-28 |
JP6411595B2 (en) | 2018-10-24 |
WO2014179109A1 (en) | 2014-11-06 |
CN106828492B (en) | 2019-04-02 |
CN107097787A (en) | 2017-08-29 |
JP6736527B2 (en) | 2020-08-05 |
CN105358397B (en) | 2016-12-14 |
US9254846B2 (en) | 2016-02-09 |
KR20190133804A (en) | 2019-12-03 |
KR102051090B1 (en) | 2019-12-02 |
CN106828492A (en) | 2017-06-13 |
KR20150127745A (en) | 2015-11-17 |
US20140330479A1 (en) | 2014-11-06 |
JP6192812B2 (en) | 2017-09-06 |
CN105358397A (en) | 2016-02-24 |
JP2017202828A (en) | 2017-11-16 |
US20160272207A1 (en) | 2016-09-22 |
JP2017214065A (en) | 2017-12-07 |
EP2991875A4 (en) | 2017-06-14 |
KR20160049017A (en) | 2016-05-04 |
US9381917B1 (en) | 2016-07-05 |
US9561797B2 (en) | 2017-02-07 |
EP2991875A1 (en) | 2016-03-09 |
KR101614677B1 (en) | 2016-04-21 |
JP2016523751A (en) | 2016-08-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9561797B2 (en) | Predictive reasoning for controlling speed of a vehicle | |
US9821807B2 (en) | Methods and systems for determining instructions for pulling over an autonomous vehicle | |
US9090259B2 (en) | Controlling vehicle lateral lane positioning | |
EP2958783B1 (en) | A method to detect nearby aggressive drivers and adjust driving modes | |
US9261879B2 (en) | Use of uncertainty regarding observations of traffic intersections to modify behavior of a vehicle | |
US9063548B1 (en) | Use of previous detections for lane marker detection | |
EP2830922B1 (en) | Method for robust detection of traffic signals and their associated states | |
US11079768B2 (en) | Use of a reference image to detect a road obstacle | |
US20160114770A1 (en) | Methods and Systems for Steering-Based Oscillatory Vehicle Braking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DOLGOV, DMITRI;FERGUSON, DAVE;REEL/FRAME:036338/0306 Effective date: 20130503 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: WAYMO HOLDING INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GOOGLE INC.;REEL/FRAME:042084/0741 Effective date: 20170321 Owner name: WAYMO LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WAYMO HOLDING INC.;REEL/FRAME:042085/0001 Effective date: 20170322 |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044144/0001 Effective date: 20170929 |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE CORRECTIVE BY NULLIFICATION TO CORRECT INCORRECTLY RECORDED APPLICATION NUMBERS PREVIOUSLY RECORDED ON REEL 044144 FRAME 0001. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:047894/0508 Effective date: 20170929 |
|
AS | Assignment |
Owner name: WAYMO LLC, CALIFORNIA Free format text: SUBMISSION TO CORRECT AN ERROR MADE IN A PREVIOUSLY RECORDED DOCUMENT THAT ERRONEOUSLY AFFECTS THE IDENTIFIED APPLICATIONS;ASSIGNOR:WAYMO LLC;REEL/FRAME:050978/0359 Effective date: 20191001 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE THE REMOVAL OF THE INCORRECTLY RECORDED APPLICATION NUMBERS 14/149802 AND 15/419313 PREVIOUSLY RECORDED AT REEL: 44144 FRAME: 1. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:068092/0502 Effective date: 20170929 |