[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20200239024A1 - Autonomous vehicle routing with roadway element impact - Google Patents

Autonomous vehicle routing with roadway element impact Download PDF

Info

Publication number
US20200239024A1
US20200239024A1 US16/752,164 US202016752164A US2020239024A1 US 20200239024 A1 US20200239024 A1 US 20200239024A1 US 202016752164 A US202016752164 A US 202016752164A US 2020239024 A1 US2020239024 A1 US 2020239024A1
Authority
US
United States
Prior art keywords
routing graph
routing
vehicle
graph
modification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/752,164
Inventor
Arvind Srinivasan
Jay Yuan
Valerie Chadha
Michael Voznesensky
Rei Chiang
Christopher James Lyons
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uber Technologies Inc
Original Assignee
Uatc LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uatc LLC filed Critical Uatc LLC
Priority to US16/752,164 priority Critical patent/US20200239024A1/en
Assigned to UATC, LLC reassignment UATC, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LYONS, CHRISTOPHER JAMES, Yuan, Jay, CHIANG, REI, SRINIVASAN, ARVIND, CHADHA, VALERIE, VOZNESENSKY, Michael
Publication of US20200239024A1 publication Critical patent/US20200239024A1/en
Assigned to UBER TECHNOLOGIES, INC. reassignment UBER TECHNOLOGIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: UATC, LLC
Assigned to UBER TECHNOLOGIES, INC. reassignment UBER TECHNOLOGIES, INC. CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNMENT DOCUMENT PREVIOUSLY RECORDED AT REEL: 054790 FRAME: 0527. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: UATC, LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0011Planning or execution of driving tasks involving control alternatives for a single driving scenario, e.g. planning several paths to avoid obstacles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/029Adapting to failures or work around with other constraints, e.g. circumvention by avoiding use of failed parts
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/343Calculating itineraries, i.e. routes leading from a starting point to a series of categorical destinations using a global route restraint, round trips, touristic trips
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0027Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement involving a plurality of vehicles, e.g. fleet or convoy travelling
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0038Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/22Command input arrangements
    • G05D1/221Remote-control arrangements
    • G05D1/222Remote-control arrangements operated by humans
    • G05D1/223Command input arrangements on the remote controller, e.g. joysticks or touch screens
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/22Command input arrangements
    • G05D1/228Command input arrangements located on-board unmanned vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/247Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/247Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
    • G05D1/249Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/65Following a desired speed profile
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/69Coordinated control of the position or course of two or more vehicles
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096783Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a roadside individual element
    • G05D2201/0213

Definitions

  • This document pertains generally, but not by way of limitation, to devices, systems, and methods for routing, operating, and/or managing an autonomous vehicle.
  • An autonomous vehicle is a vehicle that is capable of sensing its environment and operating some or all of the vehicle's controls based on the sensed environment.
  • An autonomous vehicle includes sensors that capture signals describing the environment surrounding the vehicle. The autonomous vehicle processes the captured sensor signals to comprehend the environment and automatically operates some or all of the vehicle's controls based on the resulting information.
  • FIG. 1 is a diagram showing one example of an environment for routing autonomous vehicles considering roadway element impact.
  • FIG. 2 is a diagram showing one example of an environment illustrating a remote user utilizing an impact score.
  • FIG. 3 depicts a block diagram of an example vehicle according to example aspects of the present disclosure.
  • FIG. 4 is a flowchart showing one example of a process flow that may be executed, for example, in the environment 100 of FIG. 1 to route a vehicle using an impact score.
  • FIG. 5 is a flowchart showing one example of a process flow for determining whether to apply a routing graph modification to graph elements corresponding to a considered one or more roadway elements.
  • FIG. 6 is a flowchart showing one example of a process flow to process an impact score describing a considered roadway element.
  • FIG. 7 is a flowchart showing one example of a process flow that may be executed by the impact engine to determine an impact score.
  • FIG. 8 is a flowchart showing one example of a process flow that can be executed by the service assignment system if a considered routing graph modification has been applied.
  • FIG. 9 is a block diagram showing one example of a software architecture for a computing device.
  • FIG. 10 is a block diagram illustrating a computing device hardware architecture.
  • a transportation service includes transporting a payload, such as cargo or one or more passengers, from a service start location to a service end location.
  • a payload such as cargo or one or more passengers
  • Examples of cargo can include food, packages, or the like.
  • a vehicle autonomy system In an autonomous or semi-autonomous vehicle (collectively referred to as an autonomous vehicle (AV)), a vehicle autonomy system, sometimes referred to as an AV stack, controls one or more of braking, steering, or throttle of the vehicle. In a fully autonomous vehicle, the vehicle autonomy system assumes full control of the vehicle. In a semi-autonomous vehicle, the vehicle autonomy system assumes a portion of the vehicle control, with a human user (e.g., a vehicle operator) still providing some control input. Some autonomous vehicles can also operate in a manual mode, in which a human user provides all control inputs to the vehicle.
  • a human user e.g., a vehicle operator
  • a service assignment system is configured to receive requests for transportation services from users.
  • the service assignment system receives a request for a transportation service, it generates candidate routes for one or more vehicles to execute the requested transportation service.
  • the candidate routes may begin at a vehicle location and extend to the service start location and the service end location. If routes for more than one vehicle are generated, the service assignment system selects a vehicle to execute the requested transportation service based on the generated routes. For example, a vehicle that can achieve an earlier time of arrival at the service start location and/or a faster drop-off time to the service end location may be favored.
  • the service assignment system generates routes using a routing graph.
  • the routing graph is a representation of the roadways in a geographic area.
  • the routing graph represents the roadways as a set of graph elements.
  • a graph element is a component of a routing graph that represents a roadway element on which the autonomous vehicle can travel.
  • a graph element can be or include an edge, node, or other component of a routing graph.
  • a graph element represents a portion of roadway, referred to herein as a roadway element.
  • a roadway element is a component of a roadway that can be traversed by a vehicle.
  • a roadway element be or include different subdivisions of a roadway, depending on the implementation.
  • the roadway elements are or include road segments.
  • a road segment is a portion of roadway including all lanes and directions of travel.
  • a road segment of the four-lane divided highway includes a stretch of the highway including all four lanes and both directions of travel.
  • roadway elements are or include directed road segments.
  • a directed road segment is a portion of roadway where traffic travels in a common direction. Referring again to the four-lane divided highway example, a stretch of the highway would include at least two directed road segments: a first directed road segment including the two lanes of travel in one direction and a second directed road segment including the two lanes of travel in the other direction.
  • roadway elements are or include lane segments.
  • a lane segment is a portion of a roadway including one lane of travel in one direction.
  • a portion of the divided highway may include two lane segments in each direction.
  • Lane segments may be interconnected in the direction of travel and laterally. For example, a vehicle traversing a lane segment may travel in the direction to travel to the next connected lane segment or may make a lane change to move laterally to a different lane segment.
  • the routing graph indicates data describing directionality, connectivity for the graph elements.
  • the directionality of a graph element describes limitations, if any, on the direction in which a vehicle can traverse the roadway element corresponding to the graph element.
  • the connectivity of a given graph element describes other graph elements to which the autonomous vehicle can be routed from the given graph element.
  • the routing graph can also include cost data describing costs associated with graph elements.
  • the cost data indicates the cost for a vehicle to traverse a roadway element corresponding to a graph element or to transition between roadway elements corresponding to connected graph elements. Cost can be based on various factors including, for example, estimated driving time, danger risk, etc. In some examples, higher cost generally corresponds to more negative characteristics of a graph element or transition (e.g., longer estimated driving time, higher danger risk, etc.).
  • the routing engine generates routes for vehicles by finding a low cost combination of connected graph elements corresponding to a sequence of roadway elements between two locations.
  • Conditions at a roadway element can sometimes change the suitability of the roadway element for travel, for example, for autonomous vehicles.
  • a delivery truck may be double parked at a roadway element, requiring that vehicles deviate from the middle of the lane or even cross the center line to traverse the roadway element.
  • a large pothole is present at a roadway element, making it desirable for vehicles to steer around the pothole or slow down to avoid damage from running over the pothole.
  • an autonomous vehicle utilizes a remote user, sometimes called a teleoperator, to assist the autonomous vehicle.
  • a remote user sometimes called a teleoperator
  • the remote user provides instructions to the autonomous vehicle about how to traverse the roadway element.
  • the remote user is provided with sensor data from the autonomous vehicle and the instruction from the remote user is based on the sensor data. For example, if the sensor data indicates that the autonomous vehicle can safely deviate from the middle of its lane to clear an obstruction, the remote user may instruct the autonomous vehicle to do so.
  • the remote user provides steering, throttle, braking, or other control inputs to the autonomous vehicle.
  • a routing graph modification can be applied, as described herein, to remove the connectivity between the graph element corresponding to the roadway element and the rest of the routing graph.
  • a routing graph modification may be applied to the routing graph to increase the cost of the graph element corresponding to the roadway element.
  • Determining an efficient way to respond to changing roadway conditions can be challenging. For example, closing or raising the cost of a problematic roadway element can prevent autonomous vehicles from losing time and/or utilizing remote user resources.
  • there is also a cost associated with constraining a roadway element For example, if autonomous vehicles are not routed to a particular roadway element, alternative routes may have a higher cost, and sometimes a much higher cost. Further, in some examples, it is not possible to route an autonomous vehicle without using a particular roadway element. If such a roadway element is constrained in a way that removes its connectivity from the routing graph, one or more autonomous vehicles may be stranded without a possible route to some destinations.
  • the impact score indicates an impact of constraining a roadway element, for example, by changing a cost associated with a graph element representing the roadway element in the routing graph and/or removing the connectivity of the graph element from other graph elements in the routing graph.
  • FIG. 1 is a diagram showing one example of an environment 100 for routing autonomous vehicles considering roadway element impact.
  • the environment 100 includes the service assignment system 104 and vehicles 102 A, 102 B, 102 N, 103 A, 103 N.
  • the vehicles 102 A, 102 B, 102 N, 103 A, 103 N can include passenger vehicles, such as trucks, cars, buses, or other similar vehicles.
  • the vehicles 102 A, 102 B, 102 N, 103 A, 103 N can also include delivery vehicles, such as vans, trucks, tractor trailers, etc.
  • the vehicles 102 A, 102 B, 102 N, 103 A, 103 N include self-driving vehicles (SDVs) or autonomous vehicles (AVs) 102 A, 102 B, 102 N and human-driven vehicles 103 A, 103 N.
  • SDVs self-driving vehicles
  • AVs autonomous vehicles
  • FIG. 1 shows three autonomous vehicles 102 A, 102 B, 102 N and two human-drive vehicles 103 A, 103 N, any suitable number of vehicles may be used in any suitable proportion between autonomous vehicles 102 A, 102 B, 102 N and human-driven vehicles 103 A, 103 N.
  • Each of the autonomous vehicles 102 A, 102 B, 102 N includes a vehicle autonomy system, described in more detail with respect to FIG. 3 .
  • the vehicle autonomy system is configured to operate some or all of the controls of the vehicle 102 A, 102 B, 102 N (e.g., acceleration, braking, steering).
  • one or more of the autonomous vehicles 102 A, 102 B, 102 N are operable in different modes, where the vehicle autonomy system has differing levels of control over the vehicle 102 A, 102 B, 102 N.
  • Some autonomous vehicles 102 A, 102 B, 102 N may be operable in a fully autonomous mode in which the vehicle autonomy system has responsibility for all or most of the controls of the vehicle 102 A, 102 B, 102 N.
  • Some autonomous vehicles 102 A, 102 B, 102 N are operable in a semiautonomous mode that is in addition to or instead of the fully autonomous mode.
  • a semiautonomous mode the vehicle autonomy system of a vehicle 102 A, 102 B, 102 N is responsible for some of the vehicle controls while a human user or driver is responsible for other vehicle controls.
  • one or more of the autonomous vehicles 102 A, 102 B, 102 N are operable in a manual mode in which the human user is responsible for all control of the vehicle 102 A, 102 B, 102 N.
  • the autonomous vehicles 102 A, 102 B, 102 N include one or more remote-detection sensor sets 106 A, 106 B, 106 N.
  • the remote-detection sensor sets 106 A, 106 B, 106 N include one or more remote-detection sensors that receive signals from the environment 100 .
  • the signals may be emitted by and/or reflected from objects in the environment 100 , such as the ground, buildings, trees, etc.
  • the remote-detection sensor sets 106 A, 106 B, 106 N may include one or more active sensors, such as light imaging detection and ranging (LIDAR) sensors, radio detection and ranging (RADAR) sensors, and/or sound navigation and ranging (SONAR) sensors, that emit sound or electromagnetic radiation in the form of light or radio waves to generate return signals. Information about the environment 100 is extracted from the received signals.
  • the remote-detection sensor sets 106 A, 106 B, 106 N include one or more passive sensors that receive signals that originated from other sources of sound or electromagnetic radiation.
  • the remote-detection sensor sets 106 A, 106 B, 106 N provide remote-detection sensor data that describes the environment 100 .
  • the autonomous vehicles 102 A, 102 B, 102 N can also include other types of sensors, for example, as described in more detail with respect to FIG. 2 .
  • the autonomous vehicles 102 A, 102 B, 102 N may be of different types. Different types of AVs may have different capabilities. For example, the different types of autonomous vehicles 102 A, 102 B, 102 N can have different vehicle autonomy systems. This can include, for example, vehicle autonomy systems made by different manufacturers or designers, vehicle autonomy systems having different software versions or revisions, etc. Also, in some examples, the different types of autonomous vehicles 102 A, 102 B, 102 N can have different remote-detection sensor sets 106 A, 106 B, 106 N.
  • one type of autonomous vehicle 102 A, 102 B, 102 N may include a LIDAR remote-detection sensor, while another type of vehicle 102 A, 102 B, 102 N may include stereoscopic cameras and omit a LIDAR remote-detection sensor.
  • different types of autonomous vehicles 102 A, 102 B, 102 N can also have different mechanical particulars. For example, one type of vehicle may have all-wheel drive, while another type may have front-wheel drive.
  • the service assignment system 104 is programmed to assign transportation services to the autonomous vehicles 102 A, 102 B, 102 N as described herein.
  • the service assignment system 104 can be or include one or more servers or other suitable computing devices.
  • the service assignment system 104 is configured to receive transportation service requests from one or more users 114 A, 114 B, 114 N.
  • the users 114 A, 114 B, 114 N make transportation service requests with user computing devices 116 A, 116 B, 116 N.
  • the user computing devices 116 A, 116 B, 116 N can be or include any suitable computing device such as, for example, tablet computers, mobile telephone devices, laptop computers, desktop computers, etc.
  • the user computing devices 116 A, 116 B, 116 N execute an application associated with a transportation service implemented with the service assignment system 104 .
  • the users 114 A, 114 B, 114 N launch the application on the respective user computing devices 116 A, 116 B, 116 N and utilize functionality of the application to make transportation service requests.
  • the service assignment system 104 comprises a transportation service selection engine 112 and a routing engine 110 .
  • the transportation service selection engine 112 is programmed to receive and process transportation service requests.
  • the routing engine 110 generates routes for candidate vehicles 102 A, 102 B, 102 N, 103 A, 103 N to execute a requested transportation service.
  • the transportation service selection engine 112 When the transportation service selection engine 112 receives a transportation service request, it identifies a set of candidate vehicles 102 A, 102 B, 102 N, 103 A, 103 N for executing the transportation service.
  • the set of candidate vehicles 102 A, 102 B, 102 N can include vehicles 102 A, 102 B, 102 N 103 A, 103 N that are best suited for executing the transportation service.
  • the set of candidate vehicles 102 A, 102 B, 102 N, 103 A, 103 N can include vehicles 102 A, 102 B, 102 N, 103 A, 103 N that are near to a transportation service start point (e.g., within a threshold distance, within a threshold drive time, etc.).
  • the candidate vehicles 102 A, 102 B, 102 N, 103 A, 103 N are limited to vehicles capable of executing the transportation service.
  • a transportation service that involves moving a large cargo object may be executable only by vehicles 102 A, 102 B, 102 N, 103 A, 103 N having sufficient space to carry the large object.
  • a transportation service that involves moving, for example, five passengers may be executable only by autonomous vehicles 102 A, 102 B, 102 N having sufficient space to carry five passengers.
  • a transportation service that involves traversing a portion of roadway that is not accessible to an autonomous vehicle 102 A, 102 B, 102 N may be executable only by a human-driven vehicle 103 A, 103 N.
  • the transportation service selection engine 112 provides an indication of the candidate vehicles 102 A, 102 B, 102 N, 103 A, 103 N to the routing engine 110 .
  • the routing engine 110 generates candidate routes for some or all of the set of candidate vehicles 102 A, 102 B, 102 N, 103 A, 103 N.
  • the candidate routes may begin at the location of a candidate vehicle and extend to the transportation service start point and transportation service end point. If the transportation service includes one or more waypoints, the candidate routes will also pass these waypoints.
  • Candidate routes determined by the routing engine 110 are provided to the transportation service selection engine 112 .
  • the transportation service selection engine 112 uses the candidate routes to select a vehicle 102 A, 102 B, 102 N, 103 A, 103 N best suited to execute the transportation service.
  • the candidate vehicle 102 A, 102 B, 102 N, 103 A, 103 N best suited to execute a transportation service may be the candidate vehicle 102 A, 102 B, 102 N, 103 A, 103 N having the lowest-cost route for the transportation service.
  • the transportation service selection engine 112 uses other metrics associated with particular types of autonomous vehicles 102 A, 102 B, 102 N in addition to or instead of the candidate routes to select an autonomous vehicle 102 A, 102 B, 102 N for executing a transportation service.
  • the transportation service selection engine 112 can weigh the cost of the candidate routes based on type metrics associated with the candidate autonomous vehicles 102 A, 102 B, 102 N.
  • Non-limiting examples of type metrics include, for example, an estimated time of arrival (ETA) at the service start location, an estimated drop-off time (ETD) at the service end location, a price to the user 114 A, 114 B, 114 N, an average customer rating for the vehicle 102 A, 102 B, 102 N, 103 A, 103 N and/or a manufacturer or manager of the vehicle, an availability status of the vehicles 102 A, 102 B, 102 N, 103 A, 103 N, an acceptance rate for the vehicles 102 A, 102 B, 102 N, 103 A, 103 N, etc.
  • ETA estimated time of arrival
  • ETD estimated drop-off time
  • the transportation service selection engine 112 offers the requested transportation service to the selected vehicle 102 A, 102 B, 102 N, 103 A, 103 N and instructs the vehicle 102 A, 102 B, 102 N, 103 A, 103 N to begin traversing the route associated with the transportation service.
  • the selected vehicle 102 A, 102 B, 102 N, 103 A, 103 N may optionally decline the transportation service.
  • the transportation service selection engine 112 may offer the transportation service to another vehicle 102 A, 102 B, 102 N, 103 A, 103 N, for example, a vehicle having the next-lowest cost candidate route and/or the next most favorable combination of candidate route and other metrics.
  • the routing engine 110 generates routes utilizing, for example, a routing graph 124 in conjunction with routing graph modification data 120 such as, for example, policy data, vehicle capability data, and/or operational routing graph modification data.
  • the routing graph 124 is a representation of the roadways in a geographic area.
  • the routing graph 124 represents the roadway as a set of graph elements, where the graph elements correspond to roadway elements as described herein.
  • the routing graph 124 also indicates directionality, connectivity, and cost for the various corresponding roadway elements.
  • Directionality indicates the direction of travel in a roadway element.
  • Connectivity describes roadway element connections indicating possible transitions between roadway elements.
  • Cost describes the cost for a vehicle 102 A, 102 B, 102 N, 103 A, 103 N to traverse a roadway element and/or transition between two roadway elements.
  • a break-out window 126 shows example roadway elements that can correspond to the graph elements of the routing graph 124 .
  • Roadway elements in the break-out window 126 are illustrated as shapes with arrows indicating the directionality of the roadway elements. Roadway elements can be connected to one another according to their directionality.
  • routing engine 110 is configured to utilize routing graph modification data 120 to generate constrained routing graph 109 data.
  • Routing graph modification data 120 indicates routing graph modifications that are applied to the routing graph 124 to generate a constrained routing graph 109 .
  • a routing graph modification is a change to a routing graph (e.g., a general-purpose routing graph) that reflects various factors including, for example, capabilities of the vehicle that is to execute a route, current roadway conditions, business policy considerations, and so on.
  • a routing graph modification includes a graph element descriptor and a constraint.
  • a graph element descriptor is data describing one or more graph elements that are the subject of a routing graph modification.
  • a graph element descriptor can describe graph elements using one or more graph element properties.
  • a graph element property is anything that describes a graph element and/or its corresponding roadway element.
  • Example graph element properties include, for example, a unique identifier for the graph element, a roadway type of the corresponding roadway element (e.g., divided highway, urban street, etc.), a driving rule of the roadway element associated with the graph element (e.g., speed limit, access limitations), a type of maneuver necessary to enter, exit, and/or traverse the corresponding roadway element, whether the corresponding roadway element leads to a specific type of roadway element (e.g., dead end, divided highway, etc.), and so on.
  • a graph element descriptor including a unique indicator of a particular graph element can be used to generate a routing graph modification that is applied to the particular graph element.
  • a constraint is an action applied to graph elements at a routing graph that are described by the graph element descriptor of a routing graph modification.
  • Example constraints that may be applied to a graph element include removing the graph element from the routing graph, modifying (e.g., removing) transitions to or from a graph element, changing a cost associated with a graph element or transitions involving the graph element, etc. Costs can be changed up or down. Costs may be changed up or down. For example, if the routing graph modification data 120 indicates that graph elements having a particular graph element property or set of graph element properties are disfavored, the costs to traverse and/or transition to the corresponding roadway elements can be increased. On the other hand, if the routing graph modification data 120 indicates that graph elements having a particular graph element property or set of constraint properties are favored, the costs to traverse and/or transition to the corresponding roadway elements can be decreased.
  • Another example constraint can include changing a required or recommended autonomous vehicle mode.
  • a graph element can be modified to indicate that an autonomous vehicle traversing the roadway element corresponding to the graph element should be operated in a semi-autonomous or manual mode.
  • a routing graph modification may include graph element descriptor data identifying graph elements that correspond to roadway elements having a school zone.
  • a corresponding constraint includes removing the graph elements corresponding to such school zone roadway elements from the routing graph 124 and/or removing transitions to such school zone roadway elements
  • a constraint can be applied to graph elements other than those indicated by the graph element descriptor data.
  • the associated constraint could involve removing connectivity to graph elements corresponding to cul-de-sac roadway elements and also removing graph elements corresponding to roadway elements that do not include cul-de-sacs, but can lead only to other roadway elements that do include cul-de-sacs.
  • Routing graph modification data can also include routing graph constraints related to vehicle capability.
  • vehicles of different types e.g., autonomous vehicles, human-driven vehicles, different types of autonomous vehicles, etc.
  • Vehicle capability of an autonomous vehicle 102 A, 102 B, 102 N may be and/or be derived from operation domain (OD) and/or operational design domain (ODD) data, if any, provided by the vehicle's manufacturer.
  • ODD operational design domain
  • vehicle capability is supplemented based on the performance of an autonomous vehicle 102 A, 102 B, 102 N or type of autonomous vehicle in executing transportation services.
  • Routing graph modifications based on vehicle capability can include, for example, routing graph modifications that identify graph elements corresponding to roadway elements that have property or properties (e.g., includes an unprotected left, is part of a controlled access highway, etc.) and constraint data indicating what is to be done to route components having the indicated property or properties.
  • the graph elements corresponding to roadway elements that a particular type of autonomous vehicle 102 A, 102 B, 102 N is not capable of traversing can be removed from the routing graph or can have connectivity data modified to remove transitions to those graph elements. For example, one or more connections to a graph element may be removed. If the properties of a graph element indicate that it corresponds to a roadway element including a maneuver that is undesirable for a vehicle, but not forbidden, then the routing engine 110 can increase the cost of the graph element and/or transitions thereto.
  • routing graph modifications that can be described by the routing graph modification data 120 may include, for example, policy routing graph modifications and operational routing graph modifications.
  • Policy routing graph modifications include graph element properties that identify roadway elements subject to a policy routing graph modification and corresponding routing graph modifications. Policy routing graph modifications refer to types of roadway elements that are desirable for a vehicle to avoid or prioritize.
  • An example policy routing graph modification is to avoid roadway elements that are in or pass through school zones.
  • Another example policy routing graph modification is to avoid routing vehicles through residential neighborhoods.
  • Yet another example policy routing graph modification is to favor routing vehicles on controlled-access highways, if available. Policy routing graph modifications can apply to some vehicles, some vehicle types, all vehicles, or all vehicle types.
  • Operational routing graph modifications can be based, for example, on the state of one or more roadways. For example, if a roadway is to be closed for a parade or for construction, an operational routing graph modification identifies properties (e.g., names or locations) of roadway elements that are part of the closure and an associated routing graph modification (e.g., removing the corresponding graph elements, removing transitions to the corresponding graph elements, etc.).
  • properties e.g., names or locations
  • an associated routing graph modification e.g., removing the corresponding graph elements, removing transitions to the corresponding graph elements, etc.
  • the routing engine 110 applies the routing graph modification data 120 to generate the constrained routing graph 109 .
  • the constrained routing graph 109 is used to generate a route for a vehicle 102 A, 102 B, 102 N, 103 A, 103 N.
  • different constrained routing graphs 109 are generated for different types of vehicles 102 A, 102 B, 102 N, 103 A, 103 N.
  • a human-driven vehicle 103 A, 103 N may have a different set of routing graph modifications than an autonomous vehicle 102 A, 102 B, 102 N.
  • the constrained routing graph 109 can be pre-generated and/or generated on an as-needed basis as routes are determined.
  • the routing engine 110 determines a route for the autonomous vehicle 102 A, 102 B, 102 N, for example, by applying a path-planning algorithm to the constrained routing graph 109 to find the lowest-cost route for the vehicle.
  • a path-planning algorithm can be used, such as, for example, A*, D*, Focused D*, D*Lite, GD*, or Dijkstra's algorithm.
  • a generated route can include a string of connected graph elements that correspond to roadway elements between a vehicle start location and a vehicle end location.
  • a vehicle start location is an initial roadway element of a route.
  • a vehicle end location is a last roadway element of a route.
  • the vehicle start location is a current location of the relevant vehicle 102 A, 102 B, 102 N, 103 A, 103 N
  • the vehicle end location is the end location for the requested transportation service.
  • the autonomous vehicle 102 A, 102 B, 102 N can travel from its current location to the transportation service start location, and then proceed to the transportation service end location traversing transportation service waypoints (if any) along the way.
  • FIG. 1 also shows a remote user 118 and remote user computing device 119 .
  • the remote user 118 may assist an autonomous vehicle 102 A, 102 B, 102 N. For example, if the autonomous vehicle 102 A, 102 B, 102 N encounters a roadway element that the autonomous vehicle 102 A, 102 B, 102 N cannot traverse without assistance, it may send a request for assistance to the remote user 118 .
  • the remote user 118 utilizes the remote user computing device 119 to receive data indicating the state of the autonomous vehicle 102 A, 102 B, 102 N and provide instructions to the autonomous vehicle 102 A, 102 B, 102 N.
  • the remote user computing device 119 can be or include any suitable computing device such as, for example, tablet computers, mobile telephone devices, laptop computers, desktop computers, etc. Additional details describing the remote user 118 are provided herein with respect to FIG. 2 .
  • the service assignment system 104 can also include an impact engine 108 that is configured to generate impact scores for graph elements of the routing graph 124 .
  • the impact engine 108 is shown in FIG. 1 as a component of the service assignment system 104 , in some examples, the impact engine 108 is implemented as an independent system or as a component of another system.
  • the impact score for a roadway element indicates the impact of applying a considered routing graph modification to the graph element corresponding to the roadway element.
  • the considered routing graph modification can be, for example, a change to the cost associated with the graph element and/or a change to the connectivity of the graph element to other parts of the routing graph.
  • the impact score can be expressed in various different ways including, for example, an amount of lost time or an amount of lost revenue due to the considered routing graph modification. In some examples, the impact score is expressed as a cost value that is based on multiple factors such as, for example, lost time or lost revenue.
  • An impact score can be generated for a single roadway element or for multiple roadway elements. For example, if a particular roadway condition affects more than one roadway element, a single impact score can be generated for multiple affected roadway elements.
  • the impact engine 108 can determine impact scores in various different ways.
  • the impact engine 108 generates a set of predicted transportation service requests over a future time period (e.g., the next 30 minutes, the next hour, the next day).
  • the future time period may be selected based on a time period when the considered corresponding graph element will be constrained. For example, if the considered corresponding graph element will be constrained between 4:00 p.m. and 5:00 p.m. on a Wednesday, the future time period matches that.
  • the set of predicted transportation service requests can be generated, for example, from data describing historic service requests.
  • the impact engine 108 generates the set of predicted transportation service requests considering other conditions.
  • the impact engine 108 may generate the set of predicted transportation service requests based on historical data describing transportation service requests in weather and/or traffic conditions similar to those forecast for the future time period. For example, if the future time period includes a county fair, festival, or other activity at or near the considered corresponding graph element, the impact engine 108 generates the set of predicted transportation requests considering historical data describing transportation service requests during similar events.
  • the impact engine 108 may also generate a set of predicted vehicles 102 A, 102 B, 102 N, 103 A, 103 N that are available for executing requested transportation services during the future time period.
  • the set of predicted vehicles can include autonomous vehicles 102 A, 102 B, 102 N and, in some examples, human-driven vehicles 103 A, 103 N.
  • the set of predicted vehicles 102 A, 102 B, 102 N, 103 A, 103 N can be generated in a manner similar to that used to generate the set of predicted transportation service requests.
  • the impact engine 108 may consider historical data describing the number of vehicles 102 A, 102 B, 102 N, 103 A, 103 N available during time periods similar to the future time period such as, for example, times on a similar day of the week, on similar times of day, having similar traffic and/or weather conditions, etc.
  • generating the set of predicted vehicles 102 A, 102 B, 102 N, 103 A, 103 N also includes generating predicted initial vehicle locations. This can also be performed using historical data, as described herein.
  • the impact engine 108 uses the set of predicted vehicles 102 A, 102 B, 102 N, 103 A, 103 N and the set of predicted transportation service requests to generate first and second simulations of the future time period.
  • the service assignment system 104 for each of predicted service request of the set of predicted service requests, generates candidate routes and selects a vehicle 102 A, 102 B, 102 N, 103 A, 103 N for executing the candidate route.
  • the first simulation generates the candidate routes without applying the considered routing graph modification to the graph elements corresponding to the considered one or more roadway elements.
  • the considered one or more roadway elements may be the subject of other routing graph modifications, described by routing graph modification data 120 , but may not have the considered routing graph modification applied for the first simulation.
  • the second simulation generates candidate routes based on the considered routing graph modification. For example, if the considered routing graph modification will close the considered roadway element (e.g., by removing routing graph connectivity to its corresponding graph element), then the version of the routing graph 124 used for the first simulation may not close the considered roadway element. In the second simulation, the considered routing graph modification is applied to the considered roadway element (e.g., to its corresponding graph element).
  • the impact score for the considered roadway element is determined based on a comparison between the first simulation and the second simulation.
  • the impact score may take different forms in different embodiments.
  • the impact score is a comparison between metrics, such as time of arrival, drop-off time, proportion of transportation services assigned to autonomous vehicles, completed transportation services, etc.
  • the impact score is an aggregation based on a combination of metrics.
  • the considered routing graph modification used to generate the second simulation applies to all of the considered roadway elements.
  • the second simulation may be generated based on a version of the routing graph 124 in which the considered routing graph modification is applied to all of the considered roadway elements (e.g., to their corresponding graph elements).
  • the impact engine 108 is also configured to apply an impact score for a considered roadway element.
  • the generation of the impact score is initiated when a vehicle 102 A, 102 B, 102 N, 103 A, 103 N encounters difficulties traversing a roadway element or roadway elements and places a request for assistance to the remote user 118 .
  • the impact engine 108 may generate an impact score for the roadway element or components that were the subject of the request for assistance and provides the impact score to the remote user 118 .
  • the remote user 118 may utilize the provided impact score to determine whether to apply a routing graph modification to the graph element or elements corresponding to the subject roadway element or elements.
  • the impact engine 108 is configured to automatically apply a routing graph modification based on a calculated impact score. For example, if the impact score for a roadway element or elements meets a threshold value, the impact engine 108 may apply a routing graph modification to the corresponding graph element or elements.
  • the impact engine 108 takes different actions based on different thresholds. For example, if the impact score meets a first threshold, then the impact engine 108 may take no action. If the impact score meets a second threshold indicating a lower impact than the first threshold, then the impact engine 108 may provide the remote user 118 with discretion to apply or not apply a routing graph modification to the corresponding graph element or elements. If the impact score meets a third threshold indicating a lower impact than the second threshold, the impact engine 108 may automatically apply a routing graph modification to the corresponding graph element or elements.
  • FIG. 2 is a diagram showing one example of an environment 200 illustrating a remote user 118 utilizing an impact score.
  • the environment 200 shows the autonomous vehicle 102 A and remote detection sensor set 106 A.
  • the vehicle 102 A is traveling on a roadway 204 .
  • the vehicle 102 A has encountered road construction 206 .
  • a human traffic director 208 is directing traffic around the road construction 206 .
  • the human traffic director 208 may direct traffic, including the autonomous vehicle 102 A over a detour route 209 .
  • a vehicle autonomy system of the vehicle 102 A may detect that it is unable to successfully navigate the road construction 206 .
  • the vehicle autonomy system makes a request for assistance to an autonomous vehicle operations center 212 via a wireless communications link 214 .
  • the wireless communications link 214 is illustrated as a cellular communications link that includes a cellular tower 216 , other embodiments may use other types of wireless communications links, such as those provided via satellite or other communications technologies.
  • the request for assistance identifies the vehicle 102 A and/or a location of the vehicle 120 A.
  • the location may be indicated in any suitable manner such as, for example, a current corresponding graph element, a latitude and longitude, etc.
  • the vehicle 102 A is connected to the remote user 118 at the vehicle operations center 212 .
  • the remote user 118 receives output from and provides input to a console 222 .
  • the console 222 may make up all or part of the remote user computing device 119 shown in FIG. 1 .
  • the console 222 is configured to receive input from the remote user 118 and provide control signals to the autonomous vehicle 102 A based on the input via the communications link 214 .
  • the remote user 118 is also presented with visual output derived from at least one of the vehicle's 102 A sensor's 106 A. This visual output is presented on an operator display screen 221 .
  • the operator display screen 221 may also be a component of the remote user computing device 119 .
  • the remote user 118 Based on at least observations of the operator display screen 221 , the remote user 118 provides input via the operator console 222 . After receiving control signals that are derived from the operator's input, the autonomous vehicle 102 A then proceeds according to those control signals. For example, the input from the remote user 118 via the console 222 can cause the vehicle 102 A to follow direction from the human traffic director 208 and, for example, follow the detour route 209 when so directed (e.g., by the remote user 118 ). Control signals provided to the vehicle 102 A can include, for example, direct control of the brakes, throttle, steering or other controls of the vehicle 102 A.
  • control signals provided to the vehicle 102 A can include a general instruction (e.g., deviate from the center of the lane to go around the road construction 206 ).
  • the remote user 118 may provide input to the operator console 222 indicating that the autonomous vehicle 102 A is released from manual direction. The autonomous vehicle 102 A may then continue navigating toward its destination without further operator assistance.
  • FIG. 3 depicts a block diagram of an example vehicle 300 according to example aspects of the present disclosure.
  • the vehicle 300 includes one or more sensors 301 , a vehicle autonomy system 302 , and one or more vehicle controls 307 .
  • the vehicle 300 is an autonomous vehicle, as described herein.
  • the example vehicle 300 shows just one example arrangement of an autonomous vehicle. In some examples, autonomous vehicles of different types can have different arrangements.
  • the vehicle autonomy system 302 includes a commander system 311 , a navigator system 313 , a perception system 303 , a prediction system 304 , a motion planning system 305 , and a localizer system 330 that cooperate to perceive the surrounding environment of the vehicle 300 and determine a motion plan for controlling the motion of the vehicle 300 accordingly.
  • the vehicle autonomy system 302 is engaged to control the vehicle 300 or to assist in controlling the vehicle 300 .
  • the vehicle autonomy system 302 receives sensor data from the one or more sensors 301 , attempts to comprehend the environment surrounding the vehicle 300 by performing various processing techniques on data collected by the sensors 301 , and generates an appropriate route through the environment.
  • the vehicle autonomy system 302 sends commands to control the one or more vehicle controls 307 to operate the vehicle 300 according to the route.
  • the vehicle autonomy system 302 receive sensor data from the one or more sensors 301 .
  • the sensors 301 may include remote-detection sensors as well as motion sensors such as an inertial measurement unit (IMU), one or more encoders, or one or more odometers.
  • the sensor data includes information that describes the location of objects within the surrounding environment of the vehicle 300 , information that describes the motion of the vehicle 300 , etc.
  • the sensors 301 may also include one or more remote-detection sensors or sensor systems, such as a LIDAR system, a RADAR system, one or more cameras, etc.
  • a LIDAR system of the one or more sensors 301 generates sensor data (e.g., remote-detection sensor data) that includes the location (e.g., in three-dimensional space relative to the LIDAR system) of a number of points that correspond to objects that have reflected a ranging laser.
  • the LIDAR system measures distances by measuring the Time of Flight (TOF) that it takes a short laser pulse to travel from the sensor to an object and back, calculating the distance from the known speed of light.
  • TOF Time of Flight
  • a RADAR system of the one or more sensors 301 generates sensor data (e.g., remote-detection sensor data) that includes the location (e.g., in three-dimensional space relative to the RADAR system) of a number of points that correspond to objects that have reflected ranging radio waves.
  • sensor data e.g., remote-detection sensor data
  • radio waves e.g., pulsed or continuous
  • transmitted by the RADAR system reflect off an object and return to a receiver of the RADAR system, giving information about the object's location and speed.
  • a RADAR system provides useful information about the current speed of an object.
  • one or more cameras of the one or more sensors 301 may generate sensor data (e.g., remote-detection sensor data) including still or moving images.
  • sensor data e.g., remote-detection sensor data
  • processing techniques e.g., range imaging techniques such as structure from motion, structured light, stereo triangulation, and/or other techniques
  • identify the location e.g., in three-dimensional space relative to the one or more cameras
  • Other sensor systems can identify the location of points that correspond to objects as well.
  • the one or more sensors 301 can include a positioning system.
  • the positioning system determines a current position of the vehicle 300 .
  • the positioning system can be any device or circuitry for analyzing the position of the vehicle 300 .
  • the positioning system can determine a position by using one or more of inertial sensors, a satellite positioning system such as the Global Positioning System (GPS), a positioning system based on IP address, triangulation and/or proximity to network access points or other network components (e.g., cellular towers, Wi-Fi access points), and/or other suitable techniques.
  • GPS Global Positioning System
  • the position of the vehicle 300 can be used by various systems of the vehicle autonomy system 302 .
  • the one or more sensors 301 are used to collect sensor data that includes information that describes the location (e.g., in three-dimensional space relative to the vehicle 300 ) of points that correspond to objects within the surrounding environment of the vehicle 300 .
  • the sensors 301 can be positioned at various different locations on the vehicle 300 .
  • one or more cameras and/or LIDAR sensors can be located in a pod or other structure that is mounted on a roof of the vehicle 300
  • one or more RADAR sensors can be located in or behind the front and/or rear bumper(s) or body panel(s) of the vehicle 300 .
  • one or more cameras can be located at the front or rear bumper(s) of the vehicle 300 .
  • Other locations can be used as well.
  • the localizer system 330 receives some or all of the sensor data from the sensors 301 and generates vehicle poses for the vehicle 300 .
  • a vehicle pose describes a position and attitude of the vehicle 300 .
  • the vehicle pose (or portions thereof) can be used by various other components of the vehicle autonomy system 302 including, for example, the perception system 303 , the prediction system 304 , the motion planning system 305 , and the navigator system 313 .
  • the position of the vehicle 300 is a point in a three-dimensional space. In some examples, the position is described by values for a set of Cartesian coordinates, although any other suitable coordinate system may be used.
  • the attitude of the vehicle 300 generally describes the way in which the vehicle 300 is oriented at its position. In some examples, attitude is described by a yaw about the vertical axis, a pitch about a first horizontal axis, and a roll about a second horizontal axis.
  • the localizer system 330 generates vehicle poses periodically (e.g., every second, every half second). The localizer system 330 appends time stamps to vehicle poses, where the time stamp for a pose indicates the point in time that is described by the pose. The localizer system 330 generates vehicle poses by comparing sensor data (e.g., remote-detection sensor data) to map data 326 describing the surrounding environment of the vehicle 300 .
  • sensor data e.g., remote-detection sensor data
  • the localizer system 330 includes one or more pose estimators and a pose filter.
  • Pose estimators generate pose estimates by comparing remote-detection sensor data (e.g., LIDAR, RADAR) to map data.
  • the pose filter receives pose estimates from the one or more pose estimators as well as other sensor data such as, for example, motion sensor data from an IMU, encoder, or odometer.
  • the pose filter executes a Kalman filter or machine learning algorithm to combine pose estimates from the one or more pose estimators with motion sensor data to generate vehicle poses.
  • pose estimators generate pose estimates at a frequency less than the frequency at which the localizer system 330 generates vehicle poses. Accordingly, the pose filter generates some vehicle poses by extrapolating from a previous pose estimate utilizing motion sensor data.
  • Vehicle poses and/or vehicle positions generated by the localizer system 330 are provided to various other components of the vehicle autonomy system 302 .
  • the commander system 311 may utilize a vehicle position to determine whether to respond to a call from a service assignment system 340 .
  • the commander system 311 determines a set of one or more target locations that are used for routing the vehicle 300 .
  • the target locations are determined based on user input received via a user interface 309 of the vehicle 300 .
  • the user interface 309 may include and/or use any suitable input/output device or devices.
  • the commander system 311 determines the one or more target locations considering data received from the service assignment system 340 .
  • the service assignment system 340 is programmed to provide instructions to multiple vehicles, for example, as part of a fleet of vehicles for moving passengers and/or cargo. Data from the service assignment system 340 can be provided via a wireless network, for example.
  • the navigator system 313 receives one or more target locations from the commander system 311 and map data 326 .
  • the map data 326 provides detailed information about the surrounding environment of the vehicle 300 .
  • the map data 326 provides information regarding identity and location of different roadways and roadway elements.
  • a roadway is a place where the vehicle 300 can drive and may include, for example, a road, a street, a highway, a lane, a parking lot, or a driveway.
  • Routing graph data is a type of map data 326 .
  • the navigator system 313 From the one or more target locations and the map data 326 , the navigator system 313 generates route data describing a route for the vehicle 300 to take to arrive at the one or more target locations. In some implementations, the navigator system 313 determines route data using one or more path-planning algorithms based on costs for graph elements/corresponding roadway elements, as described herein. For example, a cost for a route can indicate a time of travel, risk of danger, or other factor associated with adhering to a particular candidate route. Route data describing a route is provided to the motion planning system 305 , which commands the vehicle controls 307 to implement the route or route extension, as described herein. The navigator system 313 can generate routes as described herein using a general-purpose routing graph and routing graph modification data. Also, in examples where route data is received from the service assignment system 340 , that route data can also be provided to the motion planning system 305 .
  • the perception system 303 detects objects in the surrounding environment of the vehicle 300 based on sensor 301 data, the map data 326 , and/or vehicle poses provided by the localizer system 330 .
  • the map data 326 used by the perception system 303 describes roadways and segments thereof and may also describe buildings or other items or objects (e.g., lampposts, crosswalks, curbing); location and directions of traffic lanes or lane segments (e.g., the location and direction of a parking lane, a turning lane, a bicycle lane, or other lanes within a particular roadway); traffic control data (e.g., the location and instructions of signage, traffic lights, or other traffic control devices); and/or any other map data that provides information that assists the vehicle autonomy system 302 in comprehending and perceiving its surrounding environment and its relationship thereto.
  • buildings or other items or objects e.g., lampposts, crosswalks, curbing
  • location and directions of traffic lanes or lane segments e.g., the location and direction of a parking
  • the perception system 303 determines state data for one or more of the objects in the surrounding environment of the vehicle 300 .
  • State data describes a current state of an object (also referred to as features of the object).
  • the state data for each object describes, for example, an estimate of the object's current location (also referred to as position); current speed (also referred to as velocity); current acceleration; current heading; current orientation; size/shape/footprint (e.g., as represented by a bounding shape such as a bounding polygon or polyhedron); type/class (e.g., vehicle, pedestrian, bicycle, or other); yaw rate; distance from the vehicle 300 ; minimum path to interaction with the vehicle 300 ; minimum time duration to interaction with the vehicle 300 ; and/or other state information.
  • the perception system 303 determines state data for each object over a number of iterations. In particular, the perception system 303 updates the state data for each object at each iteration. Thus, the perception system 303 detects and tracks objects, such as other vehicles, that are proximate to the vehicle 300 over time.
  • the prediction system 304 is configured to predict one or more future positions for an object or objects in the environment surrounding the vehicle 300 (e.g., an object or objects detected by the perception system 303 ).
  • the prediction system 304 generates prediction data associated with one or more of the objects detected by the perception system 303 .
  • the prediction system 304 generates prediction data describing each of the respective objects detected by the perception system 303 .
  • Prediction data for an object is indicative of one or more predicted future locations of the object.
  • the prediction system 304 may predict where the object will be located within the next 5 seconds, 30 seconds, 200 seconds, etc.
  • Prediction data for an object may indicate a predicted trajectory (e.g., predicted path) for the object within the surrounding environment of the vehicle 300 .
  • the predicted trajectory e.g., path
  • the prediction system 304 generates prediction data for an object, for example, based on state data generated by the perception system 303 . In some examples, the prediction system 304 also considers one or more vehicle poses generated by the localizer system 330 and/or map data 326 .
  • the prediction system 304 uses state data indicative of an object type or classification to predict a trajectory for the object.
  • the prediction system 304 can use state data provided by the perception system 303 to determine that a particular object (e.g., an object classified as a vehicle) approaching an intersection and maneuvering into a left-turn lane intends to turn left. In such a situation, the prediction system 304 predicts a trajectory (e.g., path) corresponding to a left turn for the vehicle such that the vehicle turns left at the intersection.
  • the prediction system 304 determines predicted trajectories for other objects, such as bicycles, pedestrians, parked vehicles, etc.
  • the prediction system 304 provides the predicted trajectories associated with the object(s) to the motion planning system 305 .
  • the prediction system 304 is a goal-oriented prediction system 304 that generates one or more potential goals, selects one or more of the most likely potential goals, and develops one or more trajectories by which the object can achieve the one or more selected goals.
  • the prediction system 304 can include a scenario generation system that generates and/or scores the one or more goals for an object, and a scenario development system that determines the one or more trajectories by which the object can achieve the goals.
  • the prediction system 304 can include a machine-learned goal-scoring model, a machine-learned trajectory development model, and/or other machine-learned models.
  • the motion planning system 305 commands the vehicle controls 307 based at least in part on the predicted trajectories associated with the objects within the surrounding environment of the vehicle 300 , the state data for the objects provided by the perception system 303 , vehicle poses provided by the localizer system 330 , the map data 326 , and route or route extension data provided by the navigator system 313 . Stated differently, given information about the current locations of objects and/or predicted trajectories of objects within the surrounding environment of the vehicle 300 , the motion planning system 305 determines control commands for the vehicle 300 that best navigate the vehicle 300 along the route or route extension relative to the objects at such locations and their predicted trajectories on acceptable roadways.
  • the motion planning system 305 can also evaluate one or more cost functions and/or one or more reward functions for each of one or more candidate control commands or sets of control commands for the vehicle 300 .
  • the motion planning system 305 can determine a total cost (e.g., a sum of the cost(s) and/or reward(s) provided by the cost function(s) and/or reward function(s)) of adhering to a particular candidate control command or set of control commands.
  • the motion planning system 305 can select or determine a control command or set of control commands for the vehicle 300 based at least in part on the cost function(s) and the reward function(s). For example, the motion plan that minimizes the total cost can be selected or otherwise determined.
  • the motion planning system 305 can be configured to iteratively update the route or route extension for the vehicle 300 as new sensor data is obtained from the one or more sensors 301 .
  • the sensor data can be analyzed by the perception system 303 , the prediction system 304 , and the motion planning system 305 to determine the motion plan.
  • the motion planning system 305 can provide control commands to the one or more vehicle controls 307 .
  • the one or more vehicle controls 307 can include throttle systems, brake systems, steering systems, and other control systems, each of which can include various vehicle controls (e.g., actuators or other devices that control gas flow, steering, and braking) to control the motion of the vehicle 300 .
  • the various vehicle controls 307 can include one or more controllers, control devices, motors, and/or processors.
  • the vehicle controls 307 include a brake control module 320 .
  • the brake control module 320 is configured to receive a braking command and bring about a response by applying (or not applying) the vehicle brakes.
  • the brake control module 320 includes a primary system and a secondary system.
  • the primary system receives braking commands and, in response, brakes the vehicle 300 .
  • the secondary system may be configured to determine a failure of the primary system to brake the vehicle 300 in response to receiving the braking command.
  • a steering control system 332 is configured to receive a steering command and bring about a response in the steering mechanism of the vehicle 300 .
  • the steering command is provided to a steering system to provide a steering input to steer the vehicle 300 .
  • a lighting/auxiliary control module 336 receives a lighting or auxiliary command. In response, the lighting/auxiliary control module 336 controls a lighting and/or auxiliary system of the vehicle 300 . Controlling a lighting system may include, for example, turning on, turning off, or otherwise modulating headlights, parking lights, running lights, etc. Controlling an auxiliary system may include, for example, modulating windshield wipers, a defroster, etc.
  • a throttle control system 334 is configured to receive a throttle command and bring about a response in the engine speed or other throttle mechanism of the vehicle.
  • the throttle control system 334 can instruct an engine and/or engine controller, or other propulsion system component, to control the engine or other propulsion system of the vehicle 300 to accelerate, decelerate, or remain at its current speed.
  • Each of the perception system 303 , the prediction system 304 , the motion planning system 305 , the commander system 311 , the navigator system 313 , and the localizer system 330 can be included in or otherwise be a part of the vehicle autonomy system 302 configured to control the vehicle 300 based at least in part on data obtained from the one or more sensors 301 .
  • data obtained by the one or more sensors 301 can be analyzed by each of the perception system 303 , the prediction system 304 , and the motion planning system 305 in a consecutive fashion in order to control the vehicle 300 .
  • FIG. 3 depicts elements suitable for use in a vehicle autonomy system according to example aspects of the present disclosure, one of ordinary skill in the art will recognize that other vehicle autonomy systems can be configured to control an autonomous vehicle based on sensor data.
  • the vehicle autonomy system 302 includes one or more computing devices, which may implement all or parts of the perception system 303 , the prediction system 304 , the motion planning system 305 , and/or the localizer system 330 . Descriptions of hardware and software configurations for computing devices to implement the vehicle autonomy system 302 and/or the service assignment system 104 of FIG. 1 . are provided herein with reference to FIGS. 9 and 10 .
  • FIG. 4 is a flowchart showing one example of a process flow 400 that may be executed, for example, in the environment 100 of FIG. 1 to route a vehicle using an impact score.
  • the process flow 400 can be performed by the service assignment system 104 and/or the impact engine 108 .
  • the service assignment system 104 receives an indication of one or more roadway elements for which an impact score is to be determined, also referred to herein as the considered roadway element.
  • an autonomous vehicle 102 A, 102 B, 102 N provides a request for assistance indicating that the autonomous vehicle 102 A, 102 B, 102 N is requesting assistance to traverse one or more roadway elements.
  • the request for assistance includes or is otherwise associated with an indication of the one or more roadway elements with which the autonomous vehicle 102 A, 102 B, 102 N is requesting assistance.
  • the one or more corresponding graph elements may include a condition that affects suitability for travel.
  • the indication of the one or more roadway elements can be reported by a vehicle 102 A, 102 B, 102 N, 103 A, 103 N without a request for assistance.
  • a vehicle 102 A, 102 B, 102 N, 103 A, 103 N may detect a condition at a roadway element that is near the vehicle 102 A, 102 B, 102 N, 103 A, 103 N, but that the vehicle 102 A, 102 B, 102 N, 103 A, 103 N is not traversing or set to traverse.
  • the indication of the one or more roadway elements can be received from a traffic sensor, or other remote monitoring device.
  • the indication of the one or more roadway elements can be generated by the service assignment system 104 .
  • the service assignment system 104 or other suitable system, may determine that vehicle speeds at one or more roadway elements are reduced and/or may detect another factor indicating a condition at the one or more roadway elements indicating deteriorating suitability for travel.
  • the service assignment system 104 or other suitable system may provide an indication of the one or more roadway elements to the service assignment system 104 .
  • the service assignment system 104 determines an impact score for the considered one or more roadway elements. This can be performed in any suitable manner, for example, as described herein. Additional details describing how the impact score can be generated are described with respect to FIG. 7 and the accompanying description.
  • the considered routing graph modification for the impact score can be generated or received in any suitable manner. In some examples, the considered routing graph modification is generated automatically and/or received from the remote user 118 , a vehicle 102 A, 102 B, 102 N, 103 A, 103 N, or any other suitable source.
  • the service assignment system 104 determines whether to apply the considered routing graph modification. In some examples, this includes providing the determined impact score to the remote user 118 assisting an autonomous vehicle 102 A, 102 B, 102 N and receiving from the remote user 118 an indication of whether to apply the considered routing graph modification. For example, if the impact score indicates a low impact of applying the considered routing graph modification, the remote user 118 may choose to apply it. In other examples, deciding whether to apply the considered routing graph modification includes automatically applying the considered routing graph modification, for example, if the impact score meets one or more thresholds. Further details of determining whether to apply a routing graph modification to the graph elements corresponding to the considered one or more roadway elements are provided herein with respect to FIGS. 5 and 6 .
  • the service assignment system 104 determines to apply the considered routing graph modification, the service assignment system 104 , at operation 408 , generates a constrained routing graph 109 considering the applied routing graph modification at operation 408 . If the service assignment system 104 determines not to apply the considered routing graph modification, the service assignment system 104 , at operation 407 , generates a constrained routing graph 109 that does not consider the applied routing graph modification. (Other routing graph modifications, such as those described by routing graph modification data 120 , may be applied.) At operation 410 , the service assignment system 104 generates a route for a first autonomous vehicle 102 A, 102 B, 102 N using the constrained routing graph 109 as generated at operation 408 or at operation 407 . In some examples, the route at operation 410 is generated to determine whether the autonomous vehicle 102 A, 102 B, 102 N is assigned a transportation service.
  • the service assignment system 104 moves autonomous vehicles 102 A, 102 B, 102 N that may be stranded or otherwise have their movement limited by the considered routing graph modification.
  • the service assignment system 104 may identify autonomous vehicles 102 A, 102 B, 102 N that are likely to be assigned a future transportation service that requires the vehicle 102 A, 102 B, 102 N to traverse the considered one or more roadway elements. This can be performed, for example, by considering the first and second simulations used to generate the impact score.
  • the service assignment system 104 may determine that such vehicles 102 A, 102 B, 102 N will have their movement limited by the considered routing graph modification. Accordingly, the service assignment system 104 may instruct such vehicles 102 A, 102 B, 102 N to move to a different location before the considered routing graph modification is applied.
  • the service assignment system 104 instructs the autonomous vehicle 102 A, 102 B, 102 N to begin executing the route determined at operation 410 .
  • the autonomous vehicle 102 A, 102 B, 102 N may be selected for executing a transportation service.
  • the service assignment system 104 provides the instruction in a message offering the transportation service to the autonomous vehicle 102 A, 102 B, 102 N.
  • FIG. 5 is a flowchart showing one example of a process flow 500 for determining whether to apply a routing graph modification to graph elements corresponding to a considered one or more roadway elements.
  • the process flow 500 may be executed by the service assignment system 104 including, for example, by the impact engine 108 .
  • the service assignment system 104 provides an impact score for a considered one or more roadway elements to the remote user 118 .
  • This can include, for example, providing the impact score to the remote user computing device 119 , which may display the impact score at a user interface.
  • the impact score is provided as part of a user interface that is used by the remote user 118 to assist vehicles 102 A, 102 B, 102 N, 103 A, 104 N.
  • the considered routing graph modification is also provided to the remote user 118 .
  • the operation 502 can be executed, for example, before, during, or after the remote user 118 assists an autonomous vehicle 102 A, 102 B, 102 N at or near the considered one or more roadway elements.
  • the service assignment system 104 determines whether the remote user 118 has provided a prompt to apply the considered routing graph modification. If not, the service assignment system 104 (e.g., the routing engine 110 thereof) generates at least one route for an autonomous vehicle 102 A, 102 B, 102 N at operation 506 without applying the considered routing graph modification.
  • the service assignment system 104 e.g., the routing engine 110 thereof
  • the service assignment system 104 does so at operation 508 .
  • the service assignment system 104 generates at least one route for an autonomous vehicle applying the considered routing graph modification.
  • FIG. 6 is a flowchart showing one example of a process flow 600 that may be executed by the service assignment system 104 to process an impact score describing a considered roadway element.
  • the service assignment system 104 determines whether the impact score meets a first threshold. If the impact score does meet the first threshold, then the service assignment system 104 takes no further action at operation 604 .
  • the impact score may meet the first threshold if it indicates an impact greater than a determined amount. For example, if the considered one or more roadway elements have a very high impact, the considered routing graph modification may not be used.
  • the service assignment system 104 determines at operation 606 whether the impact score meets a second threshold.
  • the second threshold indicates a lower impact of the considered routing graph modification than the first threshold. For example, an impact score could meet the second threshold without indicating a sufficiently high impact to meet the first threshold.
  • the service assignment system 104 takes a first remedial action at operation 608 .
  • the first remedial action can include, for example, prompting the remote user 118 for instructions about whether to apply a routing graph modification to graph elements corresponding to the one or more roadway elements. In other examples, the first remedial action can include automatically applying the routing graph modification.
  • the service assignment system 104 executes a second remedial action at operation 610 .
  • This can include, for example, automatically applying the considered routing graph modification (e.g., without input from the remote user 118 ). For example, if the impact of a considered routing graph modification is not high enough to meet the second threshold, the considered one or more roadway elements may not be significant, justifying automatic application of a routing graph modification.
  • the thresholds and remedial actions of the process flow 600 can be determined based on the behavior of the remote user 118 or other remote users in handling requests for assistance.
  • impact scores may be provided to remote users 118 as described herein with respect to FIG. 5 .
  • the service assignment system 104 may record the impact scores provided to the remote users 118 as well as the way that the remote users 118 acted on the provided impact scores. This can include, for example, whether the remote user 118 chose to take no action, whether the remote user 118 chose to apply a routing graph modification that raised the cost of the graph element or elements corresponding to the considered one or more roadway elements, as well as whether the remote user chose to apply a routing graph modification that changed routing graph connectivity.
  • the recorded data can be used as training data to train a machine learning model executed by the service assignment system 104 .
  • the machine learning model may return values for the thresholds of FIG. 6 and/or determine remedial actions (if any) based on the impact score and/or other variables.
  • FIG. 7 is a flowchart showing one example of a process flow 700 that may be executed by the impact engine 108 to determine an impact score.
  • the impact engine 108 determines a set of predicted service requests and a set of predicted vehicles for responding to the service requests.
  • the set of predicted service requests and set of predicted vehicles may be determined for a future time period, for example, as described herein
  • the impact engine 108 generates a first simulation of the set of predicted service requests using the set of predicted vehicles 102 A, 102 B, 102 N, 103 A, 103 N. This can include, for each of the set of predicted service requests, determining candidate vehicles from the set of predicted vehicles 102 A, 102 B, 102 N, 103 A, 103 N and then generating routes for the set of predicted vehicles 102 A, 102 B, 102 N 103 A, 103 N without using the considered routing graph modification.
  • the service assignment system 104 assigns the predicted service requests to various predicted vehicles 102 A, 102 B, 102 N, 103 A, 103 N, for example, in the same way that the service assignment system would assign actual service requests.
  • the first simulation may consider other routing graph modifications such as, for example, routing graph modifications described by routing graph modification data 120 .
  • the first simulation applies an alternate routing graph modification to the graph element or elements corresponding to the considered one or more roadway elements.
  • the alternate routing graph modification may reflect the costs associated with routing autonomous vehicles 102 A, 102 B, 102 N to the considered one or more roadway elements in view of current conditions. For example, if the considered one or more components are under construction, the alternate routing graph modification may increase the cost of the considered one or more components to reflect a (slower) travel time for the roadway element.
  • the alternate routing graph modification may raise the cost of the graph element or elements corresponding to the considered one or more roadway elements to indicate the resources of the remote user 118 used to navigate autonomous vehicles 102 A, 102 B, 102 N through the considered one or more roadway elements.
  • the alternate routing graph modification involves removing the connectivity of one or more graph elements from the routing graph 124 .
  • the impact engine 108 generates a second simulation of the set of predicted service requests using the set of predicted vehicles.
  • the considered routing graph modification is applied to generate the candidate routes.
  • the considered service requests may be assigned to various vehicles of the set of predicted vehicles 102 A, 102 B, 102 N, 103 A, 103 N as described herein.
  • some or all of the considered service requests may not be met. For example, it may not be possible to complete one or more of the requested transportation services without traversing a removed roadway element.
  • the impact engine 108 generates the impact score using the first and second simulations. For example, the impact engine 108 may compare metrics describing the first and second simulations, such as time of arrival vehicles 102 A, 102 B, 102 N, 103 A, 103 N at transportation service start locations, drop-off time at the service end location, etc. In some examples, the times of arrival are normalized. One example way to normalize a time of arrival is to find a time between when a transportation service is requested and the subsequent time of arrival of the vehicle 102 A, 102 B, 102 N, 103 A, 103 N executing the service. The impact engine 108 can aggregate normalized times of arrival over all (or some) of the predicted set transportation service requests.
  • the impact engine 108 may find an average, median, or other aggregation of the normalized times of arrival.
  • the aggregated normalized time of arrival in the first simulation may be compared to the aggregated normalized time of arrival in the second simulation.
  • the impact score may be based on the results of the comparison.
  • the impact score may also be generated by considering a difference in how many of the set of predicted service requests are met in the second simulation versus the first.
  • one or more roadway elements have connectivity for their corresponding graph elements removed from the routing graph 124 by the considered routing graph modification. It is possible that one or all of the set of predicted service requests may not be met without routing an autonomous vehicle 102 A, 102 B, 102 N via the removed roadway element. When this is the case, the affected service requests may not be met in the second simulation.
  • the considered routing graph modification may cause one or more autonomous vehicles 102 A, 102 B, 102 N to be stranded within a sub-portion of the routing graph. These autonomous vehicles 102 A, 102 B, 102 N may not be available to perform transportation services that require routing outside of that sub-portion. This can affect whether transportation service requests are met at all in the second simulation as well as the time of arrival, drop-off time, etc. over the second simulation.
  • the drop-off times may also be normalized.
  • the impact engine 108 may find a difference between the time that a transportation service is requested and the time of drop-off.
  • the impact engine 108 normalizes the drop-off times by finding a difference between the time of arrival for the transportation service and the drop-off time.
  • the impact engine 108 may also aggregate normalized drop-off times over the first and second simulations. For example, the impact engine 108 may compare a mean, median or other aggregation of normalized drop-off times in the first simulation with similarly aggregated normalized drop-off times in the second simulation. The impact score may be based on the results of the comparison.
  • Yet another example metric for comparison is the portion of the set of predicted transportation services assigned to autonomous vehicles 102 A, 102 B, 102 N versus human-driven vehicles 103 A, 103 N.
  • the considered routing graph modification may apply to autonomous vehicles 102 A, 102 B, 102 N but not to human-driven vehicles 103 A, 103 N. This may cause the routes of autonomous vehicles 102 A, 102 B, 102 N to take longer than the routes of human-driven vehicles 103 A, 103 N. Accordingly, the times of arrival and/or drop-off times may be later for the second simulation than the first.
  • using the modified routing graph may cause a larger proportion of the set of predicted transportation services to be assigned to human-driven vehicles 103 A, 103 N versus autonomous vehicles 102 A, 102 B, 102 N.
  • the amount of this difference may be or affect the impact score.
  • the process flow 700 is performed by the impact engine 108 in conjunction with other components of the service assignment system 104 .
  • the routing engine 110 may be utilized to generate candidate routes for the various simulated service requests and the transportation service selection engine 112 may be used to assign service requests to particular predicted vehicles 102 A, 102 B, 102 N, 103 A, 103 N.
  • FIG. 8 is a flowchart showing one example of a process flow 800 that can be executed by the service assignment system 104 if a considered routing graph modification has been applied.
  • a considered routing graph modification When a considered routing graph modification is applied, transportation service requests received in real time are assigned to vehicles 102 A, 102 B, 102 N, 103 A, 103 N based on the considered routing graph modification.
  • candidate routes used to assign real time transportation service requests may be generated using a constrained routing graph 109 that incorporates the considered routing graph modification.
  • the service assignment system 104 reconsiders whether to apply the considered routing graph modification.
  • the service assignment system 104 determines that a period of time has elapsed since the considered routing graph modification was applied.
  • the service assignment system 104 generates a test route for a vehicle 102 A, 102 B, 102 N, 103 A, 103 N.
  • the test route includes the considered one or more roadway elements associated with the considered routing graph modification.
  • the service assignment system 104 can select the vehicle 102 A, 102 B, 102 N, 103 A, 103 N for the test route in any suitable manner.
  • the service assignment system 104 may select a vehicle 102 A, 102 B, 102 N, 103 A, 103 N that is near the considered one or more roadway elements.
  • the service assignment system 104 instructs the selected vehicle 102 A, 102 B, 102 N, 103 A, 103 N to execute the test route.
  • the service assignment system 104 determines whether the roadway condition that prompted the considered routing graph modification is still present. For example, a driver, passenger, or other human user associated with the selected vehicle 102 A, 102 B, 102 N, 103 A, 103 N may report whether the condition is still present.
  • remote detection sensors of the selected vehicle 102 A, 102 B, 102 N, 103 A, 103 N are used to determine if the condition is still present.
  • the remote detection sensor data may be used by the vehicle autonomy system to determine whether the condition is still present.
  • remote detection sensor data from the selected vehicle 102 A, 102 B, 102 N, 103 A, 103 N is provided to the remote user 118 and the remote user 118 determines whether the condition is present.
  • the considered routing graph modification is maintained at operation 810 .
  • the service assignment system 104 may continue to assign transportation service requests to vehicles 102 A, 102 B, 102 N, 103 A, 103 N using a constrained routing graph 109 that reflects the considered routing graph modification.
  • the considered routing graph modification is removed at operation 812 .
  • the service assignment system 104 begins to assign transportation service requests to vehicles 102 A, 102 B, 102 N, 103 A, 103 N using a constrained routing graph 109 that does not reflect the considered routing graph modification.
  • FIG. 9 is a block diagram 900 showing one example of a software architecture 902 for a computing device.
  • the software architecture 902 may be used in conjunction with various hardware architectures, for example, as described herein.
  • FIG. 9 is merely a non-limiting example of a software architecture 902 , and many other architectures may be implemented to facilitate the functionality described herein.
  • a representative hardware layer 904 is illustrated and can represent, for example, any of the above-referenced computing devices.
  • the hardware layer 904 may be implemented according to an architecture 1000 of FIG. 10 and/or the software architecture 902 of FIG. 9 .
  • the representative hardware layer 904 comprises one or more processing units 906 having associated executable instructions 908 .
  • the executable instructions 908 represent the executable instructions of the software architecture 902 , including implementation of the methods, modules, components, and so forth of FIGS. 1-8 .
  • the hardware layer 904 also includes memory and/or storage modules 910 , which also have the executable instructions 908 .
  • the hardware layer 904 may also comprise other hardware 912 , which represents any other hardware of the hardware layer 904 , such as the other hardware illustrated as part of the architecture 1000 .
  • the software architecture 902 may be conceptualized as a stack of layers where each layer provides particular functionality.
  • the software architecture 902 may include layers such as an operating system 914 , libraries 916 , frameworks/middleware 918 , applications 920 , and a presentation layer 944 .
  • the applications 920 and/or other components within the layers may invoke application programming interface (API) calls 924 through the software stack and receive a response, returned values, and so forth illustrated as messages 926 in response to the API calls 924 .
  • API application programming interface
  • the layers illustrated are representative in nature, and not all software architectures have all layers. For example, some mobile or special-purpose operating systems may not provide a frameworks/middleware 918 layer, while others may provide such a layer. Other software architectures may include additional or different layers.
  • the operating system 914 may manage hardware resources and provide common services.
  • the operating system 914 may include, for example, a kernel 928 , services 930 , and drivers 932 .
  • the kernel 928 may act as an abstraction layer between the hardware and the other software layers.
  • the kernel 928 may be responsible for memory management, processor management (e.g., scheduling), component management, networking, security settings, and so on.
  • the services 930 may provide other common services for the other software layers.
  • the services 930 include an interrupt service.
  • the interrupt service may detect the receipt of a hardware or software interrupt and, in response, cause the software architecture 902 to pause its current processing and execute an interrupt service routine (ISR) when an interrupt is received.
  • ISR interrupt service routine
  • the drivers 932 may be responsible for controlling or interfacing with the underlying hardware.
  • the drivers 932 may include display drivers, camera drivers, Bluetooth® drivers, flash memory drivers, serial communication drivers (e.g., Universal Serial Bus (USB) drivers), WiFi® drivers, near-field communication (NFC) drivers, audio drivers, power management drivers, and so forth depending on the hardware configuration.
  • USB Universal Serial Bus
  • NFC near-field communication
  • the libraries 916 may provide a common infrastructure that may be used by the applications 920 and/or other components and/or layers.
  • the libraries 916 typically provide functionality that allows other software modules to perform tasks in an easier fashion than by interfacing directly with the underlying operating system 914 functionality (e.g., kernel 928 , services 930 , and/or drivers 932 ).
  • the libraries 916 may include system libraries 934 (e.g., C standard library) that may provide functions such as memory allocation functions, string manipulation functions, mathematic functions, and the like.
  • libraries 916 may include API libraries 936 such as media libraries (e.g., libraries to support presentation and manipulation of various media formats such as MPEG4, H.264, MP3, AAC, AMR, JPG, and PNG), graphics libraries (e.g., an OpenGL framework that may be used to render 2D and 3D graphic content on a display), database libraries (e.g., SQLite that may provide various relational database functions), web libraries (e.g., WebKit that may provide web browsing functionality), and the like.
  • the libraries 916 may also include a wide variety of other libraries 938 to provide many other APIs to the applications 920 and other software components/modules.
  • the frameworks 918 may provide a higher-level common infrastructure that may be used by the applications 920 and/or other software components/modules.
  • the frameworks 918 may provide various graphical user interface (GUI) functions, high-level resource management, high-level location services, and so forth.
  • GUI graphical user interface
  • the frameworks 918 may provide a broad spectrum of other APIs that may be used by the applications 920 and/or other software components/modules, some of which may be specific to a particular operating system or platform.
  • the applications 920 include built-in applications 940 and/or third-party applications 942 .
  • built-in applications 940 may include, but are not limited to, a contacts application, a browser application, a book reader application, a location application, a media application, a messaging application, and/or a game application.
  • the third-party applications 942 may include any of the built-in applications 940 as well as a broad assortment of other applications.
  • the third-party application 942 e.g., an application developed using the AndroidTM or iOSTM software development kit (SDK) by an entity other than the vendor of the particular platform
  • the third-party application 942 may be mobile software running on a mobile operating system such as iOSTM, AndroidTM, Windows® Phone, or other computing device operating systems.
  • the third-party application 942 may invoke the API calls 924 provided by the mobile operating system such as the operating system 914 to facilitate functionality described herein.
  • the applications 920 may use built-in operating system functions (e.g., kernel 928 , services 930 , and/or drivers 932 ), libraries (e.g., system libraries 934 , API libraries 936 , and other libraries 938 ), or frameworks/middleware 918 to create user interfaces to interact with users of the system.
  • built-in operating system functions e.g., kernel 928 , services 930 , and/or drivers 932
  • libraries e.g., system libraries 934 , API libraries 936 , and other libraries 938
  • frameworks/middleware 918 e.g., frameworks/middleware 918 to create user interfaces to interact with users of the system.
  • interactions with a user may occur through a presentation layer, such as the presentation layer 944 .
  • the application/module “logic” can be separated from the aspects of the application/module that interact with a user.
  • Some software architectures use virtual machines. For example, systems described herein may be executed using one or more virtual machines executed at one or more server computing machines. In the example of FIG. 9 , this is illustrated by a virtual machine 948 .
  • a virtual machine creates a software environment where applications/modules can execute as if they were executing on a hardware computing device.
  • the virtual machine 948 is hosted by a host operating system (e.g., the operating system 914 ) and typically, although not always, has a virtual machine monitor 946 , which manages the operation of the virtual machine 948 as well as the interface with the host operating system (e.g., the operating system 914 ).
  • a software architecture executes within the virtual machine 948 , such as an operating system 950 , libraries 952 , frameworks/middleware 954 , applications 956 , and/or a presentation layer 958 . These layers of software architecture executing within the virtual machine 948 can be the same as corresponding layers previously described or may be different.
  • FIG. 10 is a block diagram illustrating a computing device hardware architecture 1000 , within which a set or sequence of instructions can be executed to cause a machine to perform examples of any one of the methodologies discussed herein.
  • the hardware architecture 1000 describes a computing device for executing the vehicle autonomy system, described herein.
  • the architecture 1000 may operate as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the architecture 1000 may operate in the capacity of either a server or a client machine in server-client network environments, or it may act as a peer machine in peer-to-peer (or distributed) network environments.
  • the architecture 1000 can be implemented in a personal computer (PC), a tablet PC, a hybrid tablet, a set-top box (STB), a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing instructions (sequential or otherwise) that specify operations to be taken by that machine.
  • PC personal computer
  • PDA personal digital assistant
  • the example architecture 1000 includes a processor unit 1002 comprising at least one processor (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both, processor cores, compute nodes).
  • the architecture 1000 may further comprise a main memory 1004 and a static memory 1006 , which communicate with each other via a link 1008 (e.g., a bus).
  • the architecture 1000 can further include a video display unit 1010 , an input device 1012 (e.g., a keyboard), and a UI navigation device 1014 (e.g., a mouse).
  • the video display unit 1010 , input device 1012 , and UI navigation device 1014 are incorporated into a touchscreen display.
  • the architecture 1000 may additionally include a storage device 1016 (e.g., a drive unit), a signal generation device 1018 (e.g., a speaker), a network interface device 1020 , and one or more sensors (not shown), such as a Global Positioning System (GPS) sensor, compass, accelerometer, or other sensor.
  • a storage device 1016 e.g., a drive unit
  • a signal generation device 1018 e.g., a speaker
  • a network interface device 1020 e.g., a Wi-Fi
  • sensors not shown
  • GPS Global Positioning System
  • the processor unit 1002 or another suitable hardware component may support a hardware interrupt.
  • the processor unit 1002 may pause its processing and execute an ISR, for example, as described herein.
  • the storage device 1016 includes a machine-readable medium 1022 on which is stored one or more sets of data structures and instructions 1024 (e.g., software) embodying or used by any one or more of the methodologies or functions described herein.
  • the instructions 1024 can also reside, completely or at least partially, within the main memory 1004 , within the static memory 1006 , and/or within the processor unit 1002 during execution thereof by the architecture 1000 , with the main memory 1004 , the static memory 1006 , and the processor unit 1002 also constituting machine-readable media.
  • the various memories i.e., 1004 , 1006 , and/or memory of the processor unit(s) 1002
  • the storage device 1016 may store one or more sets of instructions and data structures (e.g., the instructions 1024 ) embodying or used by any one or more of the methodologies or functions described herein. These instructions, when executed by the processor unit(s) 1002 , cause various operations to implement the disclosed examples.
  • machine-storage medium As used herein, the terms “machine-storage medium,” “device-storage medium,” and “computer-storage medium” (referred to collectively as “machine-storage medium”) mean the same thing and may be used interchangeably.
  • the terms refer to a single or multiple storage devices and/or media (e.g., a centralized or distributed database, and/or associated caches and servers) that store executable instructions and/or data, as well as cloud-based storage systems or storage networks that include multiple storage apparatus or devices.
  • the terms shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media, including memory internal or external to processors.
  • machine-storage media examples include non-volatile memory, including by way of example semiconductor memory devices, e.g., erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), field-programmable gate array (FPGA), and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • semiconductor memory devices e.g., erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), field-programmable gate array (FPGA), and flash memory devices
  • magnetic disks such as internal hard disks and removable disks
  • magneto-optical disks magneto-optical disks
  • CD-ROM and DVD-ROM disks examples include CD-ROM and DVD-ROM disks.
  • signal medium or “transmission medium” shall be taken to include any form of modulated data signal, carrier wave, and so forth.
  • modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • machine-readable medium means the same thing and may be used interchangeably in this disclosure.
  • the terms are defined to include both non-transitory machine-storage media and signal media.
  • the terms include both storage devices/media and carrier waves/modulated data signals.
  • the instructions 1024 can further be transmitted or received over a communications network 1026 using a transmission medium via the network interface device 1020 using any one of a number of well-known transfer protocols (e.g., Hypertext Transfer Protocol (HTTP)).
  • HTTP Hypertext Transfer Protocol
  • Examples of communication networks include a local area network (LAN), a wide area network (WAN), the Internet, mobile telephone networks, plain old telephone service (POTS) networks, and wireless data networks (e.g., Wi-Fi, 3G, 4G Long-Term Evolution (LTE)/LTE-A, 5G, or WiMAX networks).
  • a component may be configured in any suitable manner.
  • a component that is or that includes a computing device may be configured with suitable software instructions that program the computing device.
  • a component may also be configured by virtue of its hardware arrangement or in any other suitable manner.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Atmospheric Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)

Abstract

Various examples are directed to systems and methods for routing autonomous vehicles. A system may receive an indication of a roadway element associated with a routing graph for routing autonomous vehicles and may determine an impact score for the roadway element. The impact score may describe an impact of applying a routing graph modification to the routing graph to modify routing to the roadway element. Based at least in part on the impact score, the system may apply the routing graph modification to the routing graph to generate a constrained routing graph and generate a route for a first autonomous vehicle based at least in part on the constrained routing graph. The system may instruct the first autonomous vehicle to begin traversing the route.

Description

    CLAIM FOR PRIORITY
  • This application claims the priority of U.S. Provisional Application No. 62/868,347, filed Jun. 28, 2019 and U.S. Provisional Application No. 62/836,936, filed Apr. 22, 2019 and U.S. Provisional Application No. 62/796,882, filed Jan. 25, 2019, the contents of each of which are incorporated herein by reference in their entireties.
  • FIELD
  • This document pertains generally, but not by way of limitation, to devices, systems, and methods for routing, operating, and/or managing an autonomous vehicle.
  • BACKGROUND
  • An autonomous vehicle is a vehicle that is capable of sensing its environment and operating some or all of the vehicle's controls based on the sensed environment. An autonomous vehicle includes sensors that capture signals describing the environment surrounding the vehicle. The autonomous vehicle processes the captured sensor signals to comprehend the environment and automatically operates some or all of the vehicle's controls based on the resulting information.
  • DRAWINGS
  • In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. Like numerals having different letter suffixes may represent different instances of similar components. Some embodiments are illustrated by way of example, and not of limitation, in the figures of the accompanying drawings.
  • FIG. 1 is a diagram showing one example of an environment for routing autonomous vehicles considering roadway element impact.
  • FIG. 2 is a diagram showing one example of an environment illustrating a remote user utilizing an impact score.
  • FIG. 3 depicts a block diagram of an example vehicle according to example aspects of the present disclosure.
  • FIG. 4 is a flowchart showing one example of a process flow that may be executed, for example, in the environment 100 of FIG. 1 to route a vehicle using an impact score.
  • FIG. 5 is a flowchart showing one example of a process flow for determining whether to apply a routing graph modification to graph elements corresponding to a considered one or more roadway elements.
  • FIG. 6 is a flowchart showing one example of a process flow to process an impact score describing a considered roadway element.
  • FIG. 7 is a flowchart showing one example of a process flow that may be executed by the impact engine to determine an impact score.
  • FIG. 8 is a flowchart showing one example of a process flow that can be executed by the service assignment system if a considered routing graph modification has been applied.
  • FIG. 9 is a block diagram showing one example of a software architecture for a computing device.
  • FIG. 10 is a block diagram illustrating a computing device hardware architecture.
  • DESCRIPTION
  • Examples described herein are directed to systems and methods for routing autonomous vehicles, for example, in the context of assigning transportation services to the autonomous vehicles. A transportation service includes transporting a payload, such as cargo or one or more passengers, from a service start location to a service end location. Examples of cargo can include food, packages, or the like.
  • In an autonomous or semi-autonomous vehicle (collectively referred to as an autonomous vehicle (AV)), a vehicle autonomy system, sometimes referred to as an AV stack, controls one or more of braking, steering, or throttle of the vehicle. In a fully autonomous vehicle, the vehicle autonomy system assumes full control of the vehicle. In a semi-autonomous vehicle, the vehicle autonomy system assumes a portion of the vehicle control, with a human user (e.g., a vehicle operator) still providing some control input. Some autonomous vehicles can also operate in a manual mode, in which a human user provides all control inputs to the vehicle.
  • A service assignment system is configured to receive requests for transportation services from users. When the service assignment system receives a request for a transportation service, it generates candidate routes for one or more vehicles to execute the requested transportation service. The candidate routes may begin at a vehicle location and extend to the service start location and the service end location. If routes for more than one vehicle are generated, the service assignment system selects a vehicle to execute the requested transportation service based on the generated routes. For example, a vehicle that can achieve an earlier time of arrival at the service start location and/or a faster drop-off time to the service end location may be favored.
  • The service assignment system generates routes using a routing graph. The routing graph is a representation of the roadways in a geographic area. The routing graph represents the roadways as a set of graph elements. A graph element is a component of a routing graph that represents a roadway element on which the autonomous vehicle can travel. A graph element can be or include an edge, node, or other component of a routing graph. A graph element represents a portion of roadway, referred to herein as a roadway element. A roadway element is a component of a roadway that can be traversed by a vehicle.
  • A roadway element be or include different subdivisions of a roadway, depending on the implementation. In some examples, the roadway elements are or include road segments. A road segment is a portion of roadway including all lanes and directions of travel. Consider a four-lane divided highway. A road segment of the four-lane divided highway includes a stretch of the highway including all four lanes and both directions of travel.
  • In some examples, roadway elements are or include directed road segments. A directed road segment is a portion of roadway where traffic travels in a common direction. Referring again to the four-lane divided highway example, a stretch of the highway would include at least two directed road segments: a first directed road segment including the two lanes of travel in one direction and a second directed road segment including the two lanes of travel in the other direction.
  • In some examples, roadway elements are or include lane segments. A lane segment is a portion of a roadway including one lane of travel in one direction. Referring again to the four-lane divided highway example, a portion of the divided highway may include two lane segments in each direction. Lane segments may be interconnected in the direction of travel and laterally. For example, a vehicle traversing a lane segment may travel in the direction to travel to the next connected lane segment or may make a lane change to move laterally to a different lane segment.
  • The routing graph indicates data describing directionality, connectivity for the graph elements. The directionality of a graph element describes limitations, if any, on the direction in which a vehicle can traverse the roadway element corresponding to the graph element. The connectivity of a given graph element describes other graph elements to which the autonomous vehicle can be routed from the given graph element.
  • The routing graph can also include cost data describing costs associated with graph elements. The cost data indicates the cost for a vehicle to traverse a roadway element corresponding to a graph element or to transition between roadway elements corresponding to connected graph elements. Cost can be based on various factors including, for example, estimated driving time, danger risk, etc. In some examples, higher cost generally corresponds to more negative characteristics of a graph element or transition (e.g., longer estimated driving time, higher danger risk, etc.). The routing engine generates routes for vehicles by finding a low cost combination of connected graph elements corresponding to a sequence of roadway elements between two locations.
  • Conditions at a roadway element can sometimes change the suitability of the roadway element for travel, for example, for autonomous vehicles. For example, a delivery truck may be double parked at a roadway element, requiring that vehicles deviate from the middle of the lane or even cross the center line to traverse the roadway element. Consider another example in which a large pothole is present at a roadway element, making it desirable for vehicles to steer around the pothole or slow down to avoid damage from running over the pothole.
  • In some examples, an autonomous vehicle utilizes a remote user, sometimes called a teleoperator, to assist the autonomous vehicle. For example, if an autonomous vehicle encounters roadway conditions at a roadway element that make it difficult for the autonomous vehicle to proceed, the autonomous vehicle makes a request for assistance to a remote user. The remote user provides instructions to the autonomous vehicle about how to traverse the roadway element. In some examples, the remote user is provided with sensor data from the autonomous vehicle and the instruction from the remote user is based on the sensor data. For example, if the sensor data indicates that the autonomous vehicle can safely deviate from the middle of its lane to clear an obstruction, the remote user may instruct the autonomous vehicle to do so. In other examples, the remote user provides steering, throttle, braking, or other control inputs to the autonomous vehicle.
  • When the conditions at a roadway element render it less suitable or even unsuitable for travel by an autonomous vehicle, it is sometimes desirable to prevent autonomous vehicles from being routed to the roadway element and/or make it less likely that autonomous vehicles will be routed to the roadway element. This can be done, for example, by constraining the graph element at the routing graph that corresponds to the less suitable roadway element. To prevent routing to the roadway element, a routing graph modification can be applied, as described herein, to remove the connectivity between the graph element corresponding to the roadway element and the rest of the routing graph. To reduce the likelihood that autonomous vehicles will be routed to the roadway element, a routing graph modification may be applied to the routing graph to increase the cost of the graph element corresponding to the roadway element.
  • Determining an efficient way to respond to changing roadway conditions, however, can be challenging. For example, closing or raising the cost of a problematic roadway element can prevent autonomous vehicles from losing time and/or utilizing remote user resources. On the other hand, there is also a cost associated with constraining a roadway element. For example, if autonomous vehicles are not routed to a particular roadway element, alternative routes may have a higher cost, and sometimes a much higher cost. Further, in some examples, it is not possible to route an autonomous vehicle without using a particular roadway element. If such a roadway element is constrained in a way that removes its connectivity from the routing graph, one or more autonomous vehicles may be stranded without a possible route to some destinations.
  • Various examples address these and other challenges by generating and using an impact score for a roadway element and using the impact score to route and/or direct autonomous vehicles. The impact score indicates an impact of constraining a roadway element, for example, by changing a cost associated with a graph element representing the roadway element in the routing graph and/or removing the connectivity of the graph element from other graph elements in the routing graph.
  • FIG. 1 is a diagram showing one example of an environment 100 for routing autonomous vehicles considering roadway element impact. The environment 100 includes the service assignment system 104 and vehicles 102A, 102B, 102N, 103A, 103N. The vehicles 102A, 102B, 102N, 103A, 103N can include passenger vehicles, such as trucks, cars, buses, or other similar vehicles. The vehicles 102A, 102B, 102N, 103A, 103N can also include delivery vehicles, such as vans, trucks, tractor trailers, etc. The vehicles 102A, 102B, 102N, 103A, 103N include self-driving vehicles (SDVs) or autonomous vehicles (AVs) 102A, 102B, 102N and human-driven vehicles 103A, 103N. Although FIG. 1 shows three autonomous vehicles 102A, 102B, 102N and two human- drive vehicles 103A, 103N, any suitable number of vehicles may be used in any suitable proportion between autonomous vehicles 102A, 102B, 102N and human-driven vehicles 103A, 103N.
  • Each of the autonomous vehicles 102A, 102B, 102N includes a vehicle autonomy system, described in more detail with respect to FIG. 3. The vehicle autonomy system is configured to operate some or all of the controls of the vehicle 102A, 102B, 102N (e.g., acceleration, braking, steering). In some examples, one or more of the autonomous vehicles 102A, 102B, 102N are operable in different modes, where the vehicle autonomy system has differing levels of control over the vehicle 102A, 102B, 102N. Some autonomous vehicles 102A, 102B, 102N may be operable in a fully autonomous mode in which the vehicle autonomy system has responsibility for all or most of the controls of the vehicle 102A, 102B, 102N. Some autonomous vehicles 102A, 102B, 102N are operable in a semiautonomous mode that is in addition to or instead of the fully autonomous mode. In a semiautonomous mode, the vehicle autonomy system of a vehicle 102A, 102B, 102N is responsible for some of the vehicle controls while a human user or driver is responsible for other vehicle controls. In some examples, one or more of the autonomous vehicles 102A, 102B, 102N are operable in a manual mode in which the human user is responsible for all control of the vehicle 102A, 102B, 102N.
  • The autonomous vehicles 102A, 102B, 102N include one or more remote-detection sensor sets 106A, 106B, 106N. The remote-detection sensor sets 106A, 106B, 106N include one or more remote-detection sensors that receive signals from the environment 100. The signals may be emitted by and/or reflected from objects in the environment 100, such as the ground, buildings, trees, etc. The remote-detection sensor sets 106A, 106B, 106N may include one or more active sensors, such as light imaging detection and ranging (LIDAR) sensors, radio detection and ranging (RADAR) sensors, and/or sound navigation and ranging (SONAR) sensors, that emit sound or electromagnetic radiation in the form of light or radio waves to generate return signals. Information about the environment 100 is extracted from the received signals. In some examples, the remote-detection sensor sets 106A, 106B, 106N include one or more passive sensors that receive signals that originated from other sources of sound or electromagnetic radiation. The remote-detection sensor sets 106A, 106B, 106N provide remote-detection sensor data that describes the environment 100. The autonomous vehicles 102A, 102B, 102N can also include other types of sensors, for example, as described in more detail with respect to FIG. 2.
  • The autonomous vehicles 102A, 102B, 102N may be of different types. Different types of AVs may have different capabilities. For example, the different types of autonomous vehicles 102A, 102B, 102N can have different vehicle autonomy systems. This can include, for example, vehicle autonomy systems made by different manufacturers or designers, vehicle autonomy systems having different software versions or revisions, etc. Also, in some examples, the different types of autonomous vehicles 102A, 102B, 102N can have different remote-detection sensor sets 106A, 106B, 106N. For example, one type of autonomous vehicle 102A, 102B, 102N may include a LIDAR remote-detection sensor, while another type of vehicle 102A, 102B, 102N may include stereoscopic cameras and omit a LIDAR remote-detection sensor. In some examples, different types of autonomous vehicles 102A, 102B, 102N can also have different mechanical particulars. For example, one type of vehicle may have all-wheel drive, while another type may have front-wheel drive.
  • The service assignment system 104 is programmed to assign transportation services to the autonomous vehicles 102A, 102B, 102N as described herein. The service assignment system 104 can be or include one or more servers or other suitable computing devices. The service assignment system 104 is configured to receive transportation service requests from one or more users 114A, 114B, 114N. The users 114A, 114B, 114N make transportation service requests with user computing devices 116A, 116B, 116N. The user computing devices 116A, 116B, 116N can be or include any suitable computing device such as, for example, tablet computers, mobile telephone devices, laptop computers, desktop computers, etc. In some examples, the user computing devices 116A, 116B, 116N execute an application associated with a transportation service implemented with the service assignment system 104. The users 114A, 114B, 114N launch the application on the respective user computing devices 116A, 116B, 116N and utilize functionality of the application to make transportation service requests.
  • The service assignment system 104 comprises a transportation service selection engine 112 and a routing engine 110. The transportation service selection engine 112 is programmed to receive and process transportation service requests. The routing engine 110 generates routes for candidate vehicles 102A, 102B, 102N, 103A, 103N to execute a requested transportation service.
  • When the transportation service selection engine 112 receives a transportation service request, it identifies a set of candidate vehicles 102A, 102B, 102N, 103A, 103N for executing the transportation service. The set of candidate vehicles 102A, 102B, 102N can include vehicles 102A, 102B, 102 N 103A, 103N that are best suited for executing the transportation service. For example, the set of candidate vehicles 102A, 102B, 102N, 103A, 103N can include vehicles 102A, 102B, 102N, 103A, 103N that are near to a transportation service start point (e.g., within a threshold distance, within a threshold drive time, etc.). In some examples, the candidate vehicles 102A, 102B, 102N, 103A, 103N are limited to vehicles capable of executing the transportation service. For example, a transportation service that involves moving a large cargo object may be executable only by vehicles 102A, 102B, 102N, 103A, 103N having sufficient space to carry the large object. A transportation service that involves moving, for example, five passengers may be executable only by autonomous vehicles 102A, 102B, 102N having sufficient space to carry five passengers. As another example, a transportation service that involves traversing a portion of roadway that is not accessible to an autonomous vehicle 102A, 102B, 102N may be executable only by a human-driven vehicle 103A, 103N.
  • The transportation service selection engine 112 provides an indication of the candidate vehicles 102A, 102B, 102N, 103A, 103N to the routing engine 110. The routing engine 110 generates candidate routes for some or all of the set of candidate vehicles 102A, 102B, 102N, 103A, 103N. The candidate routes may begin at the location of a candidate vehicle and extend to the transportation service start point and transportation service end point. If the transportation service includes one or more waypoints, the candidate routes will also pass these waypoints.
  • Candidate routes determined by the routing engine 110 are provided to the transportation service selection engine 112. The transportation service selection engine 112 uses the candidate routes to select a vehicle 102A, 102B, 102N, 103A, 103N best suited to execute the transportation service. For example, the candidate vehicle 102A, 102B, 102N, 103A, 103N best suited to execute a transportation service may be the candidate vehicle 102A, 102B, 102N, 103A, 103N having the lowest-cost route for the transportation service.
  • In some examples, the transportation service selection engine 112 uses other metrics associated with particular types of autonomous vehicles 102A, 102B, 102N in addition to or instead of the candidate routes to select an autonomous vehicle 102A, 102B, 102N for executing a transportation service. The transportation service selection engine 112 can weigh the cost of the candidate routes based on type metrics associated with the candidate autonomous vehicles 102A, 102B, 102N. Non-limiting examples of type metrics include, for example, an estimated time of arrival (ETA) at the service start location, an estimated drop-off time (ETD) at the service end location, a price to the user 114A, 114B, 114N, an average customer rating for the vehicle 102A, 102B, 102N, 103A, 103N and/or a manufacturer or manager of the vehicle, an availability status of the vehicles 102A, 102B, 102N, 103A, 103N, an acceptance rate for the vehicles 102A, 102B, 102N, 103A, 103N, etc.
  • The transportation service selection engine 112 offers the requested transportation service to the selected vehicle 102A, 102B, 102N, 103A, 103N and instructs the vehicle 102A, 102B, 102N, 103A, 103N to begin traversing the route associated with the transportation service. In some examples, the selected vehicle 102A, 102B, 102N, 103A, 103N may optionally decline the transportation service. If the selected vehicle 102A, 102B, 102N, 103A, 103N declines the transportation service, the transportation service selection engine 112 may offer the transportation service to another vehicle 102A, 102B, 102N, 103A, 103N, for example, a vehicle having the next-lowest cost candidate route and/or the next most favorable combination of candidate route and other metrics.
  • The routing engine 110 generates routes utilizing, for example, a routing graph 124 in conjunction with routing graph modification data 120 such as, for example, policy data, vehicle capability data, and/or operational routing graph modification data. The routing graph 124 is a representation of the roadways in a geographic area. The routing graph 124, as described above, represents the roadway as a set of graph elements, where the graph elements correspond to roadway elements as described herein. The routing graph 124 also indicates directionality, connectivity, and cost for the various corresponding roadway elements. Directionality indicates the direction of travel in a roadway element. Connectivity describes roadway element connections indicating possible transitions between roadway elements. Cost describes the cost for a vehicle 102A, 102B, 102N, 103A, 103N to traverse a roadway element and/or transition between two roadway elements.
  • In FIG. 1, a break-out window 126 shows example roadway elements that can correspond to the graph elements of the routing graph 124. Roadway elements in the break-out window 126 are illustrated as shapes with arrows indicating the directionality of the roadway elements. Roadway elements can be connected to one another according to their directionality.
  • The routing engine 110 is configured to utilize routing graph modification data 120 to generate constrained routing graph 109 data. Routing graph modification data 120 indicates routing graph modifications that are applied to the routing graph 124 to generate a constrained routing graph 109. A routing graph modification is a change to a routing graph (e.g., a general-purpose routing graph) that reflects various factors including, for example, capabilities of the vehicle that is to execute a route, current roadway conditions, business policy considerations, and so on. A routing graph modification includes a graph element descriptor and a constraint.
  • A graph element descriptor is data describing one or more graph elements that are the subject of a routing graph modification. For example, a graph element descriptor can describe graph elements using one or more graph element properties. A graph element property is anything that describes a graph element and/or its corresponding roadway element. Example graph element properties include, for example, a unique identifier for the graph element, a roadway type of the corresponding roadway element (e.g., divided highway, urban street, etc.), a driving rule of the roadway element associated with the graph element (e.g., speed limit, access limitations), a type of maneuver necessary to enter, exit, and/or traverse the corresponding roadway element, whether the corresponding roadway element leads to a specific type of roadway element (e.g., dead end, divided highway, etc.), and so on. In some examples, a graph element descriptor including a unique indicator of a particular graph element can be used to generate a routing graph modification that is applied to the particular graph element.
  • A constraint is an action applied to graph elements at a routing graph that are described by the graph element descriptor of a routing graph modification. Example constraints that may be applied to a graph element include removing the graph element from the routing graph, modifying (e.g., removing) transitions to or from a graph element, changing a cost associated with a graph element or transitions involving the graph element, etc. Costs can be changed up or down. Costs may be changed up or down. For example, if the routing graph modification data 120 indicates that graph elements having a particular graph element property or set of graph element properties are disfavored, the costs to traverse and/or transition to the corresponding roadway elements can be increased. On the other hand, if the routing graph modification data 120 indicates that graph elements having a particular graph element property or set of constraint properties are favored, the costs to traverse and/or transition to the corresponding roadway elements can be decreased.
  • Another example constraint can include changing a required or recommended autonomous vehicle mode. For example, a graph element can be modified to indicate that an autonomous vehicle traversing the roadway element corresponding to the graph element should be operated in a semi-autonomous or manual mode.
  • Consider an example in which a routing policy forbids routing a vehicle through roadway elements that include or are in a school zone. A routing graph modification may include graph element descriptor data identifying graph elements that correspond to roadway elements having a school zone. A corresponding constraint includes removing the graph elements corresponding to such school zone roadway elements from the routing graph 124 and/or removing transitions to such school zone roadway elements
  • In some examples, a constraint can be applied to graph elements other than those indicated by the graph element descriptor data. Consider an example routing graph modification that is to avoid cul-de-sacs. The associated constraint could involve removing connectivity to graph elements corresponding to cul-de-sac roadway elements and also removing graph elements corresponding to roadway elements that do not include cul-de-sacs, but can lead only to other roadway elements that do include cul-de-sacs.
  • Routing graph modification data can also include routing graph constraints related to vehicle capability. For example, vehicles of different types (e.g., autonomous vehicles, human-driven vehicles, different types of autonomous vehicles, etc.) can have different capabilities and, therefore, can be associated with different vehicle-capability-related routing graph modifications. Vehicle capability of an autonomous vehicle 102A, 102B, 102N may be and/or be derived from operation domain (OD) and/or operational design domain (ODD) data, if any, provided by the vehicle's manufacturer. In some examples, vehicle capability is supplemented based on the performance of an autonomous vehicle 102A, 102B, 102N or type of autonomous vehicle in executing transportation services. Routing graph modifications based on vehicle capability can include, for example, routing graph modifications that identify graph elements corresponding to roadway elements that have property or properties (e.g., includes an unprotected left, is part of a controlled access highway, etc.) and constraint data indicating what is to be done to route components having the indicated property or properties. The graph elements corresponding to roadway elements that a particular type of autonomous vehicle 102A, 102B, 102N is not capable of traversing can be removed from the routing graph or can have connectivity data modified to remove transitions to those graph elements. For example, one or more connections to a graph element may be removed. If the properties of a graph element indicate that it corresponds to a roadway element including a maneuver that is undesirable for a vehicle, but not forbidden, then the routing engine 110 can increase the cost of the graph element and/or transitions thereto.
  • Other routing graph modifications that can be described by the routing graph modification data 120 may include, for example, policy routing graph modifications and operational routing graph modifications. Policy routing graph modifications include graph element properties that identify roadway elements subject to a policy routing graph modification and corresponding routing graph modifications. Policy routing graph modifications refer to types of roadway elements that are desirable for a vehicle to avoid or prioritize. An example policy routing graph modification is to avoid roadway elements that are in or pass through school zones. Another example policy routing graph modification is to avoid routing vehicles through residential neighborhoods. Yet another example policy routing graph modification is to favor routing vehicles on controlled-access highways, if available. Policy routing graph modifications can apply to some vehicles, some vehicle types, all vehicles, or all vehicle types.
  • Operational routing graph modifications can be based, for example, on the state of one or more roadways. For example, if a roadway is to be closed for a parade or for construction, an operational routing graph modification identifies properties (e.g., names or locations) of roadway elements that are part of the closure and an associated routing graph modification (e.g., removing the corresponding graph elements, removing transitions to the corresponding graph elements, etc.).
  • The routing engine 110 applies the routing graph modification data 120 to generate the constrained routing graph 109. The constrained routing graph 109 is used to generate a route for a vehicle 102A, 102B, 102N, 103A, 103N. In some examples, different constrained routing graphs 109 are generated for different types of vehicles 102A, 102B, 102N, 103A, 103N. For example, a human-driven vehicle 103A, 103N may have a different set of routing graph modifications than an autonomous vehicle 102A, 102B, 102N. The constrained routing graph 109 can be pre-generated and/or generated on an as-needed basis as routes are determined.
  • The routing engine 110 determines a route for the autonomous vehicle 102A, 102B, 102N, for example, by applying a path-planning algorithm to the constrained routing graph 109 to find the lowest-cost route for the vehicle. Any suitable path-planning algorithm can be used, such as, for example, A*, D*, Focused D*, D*Lite, GD*, or Dijkstra's algorithm. A generated route can include a string of connected graph elements that correspond to roadway elements between a vehicle start location and a vehicle end location. A vehicle start location is an initial roadway element of a route. A vehicle end location is a last roadway element of a route. In some examples, the vehicle start location is a current location of the relevant vehicle 102A, 102B, 102N, 103A, 103N, and the vehicle end location is the end location for the requested transportation service. For example, on the route, the autonomous vehicle 102A, 102B, 102N can travel from its current location to the transportation service start location, and then proceed to the transportation service end location traversing transportation service waypoints (if any) along the way.
  • FIG. 1 also shows a remote user 118 and remote user computing device 119. The remote user 118 may assist an autonomous vehicle 102A, 102B, 102N. For example, if the autonomous vehicle 102A, 102B, 102N encounters a roadway element that the autonomous vehicle 102A, 102B, 102N cannot traverse without assistance, it may send a request for assistance to the remote user 118. The remote user 118 utilizes the remote user computing device 119 to receive data indicating the state of the autonomous vehicle 102A, 102B, 102N and provide instructions to the autonomous vehicle 102A, 102B, 102N. The remote user computing device 119 can be or include any suitable computing device such as, for example, tablet computers, mobile telephone devices, laptop computers, desktop computers, etc. Additional details describing the remote user 118 are provided herein with respect to FIG. 2.
  • The service assignment system 104 can also include an impact engine 108 that is configured to generate impact scores for graph elements of the routing graph 124. Although the impact engine 108 is shown in FIG. 1 as a component of the service assignment system 104, in some examples, the impact engine 108 is implemented as an independent system or as a component of another system. The impact score for a roadway element indicates the impact of applying a considered routing graph modification to the graph element corresponding to the roadway element. The considered routing graph modification can be, for example, a change to the cost associated with the graph element and/or a change to the connectivity of the graph element to other parts of the routing graph.
  • The impact score can be expressed in various different ways including, for example, an amount of lost time or an amount of lost revenue due to the considered routing graph modification. In some examples, the impact score is expressed as a cost value that is based on multiple factors such as, for example, lost time or lost revenue. An impact score can be generated for a single roadway element or for multiple roadway elements. For example, if a particular roadway condition affects more than one roadway element, a single impact score can be generated for multiple affected roadway elements.
  • The impact engine 108 can determine impact scores in various different ways. In some examples, the impact engine 108 generates a set of predicted transportation service requests over a future time period (e.g., the next 30 minutes, the next hour, the next day). The future time period may be selected based on a time period when the considered corresponding graph element will be constrained. For example, if the considered corresponding graph element will be constrained between 4:00 p.m. and 5:00 p.m. on a Wednesday, the future time period matches that.
  • The set of predicted transportation service requests can be generated, for example, from data describing historic service requests. In some examples, the impact engine 108 generates the set of predicted transportation service requests considering other conditions. In some examples, the impact engine 108 may generate the set of predicted transportation service requests based on historical data describing transportation service requests in weather and/or traffic conditions similar to those forecast for the future time period. For example, if the future time period includes a county fair, festival, or other activity at or near the considered corresponding graph element, the impact engine 108 generates the set of predicted transportation requests considering historical data describing transportation service requests during similar events.
  • The impact engine 108 may also generate a set of predicted vehicles 102A, 102B, 102N, 103A, 103N that are available for executing requested transportation services during the future time period. The set of predicted vehicles can include autonomous vehicles 102A, 102B, 102N and, in some examples, human-driven vehicles 103A, 103N. The set of predicted vehicles 102A, 102B, 102N, 103A, 103N can be generated in a manner similar to that used to generate the set of predicted transportation service requests. For example, the impact engine 108 may consider historical data describing the number of vehicles 102A, 102B, 102N, 103A, 103N available during time periods similar to the future time period such as, for example, times on a similar day of the week, on similar times of day, having similar traffic and/or weather conditions, etc. In some examples, generating the set of predicted vehicles 102A, 102B, 102N, 103A, 103N also includes generating predicted initial vehicle locations. This can also be performed using historical data, as described herein.
  • The impact engine 108 uses the set of predicted vehicles 102A, 102B, 102N, 103A, 103N and the set of predicted transportation service requests to generate first and second simulations of the future time period. In the respective simulations, the service assignment system 104, for each of predicted service request of the set of predicted service requests, generates candidate routes and selects a vehicle 102A, 102B, 102N, 103A, 103N for executing the candidate route. The first simulation generates the candidate routes without applying the considered routing graph modification to the graph elements corresponding to the considered one or more roadway elements. In some examples, the considered one or more roadway elements may be the subject of other routing graph modifications, described by routing graph modification data 120, but may not have the considered routing graph modification applied for the first simulation. The second simulation generates candidate routes based on the considered routing graph modification. For example, if the considered routing graph modification will close the considered roadway element (e.g., by removing routing graph connectivity to its corresponding graph element), then the version of the routing graph 124 used for the first simulation may not close the considered roadway element. In the second simulation, the considered routing graph modification is applied to the considered roadway element (e.g., to its corresponding graph element).
  • The impact score for the considered roadway element is determined based on a comparison between the first simulation and the second simulation. The impact score may take different forms in different embodiments. In some examples, the impact score is a comparison between metrics, such as time of arrival, drop-off time, proportion of transportation services assigned to autonomous vehicles, completed transportation services, etc. In other examples, the impact score is an aggregation based on a combination of metrics.
  • When the impact score is based on more than one considered roadway element, the considered routing graph modification used to generate the second simulation applies to all of the considered roadway elements. For example, the second simulation may be generated based on a version of the routing graph 124 in which the considered routing graph modification is applied to all of the considered roadway elements (e.g., to their corresponding graph elements).
  • In some examples, the impact engine 108 is also configured to apply an impact score for a considered roadway element. In some examples, the generation of the impact score is initiated when a vehicle 102A, 102B, 102N, 103A, 103N encounters difficulties traversing a roadway element or roadway elements and places a request for assistance to the remote user 118. The impact engine 108 may generate an impact score for the roadway element or components that were the subject of the request for assistance and provides the impact score to the remote user 118. The remote user 118 may utilize the provided impact score to determine whether to apply a routing graph modification to the graph element or elements corresponding to the subject roadway element or elements.
  • In another example, the impact engine 108 is configured to automatically apply a routing graph modification based on a calculated impact score. For example, if the impact score for a roadway element or elements meets a threshold value, the impact engine 108 may apply a routing graph modification to the corresponding graph element or elements. In another example, the impact engine 108 takes different actions based on different thresholds. For example, if the impact score meets a first threshold, then the impact engine 108 may take no action. If the impact score meets a second threshold indicating a lower impact than the first threshold, then the impact engine 108 may provide the remote user 118 with discretion to apply or not apply a routing graph modification to the corresponding graph element or elements. If the impact score meets a third threshold indicating a lower impact than the second threshold, the impact engine 108 may automatically apply a routing graph modification to the corresponding graph element or elements.
  • FIG. 2 is a diagram showing one example of an environment 200 illustrating a remote user 118 utilizing an impact score. The environment 200 shows the autonomous vehicle 102A and remote detection sensor set 106A. The vehicle 102A is traveling on a roadway 204. As shown in the example environment 200, the vehicle 102A has encountered road construction 206. A human traffic director 208 is directing traffic around the road construction 206. For example, the human traffic director 208 may direct traffic, including the autonomous vehicle 102A over a detour route 209.
  • A vehicle autonomy system of the vehicle 102A may detect that it is unable to successfully navigate the road construction 206. In response, the vehicle autonomy system makes a request for assistance to an autonomous vehicle operations center 212 via a wireless communications link 214. While the wireless communications link 214 is illustrated as a cellular communications link that includes a cellular tower 216, other embodiments may use other types of wireless communications links, such as those provided via satellite or other communications technologies.
  • The request for assistance identifies the vehicle 102A and/or a location of the vehicle 120A. The location may be indicated in any suitable manner such as, for example, a current corresponding graph element, a latitude and longitude, etc. The vehicle 102A is connected to the remote user 118 at the vehicle operations center 212. The remote user 118 receives output from and provides input to a console 222. The console 222 may make up all or part of the remote user computing device 119 shown in FIG. 1. The console 222 is configured to receive input from the remote user 118 and provide control signals to the autonomous vehicle 102A based on the input via the communications link 214. The remote user 118 is also presented with visual output derived from at least one of the vehicle's 102A sensor's 106A. This visual output is presented on an operator display screen 221. The operator display screen 221 may also be a component of the remote user computing device 119.
  • Based on at least observations of the operator display screen 221, the remote user 118 provides input via the operator console 222. After receiving control signals that are derived from the operator's input, the autonomous vehicle 102A then proceeds according to those control signals. For example, the input from the remote user 118 via the console 222 can cause the vehicle 102A to follow direction from the human traffic director 208 and, for example, follow the detour route 209 when so directed (e.g., by the remote user 118). Control signals provided to the vehicle 102A can include, for example, direct control of the brakes, throttle, steering or other controls of the vehicle 102A. In other examples, control signals provided to the vehicle 102A can include a general instruction (e.g., deviate from the center of the lane to go around the road construction 206). After the road construction 206 has been successfully navigated, the remote user 118 may provide input to the operator console 222 indicating that the autonomous vehicle 102A is released from manual direction. The autonomous vehicle 102A may then continue navigating toward its destination without further operator assistance.
  • FIG. 3 depicts a block diagram of an example vehicle 300 according to example aspects of the present disclosure. The vehicle 300 includes one or more sensors 301, a vehicle autonomy system 302, and one or more vehicle controls 307. The vehicle 300 is an autonomous vehicle, as described herein. The example vehicle 300 shows just one example arrangement of an autonomous vehicle. In some examples, autonomous vehicles of different types can have different arrangements.
  • The vehicle autonomy system 302 includes a commander system 311, a navigator system 313, a perception system 303, a prediction system 304, a motion planning system 305, and a localizer system 330 that cooperate to perceive the surrounding environment of the vehicle 300 and determine a motion plan for controlling the motion of the vehicle 300 accordingly.
  • The vehicle autonomy system 302 is engaged to control the vehicle 300 or to assist in controlling the vehicle 300. In particular, the vehicle autonomy system 302 receives sensor data from the one or more sensors 301, attempts to comprehend the environment surrounding the vehicle 300 by performing various processing techniques on data collected by the sensors 301, and generates an appropriate route through the environment. The vehicle autonomy system 302 sends commands to control the one or more vehicle controls 307 to operate the vehicle 300 according to the route.
  • Various portions of the vehicle autonomy system 302 receive sensor data from the one or more sensors 301. For example, the sensors 301 may include remote-detection sensors as well as motion sensors such as an inertial measurement unit (IMU), one or more encoders, or one or more odometers. The sensor data includes information that describes the location of objects within the surrounding environment of the vehicle 300, information that describes the motion of the vehicle 300, etc.
  • The sensors 301 may also include one or more remote-detection sensors or sensor systems, such as a LIDAR system, a RADAR system, one or more cameras, etc. As one example, a LIDAR system of the one or more sensors 301 generates sensor data (e.g., remote-detection sensor data) that includes the location (e.g., in three-dimensional space relative to the LIDAR system) of a number of points that correspond to objects that have reflected a ranging laser. For example, the LIDAR system measures distances by measuring the Time of Flight (TOF) that it takes a short laser pulse to travel from the sensor to an object and back, calculating the distance from the known speed of light.
  • As another example, a RADAR system of the one or more sensors 301 generates sensor data (e.g., remote-detection sensor data) that includes the location (e.g., in three-dimensional space relative to the RADAR system) of a number of points that correspond to objects that have reflected ranging radio waves. For example, radio waves (e.g., pulsed or continuous) transmitted by the RADAR system reflect off an object and return to a receiver of the RADAR system, giving information about the object's location and speed. Thus, a RADAR system provides useful information about the current speed of an object.
  • As yet another example, one or more cameras of the one or more sensors 301 may generate sensor data (e.g., remote-detection sensor data) including still or moving images. Various processing techniques (e.g., range imaging techniques such as structure from motion, structured light, stereo triangulation, and/or other techniques) can be performed to identify the location (e.g., in three-dimensional space relative to the one or more cameras) of a number of points that correspond to objects that are depicted in an image or images captured by the one or more cameras. Other sensor systems can identify the location of points that correspond to objects as well.
  • As another example, the one or more sensors 301 can include a positioning system. The positioning system determines a current position of the vehicle 300. The positioning system can be any device or circuitry for analyzing the position of the vehicle 300. For example, the positioning system can determine a position by using one or more of inertial sensors, a satellite positioning system such as the Global Positioning System (GPS), a positioning system based on IP address, triangulation and/or proximity to network access points or other network components (e.g., cellular towers, Wi-Fi access points), and/or other suitable techniques. The position of the vehicle 300 can be used by various systems of the vehicle autonomy system 302.
  • Thus, the one or more sensors 301 are used to collect sensor data that includes information that describes the location (e.g., in three-dimensional space relative to the vehicle 300) of points that correspond to objects within the surrounding environment of the vehicle 300. In some implementations, the sensors 301 can be positioned at various different locations on the vehicle 300. As an example, in some implementations, one or more cameras and/or LIDAR sensors can be located in a pod or other structure that is mounted on a roof of the vehicle 300, while one or more RADAR sensors can be located in or behind the front and/or rear bumper(s) or body panel(s) of the vehicle 300. As another example, one or more cameras can be located at the front or rear bumper(s) of the vehicle 300. Other locations can be used as well.
  • The localizer system 330 receives some or all of the sensor data from the sensors 301 and generates vehicle poses for the vehicle 300. A vehicle pose describes a position and attitude of the vehicle 300. The vehicle pose (or portions thereof) can be used by various other components of the vehicle autonomy system 302 including, for example, the perception system 303, the prediction system 304, the motion planning system 305, and the navigator system 313.
  • The position of the vehicle 300 is a point in a three-dimensional space. In some examples, the position is described by values for a set of Cartesian coordinates, although any other suitable coordinate system may be used. The attitude of the vehicle 300 generally describes the way in which the vehicle 300 is oriented at its position. In some examples, attitude is described by a yaw about the vertical axis, a pitch about a first horizontal axis, and a roll about a second horizontal axis. In some examples, the localizer system 330 generates vehicle poses periodically (e.g., every second, every half second). The localizer system 330 appends time stamps to vehicle poses, where the time stamp for a pose indicates the point in time that is described by the pose. The localizer system 330 generates vehicle poses by comparing sensor data (e.g., remote-detection sensor data) to map data 326 describing the surrounding environment of the vehicle 300.
  • In some examples, the localizer system 330 includes one or more pose estimators and a pose filter. Pose estimators generate pose estimates by comparing remote-detection sensor data (e.g., LIDAR, RADAR) to map data. The pose filter receives pose estimates from the one or more pose estimators as well as other sensor data such as, for example, motion sensor data from an IMU, encoder, or odometer. In some examples, the pose filter executes a Kalman filter or machine learning algorithm to combine pose estimates from the one or more pose estimators with motion sensor data to generate vehicle poses. In some examples, pose estimators generate pose estimates at a frequency less than the frequency at which the localizer system 330 generates vehicle poses. Accordingly, the pose filter generates some vehicle poses by extrapolating from a previous pose estimate utilizing motion sensor data.
  • Vehicle poses and/or vehicle positions generated by the localizer system 330 are provided to various other components of the vehicle autonomy system 302. For example, the commander system 311 may utilize a vehicle position to determine whether to respond to a call from a service assignment system 340.
  • The commander system 311 determines a set of one or more target locations that are used for routing the vehicle 300. The target locations are determined based on user input received via a user interface 309 of the vehicle 300. The user interface 309 may include and/or use any suitable input/output device or devices. In some examples, the commander system 311 determines the one or more target locations considering data received from the service assignment system 340. The service assignment system 340 is programmed to provide instructions to multiple vehicles, for example, as part of a fleet of vehicles for moving passengers and/or cargo. Data from the service assignment system 340 can be provided via a wireless network, for example.
  • The navigator system 313 receives one or more target locations from the commander system 311 and map data 326. The map data 326, for example, provides detailed information about the surrounding environment of the vehicle 300. The map data 326 provides information regarding identity and location of different roadways and roadway elements. A roadway is a place where the vehicle 300 can drive and may include, for example, a road, a street, a highway, a lane, a parking lot, or a driveway. Routing graph data is a type of map data 326.
  • From the one or more target locations and the map data 326, the navigator system 313 generates route data describing a route for the vehicle 300 to take to arrive at the one or more target locations. In some implementations, the navigator system 313 determines route data using one or more path-planning algorithms based on costs for graph elements/corresponding roadway elements, as described herein. For example, a cost for a route can indicate a time of travel, risk of danger, or other factor associated with adhering to a particular candidate route. Route data describing a route is provided to the motion planning system 305, which commands the vehicle controls 307 to implement the route or route extension, as described herein. The navigator system 313 can generate routes as described herein using a general-purpose routing graph and routing graph modification data. Also, in examples where route data is received from the service assignment system 340, that route data can also be provided to the motion planning system 305.
  • The perception system 303 detects objects in the surrounding environment of the vehicle 300 based on sensor 301 data, the map data 326, and/or vehicle poses provided by the localizer system 330. For example, the map data 326 used by the perception system 303 describes roadways and segments thereof and may also describe buildings or other items or objects (e.g., lampposts, crosswalks, curbing); location and directions of traffic lanes or lane segments (e.g., the location and direction of a parking lane, a turning lane, a bicycle lane, or other lanes within a particular roadway); traffic control data (e.g., the location and instructions of signage, traffic lights, or other traffic control devices); and/or any other map data that provides information that assists the vehicle autonomy system 302 in comprehending and perceiving its surrounding environment and its relationship thereto.
  • In some examples, the perception system 303 determines state data for one or more of the objects in the surrounding environment of the vehicle 300. State data describes a current state of an object (also referred to as features of the object). The state data for each object describes, for example, an estimate of the object's current location (also referred to as position); current speed (also referred to as velocity); current acceleration; current heading; current orientation; size/shape/footprint (e.g., as represented by a bounding shape such as a bounding polygon or polyhedron); type/class (e.g., vehicle, pedestrian, bicycle, or other); yaw rate; distance from the vehicle 300; minimum path to interaction with the vehicle 300; minimum time duration to interaction with the vehicle 300; and/or other state information.
  • In some implementations, the perception system 303 determines state data for each object over a number of iterations. In particular, the perception system 303 updates the state data for each object at each iteration. Thus, the perception system 303 detects and tracks objects, such as other vehicles, that are proximate to the vehicle 300 over time.
  • The prediction system 304 is configured to predict one or more future positions for an object or objects in the environment surrounding the vehicle 300 (e.g., an object or objects detected by the perception system 303). The prediction system 304 generates prediction data associated with one or more of the objects detected by the perception system 303. In some examples, the prediction system 304 generates prediction data describing each of the respective objects detected by the perception system 303.
  • Prediction data for an object is indicative of one or more predicted future locations of the object. For example, the prediction system 304 may predict where the object will be located within the next 5 seconds, 30 seconds, 200 seconds, etc. Prediction data for an object may indicate a predicted trajectory (e.g., predicted path) for the object within the surrounding environment of the vehicle 300. For example, the predicted trajectory (e.g., path) can indicate a path along which the respective object is predicted to travel over time (and/or the speed at which the object is predicted to travel along the predicted path). The prediction system 304 generates prediction data for an object, for example, based on state data generated by the perception system 303. In some examples, the prediction system 304 also considers one or more vehicle poses generated by the localizer system 330 and/or map data 326.
  • In some examples, the prediction system 304 uses state data indicative of an object type or classification to predict a trajectory for the object. As an example, the prediction system 304 can use state data provided by the perception system 303 to determine that a particular object (e.g., an object classified as a vehicle) approaching an intersection and maneuvering into a left-turn lane intends to turn left. In such a situation, the prediction system 304 predicts a trajectory (e.g., path) corresponding to a left turn for the vehicle such that the vehicle turns left at the intersection. Similarly, the prediction system 304 determines predicted trajectories for other objects, such as bicycles, pedestrians, parked vehicles, etc. The prediction system 304 provides the predicted trajectories associated with the object(s) to the motion planning system 305.
  • In some implementations, the prediction system 304 is a goal-oriented prediction system 304 that generates one or more potential goals, selects one or more of the most likely potential goals, and develops one or more trajectories by which the object can achieve the one or more selected goals. For example, the prediction system 304 can include a scenario generation system that generates and/or scores the one or more goals for an object, and a scenario development system that determines the one or more trajectories by which the object can achieve the goals. In some implementations, the prediction system 304 can include a machine-learned goal-scoring model, a machine-learned trajectory development model, and/or other machine-learned models.
  • The motion planning system 305 commands the vehicle controls 307 based at least in part on the predicted trajectories associated with the objects within the surrounding environment of the vehicle 300, the state data for the objects provided by the perception system 303, vehicle poses provided by the localizer system 330, the map data 326, and route or route extension data provided by the navigator system 313. Stated differently, given information about the current locations of objects and/or predicted trajectories of objects within the surrounding environment of the vehicle 300, the motion planning system 305 determines control commands for the vehicle 300 that best navigate the vehicle 300 along the route or route extension relative to the objects at such locations and their predicted trajectories on acceptable roadways.
  • In some implementations, the motion planning system 305 can also evaluate one or more cost functions and/or one or more reward functions for each of one or more candidate control commands or sets of control commands for the vehicle 300. Thus, given information about the current locations and/or predicted future locations/trajectories of objects, the motion planning system 305 can determine a total cost (e.g., a sum of the cost(s) and/or reward(s) provided by the cost function(s) and/or reward function(s)) of adhering to a particular candidate control command or set of control commands. The motion planning system 305 can select or determine a control command or set of control commands for the vehicle 300 based at least in part on the cost function(s) and the reward function(s). For example, the motion plan that minimizes the total cost can be selected or otherwise determined.
  • In some implementations, the motion planning system 305 can be configured to iteratively update the route or route extension for the vehicle 300 as new sensor data is obtained from the one or more sensors 301. For example, as new sensor data is obtained from the one or more sensors 301, the sensor data can be analyzed by the perception system 303, the prediction system 304, and the motion planning system 305 to determine the motion plan.
  • The motion planning system 305 can provide control commands to the one or more vehicle controls 307. For example, the one or more vehicle controls 307 can include throttle systems, brake systems, steering systems, and other control systems, each of which can include various vehicle controls (e.g., actuators or other devices that control gas flow, steering, and braking) to control the motion of the vehicle 300. The various vehicle controls 307 can include one or more controllers, control devices, motors, and/or processors.
  • The vehicle controls 307 include a brake control module 320. The brake control module 320 is configured to receive a braking command and bring about a response by applying (or not applying) the vehicle brakes. In some examples, the brake control module 320 includes a primary system and a secondary system. The primary system receives braking commands and, in response, brakes the vehicle 300. The secondary system may be configured to determine a failure of the primary system to brake the vehicle 300 in response to receiving the braking command.
  • A steering control system 332 is configured to receive a steering command and bring about a response in the steering mechanism of the vehicle 300. The steering command is provided to a steering system to provide a steering input to steer the vehicle 300.
  • A lighting/auxiliary control module 336 receives a lighting or auxiliary command. In response, the lighting/auxiliary control module 336 controls a lighting and/or auxiliary system of the vehicle 300. Controlling a lighting system may include, for example, turning on, turning off, or otherwise modulating headlights, parking lights, running lights, etc. Controlling an auxiliary system may include, for example, modulating windshield wipers, a defroster, etc.
  • A throttle control system 334 is configured to receive a throttle command and bring about a response in the engine speed or other throttle mechanism of the vehicle. For example, the throttle control system 334 can instruct an engine and/or engine controller, or other propulsion system component, to control the engine or other propulsion system of the vehicle 300 to accelerate, decelerate, or remain at its current speed.
  • Each of the perception system 303, the prediction system 304, the motion planning system 305, the commander system 311, the navigator system 313, and the localizer system 330 can be included in or otherwise be a part of the vehicle autonomy system 302 configured to control the vehicle 300 based at least in part on data obtained from the one or more sensors 301. For example, data obtained by the one or more sensors 301 can be analyzed by each of the perception system 303, the prediction system 304, and the motion planning system 305 in a consecutive fashion in order to control the vehicle 300. While FIG. 3 depicts elements suitable for use in a vehicle autonomy system according to example aspects of the present disclosure, one of ordinary skill in the art will recognize that other vehicle autonomy systems can be configured to control an autonomous vehicle based on sensor data.
  • The vehicle autonomy system 302 includes one or more computing devices, which may implement all or parts of the perception system 303, the prediction system 304, the motion planning system 305, and/or the localizer system 330. Descriptions of hardware and software configurations for computing devices to implement the vehicle autonomy system 302 and/or the service assignment system 104 of FIG. 1. are provided herein with reference to FIGS. 9 and 10.
  • FIG. 4 is a flowchart showing one example of a process flow 400 that may be executed, for example, in the environment 100 of FIG. 1 to route a vehicle using an impact score. The process flow 400 can be performed by the service assignment system 104 and/or the impact engine 108. At operation 402, the service assignment system 104 receives an indication of one or more roadway elements for which an impact score is to be determined, also referred to herein as the considered roadway element.
  • The indication of the considered roadway element can be received in any suitable way. In some examples, an autonomous vehicle 102A, 102B, 102N provides a request for assistance indicating that the autonomous vehicle 102A, 102B, 102N is requesting assistance to traverse one or more roadway elements. The request for assistance includes or is otherwise associated with an indication of the one or more roadway elements with which the autonomous vehicle 102A, 102B, 102N is requesting assistance. For example, the one or more corresponding graph elements may include a condition that affects suitability for travel.
  • In another example, the indication of the one or more roadway elements can be reported by a vehicle 102A, 102B, 102N, 103A, 103N without a request for assistance. For example, a vehicle 102A, 102B, 102N, 103A, 103N may detect a condition at a roadway element that is near the vehicle 102A, 102B, 102N, 103A, 103N, but that the vehicle 102A, 102B, 102N, 103A, 103N is not traversing or set to traverse.
  • In other examples, the indication of the one or more roadway elements can be received from a traffic sensor, or other remote monitoring device. In still other examples, the indication of the one or more roadway elements can be generated by the service assignment system 104. For example, the service assignment system 104, or other suitable system, may determine that vehicle speeds at one or more roadway elements are reduced and/or may detect another factor indicating a condition at the one or more roadway elements indicating deteriorating suitability for travel. The service assignment system 104 or other suitable system may provide an indication of the one or more roadway elements to the service assignment system 104.
  • At operation 404, the service assignment system 104 determines an impact score for the considered one or more roadway elements. This can be performed in any suitable manner, for example, as described herein. Additional details describing how the impact score can be generated are described with respect to FIG. 7 and the accompanying description. The considered routing graph modification for the impact score can be generated or received in any suitable manner. In some examples, the considered routing graph modification is generated automatically and/or received from the remote user 118, a vehicle 102A, 102B, 102N, 103A, 103N, or any other suitable source.
  • At operation 406, the service assignment system 104 determines whether to apply the considered routing graph modification. In some examples, this includes providing the determined impact score to the remote user 118 assisting an autonomous vehicle 102A, 102B, 102N and receiving from the remote user 118 an indication of whether to apply the considered routing graph modification. For example, if the impact score indicates a low impact of applying the considered routing graph modification, the remote user 118 may choose to apply it. In other examples, deciding whether to apply the considered routing graph modification includes automatically applying the considered routing graph modification, for example, if the impact score meets one or more thresholds. Further details of determining whether to apply a routing graph modification to the graph elements corresponding to the considered one or more roadway elements are provided herein with respect to FIGS. 5 and 6.
  • If the service assignment system 104 determines to apply the considered routing graph modification, the service assignment system 104, at operation 408, generates a constrained routing graph 109 considering the applied routing graph modification at operation 408. If the service assignment system 104 determines not to apply the considered routing graph modification, the service assignment system 104, at operation 407, generates a constrained routing graph 109 that does not consider the applied routing graph modification. (Other routing graph modifications, such as those described by routing graph modification data 120, may be applied.) At operation 410, the service assignment system 104 generates a route for a first autonomous vehicle 102A, 102B, 102N using the constrained routing graph 109 as generated at operation 408 or at operation 407. In some examples, the route at operation 410 is generated to determine whether the autonomous vehicle 102A, 102B, 102N is assigned a transportation service.
  • In some examples, before applying the considered routing graph modification, the service assignment system 104 moves autonomous vehicles 102A, 102B, 102N that may be stranded or otherwise have their movement limited by the considered routing graph modification. For example, the service assignment system 104 may identify autonomous vehicles 102A, 102B, 102N that are likely to be assigned a future transportation service that requires the vehicle 102A, 102B, 102N to traverse the considered one or more roadway elements. This can be performed, for example, by considering the first and second simulations used to generate the impact score. For example, if a vehicle or vehicles 102A, 102B, 102N remain in a particular region or sector of the routing graph 124 for more than a threshold time in the second simulation and/or are assigned fewer than a threshold number of transportation services, the service assignment system 104 may determine that such vehicles 102A, 102B, 102N will have their movement limited by the considered routing graph modification. Accordingly, the service assignment system 104 may instruct such vehicles 102A, 102B, 102N to move to a different location before the considered routing graph modification is applied.
  • At operation 412, the service assignment system 104 instructs the autonomous vehicle 102A, 102B, 102N to begin executing the route determined at operation 410. For example, the autonomous vehicle 102A, 102B, 102N may be selected for executing a transportation service. The service assignment system 104 provides the instruction in a message offering the transportation service to the autonomous vehicle 102A, 102B, 102N.
  • FIG. 5 is a flowchart showing one example of a process flow 500 for determining whether to apply a routing graph modification to graph elements corresponding to a considered one or more roadway elements. The process flow 500 may be executed by the service assignment system 104 including, for example, by the impact engine 108.
  • At operation 502, the service assignment system 104 provides an impact score for a considered one or more roadway elements to the remote user 118. This can include, for example, providing the impact score to the remote user computing device 119, which may display the impact score at a user interface. In some examples, the impact score is provided as part of a user interface that is used by the remote user 118 to assist vehicles 102A, 102B, 102N, 103A, 104N. In some examples, the considered routing graph modification is also provided to the remote user 118. The operation 502 can be executed, for example, before, during, or after the remote user 118 assists an autonomous vehicle 102A, 102B, 102N at or near the considered one or more roadway elements.
  • At operation 504, the service assignment system 104 determines whether the remote user 118 has provided a prompt to apply the considered routing graph modification. If not, the service assignment system 104 (e.g., the routing engine 110 thereof) generates at least one route for an autonomous vehicle 102A, 102B, 102N at operation 506 without applying the considered routing graph modification.
  • If the remote user 118 does prompt the service assignment system 104 to apply the considered routing graph modification, the service assignment system 104 does so at operation 508. At operation 510, the service assignment system 104 generates at least one route for an autonomous vehicle applying the considered routing graph modification.
  • FIG. 6 is a flowchart showing one example of a process flow 600 that may be executed by the service assignment system 104 to process an impact score describing a considered roadway element. At operation 602, the service assignment system 104 determines whether the impact score meets a first threshold. If the impact score does meet the first threshold, then the service assignment system 104 takes no further action at operation 604. The impact score, for example, may meet the first threshold if it indicates an impact greater than a determined amount. For example, if the considered one or more roadway elements have a very high impact, the considered routing graph modification may not be used.
  • If the impact score does not meet the first threshold, the service assignment system 104 determines at operation 606 whether the impact score meets a second threshold. The second threshold indicates a lower impact of the considered routing graph modification than the first threshold. For example, an impact score could meet the second threshold without indicating a sufficiently high impact to meet the first threshold. If the impact score meets the second threshold, the service assignment system 104 takes a first remedial action at operation 608. The first remedial action can include, for example, prompting the remote user 118 for instructions about whether to apply a routing graph modification to graph elements corresponding to the one or more roadway elements. In other examples, the first remedial action can include automatically applying the routing graph modification.
  • In the example of FIG. 6, if the impact score does not meet the second threshold, then the service assignment system 104 executes a second remedial action at operation 610. This can include, for example, automatically applying the considered routing graph modification (e.g., without input from the remote user 118). For example, if the impact of a considered routing graph modification is not high enough to meet the second threshold, the considered one or more roadway elements may not be significant, justifying automatic application of a routing graph modification.
  • In some examples, the thresholds and remedial actions of the process flow 600 can be determined based on the behavior of the remote user 118 or other remote users in handling requests for assistance. For example, impact scores may be provided to remote users 118 as described herein with respect to FIG. 5. The service assignment system 104 may record the impact scores provided to the remote users 118 as well as the way that the remote users 118 acted on the provided impact scores. This can include, for example, whether the remote user 118 chose to take no action, whether the remote user 118 chose to apply a routing graph modification that raised the cost of the graph element or elements corresponding to the considered one or more roadway elements, as well as whether the remote user chose to apply a routing graph modification that changed routing graph connectivity. The recorded data can be used as training data to train a machine learning model executed by the service assignment system 104. The machine learning model may return values for the thresholds of FIG. 6 and/or determine remedial actions (if any) based on the impact score and/or other variables.
  • FIG. 7 is a flowchart showing one example of a process flow 700 that may be executed by the impact engine 108 to determine an impact score. At operation 702, the impact engine 108 determines a set of predicted service requests and a set of predicted vehicles for responding to the service requests. The set of predicted service requests and set of predicted vehicles may be determined for a future time period, for example, as described herein
  • At operation 704, the impact engine 108 generates a first simulation of the set of predicted service requests using the set of predicted vehicles 102A, 102B, 102N, 103A, 103N. This can include, for each of the set of predicted service requests, determining candidate vehicles from the set of predicted vehicles 102A, 102B, 102N, 103A, 103N and then generating routes for the set of predicted vehicles 102A, 102B, 102 N 103A, 103N without using the considered routing graph modification. In some examples, the service assignment system 104 assigns the predicted service requests to various predicted vehicles 102A, 102B, 102N, 103A, 103N, for example, in the same way that the service assignment system would assign actual service requests.
  • Although the first simulation is generated without using the considered routing graph modification, the first simulation may consider other routing graph modifications such as, for example, routing graph modifications described by routing graph modification data 120. In some examples, the first simulation applies an alternate routing graph modification to the graph element or elements corresponding to the considered one or more roadway elements. The alternate routing graph modification may reflect the costs associated with routing autonomous vehicles 102A, 102B, 102N to the considered one or more roadway elements in view of current conditions. For example, if the considered one or more components are under construction, the alternate routing graph modification may increase the cost of the considered one or more components to reflect a (slower) travel time for the roadway element. In another example, the alternate routing graph modification may raise the cost of the graph element or elements corresponding to the considered one or more roadway elements to indicate the resources of the remote user 118 used to navigate autonomous vehicles 102A, 102B, 102N through the considered one or more roadway elements. In some examples, the alternate routing graph modification involves removing the connectivity of one or more graph elements from the routing graph 124.
  • At operation 706, the impact engine 108 generates a second simulation of the set of predicted service requests using the set of predicted vehicles. In the second simulation, the considered routing graph modification is applied to generate the candidate routes. The considered service requests may be assigned to various vehicles of the set of predicted vehicles 102A, 102B, 102N, 103A, 103N as described herein. In some examples, such as when the connectivity of one or more graph elements is removed by the alternate routing graph modification, some or all of the considered service requests may not be met. For example, it may not be possible to complete one or more of the requested transportation services without traversing a removed roadway element.
  • At operation 708, the impact engine 108 generates the impact score using the first and second simulations. For example, the impact engine 108 may compare metrics describing the first and second simulations, such as time of arrival vehicles 102A, 102B, 102N, 103A, 103N at transportation service start locations, drop-off time at the service end location, etc. In some examples, the times of arrival are normalized. One example way to normalize a time of arrival is to find a time between when a transportation service is requested and the subsequent time of arrival of the vehicle 102A, 102B, 102N, 103A, 103N executing the service. The impact engine 108 can aggregate normalized times of arrival over all (or some) of the predicted set transportation service requests. For example, the impact engine 108 may find an average, median, or other aggregation of the normalized times of arrival. The aggregated normalized time of arrival in the first simulation may be compared to the aggregated normalized time of arrival in the second simulation. The impact score may be based on the results of the comparison.
  • The impact score may also be generated by considering a difference in how many of the set of predicted service requests are met in the second simulation versus the first. Consider an example in which one or more roadway elements have connectivity for their corresponding graph elements removed from the routing graph 124 by the considered routing graph modification. It is possible that one or all of the set of predicted service requests may not be met without routing an autonomous vehicle 102A, 102B, 102N via the removed roadway element. When this is the case, the affected service requests may not be met in the second simulation.
  • Further, in some examples, if the considered routing graph modification removes the connectivity of one or more graph elements, it may cause one or more autonomous vehicles 102A, 102B, 102N to be stranded within a sub-portion of the routing graph. These autonomous vehicles 102A, 102B, 102N may not be available to perform transportation services that require routing outside of that sub-portion. This can affect whether transportation service requests are met at all in the second simulation as well as the time of arrival, drop-off time, etc. over the second simulation.
  • Another example metric for comparison of the first and second simulations is the drop-off times for the transportation services. The drop-off times may also be normalized. For example, the impact engine 108 may find a difference between the time that a transportation service is requested and the time of drop-off. In another example, the impact engine 108 normalizes the drop-off times by finding a difference between the time of arrival for the transportation service and the drop-off time. The impact engine 108 may also aggregate normalized drop-off times over the first and second simulations. For example, the impact engine 108 may compare a mean, median or other aggregation of normalized drop-off times in the first simulation with similarly aggregated normalized drop-off times in the second simulation. The impact score may be based on the results of the comparison.
  • Yet another example metric for comparison is the portion of the set of predicted transportation services assigned to autonomous vehicles 102A, 102B, 102N versus human-driven vehicles 103A, 103N. The considered routing graph modification may apply to autonomous vehicles 102A, 102B, 102N but not to human-driven vehicles 103A, 103N. This may cause the routes of autonomous vehicles 102A, 102B, 102N to take longer than the routes of human-driven vehicles 103A, 103N. Accordingly, the times of arrival and/or drop-off times may be later for the second simulation than the first. As a result, using the modified routing graph may cause a larger proportion of the set of predicted transportation services to be assigned to human-driven vehicles 103A, 103N versus autonomous vehicles 102A, 102B, 102N. The amount of this difference may be or affect the impact score.
  • In some examples, the process flow 700 is performed by the impact engine 108 in conjunction with other components of the service assignment system 104. For example, the routing engine 110 may be utilized to generate candidate routes for the various simulated service requests and the transportation service selection engine 112 may be used to assign service requests to particular predicted vehicles 102A, 102B, 102N, 103A, 103N.
  • FIG. 8 is a flowchart showing one example of a process flow 800 that can be executed by the service assignment system 104 if a considered routing graph modification has been applied. When a considered routing graph modification is applied, transportation service requests received in real time are assigned to vehicles 102A, 102B, 102N, 103A, 103N based on the considered routing graph modification. For example, candidate routes used to assign real time transportation service requests may be generated using a constrained routing graph 109 that incorporates the considered routing graph modification. In the example of FIG. 8, the service assignment system 104 reconsiders whether to apply the considered routing graph modification.
  • At operation 802, the service assignment system 104 determines that a period of time has elapsed since the considered routing graph modification was applied. At operation 804, the service assignment system 104 generates a test route for a vehicle 102A, 102B, 102N, 103A, 103N. The test route includes the considered one or more roadway elements associated with the considered routing graph modification. The service assignment system 104 can select the vehicle 102A, 102B, 102N, 103A, 103N for the test route in any suitable manner. For example, the service assignment system 104 may select a vehicle 102A, 102B, 102N, 103A, 103N that is near the considered one or more roadway elements. At operation 806, the service assignment system 104 instructs the selected vehicle 102A, 102B, 102N, 103A, 103N to execute the test route.
  • At operation 808, the service assignment system 104 determines whether the roadway condition that prompted the considered routing graph modification is still present. For example, a driver, passenger, or other human user associated with the selected vehicle 102A, 102B, 102N, 103A, 103N may report whether the condition is still present. In another example, remote detection sensors of the selected vehicle 102A, 102B, 102N, 103A, 103N are used to determine if the condition is still present. For example, the remote detection sensor data may be used by the vehicle autonomy system to determine whether the condition is still present. In another example, remote detection sensor data from the selected vehicle 102A, 102B, 102N, 103A, 103N is provided to the remote user 118 and the remote user 118 determines whether the condition is present.
  • If the condition is still present, the considered routing graph modification is maintained at operation 810. For example, the service assignment system 104 may continue to assign transportation service requests to vehicles 102A, 102B, 102N, 103A, 103N using a constrained routing graph 109 that reflects the considered routing graph modification. If the condition is not present at operation 808, the considered routing graph modification is removed at operation 812. When the considered routing graph modification is removed, the service assignment system 104 begins to assign transportation service requests to vehicles 102A, 102B, 102N, 103A, 103N using a constrained routing graph 109 that does not reflect the considered routing graph modification.
  • FIG. 9 is a block diagram 900 showing one example of a software architecture 902 for a computing device. The software architecture 902 may be used in conjunction with various hardware architectures, for example, as described herein. FIG. 9 is merely a non-limiting example of a software architecture 902, and many other architectures may be implemented to facilitate the functionality described herein. A representative hardware layer 904 is illustrated and can represent, for example, any of the above-referenced computing devices. In some examples, the hardware layer 904 may be implemented according to an architecture 1000 of FIG. 10 and/or the software architecture 902 of FIG. 9.
  • The representative hardware layer 904 comprises one or more processing units 906 having associated executable instructions 908. The executable instructions 908 represent the executable instructions of the software architecture 902, including implementation of the methods, modules, components, and so forth of FIGS. 1-8. The hardware layer 904 also includes memory and/or storage modules 910, which also have the executable instructions 908. The hardware layer 904 may also comprise other hardware 912, which represents any other hardware of the hardware layer 904, such as the other hardware illustrated as part of the architecture 1000.
  • In the example architecture of FIG. 9, the software architecture 902 may be conceptualized as a stack of layers where each layer provides particular functionality. For example, the software architecture 902 may include layers such as an operating system 914, libraries 916, frameworks/middleware 918, applications 920, and a presentation layer 944. Operationally, the applications 920 and/or other components within the layers may invoke application programming interface (API) calls 924 through the software stack and receive a response, returned values, and so forth illustrated as messages 926 in response to the API calls 924. The layers illustrated are representative in nature, and not all software architectures have all layers. For example, some mobile or special-purpose operating systems may not provide a frameworks/middleware 918 layer, while others may provide such a layer. Other software architectures may include additional or different layers.
  • The operating system 914 may manage hardware resources and provide common services. The operating system 914 may include, for example, a kernel 928, services 930, and drivers 932. The kernel 928 may act as an abstraction layer between the hardware and the other software layers. For example, the kernel 928 may be responsible for memory management, processor management (e.g., scheduling), component management, networking, security settings, and so on. The services 930 may provide other common services for the other software layers. In some examples, the services 930 include an interrupt service. The interrupt service may detect the receipt of a hardware or software interrupt and, in response, cause the software architecture 902 to pause its current processing and execute an interrupt service routine (ISR) when an interrupt is received. The ISR may generate an alert.
  • The drivers 932 may be responsible for controlling or interfacing with the underlying hardware. For instance, the drivers 932 may include display drivers, camera drivers, Bluetooth® drivers, flash memory drivers, serial communication drivers (e.g., Universal Serial Bus (USB) drivers), WiFi® drivers, near-field communication (NFC) drivers, audio drivers, power management drivers, and so forth depending on the hardware configuration.
  • The libraries 916 may provide a common infrastructure that may be used by the applications 920 and/or other components and/or layers. The libraries 916 typically provide functionality that allows other software modules to perform tasks in an easier fashion than by interfacing directly with the underlying operating system 914 functionality (e.g., kernel 928, services 930, and/or drivers 932). The libraries 916 may include system libraries 934 (e.g., C standard library) that may provide functions such as memory allocation functions, string manipulation functions, mathematic functions, and the like. In addition, the libraries 916 may include API libraries 936 such as media libraries (e.g., libraries to support presentation and manipulation of various media formats such as MPEG4, H.264, MP3, AAC, AMR, JPG, and PNG), graphics libraries (e.g., an OpenGL framework that may be used to render 2D and 3D graphic content on a display), database libraries (e.g., SQLite that may provide various relational database functions), web libraries (e.g., WebKit that may provide web browsing functionality), and the like. The libraries 916 may also include a wide variety of other libraries 938 to provide many other APIs to the applications 920 and other software components/modules.
  • The frameworks 918 (also sometimes referred to as middleware) may provide a higher-level common infrastructure that may be used by the applications 920 and/or other software components/modules. For example, the frameworks 918 may provide various graphical user interface (GUI) functions, high-level resource management, high-level location services, and so forth. The frameworks 918 may provide a broad spectrum of other APIs that may be used by the applications 920 and/or other software components/modules, some of which may be specific to a particular operating system or platform.
  • The applications 920 include built-in applications 940 and/or third-party applications 942. Examples of representative built-in applications 940 may include, but are not limited to, a contacts application, a browser application, a book reader application, a location application, a media application, a messaging application, and/or a game application. The third-party applications 942 may include any of the built-in applications 940 as well as a broad assortment of other applications. In a specific example, the third-party application 942 (e.g., an application developed using the Android™ or iOS™ software development kit (SDK) by an entity other than the vendor of the particular platform) may be mobile software running on a mobile operating system such as iOS™, Android™, Windows® Phone, or other computing device operating systems. In this example, the third-party application 942 may invoke the API calls 924 provided by the mobile operating system such as the operating system 914 to facilitate functionality described herein.
  • The applications 920 may use built-in operating system functions (e.g., kernel 928, services 930, and/or drivers 932), libraries (e.g., system libraries 934, API libraries 936, and other libraries 938), or frameworks/middleware 918 to create user interfaces to interact with users of the system. Alternatively, or additionally, in some systems, interactions with a user may occur through a presentation layer, such as the presentation layer 944. In these systems, the application/module “logic” can be separated from the aspects of the application/module that interact with a user.
  • Some software architectures use virtual machines. For example, systems described herein may be executed using one or more virtual machines executed at one or more server computing machines. In the example of FIG. 9, this is illustrated by a virtual machine 948. A virtual machine creates a software environment where applications/modules can execute as if they were executing on a hardware computing device. The virtual machine 948 is hosted by a host operating system (e.g., the operating system 914) and typically, although not always, has a virtual machine monitor 946, which manages the operation of the virtual machine 948 as well as the interface with the host operating system (e.g., the operating system 914). A software architecture executes within the virtual machine 948, such as an operating system 950, libraries 952, frameworks/middleware 954, applications 956, and/or a presentation layer 958. These layers of software architecture executing within the virtual machine 948 can be the same as corresponding layers previously described or may be different.
  • FIG. 10 is a block diagram illustrating a computing device hardware architecture 1000, within which a set or sequence of instructions can be executed to cause a machine to perform examples of any one of the methodologies discussed herein. The hardware architecture 1000 describes a computing device for executing the vehicle autonomy system, described herein.
  • The architecture 1000 may operate as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the architecture 1000 may operate in the capacity of either a server or a client machine in server-client network environments, or it may act as a peer machine in peer-to-peer (or distributed) network environments. The architecture 1000 can be implemented in a personal computer (PC), a tablet PC, a hybrid tablet, a set-top box (STB), a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing instructions (sequential or otherwise) that specify operations to be taken by that machine.
  • The example architecture 1000 includes a processor unit 1002 comprising at least one processor (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both, processor cores, compute nodes). The architecture 1000 may further comprise a main memory 1004 and a static memory 1006, which communicate with each other via a link 1008 (e.g., a bus). The architecture 1000 can further include a video display unit 1010, an input device 1012 (e.g., a keyboard), and a UI navigation device 1014 (e.g., a mouse). In some examples, the video display unit 1010, input device 1012, and UI navigation device 1014 are incorporated into a touchscreen display. The architecture 1000 may additionally include a storage device 1016 (e.g., a drive unit), a signal generation device 1018 (e.g., a speaker), a network interface device 1020, and one or more sensors (not shown), such as a Global Positioning System (GPS) sensor, compass, accelerometer, or other sensor.
  • In some examples, the processor unit 1002 or another suitable hardware component may support a hardware interrupt. In response to a hardware interrupt, the processor unit 1002 may pause its processing and execute an ISR, for example, as described herein.
  • The storage device 1016 includes a machine-readable medium 1022 on which is stored one or more sets of data structures and instructions 1024 (e.g., software) embodying or used by any one or more of the methodologies or functions described herein. The instructions 1024 can also reside, completely or at least partially, within the main memory 1004, within the static memory 1006, and/or within the processor unit 1002 during execution thereof by the architecture 1000, with the main memory 1004, the static memory 1006, and the processor unit 1002 also constituting machine-readable media.
  • Executable Instructions and Machine-Storage Medium
  • The various memories (i.e., 1004, 1006, and/or memory of the processor unit(s) 1002) and/or the storage device 1016 may store one or more sets of instructions and data structures (e.g., the instructions 1024) embodying or used by any one or more of the methodologies or functions described herein. These instructions, when executed by the processor unit(s) 1002, cause various operations to implement the disclosed examples.
  • As used herein, the terms “machine-storage medium,” “device-storage medium,” and “computer-storage medium” (referred to collectively as “machine-storage medium”) mean the same thing and may be used interchangeably. The terms refer to a single or multiple storage devices and/or media (e.g., a centralized or distributed database, and/or associated caches and servers) that store executable instructions and/or data, as well as cloud-based storage systems or storage networks that include multiple storage apparatus or devices. The terms shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media, including memory internal or external to processors. Specific examples of machine-storage media, computer-storage media, and/or device-storage media include non-volatile memory, including by way of example semiconductor memory devices, e.g., erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), field-programmable gate array (FPGA), and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The terms “machine-storage media,” “computer-storage media,” and “device-storage media” specifically exclude carrier waves, modulated data signals, and other such media, at least some of which are covered under the term “signal medium” discussed below.
  • Signal Medium
  • The term “signal medium” or “transmission medium” shall be taken to include any form of modulated data signal, carrier wave, and so forth. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • Computer-Readable Medium
  • The terms “machine-readable medium,” “computer-readable medium” and “device-readable medium” mean the same thing and may be used interchangeably in this disclosure. The terms are defined to include both non-transitory machine-storage media and signal media. Thus, the terms include both storage devices/media and carrier waves/modulated data signals.
  • The instructions 1024 can further be transmitted or received over a communications network 1026 using a transmission medium via the network interface device 1020 using any one of a number of well-known transfer protocols (e.g., Hypertext Transfer Protocol (HTTP)). Examples of communication networks include a local area network (LAN), a wide area network (WAN), the Internet, mobile telephone networks, plain old telephone service (POTS) networks, and wireless data networks (e.g., Wi-Fi, 3G, 4G Long-Term Evolution (LTE)/LTE-A, 5G, or WiMAX networks).
  • Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
  • Various components are described in the present disclosure as being configured in a particular way. A component may be configured in any suitable manner. For example, a component that is or that includes a computing device may be configured with suitable software instructions that program the computing device. A component may also be configured by virtue of its hardware arrangement or in any other suitable manner.
  • The above description is intended to be illustrative, and not restrictive. For example, the above-described examples (or one or more aspects thereof) can be used in combination with others. Other examples can be used, such as by one of ordinary skill in the art upon reviewing the above description. The Abstract is to allow the reader to quickly ascertain the nature of the technical disclosure, for example, to comply with 37 C.F.R. § 1.72(b) in the United States of America. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims.
  • Also, in the above Detailed Description, various features can be grouped together to streamline the disclosure. However, the claims cannot set forth every feature disclosed herein, as examples can feature a subset of said features. Further, examples can include fewer features than those disclosed in a particular example. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate example. The scope of the examples disclosed herein is to be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.

Claims (20)

1. A computerized system for routing autonomous vehicles, the system comprising:
a processor unit; and
a storage device comprising instructions stored thereon that, when executed by the processor unit, cause the processor unit to execute operations comprising:
receiving an indication of a roadway element associated with a routing graph for routing autonomous vehicles;
determining an impact score for the roadway element, the impact score describing an impact of applying a routing graph modification to the routing graph to modify routing to the roadway element;
based at least in part on the impact score, applying the routing graph modification to the routing graph to generate a constrained routing graph;
generating a route for a first autonomous vehicle based at least in part on the constrained routing graph; and
instructing the first autonomous vehicle to begin traversing the route.
2. The computerized system of claim 1, the operations further comprising:
providing an indication of the impact score to a remote user computing device; and
receiving, from the remote user computing device, an instruction to apply the routing graph modification.
3. The computerized system of claim 1, wherein applying the routing graph modification comprises increasing a cost associated with a graph element that corresponds to the roadway element.
4. The computerized system of claim 1, wherein applying the routing graph modification comprises changing a connectivity of the routing graph to prevent routing via the roadway element.
5. The computerized system of claim 1, the operations further comprising receiving a request for assistance from a second autonomous vehicle, the request for assistance associated with the roadway element.
6. The computerized system of claim 1, wherein determining the impact score comprises generating a set of predicted service requests for a first future time period.
7. The computerized system of claim 6, the operations further comprising:
generating a first set of routes using the set of predicted service requests and a first routing graph; and
generating a second set of routes using the set of predicted service requests and a second routing graph, wherein the second routing graph is based at least in part on the routing graph modification.
8. The computerized system of claim 6, the operations further comprising:
generating a set of routes using the set of predicted service requests and a second routing graph, wherein the second routing graph is based at least in part on the routing graph modification; and
determining a portion of the set of predicted service requests assigned to the autonomous vehicles.
9. The computerized system of claim 1, the operations further comprising determining that the impact score meets a first threshold, wherein applying the routing graph modification is responsive to the determining that the impact score meets the first threshold.
10. The computerized system of claim 1, the operations further comprising:
determining that the impact score meets a second threshold;
prompting a device associated with a human user to indicate whether to apply the routing graph modification; and
receiving, from the device associated with human user, an instruction to apply the routing graph modification.
11. The computerized system of claim 1, the operations further comprising:
determining that a first time period has elapsed since applying the routing graph modification;
generating a test route including the roadway element;
instructing a second autonomous vehicle to begin traversing the test route;
determining, based at least in part on data received from the second autonomous vehicle, that a condition at the roadway element has cleared; and
responsive to determining that the condition has cleared, removing the routing graph modification.
12. The computerized system of claim 1, the operations further comprising, before applying routing graph modification, instructing a second autonomous vehicle to move from its current location.
13. A computer-implemented method for routing autonomous vehicles, the method comprising:
receiving, by a processor unit, an indication of roadway element associated with a routing graph for routing autonomous vehicles;
determining, by the processor unit, an impact score for the roadway element, the impact score describing an impact of applying a routing graph modification to the routing graph to modify routing to the roadway element;
based at least in part on the impact score, applying the routing graph modification to the routing graph to generate a constrained routing graph t, by the processor unit;
generating, by the processor unit, a route for a first autonomous vehicle based at least in part on the constrained routing graph; and
instructing, by the processor unit, the first autonomous vehicle to begin traversing the route.
14. The method of claim 13, further comprising:
providing an indication of the impact score to a remote user computing device; and
receiving, from the remote user computing device, an instruction to apply the routing graph modification.
15. The method of claim 13, further comprising receiving a request for assistance from a second autonomous vehicle, the request for assistance associated with the roadway element.
16. The method of claim 13, wherein determining the impact score comprises generating a set of predicted service requests for a first future time period.
17. The method of claim 16, further comprising:
generating a first set of routes using the set of predicted service requests and a first routing graph; and
generating a second set of routes using the set of predicted service requests and a second routing graph, wherein the second routing graph is based at least in part on the routing graph modification.
18. The method of claim 16, further comprising:
generating a set of routes using the set of predicted service requests and a second routing graph, wherein the second routing graph is based at least in part on the routing graph modification; and
determining a portion of the set of predicted service requests assigned to autonomous vehicles.
19. The method of claim 13, further comprising determining that the impact score meets a first threshold, wherein applying the routing graph modification is responsive to the determining that the impact score meets the first threshold.
20. A machine-storage medium comprising instructions thereon that, when executed by a processor unit, cause the processor unit to execute operations comprising:
receiving an indication of a roadway element associated with a routing graph for routing autonomous vehicles;
determining an impact score for the roadway element, the impact score describing an impact of applying a routing graph modification to the routing graph to modify routing to the roadway element;
based at least in part on the impact score, applying the routing graph modification to the routing graph to generate a constrained routing graph;
generating a route for a first autonomous vehicle based at least in part on the constrained routing graph; and
instructing the first autonomous vehicle to begin traversing the route.
US16/752,164 2019-01-25 2020-01-24 Autonomous vehicle routing with roadway element impact Abandoned US20200239024A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/752,164 US20200239024A1 (en) 2019-01-25 2020-01-24 Autonomous vehicle routing with roadway element impact

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201962796882P 2019-01-25 2019-01-25
US201962836936P 2019-04-22 2019-04-22
US201962868347P 2019-06-28 2019-06-28
US16/752,164 US20200239024A1 (en) 2019-01-25 2020-01-24 Autonomous vehicle routing with roadway element impact

Publications (1)

Publication Number Publication Date
US20200239024A1 true US20200239024A1 (en) 2020-07-30

Family

ID=69724123

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/752,164 Abandoned US20200239024A1 (en) 2019-01-25 2020-01-24 Autonomous vehicle routing with roadway element impact
US16/752,292 Active US11884293B2 (en) 2019-01-25 2020-01-24 Operator assistance for autonomous vehicles

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/752,292 Active US11884293B2 (en) 2019-01-25 2020-01-24 Operator assistance for autonomous vehicles

Country Status (5)

Country Link
US (2) US20200239024A1 (en)
EP (1) EP3914982A1 (en)
AU (1) AU2020211604A1 (en)
CA (1) CA3127637A1 (en)
WO (1) WO2020154676A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210248915A1 (en) * 2018-07-20 2021-08-12 Cybernet Systems Corp. Autonomous transportation system and methods
US20210302194A1 (en) * 2020-03-24 2021-09-30 HERE Global, B.V. Method, apparatus, and computer program product for generating parking lot geometry
GB2598758A (en) * 2020-09-10 2022-03-16 Toshiba Kk Task performing agent systems and methods
US20220242447A1 (en) * 2019-10-22 2022-08-04 Denso Corporation Remote support system, on-vehicle apparatus, remote support method, remote support program
US11409292B2 (en) 2020-03-24 2022-08-09 Here Global B.V. Method, apparatus, and computer program product for generating a map of road links of a parking lot
CN115311876A (en) * 2021-05-07 2022-11-08 丰田自动车株式会社 Remote assistance management system, remote assistance management method, and remote assistance management program
US20230094255A1 (en) * 2021-09-27 2023-03-30 7-Eleven, Inc. Autonomous delivery mechanism data integration in an application platform
US11754408B2 (en) * 2019-10-09 2023-09-12 Argo AI, LLC Methods and systems for topological planning in autonomous driving
US11884293B2 (en) 2019-01-25 2024-01-30 Uber Technologies, Inc. Operator assistance for autonomous vehicles
US11921504B1 (en) * 2020-12-29 2024-03-05 Zoox, Inc. Vehicle controller validation

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018119417A1 (en) * 2016-12-22 2018-06-28 Nissan North America, Inc. Autonomous vehicle service system
JP7234872B2 (en) * 2019-09-12 2023-03-08 トヨタ自動車株式会社 Vehicle remote indication system
JP7172937B2 (en) * 2019-09-30 2022-11-16 株式会社デンソー Remote support device, method and program
US11240730B2 (en) * 2020-02-28 2022-02-01 At&T Intellectual Property I, L.P. Selectively sending routing information to routing devices in a fifth generation (5G) or other next generation network
DE102020003073B3 (en) * 2020-05-22 2021-11-04 Daimler Ag Method and device for automated driving of a vehicle and vehicle
WO2021261058A1 (en) * 2020-06-26 2021-12-30 パナソニックIpマネジメント株式会社 Information processing method, information processing terminal, and information processing system
US11513498B2 (en) * 2020-08-03 2022-11-29 Caterpillar Paving Products Inc. Transitioning between manned control mode and unmanned control mode based on assigned priority
US11535276B2 (en) * 2020-09-08 2022-12-27 Waymo Llc Methods and systems for using remote assistance to maneuver an autonomous vehicle to a location
US11787438B2 (en) * 2020-12-17 2023-10-17 Zoox, Inc. Collaborative vehicle path generation
US20220204028A1 (en) * 2020-12-30 2022-06-30 Here Global B.V. Autonomous driving dual mode control
US20220207995A1 (en) * 2020-12-30 2022-06-30 Here Global B.V. Origination destination route analytics of road lanes
CN112732085A (en) * 2021-01-14 2021-04-30 广东高驰运动科技有限公司 Wearable device, control method and device thereof, and computer storage medium
JP7425975B2 (en) * 2021-04-23 2024-02-01 トヨタ自動車株式会社 remote function selection device
JP2022174596A (en) * 2021-05-11 2022-11-24 トヨタ自動車株式会社 Automatic driving system, automatic driving control method, and automatic driving control program
KR20230001073A (en) * 2021-06-25 2023-01-04 현대자동차주식회사 Autonomous vehicle, control system for remotely controlling the same, and method thereof
US11639180B1 (en) * 2021-06-30 2023-05-02 Gm Cruise Holdings Llc Notifications from an autonomous vehicle to a driver
CN115729226A (en) * 2021-09-01 2023-03-03 灵动科技(北京)有限公司 Method and system for controlling robot according to scheduling information and corresponding robot
US20230409025A1 (en) * 2022-06-17 2023-12-21 Gm Cruise Holdings Llc Proactive simulation-based remote assistance resolutions

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9047774B2 (en) 2013-03-12 2015-06-02 Ford Global Technologies, Llc Method and apparatus for crowd-sourced traffic reporting
US9720410B2 (en) * 2014-03-03 2017-08-01 Waymo Llc Remote assistance for autonomous vehicles in predetermined situations
US9798323B2 (en) * 2014-07-28 2017-10-24 GM Global Technology Operations LLC Crowd-sourced transfer-of-control policy for automated vehicles
US20180038701A1 (en) * 2015-03-03 2018-02-08 Pioneer Corporation Route search device, control method, program and storage medium
CN111367287B (en) * 2015-05-13 2024-06-21 优步技术公司 Autonomous vehicle operated by guidance assistance
US11283877B2 (en) * 2015-11-04 2022-03-22 Zoox, Inc. Software application and logic to modify configuration of an autonomous vehicle
US10401852B2 (en) * 2015-11-04 2019-09-03 Zoox, Inc. Teleoperation system and method for trajectory modification of autonomous vehicles
US9958864B2 (en) * 2015-11-04 2018-05-01 Zoox, Inc. Coordination of dispatching and maintaining fleet of autonomous vehicles
US9612123B1 (en) * 2015-11-04 2017-04-04 Zoox, Inc. Adaptive mapping to navigate autonomous vehicles responsive to physical environment changes
WO2017079341A2 (en) 2015-11-04 2017-05-11 Zoox, Inc. Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles
US10379533B2 (en) 2016-01-04 2019-08-13 GM Global Technology Operations LLC System and method for autonomous vehicle fleet routing
JP6565859B2 (en) * 2016-10-14 2019-08-28 トヨタ自動車株式会社 Vehicle control system
JP6695999B2 (en) * 2016-11-11 2020-05-20 本田技研工業株式会社 Vehicle control system, vehicle control method, and vehicle control program
US10042362B2 (en) 2016-11-18 2018-08-07 Waymo Llc Dynamic routing for autonomous vehicles
EP3577528B1 (en) * 2017-02-06 2021-10-27 Telefonaktiebolaget LM Ericsson (PUBL) Enabling remote control of a vehicle
US10779194B2 (en) * 2017-03-27 2020-09-15 Qualcomm Incorporated Preferred path network scheduling in multi-modem setup
US10564638B1 (en) * 2017-07-07 2020-02-18 Zoox, Inc. Teleoperator situational awareness
US10818187B2 (en) 2017-07-17 2020-10-27 Uatc, Llc Systems and methods for deploying an autonomous vehicle to oversee autonomous navigation
US10437247B2 (en) * 2017-08-10 2019-10-08 Udelv Inc. Multi-stage operation of autonomous vehicles
US10514697B2 (en) * 2017-09-15 2019-12-24 GM Global Technology Operations LLC Vehicle remote assistance mode
JP6839770B2 (en) * 2017-10-20 2021-03-10 株式会社日立製作所 Mobile control system and control device
US20190163176A1 (en) * 2017-11-30 2019-05-30 drive.ai Inc. Method for transferring control of an autonomous vehicle to a remote operator
US10732625B2 (en) * 2017-12-04 2020-08-04 GM Global Technology Operations LLC Autonomous vehicle operations with automated assistance
US10503165B2 (en) * 2017-12-22 2019-12-10 Toyota Research Institute, Inc. Input from a plurality of teleoperators for decision making regarding a predetermined driving situation
US10501014B2 (en) * 2018-03-02 2019-12-10 Uatc, Llc Remote assist feedback system for autonomous vehicles
WO2019191313A1 (en) * 2018-03-27 2019-10-03 Nvidia Corporation Remote operation of vehicles using immersive virtual reality environments
US10663977B2 (en) * 2018-05-16 2020-05-26 Direct Current Capital LLC Method for dynamically querying a remote operator for assistance
US11176831B2 (en) * 2018-06-15 2021-11-16 Phantom Auto Inc. Restricting areas available to autonomous and teleoperated vehicles
US11099561B1 (en) * 2018-06-19 2021-08-24 Zoox, Inc. Control of an autonomous vehicle in unmapped regions
US11726473B2 (en) * 2018-11-08 2023-08-15 Zoox, Inc. Autonomous vehicle guidance authority framework
US20200239024A1 (en) 2019-01-25 2020-07-30 Uatc, Llc Autonomous vehicle routing with roadway element impact
US11325591B2 (en) * 2019-03-07 2022-05-10 Honda Motor Co., Ltd. System and method for teleoperation service for vehicle

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12046145B2 (en) * 2018-07-20 2024-07-23 Cybernet Systems Corporation Autonomous transportation system and methods
US20210248915A1 (en) * 2018-07-20 2021-08-12 Cybernet Systems Corp. Autonomous transportation system and methods
US11884293B2 (en) 2019-01-25 2024-01-30 Uber Technologies, Inc. Operator assistance for autonomous vehicles
US11754408B2 (en) * 2019-10-09 2023-09-12 Argo AI, LLC Methods and systems for topological planning in autonomous driving
US20220242447A1 (en) * 2019-10-22 2022-08-04 Denso Corporation Remote support system, on-vehicle apparatus, remote support method, remote support program
US11624629B2 (en) * 2020-03-24 2023-04-11 Here Global B.V. Method, apparatus, and computer program product for generating parking lot geometry
US11409292B2 (en) 2020-03-24 2022-08-09 Here Global B.V. Method, apparatus, and computer program product for generating a map of road links of a parking lot
US20210302194A1 (en) * 2020-03-24 2021-09-30 HERE Global, B.V. Method, apparatus, and computer program product for generating parking lot geometry
GB2598758B (en) * 2020-09-10 2023-03-29 Toshiba Kk Task performing agent systems and methods
GB2598758A (en) * 2020-09-10 2022-03-16 Toshiba Kk Task performing agent systems and methods
US12085947B2 (en) 2020-09-10 2024-09-10 Kabushiki Kaisha Toshiba Task performing agent systems and methods
US11921504B1 (en) * 2020-12-29 2024-03-05 Zoox, Inc. Vehicle controller validation
CN115311876A (en) * 2021-05-07 2022-11-08 丰田自动车株式会社 Remote assistance management system, remote assistance management method, and remote assistance management program
US20230094255A1 (en) * 2021-09-27 2023-03-30 7-Eleven, Inc. Autonomous delivery mechanism data integration in an application platform
US12062004B2 (en) * 2021-09-27 2024-08-13 7-Eleven, Inc. Autonomous delivery mechanism data integration in an application platform

Also Published As

Publication number Publication date
EP3914982A1 (en) 2021-12-01
CA3127637A1 (en) 2020-07-30
US20200239023A1 (en) 2020-07-30
US11884293B2 (en) 2024-01-30
AU2020211604A1 (en) 2021-09-09
WO2020154676A1 (en) 2020-07-30

Similar Documents

Publication Publication Date Title
US20200239024A1 (en) Autonomous vehicle routing with roadway element impact
US20230358554A1 (en) Routing graph management in autonomous vehicle routing
US11619502B2 (en) Monitoring autonomous vehicle route conformance for improved efficiency
US11859990B2 (en) Routing autonomous vehicles using temporal data
US20220412755A1 (en) Autonomous vehicle routing with local and general routes
US12118837B2 (en) Responding to autonomous vehicle error states
US12025450B2 (en) Route comparison for vehicle routing
US11829135B2 (en) Tuning autonomous vehicle dispatch using vehicle performance
US12049235B2 (en) Routing feature flags
US20210356965A1 (en) Vehicle routing using third party vehicle capabilities
US20220065647A1 (en) Autonomous vehicle planned route prediction
US20210095977A1 (en) Revising self-driving vehicle routes in response to obstructions
US12013704B2 (en) Autonomous vehicle control system testing
US20210097587A1 (en) Managing self-driving vehicles with parking support
US20220065638A1 (en) Joint routing of transportation services for autonomous vehicles
US20230351896A1 (en) Transportation service provision with a vehicle fleet

Legal Events

Date Code Title Description
AS Assignment

Owner name: UATC, LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SRINIVASAN, ARVIND;YUAN, JAY;CHADHA, VALERIE;AND OTHERS;SIGNING DATES FROM 20200214 TO 20200407;REEL/FRAME:052415/0689

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: UBER TECHNOLOGIES, INC., MINNESOTA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:UATC, LLC;REEL/FRAME:054790/0526

Effective date: 20201204

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

AS Assignment

Owner name: UBER TECHNOLOGIES, INC., CALIFORNIA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNMENT DOCUMENT PREVIOUSLY RECORDED AT REEL: 054790 FRAME: 0527. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:UATC, LLC;REEL/FRAME:059692/0421

Effective date: 20201002

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION