CN118742868A - Mobile terminal system for autonomous vehicle - Google Patents
Mobile terminal system for autonomous vehicle Download PDFInfo
- Publication number
- CN118742868A CN118742868A CN202280091743.7A CN202280091743A CN118742868A CN 118742868 A CN118742868 A CN 118742868A CN 202280091743 A CN202280091743 A CN 202280091743A CN 118742868 A CN118742868 A CN 118742868A
- Authority
- CN
- China
- Prior art keywords
- autonomous vehicle
- landing
- inbound
- landing stage
- route
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000033001 locomotion Effects 0.000 claims description 95
- 238000000034 method Methods 0.000 claims description 83
- 230000000694 effects Effects 0.000 claims description 11
- 238000001514 detection method Methods 0.000 description 66
- 238000007726 management method Methods 0.000 description 50
- 238000004891 communication Methods 0.000 description 45
- 238000012423 maintenance Methods 0.000 description 36
- 238000007689 inspection Methods 0.000 description 30
- 230000008439 repair process Effects 0.000 description 23
- 230000004927 fusion Effects 0.000 description 22
- 230000006870 function Effects 0.000 description 13
- 241001465754 Metazoa Species 0.000 description 12
- 238000013500 data storage Methods 0.000 description 12
- 230000009471 action Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 11
- 238000005259 measurement Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 9
- 238000012545 processing Methods 0.000 description 9
- 230000004044 response Effects 0.000 description 9
- 230000006399 behavior Effects 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 6
- 238000004140 cleaning Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000012790 confirmation Methods 0.000 description 4
- 238000011068 loading method Methods 0.000 description 4
- 238000010801 machine learning Methods 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 238000012795 verification Methods 0.000 description 4
- 238000005303 weighing Methods 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 230000004888 barrier function Effects 0.000 description 3
- 230000010267 cellular communication Effects 0.000 description 3
- 238000010276 construction Methods 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 238000002360 preparation method Methods 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 230000000153 supplemental effect Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 239000012530 fluid Substances 0.000 description 2
- 239000000446 fuel Substances 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000007789 sealing Methods 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003891 environmental analysis Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- VJYFKVYYMZPMAB-UHFFFAOYSA-N ethoprophos Chemical compound CCCSP(=O)(OCC)SCCC VJYFKVYYMZPMAB-UHFFFAOYSA-N 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000010705 motor oil Substances 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The mobile terminal system includes equipment for establishing a terminal station having at least one landing stage sized and shaped to accommodate autonomous vehicles in a fleet of vehicles. The control subsystem determines that an autonomous vehicle in the fleet is inbound to the established terminal station in the forward direction. After determining the inbound autonomous vehicle, a landing order is determined that indicates a landing stage in which the inbound autonomous vehicle is to park and a route along which the inbound autonomous vehicle is to travel to reach the landing stage. The drop command is provided to the inbound autonomous vehicle. The landing instructions cause the inbound autonomous vehicle to travel along the route to the landing stage.
Description
Priority
The present application claims priority from U.S. provisional patent application No. 63/265728 entitled "MOBILE TERMINAL SYSTEM FOR AUTONOMOUS VEHICLES" filed on 12 months 20 of 2021 and U.S. provisional patent application No. 63/265734 entitled "SYSTEM FOR RAPID DEPLOYMENT OF TERMINALS FOR AUTONMOUS VEHICLES" filed on 12 months 20 of 2021, which are incorporated herein by reference.
Technical Field
The present disclosure relates generally to autonomous vehicles (autonomous vehicle, AV). More specifically, in certain embodiments, the present disclosure relates to a mobile terminal system for an autonomous vehicle.
Background
One goal of autonomous vehicle technology is to provide a vehicle that is capable of safely navigating towards a destination with or without limited driver assistance. In some cases, the autonomous vehicle may allow the driver to operate the autonomous vehicle as a conventional vehicle by controlling steering, throttle, clutches, gear shifts, and/or other vehicle control devices. In other cases, the driver may employ autonomous vehicle navigation techniques to allow the vehicle to drive autonomously. There is a need for a safer and more reliable operation of autonomous vehicles.
Disclosure of Invention
The present disclosure recognizes various problems and previously unmet needs associated with autonomous vehicle navigation and driving, including lack of tools for effectively establishing and operating resources to reliably launch and drop an autonomous vehicle from a location. For example, if the autonomous vehicle is leaving a given location, the driver may currently be required to maneuver the autonomous vehicle along an initial portion of the route (e.g., until the autonomous vehicle begins to autonomously drive on the appropriate road). As another example, when an autonomous vehicle reaches its destination, it is not possible to effectively and reliably place the autonomous vehicle in a given location. In these situations, the driver typically controls the autonomous vehicle to maneuver the vehicle to the appropriate stop.
Certain embodiments of the present disclosure address these and other problems, including those described above, by facilitating efficient, safe, reliable setup and operation of terminal sites of an autonomous vehicle fleet using a mobile terminal system. The mobile terminal system includes equipment for setting and operating a terminal station in which an autonomous vehicle can fall (e.g., to put down items, people, etc. for transport) and launch (e.g., to begin traveling to transport items, people, etc.). The terminal station set by the mobile terminal system includes a landing stage (LANDING PAD) capable of holding or accommodating an incoming autonomous vehicle and/or a launch stage (launchpad) capable of holding or accommodating an outgoing autonomous vehicle exiting the terminal station. The control subsystem of the mobile terminal system helps direct the launch and landing operations of the autonomous vehicle. The disclosed mobile terminal system provides several technical advantages by providing, for example: 1) The usability of items such as position profile (delineator), sensors, etc. for quick and efficient setup of a terminal station with landing stage(s) and/or launch stage(s) is improved; 2) Improving the landing of the autonomous vehicle at a specially designated landing stage, which facilitates efficient and reliable guiding of the autonomous vehicle to an appropriate parking position without obstruction; 3) The launch of the autonomous vehicle from a specially designated launch pad is improved, which facilitates efficient and reliable launch or "launching" of the autonomous vehicle to begin moving along the route; 4) The ability to efficiently generate route data that an autonomous vehicle follows in order to reach a newly established terminal site by a mobile terminal system is improved; and 5) the ability to quickly and efficiently establish new end stations or provide supplemental control resources to existing end stations when needed. As such, the present disclosure may improve the functionality of a computer system for supporting operation of a fleet of autonomous vehicles and improve autonomous vehicle navigation during at least a portion of a journey experienced by an autonomous vehicle.
In some embodiments, the mobile terminal system described in this disclosure may be integrated into the practical application of a vehicle that includes equipment for quickly deploying or setting up new terminal sites as needed when the need arises. The equipment allows for quick deployment of end stations on demand to support autonomous vehicle fleets. The present disclosure is also integrated into the practical application of a control subsystem that more effectively and reliably directs autonomous vehicles to move in and out of a rapidly deployed terminal site than previously possible. The mobile terminal system facilitates efficient, safe, reliable routing and landing (e.g., parking or stopping) of autonomous vehicles at unobstructed, proper landing stops at rapidly deployed terminal sites. The mobile terminal system also or alternatively facilitates reliable and efficient launch and routing of autonomous vehicles from terminal sites.
In some embodiments, the control subsystem communicates with sensors located in, near, and/or around the landing stage and/or the launch stage. Information from the autonomous vehicle sensors (e.g., alone or in combination with information from the autonomous vehicle sensors) is used to effectively and reliably direct the autonomous vehicle into the appropriate landing stage and/or out of the launch stage. For example, when an autonomous vehicle is driving into a terminal site, the control subsystem of the mobile terminal system may receive information from the sensor and use the sensor information to identify landing pads available for receiving the driving-in autonomous vehicle and/or routes to the identified landing pads. If the route to the identified landing stage becomes blocked, the control subsystem may identify a different landing stage without obstructions and/or a different route to the landing stage. An initiating instruction is provided to the entering autonomous vehicle that causes the autonomous vehicle to follow the route to the landing stage. The mobile terminal system may reduce or eliminate the actual technical barriers or bottlenecks to receiving a large number of autonomous vehicles at a rapidly deployed terminal site, such as a location to which cargo is transported, with little or no human intervention.
As another example, when an autonomous vehicle needs to exit the dock, information from sensors in, on, or near the dock (e.g., alone and/or in combination with autonomous vehicle sensor data) may be used to determine whether the space around the autonomous vehicle and the dock are clear enough to begin movement. The launch instructions provided from the mobile terminal system facilitate an improvement in the automatic launch of the autonomous vehicle to begin moving along the route without requiring action from the driver. The launch instruction may indicate an active path for exiting the end station. Such an approach may reduce or eliminate the actual technical barriers to launching an autonomous vehicle from a rapidly deployed terminal site, such as those commonly encountered, for example, for movement of cargo and/or personnel.
Some embodiments of the present disclosure may include some, all, or none of these advantages. These and other features will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings and claims.
In one embodiment, a mobile terminal system includes a vehicle having (e.g., capable of storing) a location profile that, when deployed, is configured to establish a terminal site within a physical space. The established terminal station includes at least one landing stage sized and shaped to accommodate autonomous vehicles in the fleet. The mobile terminal system includes a control subsystem having a hardware processor that determines that an autonomous vehicle in a fleet is inbound to an established terminal station. After determining that the inbound autonomous vehicle of the fleet is inbound to the terminal station established in the forward direction, a landing order is determined that indicates a landing stage in which the inbound autonomous vehicle is to park and a route that the inbound autonomous vehicle is to travel in order to reach the landing stage. The drop command is provided to an inbound autonomous vehicle. The landing instructions cause the inbound autonomous vehicle to travel along the route to the landing stage (e.g., after being received by an onboard control system of the inbound autonomous vehicle).
In another embodiment, a mobile terminal system includes a vehicle having (e.g., capable of storing) a location profile that, when deployed, is configured to establish a terminal site within a physical space. The established terminal station includes at least one launch pad sized and shaped to accommodate autonomous vehicles in the fleet. The mobile terminal system includes a control subsystem having a hardware processor that determines that an autonomous vehicle in a fleet is requesting to depart from a launch platform. After determining that an autonomous vehicle in the fleet is requesting departure from the launch pad, a launch command is determined that indicates whether the autonomous vehicle is capable of exiting the launch pad and a route the autonomous vehicle is to travel after exiting the launch pad. The launch command is provided to the autonomous vehicle. The launch instructions cause the autonomous vehicle to exit the launch pad and move along the route (e.g., after being received by an on-board control system of the autonomous vehicle).
Drawings
For a more complete understanding of the present disclosure, reference is now made to the following brief description, taken in connection with the accompanying drawings and detailed description, wherein like reference numerals represent like parts.
FIG. 1 is a diagram of an example mobile terminal system;
FIG. 2 is a diagram illustrating an example route that an autonomous vehicle between terminal sites established by the mobile terminal system of FIG. 1 can travel;
FIG. 3 is a diagram illustrating the example end station of FIG. 2 in more detail;
FIG. 4 is a flow chart of an example method of operating the mobile terminal system of FIG. 1;
FIG. 5 is a diagram of an example autonomous vehicle configured to implement an autonomous driving operation;
FIG. 6 is an example system for providing autonomous driving operations for use by the autonomous vehicle of FIG. 5;
FIG. 7 is a diagram of an onboard control computer included in an autonomous vehicle;
FIG. 8 is a diagram illustrating operation of an example engine mount;
FIG. 9 is a flow chart of an example method of operating a launch pad;
FIG. 10 is a diagram illustrating operation of an example landing stage;
FIG. 11 is a flow chart of an example method of operating a landing stage;
Fig. 12A is a diagram showing an example mobile re-transmission operation;
Fig. 12B is a diagram illustrating the example mobile re-transmission apparatus of fig. 12A; and
Fig. 13 is a flow chart illustrating a mobile retransmission method.
Detailed Description
Generally, a terminal is an area in which an autonomous driving system of each autonomous vehicle can be safely engaged and disengaged. The terminal also provides a space in which activities can be performed, such as inspecting mechanical components of the autonomous vehicle, cleaning the autonomous vehicle sensors, calibrating the autonomous vehicle sensors, repairing the autonomous vehicle, adding fluid to the autonomous vehicle, refueling the autonomous vehicle, performing a towing operation (e.g., loading, inspecting, weighing, sealing), unloading data storage from the autonomous vehicle (e.g., by pulling physical memory from the autonomous vehicle and/or transmitting via wireless communication), and attaching/detaching a trailer to/from the autonomous vehicle.
The present disclosure recognizes a previously unrecognized and unmet need for a tool for quickly, efficiently, and reliably establishing a new terminal (also referred to herein as a terminal station) to support movement of an autonomous vehicle fleet. Such a rapidly deployed terminal site (which may use the mobile terminal system of the present disclosure) may meet short-term needs for routes, such as when a proof-of-concept (proof-of-concept) route is tested for an autonomous vehicle fleet, or when a temporary route is needed to avoid an area (e.g., where a previous route is not available or no longer suitable). As an example, a mobile terminal station may help support an alternative route in the event that a natural disaster or road construction renders the previous route no longer sustainable. As another example, the mobile terminal station may meet short term increases in shipment needs, such as during a particular time of day when shipment increases in a year, or when shipment begins to increase at a given location. In some of these cases and others, it may be necessary to quickly establish a terminal station and use the terminal station for hours or days, while establishing a conventional terminal may take weeks. The mobile terminal system of the present disclosure may be used in these situations to help support movement of an autonomous vehicle fleet. The mobile terminal system of the present disclosure is capable of establishing a functional terminal site without any fixed structure.
The present disclosure provides practical applications of a mobile terminal system that solve the above-mentioned problems and other problems. In addition to providing resources for rapid deployment of new end stations and/or augmenting existing stations, the mobile end system is also configured to help direct autonomous vehicles to move toward, from, and within the end stations. The present disclosure allows an autonomous vehicle to travel more efficiently and reliably than previously possible by facilitating the autonomous vehicle to travel as much as possible without operator intervention. The mobile terminal system also includes a control subsystem that not only helps direct the host vehicle's landing and launching movements, but also improves the performance of unloading, loading, inspection, and maintenance tasks on the host vehicle.
Mobile terminal system
Fig. 1 illustrates an example mobile terminal system 100. The mobile terminal system 100 includes a vehicle 132, a control subsystem 102, one or more sensors 104, and equipment 106 for setting up a new terminal station (e.g., stations 202, 206, 216 of fig. 2 and 3). The vehicle 132 may generally be any type of vehicle capable of transporting the control subsystem 102, sensors 104, and equipment 106. For example, the vehicle 132 may be a van as shown in the example of fig. 1 or any other suitably sized vehicle. In some embodiments, vehicle 132 is a larger vehicle, such as a camping truck or bus, and may include, for example, a bathroom fixture.
Control subsystem 102 is a device that coordinates operation of mobile terminal system 100 and provides information to an autonomous vehicle fleet to improve performance of the autonomous vehicle fleet (see autonomous vehicle 502 of fig. 5, described below). For example, the landing instructions 116 and launch instructions 118 may be determined by the control system 102 and provided to an incoming and outgoing autonomous vehicle to better direct movement of the autonomous vehicle during landing and exiting the terminal station, as described in more detail below with respect to fig. 3 and 4. The fleet management data 114 may be used to determine when and where end stations need to be set, and route data 120 is collected by the mobile terminal system 100 and used by the autonomous vehicle fleet to navigate to the end stations. The route data 120 may also capture environmental changes around the end station and/or establish one or more new lanes in or around the end station. The control subsystem 102 includes a processor 108, a memory 110, and a communication interface 112. The processor 108 includes one or more processors. The processor 108 is any electronic circuit including, but not limited to, a state machine, one or more central processing unit (central processing unit) chips, a logic unit, a core (e.g., a multi-core processor), a field-programmable gate array (FPGA) GATE ARRAY, an application-specific integrated circuit (ASIC), or a digital signal processor (DIGITAL SIGNAL processor, DSP). The processor 108 may be a programmable logic device, a microcontroller, a microprocessor, or any suitable combination of the preceding. The processor 108 is communicatively coupled to, and in signal communication with, the memory 110 and the communication interface 112, as well as the sensor(s) 104 (described further below). The one or more processors are configured to process data and may be implemented in hardware and/or software. For example, the processor 108 may be 8-bit, 16-bit, 32-bit, 64-bit, or any other suitable architecture. The processor 108 may include an arithmetic logic unit (ARITHMETIC LOGIC UNIT, ALU) for performing arithmetic and logical operations, processor registers that supply operands to the ALU and store the results of the ALU operations, and a control unit that fetches instructions from the memory 110 and performs them by directing coordinated operations of the ALU, registers, and other components.
The memory 110 is operable to store fleet management data 114, drop instructions 116, launch instructions 118, route data 120 (e.g., data including new routes or updated data for existing routes), and/or any other data, instructions, logic, rules, or code operable to perform the functions of the mobile terminal system 100. The fleet management data 114 may include the current location and planned destination of autonomous vehicles in the fleet. Fleet management data 114 may include planned routes that autonomous vehicles will travel in order to reach a destination. The fleet management data 114 can be used to determine when and where new end stations should be deployed, as further described below with respect to fig. 2. The drop instructions 116 indicate the movements that an incoming autonomous vehicle in the fleet may perform in order to reach a drop station within the terminal station (see fig. 3 and 4). For example, the landing instructions 116, when executed by the autonomous vehicle, may direct at least a portion of the autonomous vehicle to operate to reach a landing stage in the terminal station. Launch instructions 118 indicate the movements that an exiting autonomous vehicle in a fleet of vehicles on a launch pad in a terminal station may perform in order to exit the launch pad. For example, when executed by an autonomous vehicle, launch instructions 118 may direct at least a portion of the exiting autonomous vehicle to operate to leave the launch platform and reach a haul route (e.g., a road). Route data 120 may indicate a route that an autonomous vehicle in a fleet travels in order to reach a terminal (e.g., routes 204, 214 of fig. 2). Route data 120 may include data collected by sensors 104, such as road condition information, travel obstacles, traffic, and the like. Memory 110 includes one or more magnetic disks, tape drives, or solid state drives, and may serve as an overflow data storage device to store programs as they are selected for execution, and to store instructions and data that are read during program execution. The memory 110 may be volatile or non-volatile, and may include read-only memory (ROM), random-access memory (RAM), ternary content addressable memory (ternary content-addressable memory, TCAM), dynamic random-access memory (DRAM), and static random-access memory (SRAM).
The communication interface 112 is configured to communicate data between the control subsystem 102 and other devices, systems, or domain(s), such as autonomous vehicles 502 in a fleet and a fleet management system (see fleet management system 208 of fig. 2). The communication interface 112 is an electronic circuit configured to enable communication between devices. For example, the communication interface 112 may include one or more serial ports (e.g., USB ports, etc.) and/or parallel ports (e.g., any type of multi-pin ports) for facilitating communication with local devices such as the sensor 104. As further examples, the communication interface 112 may be a network interface including a cellular communication transceiver, a WiFi interface, a local area network (local area network, LAN) interface, a wide area network (wide area network, WAN) interface, a modem, a switch, and/or a router. The processor 108 is configured to send and receive data using the communication interface 112. The communication interface 112 may be configured to use any suitable type of communication protocol. Communication interface 112 communicates fleet management data 114, drop instructions 116, launch instructions 118, and route data 120.
The sensors 104 may include any number of sensors configured to sense information regarding a location, environment, or other condition surrounding the vehicle 132 of the mobile terminal system 100. The sensors 104 may include one or more sensors 546 as shown in fig. 5 and described further below. The sensors 104 may include one or more cameras or image capture devices, RADAR units, one or more temperature sensors, inertial measurement units (inertial measurement unit, IMU), laser rangefinder/LIDAR units, and/or global positioning system (Global Positioning System, GPS) transceivers. The IMU of the sensor 104 may include any combination of sensors (e.g., accelerometers and gyroscopes) configured to sense a change in position and a change in orientation of the vehicle 132 (e.g., along the lines 204, 214 of fig. 2). The GPS transceiver of the sensor 104 may be any sensor configured to estimate the geographic location of the vehicle 132 (e.g., traveling along the routes 204, 214 of fig. 2). The GPS transceiver may include a receiver/transmitter operable to provide information regarding the position of the vehicle 132 relative to the earth. The RADAR unit of the sensor 104 may be configured to use radio signals to sense objects within the local environment of the vehicle 132 (e.g., along the routes 204, 214 of fig. 2). The laser rangefinder or LIDAR unit of the sensor 104 may be any sensor(s) configured to sense objects in the environment of the vehicle 132 (e.g., along the routes 204, 214 of fig. 2) using lasers. The camera of the sensor 104 may include one or more devices configured to capture a plurality of images (e.g., still images or video) of the environment of the vehicle 132 (e.g., along the routes 204, 214 of fig. 2).
Information collected and/or generated by the sensors 104 may be included in the route data 120. For example, the route data 120 may provide coordinates (e.g., a GPS transceiver from the sensor 104) that are traveled in order to reach the established terminal site 206. The route data 120 may include information about the route detected by the sensor 104, such as a closed lane, an obstacle on or near the route, traffic, and the like. Such more detailed route information may further improve the operation of autonomous vehicles in a fleet because autonomous vehicles may operate more efficiently and reliably when the planned route is more known than merely geographic coordinates.
Equipment 106 includes any materials, supplies, and resources that can be used to deploy new (e.g., short-term or temporary) end-stations (see fig. 3 and corresponding description below to obtain a more detailed description of an example end-station). The equipment 106 of the mobile terminal system 100 may be deployed to any location suitable (e.g., sufficiently flat, large enough, located in an area along the drawn routes 204, 214 of fig. 2) to rapidly add capacity or capability to an autonomous vehicle fleet. The equipment 106 may include a secure data store 122, an autonomous vehicle maintenance/repair kit (kit) 124, one or more portable devices 126, and a site setup kit 128. The equipment 106 may be packaged for efficient transportation and deployment (e.g., the equipment 106 may be collapsible, modular, and/or made with a less permanent construction), as needed or temporary. The equipment 106 may comprise less than a full set of equipment for establishing a complete conventional terminal site. For example, the secure data store 122 may have reduced capacity compared to a complete conventional terminal site, and the autonomous vehicle maintenance/repair kit 124 may have fewer tools and replacement parts than included in a permanent conventional terminal site. At least some of the equipment 106 (such as cameras, lights, and traffic guardrails) may improve security and feel of security within the terminal station. Other equipment, such as portable device(s) 126, 106 improves the efficiency of autonomous vehicle operation in the terminal station by allowing an alert (see alert 340 of fig. 3) to be sent to the appropriate technician or other person responsible for supporting autonomous vehicle landing activities and autonomous vehicle launching activities, as described further below.
The equipment 106 may facilitate both setting the physical space to operate as new (e.g., short-term terminal site—see fig. 3) and performing fleet support actions in the terminal site, such as inspecting mechanical components of the autonomous vehicle, cleaning and/or calibrating autonomous vehicle sensors (see sensor 546 of fig. 5), repairing the autonomous vehicle, adding fluid to the autonomous vehicle, refueling the autonomous vehicle, performing a towing operation (e.g., loading of a trailer, inspection, weighing, sealing), unloading data storage from the autonomous vehicle (e.g., by pulling physical memory from the autonomous vehicle and/or transmitting via wireless communication), and attaching/detaching the trailer to/from the autonomous vehicle.
Secure data store 122 may be any secure data store (e.g., the same as or similar to memory 110 described above) that is used to offload data from an autonomous vehicle in a terminal station. For example, when an autonomous vehicle falls at a terminal station, data regarding the last journey performed by the autonomous vehicle may be offloaded to the secure data store 122.
The autonomous vehicle maintenance/repair kit 124 may include any tools and/or components for performing autonomous vehicle maintenance. The autonomous vehicle maintenance/repair kit 124 may include equipment to calibrate sensors (e.g., of the sensor subsystem 544 of the autonomous vehicle 502 shown in fig. 5). In some cases, the mobile terminal system 100 may be configured to perform roaming maintenance, as described in more detail below with respect to fig. 2 and 4. For example, after establishing the terminal site, the mobile terminal system 100 may receive a request for maintenance along an autonomous vehicle route (e.g., routes 204, 214 of fig. 2). The vehicle 102 of the mobile terminal system 100 may then travel to the autonomous vehicle in need of repair, and the operator or technician may use the autonomous vehicle maintenance/repair kit 124 to effectively repair and assist in re-moving the autonomous vehicle. The autonomous vehicle maintenance/repair kit 124 may include tools for inspecting the autonomous vehicle. In some cases, results from the inspection may be provided to the control subsystem 102, which in turn, the control subsystem 102 may provide the inspection results to a centralized fleet management system (e.g., the fleet management system 208 of fig. 2).
The portable device(s) 126 are typically smart phones, tablets, or other handheld and/or lightweight devices that can operate within a deployed terminal site. The portable device 126 may receive alarms or other notifications from the control subsystem 102 regarding actions to be taken to improve reliability and efficiency of operation in the terminal site. For example, the portable device 126 may receive an alert indicating an incoming autonomous vehicle so that a user of the portable device 126 may begin to prepare for inspection and offloading of the autonomous vehicle. As another example, the portable device 126 may receive an alert indicating that the dock is not clear to an autonomous vehicle requesting to exit the terminal site. The user of the portable device 126 may then take action to clear the area around the engine block (see fig. 3). As a further example, the portable device 126 may receive an alert of an incoming autonomous vehicle, access an autonomous vehicle route schedule, provide information to support inspection and/or verification of the readiness of the autonomous vehicle prior to departure. The portable device 126 may provide the user with visibility of the health and/or location of the autonomous vehicle or associated trailer. In some cases, the portable device 126 may host an application service that supports operation of the autonomous vehicle fleet from the terminal during preparation for departure and/or arrival. The portable device 126 may support a workflow to improve efficiency of terminal operation.
The site placement kit 128 includes materials for establishing landing stages and launch stages (see landing stage/launch stage 310 of fig. 3). The site placement tool package 128 may include a location outline or marker 130. The location markers 130 may include traffic cones, traffic guardrails, paint, and anything else that provides visual and/or physical separation of the regions within the space. The site setup tool package 128 may include sensors 312, 316, 320, the sensors 312, 316, 320 being capable of being deployed within an end site to enhance autonomous vehicle performance within the site (see fig. 3). The items in the site setup kit 128 may be collapsible, expandable, and/or modular as desired to fit the available space of the vehicle 132. Other items in the site placement kit 128 may include tools for facilitating site management, such as lights, security cameras, tents (or other shade structures), chairs, space heaters or coolers, fans, portable restrooms, and the like. Other equipment 106 may be used for vehicle and trailer inspection. For example, equipment 106 may include components that provide weigh stations in a terminal site. As described above, results from the inspection performed using equipment 106 may be provided to control subsystem 102, which control subsystem 102 may in turn provide the inspection results to a centralized fleet management system (e.g., fleet management system 208 of fig. 2). A third party may be called to provide supplemental fuel at a mobile terminal site established using the mobile terminal system 100.
Terminal routing and fleet management
Fig. 2 illustrates an autonomous vehicle fleet system 200 operating in a geographic area in which a plurality of terminal stations 202, 206, 214 are deployed using the mobile terminal system 100 of fig. 1. The area of the autonomous vehicle fleet system 200 includes a first terminal station 202, a second terminal station 206, and a third terminal station 216. The end stations 202, 206, 216 may be operational sites that support loading and/or unloading items from the autonomous vehicle 502, weighing the autonomous vehicle 502, inspecting the autonomous vehicle 502, repairing the autonomous vehicle 502, and the like. End stations 202, 206, 216 may include resources to prepare autonomous vehicle 502 for traveling to other locations (e.g., one or another destination of other end stations 202, 206, 216 shown in fig. 2). The end stations 202, 206, 216 are not limited to a particular physical structure and pre-constructed location or building. Further details of example end stations 202, 206, 216 are described below with respect to fig. 3.
The autonomous vehicle 502 may autonomously travel between the terminal sites 202, 206, 216 using routes 204, 214 that may have been determined by the mobile terminal system 100. For example, the sensor 104 may have generated route data 120 when the mobile terminal system 100 is traveling along the route 204, 214 to establish various end stations 202, 206, 216. As described above, the route data 120 may include the location of the newly established end stations 202, 206, 216, the geographic coordinates of the routes 204, 214, information about obstacles along the routes 204, 214, information about traffic along the routes 204, 214, information about lane or road closures along the routes 204, 214, and the like. Route data 120 may be provided from mobile terminal system 100 to autonomous vehicles 502 in a fleet traveling in a given geographic area and/or fleet management system 208 that helps track and manage movement of autonomous vehicles 502 in that area. The fleet management system 208 is described in more detail below.
In the example of fig. 2, the mobile terminal system 100 has established the first terminal station 202 and the mobile terminal system 100 is no longer located in the first terminal station 202. More specifically, the mobile terminal system 100 has traveled to the location of the second terminal site 206 and established or deployed the second terminal site 206 using the equipment 106. The location of the new terminal station 206 may be any suitable location (e.g., suitably flat, unobstructed, near road/cargo receiving location) where increased support for the autonomous vehicle 502 is desired.
In some cases, a request may have been sent for the mobile terminal system 100 to establish the terminal station 206 at the corresponding location. In some cases, the location of the end station 206 may have been determined based at least in part on the fleet management data 114. Fleet management data 114 may indicate that autonomous vehicle 502 requires additional support in establishing the location of terminal station 206. For example, if fleet management data 114 indicates that traffic of autonomous vehicles 502 increases in a location lacking sufficient terminal capacity, then a new mobile terminal station 206 may be deployed at that location using mobile terminal system 100. The fleet management data 114 may be provided from autonomous vehicles 502 in a fleet and/or from the fleet management system 208. As a further example, when route 204 requires one or more temporary support end stations (either due to lack of permanent terminals or to support short term increases in fleet size (e.g., increases in transportation demand)), mobile end system 100 may be deployed to establish new end stations 206. The mobile terminal system 100 is able to efficiently and quickly establish these supporting terminal sites 202, 206, 216 to meet these short-term or dynamic needs. In general, the mobile terminal system 100 allows the full functionality of conventional terminals to be quickly deployed in any available and appropriate location. Conventional terminals require long lead times and high costs to install a more permanent infrastructure.
Route data 120 is collected for route 204 as vehicle 132 of mobile terminal system 100 travels along route 204 to reach a location of second terminal station 206 from first terminal station 202. The route data 120 may include geographic coordinates of the routes 204, 214 (e.g., a GPS transceiver from the sensor 104 of the mobile terminal system 100), information about obstacles along the routes 204, 214 (e.g., a camera, LIDAR, or RADAR from the sensor 104 of the mobile terminal system 100), information about traffic along the routes 204, 214, information about lanes or road closures along the routes 204, 214 (e.g., a camera, LIDAR, or RADAR from the sensor 104 of the mobile terminal system 100), and the like. The route data 120 allows the autonomous vehicle 502 to travel more reliably and efficiently to the new terminal station 206 than is currently possible.
After the vehicle 132 reaches the location of the terminal site 206, equipment 106 from the mobile terminal system 100 is used to establish the terminal site 206. For example, the site setup tool package 128 may be used to establish a new end site 206, as described above with respect to fig. 1. For example, the location markers 130 may be deployed to designate different areas within the space of the terminal site 206, including landing stage(s) and/or launch stage (see landing stage/launch stage 310 of fig. 3). Sensors 312, 316, 320 may be deployed within end station 206 to help effectively and reliably direct movement of autonomous vehicle 502 in end station 206 (see also fig. 3). Other equipment 106 (such as lights, security cameras, tents (or other shade structures), chairs, space heaters or coolers, fans, portable restroom facilities, etc.) may also be deployed in the new terminal station 206.
As autonomous vehicle 502 approaches, control subsystem 102 provides drop command 116 to autonomous vehicle 502. As described above and with further reference to fig. 3 and 4 below, the landing instructions 116 indicate the movement that the incoming autonomous vehicle 502 is able to perform in order to reach the landing stage in the terminal station 206. When autonomous vehicle 502 enters terminal site 206, portable device 126 may receive an alert or other notification from control subsystem 102 regarding the action to be taken to improve the safety and efficiency of the operation in the terminal site. For example, portable device 126 may receive an alert indicating an incoming autonomous vehicle in preparation for inspection and offloading of autonomous vehicle 502.
The secure data store 122 may be used to offload data from the autonomous vehicle 502 when the incoming autonomous vehicle 502 is in place in a landing stage at the terminal station 206. The autonomous vehicle maintenance/repair kit 124 may be used to inspect the autonomous vehicle 502, make any necessary repairs to the autonomous vehicle 502, and/or calibrate sensors of the autonomous vehicle 502 (e.g., of the sensor subsystem 544 shown in fig. 5). When the autonomous vehicle 502 is ready to leave the terminal station 206, launch instructions 118 are provided to instruct the autonomous vehicle 502 to execute a movement exiting the launch station and exiting the terminal station 206. If the movement is not clear, the portable device 126 may receive an alert indicating that the dock is not clear to the movement and may indicate appropriate action to effectively clear the area around the dock (see alert 340 and area 326 of FIG. 3).
At some time, the mobile terminal system 100 may receive a request to establish another new terminal station 216 (e.g., based on the fleet management data 114 indicating a need to increase support for the fleet of autonomous vehicles 502). The vehicle 132 of the mobile terminal system 100 may travel along the route 214 to the location of the new terminal site 216. In some cases, equipment and/or computing resources for operating the control subsystem 102 may remain at the end station 206 so that the end station 206 may continue to operate. As the vehicle 132 travels to the new terminal site 216, route data 120 is collected, as described above with respect to route 204. An example terminal site 216 is shown in fig. 3 and described in more detail below.
At some time, the mobile terminal system 100 may receive a request to generate updated route data 120 for one or more routes 204, 214. For example, after a predetermined time interval (e.g., a particular number of days or weeks), or if a problem with autonomous vehicle 502 traveling has been detected, vehicle 132 of mobile terminal system 100 may travel along routes 204, 214 to generate new route data 120 for routes 204, 214. The new route data 120 may capture any changes to the routes 204, 214. For example, the new route data 120 may reflect changes in traffic, obstacles, closed lanes, changes in GPS coordinates of the routes 204, 214 (e.g., due to detour or road construction), and so forth.
At some time, the mobile terminal system 100 may receive a request for roaming or off-terminal maintenance for an autonomous vehicle 502 located somewhere near or along the route 204, 214. For example, after establishing the end station 202, 206, 216, the mobile terminal system 100 may receive a request for maintenance along the autonomous vehicle route 204, 214. The vehicle 102 of the mobile terminal system 100 may then travel to (e.g., be driven to) the autonomous vehicle 502 in need of repair, and may perform the repair. The mobile terminal system 100 may help re-launch the repaired autonomous vehicle 502. Further details of performing off-terminal maintenance and re-transmitting autonomous vehicle 502 will be described below with respect to fig. 12A, 12B, and 13.
Fleet management system 208, shown in fig. 2, both tracks and manages the deployment of mobile terminal stations 202, 206, 216 and tracks and manages the movement of autonomous vehicles 502. The fleet management system 208 generally manages and tracks movement of the fleet of autonomous vehicles 502 and the mobile terminal system 100 in the area of the autonomous vehicle fleet system 200. For example, the fleet management system 208 can detect when autonomous vehicle traffic in the area of the autonomous vehicle fleet system 200 is greater than a threshold level and initiate establishment of the rapidly deployed end stations 202, 206, 216. Fleet management system 208 helps provide visibility into the status, arrival time, and departure time of autonomous vehicles 502 in a fleet. This information and other similar information may be used to improve the scheduling of fleet movements and the deployment of end stations 202, 206, 216. Information from fleet management system 208 may be visible (e.g., to individuals at terminal sites 202, 206, 216 and/or other locations) to support terminal operation.
The example fleet management system 208 includes a processor 210, a memory 212, and a communication interface 218. The processor 108 includes one or more processors. Processor 210 is any electronic circuit including, but not limited to, a state machine, one or more Central Processing Unit (CPU) chips, a logic unit, a core (e.g., a multi-core processor), a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), or a Digital Signal Processor (DSP). Processor 210 may be a programmable logic device, a microcontroller, a microprocessor, or any suitable combination of the preceding. Processor 210 is communicatively coupled to, and in signal communication with, memory 212 and communication interface 218, as well as sensor(s) 104 (described further below). The one or more processors are configured to process data and may be implemented in hardware and/or software. For example, the processor 210 may be 8-bit, 16-bit, 32-bit, 64-bit, or any other suitable architecture. The processor 210 may include an Arithmetic Logic Unit (ALU) for performing arithmetic and logic operations, processor registers that supply operands to the ALU and store the results of the ALU operations, and a control unit that fetches instructions from the memory 212 and performs the ALU, registers, and other components by directing their coordinated operations.
The memory 212 is operable to store fleet management data 114, route data 120, and/or any other data, instructions, logic, rules, or code operable to perform the functions of the fleet management system 208. Memory 212 includes one or more magnetic disks, tape drives, or solid state drives, and may serve as an overflow data storage device to store programs as they are selected for execution, as well as store instructions and data that are read during program execution. Memory 212 may be volatile or nonvolatile and may include Read Only Memory (ROM), random Access Memory (RAM), ternary Content Addressable Memory (TCAM), dynamic Random Access Memory (DRAM), and Static Random Access Memory (SRAM).
Communication interface 218 is configured to communicate data between fleet management system 208 and other devices, systems, or domain(s), such as autonomous vehicle 502 and mobile terminal system 100. Communication interface 218 is an electronic circuit configured to enable communication between devices. For example, communication interface 218 may be a network interface including a cellular communication transceiver, a WiFi interface, a Local Area Network (LAN) interface, a Wide Area Network (WAN) interface, a modem, a switch, and/or a router. The processor 210 is configured to send and receive data using the communication interface 218. Communication interface 218 may be configured to use any suitable type of communication protocol. Communication interface 218 communicates fleet management data 114 and route data 120.
In some cases, the drop instructions 116 and/or the launch instructions 118 may be communicated by the fleet management system 208. For example, the control subsystem 102 of the mobile terminal system 100 may provide the drop instructions 116 and/or the launch instructions 118 to the fleet management system 208, which in turn sends the drop instructions 116 and/or the launch instructions 118 to the appropriate autonomous vehicle 502. The method allows the drop command 116 and/or the launch command 118 to reach an autonomous vehicle 502 that may be out of direct communication range with the mobile terminal system 100.
In some cases, the mobile terminal system 100 may improve communication between the autonomous vehicles 502 in the fleet and the fleet management system 208 responsible for managing at least a portion of the operation of the autonomous vehicles 502. For example, if the autonomous vehicles 502 are temporarily unable to communicate with the fleet management system 208, fleet management data 114 from one or more autonomous vehicles 502 located in proximity to the mobile terminal system 100 may be provided to the mobile terminal system 100, which mobile terminal system 100 then communicates the fleet management data 114 to the fleet management system 208. In this manner, when direct communication between the autonomous vehicle and the fleet management system is slow or unavailable (e.g., if the autonomous vehicle 502 is within communication range of the mobile terminal system 100 but outside of communication range of the fleet management system 208), the mobile terminal system 100 may provide a supplemental communication path between the autonomous vehicle 502 and the fleet management system 208.
Example terminal site
Fig. 3 illustrates the example mobile terminal sites 202, 206, 216 of fig. 2 in more detail. The example end stations 202, 206, 216 of fig. 3 include a secure area 302 in which the mobile terminal system 100 is located. For example, the location marker 130 from the mobile terminal system 100 may be used to establish the secure area 302. One or more security cameras may be deployed in the security area 302 or around the security area 302. The safety area 302 also includes an operating canopy (operations tent) 304. The operating canopy 304 can act as a staging area (STAGING AREA) for operators working in the end-stations 202, 206, 216. Equipment 106 (such as computers, chairs, heating/cooling devices, etc.) may be deployed within the operating canopy 304. In some cases, all or part of the control subsystem 102 is modular and can be removed from the vehicle 132 of the mobile terminal system 100 and operated from the operating canopy 304 or some other location within the terminal stations 202, 206, 216.
In the example of fig. 3, the end stations 202, 206, 216 include a plurality of lights 306 arranged around the boundaries of the end stations 202, 206, 216. These lights 306 may be in the equipment 106 of the mobile terminal system 100. When deployed as shown in fig. 3, lights 306 enhance security and feel of security in end stations 202, 206, 216.
The example terminal sites 202, 206, 216 of fig. 3 are divided into an autonomous vehicle zone (zone) 308 in which the autonomous vehicle 502 operates primarily autonomously, and an artificial zone 328 in which conventional tractor-trailers are handled. The autonomous vehicle zone 308 includes at least one landing stage/launch pad 310. The same space may be used as both landing stage and launch stage 310, or separate spaces may be assigned to each. The location marker 130 may be used to designate landing stage (s)/launch stage 310 (see fig. 1 and corresponding description above). Landing stage (s)/engine stage 310 is sized and shaped to accommodate autonomous vehicle 502. In this example, an autonomous vehicle 502b that is in place or ready to launch is shown within the docking/launch platform 310. Further details of landing stage/launch pad 310 are described below with respect to fig. 8-11.
In the example of fig. 3, landing stage/engine stage 310 includes a sensor 312 located on landing stage/engine stage 310, in landing stage/engine stage 310, or near landing stage/engine stage 310. As described further below with respect to example operation of landing stage/engine stage 310, sensor 312 may detect movement and/or obstructions within landing stage/engine stage 310 or near landing stage/engine stage 310. For example, the sensor 312 may provide information indicating whether the landing stage/dock 310 is currently occupied (e.g., vehicle, person, animal, or other object) and/or whether the area 326 surrounding the landing stage/dock 310 is clear of an obstacle (e.g., vehicle, person, animal, or other object).
One or more inbound routes 314a-314c that the inbound autonomous vehicle 502a is capable of traveling in order to reach the landing stage/launch stage 310 may be specified (e.g., using the position markers 130 of fig. 1). The sensors 316 may be located on the routes 314a-314c, in the routes 314a-314c, or near the routes 314a-314c to detect traffic along these routes 314a-314c. This traffic information may be used by the control subsystem 102 to determine more efficient routes 314a-314c for the inbound autonomous vehicle 502a to travel in order to reach the landing stage/dock 310. The drop instruction 116 may indicate the route 314a-314c. For example, if an obstacle (e.g., a parked vehicle) is present in the first route 314a, the landing instructions 116 may cause the inbound autonomous vehicle 502a to travel along the second route 314b or the third route 314c to reach the landing stage/engine stage 310. When the inbound autonomous vehicle 502a executes the landing instructions 116, the autonomous vehicle 502a follows the modified route 314b, 314c. The landing instructions 116 may be periodically updated to capture changes in traffic (e.g., based on information from the sensor 316) and/or occupancy at the landing stage/dock 310 (e.g., based on information from the sensor(s) 312) such that the inbound autonomous vehicle 502a reliably reaches the available landing stage/dock 310 via routes 314a-314c with little or no traffic, obstruction, or delay.
Similar to that described above for inbound routes 314a-314c, one or more outbound routes 318a-318c that outbound autonomous vehicle 502b is capable of traveling after exiting landing stage/launch stage 310 may be specified (e.g., using position markers 130 of fig. 1). The sensors 320 may be located on the routes 318a-318c, in the routes 318a-318c, or near the routes 318a-318c to detect traffic along these routes 318a-318c. This traffic information may be used by the control subsystem 102 to determine more efficient routes 318a-318c for the outbound autonomous vehicle 502b to travel away from the landing stage/engine stage 310 and to reach the road corresponding to the routes 204, 214 of fig. 2. The launch instructions 118 may indicate the routes 318a-318c. For example, if there is an obstacle (e.g., a parked vehicle) in the first route 318a, the launch instructions 118 may cause the outbound autonomous vehicle 502b to travel along the second route 318b or the third route 318c to reach the road of the route 204, 214. When the autonomous vehicle 502b executes the launch instruction 118, the autonomous vehicle 502b follows the modified route 318b, 318c. The launch instructions 118 may be periodically updated to capture changes in traffic (e.g., based on information from the sensors 320) and/or occupancy in the area 326 around the landing/launch pad 310 (e.g., based on information from the sensor(s) 312) such that the outbound autonomous vehicle 502a reliably exits the landing/launch pad 310 and travels along the routes 318a-318c with little or no traffic or obstructions.
The artificial zones 328 of the end stations 202, 206, 216 facilitate the arrival 332 and departure 334 of vehicles that are not traveling autonomously. A floodgate canopy (gate tent) 330 can be provided in the artificial zone 328 using the equipment 106 (see fig. 1). The floodgate canopy 330 provides space for individuals responsible for monitoring and approving the arrival 332 and departure 334 of conventional non-autonomous vehicles. Such management and record keeping is performed by control subsystem 102 for inbound autonomous vehicle 502a and outbound autonomous vehicle 502b, further improving the overall efficiency of autonomous vehicle operation. The artificial zone 328 of the terminal will be where the autonomous vehicle can operate in manual mode during disassembly/attachment of the trailer either at the time of landing or prior to launch. It is also possible that the manual side is where autonomous vehicle 502 is also operable in manual mode (e.g., driven by a driver) in manual zone 328, e.g., for minor repairs to physical/mechanical parts of autonomous vehicle 502 to be performed.
The end stations 202, 206, 216 may include separate transfer sites 336 and discharge sites 338. The transfer site 336 and the discharge site 338 may be designated using the location markers 130 from the mobile terminal system (see fig. 1). The staging site 336 may be an area in which vehicle inspection and maintenance is performed prior to departure of the autonomous vehicle 502 b. For example, the outbound autonomous vehicle 502b and/or its trailer may be inspected at the staging site 336 and returned to the launch platform 310 prior to autonomous departure of the outbound autonomous vehicle 502 b. Information about the inspection may be entered into the electronic inspection report 342 (e.g., using the portable device 126 from the mobile terminal system 100) and provided to the control subsystem 102. The control subsystem 102, in turn, may provide the inspection report 342 to the fleet management system 208 (see fig. 2) and/or other suitable parties that need this information. This ease of processing inspection report 342 may improve the accuracy and availability of inspection information and improve the throughput of inspected vehicles in terminal sites 202, 206, 216. Discharge 338 may be a location where items carried by a vehicle (e.g., inbound autonomous vehicle 502 a) are removed from the vehicle. For example, discharge station 338 may be near a location where the transported item is desired or is to be stored. A third party may be contacted to provide refueling of autonomous vehicle 502 in end stations 202, 206, 216 (e.g., in landing stage/engine 310, transfer 336, or discharge 338).
In an example operation of landing stage 310, a landing request 322 for an inbound autonomous vehicle 502a is received. Drop request 322 indicates that autonomous vehicle 502a is inbound to terminal stations 202, 206, 216. Landing request 322 may be a request to grant permission to inbound autonomous vehicle 502a to dock at landing stage 310 of terminal stations 202, 206, 216. Landing request 322 may indicate an expected arrival time of autonomous vehicle 502a and/or provide information regarding items transported by autonomous vehicle 502a, an operator of autonomous vehicle 502a, and the like. The drop request 322 may be sent when the autonomous vehicle 502a is within a threshold distance from the terminal station 202, 206, 216 and/or when the inbound autonomous vehicle 502a is traveling along the known route 204, 214 to the terminal station 202, 206, 216.
After receiving the landing request 322, the mobile terminal system 100 (e.g., the control subsystem 102) may determine a landing stage 310 capable of accommodating the inbound autonomous vehicle 502 a. For example, the control subsystem 102 may determine that the landing stage 310 is unoccupied or otherwise free of obstructions or other vehicles. Information from the sensor 312 may be used to determine the available landing stage 310. The control subsystem 102 may determine a landing stage 310 that is positioned proximate to other resources required by the inbound autonomous vehicle 502 a. For example, if the drop request 322 indicates that certain maintenance is required or that the autonomous vehicle 502a is transporting items, a resource for maintenance and/or a drop station 310 in the vicinity of the item unloading facility may be selected for the inbound autonomous vehicle 502 a. If landing stage 310 is not available, control subsystem 102 may initiate an activity to clear landing stage 310 for inbound autonomous vehicle 502a by sending an alert 340 (e.g., to portable device 126 of a technician working in terminal station 202, 206, 216). The alert 340 may instruct a technician or other individual to clear the landing stage 310. The control subsystem 102 may also determine the inbound routes 314a-314c that the inbound autonomous vehicle 502a travels in order to reach the landing stage 310. For example, movement information or traffic information from sensor(s) 316 may be used to select the route 314a-314c that is most effective for inbound autonomous vehicle 502a to reach the selected landing stage 310.
The drop command 116 is then provided to the inbound autonomous vehicle 502a. The landing instructions 116 may indicate the landing stage 310 in which the inbound autonomous vehicle 502a is to dock and the routes 314a-314c that the autonomous vehicle 502a is to travel in order to reach the landing stage 310. In other words, the landing instructions 116 may indicate the movement that the inbound autonomous vehicle 502a is able to perform in order to reach the landing stage 310. For example, the landing instructions 116, when executed by a control computer (see fig. 5 and 7) of the inbound autonomous vehicle 502a, direct at least a portion of the inbound autonomous vehicle 502a to operate and/or move to reach the landing stage 310. The drop instructions 116 may include a time when the inbound autonomous vehicle 502a is able to enter the terminal station 202, 206, 216, a route 314a-314c within the terminal station 202, 206, 216 that the autonomous vehicle 502a moved to reach the drop station 310 when entering the terminal station 202, 206, 216, a location of the drop station 310 within the terminal station 202, 206, 216 (e.g., GPS coordinates or other geographic coordinates of the drop station 310), and/or an identifier of the drop station 310.
The landing instructions 116 may be updated as needed or from time to account for traffic changes along the routes 314a-314c and/or occupancy changes of the landing stage(s) 310. For example, the control subsystem 102 may receive sensor data from the movement or traffic sensor(s) 316 that is indicative of the amount of traffic within the end stations 202, 206, 216. The sensor data may be used to determine updated landing instructions 116, which when executed by the control system of the autonomous vehicle 502a, cause the autonomous vehicle 502a to reach the landing stage 310 and avoid traffic while traveling toward the landing stage 310 (e.g., by following alternative routes 314a-314c that have less traffic than the initially assigned routes 314a-314 c). As another example, the control subsystem 102 may receive sensor data from sensor(s) 312 around the landing stage 310 that indicates that the landing stage 310 is now occupied. Updated landing instructions 116 may then be determined and provided to autonomous vehicle 502a, the updated landing instructions 116, when executed by the control system of autonomous vehicle 502a, preventing autonomous vehicle 502a from entering landing stage 310 while landing stage 310 is occupied. For example, autonomous vehicle 502a may be held until landing stage 310 is idle or sent to another landing stage 310 (if one is available).
At about the time that the drop instructions 116 are determined and/or sent, the control subsystem 102 may initiate an activity to prepare for the arrival of the inbound autonomous vehicle 502a by providing an alert 340 to the technician's portable device 126, the alert 340 having instructions indicating actions to prepare for maintenance, inspection, offloading, etc. of the inbound autonomous vehicle 502 a. Further, control subsystem 102 may determine that inbound autonomous vehicle 502a has arrived at landing stage 310 (e.g., by receiving confirmation of a landing, by determining that autonomous vehicle 502a is in landing stage 310 based on the location of autonomous vehicle 502a, using data from sensor(s) 312, etc.) and provide alert 340 (e.g., to portable device 126) to initiate post-landing activity. For example, alert 340 may instruct a technician to move autonomous vehicle 502a from landing stage 310 to transfer 336 or discharge 338 to perform other tasks.
In an example operation of launch pad 310, a launch request 324 is received for outbound autonomous vehicle 502 b. The launch request 324 may be sent when the outbound autonomous vehicle 502b has completed all pre-trip checksums and is ready to begin moving back to the road corresponding to the route 204, 214. The outbound autonomous vehicle 502b may be the same vehicle as the inbound autonomous vehicle 502a or a different vehicle.
After receiving launch request 324, control subsystem 102 determines whether outbound autonomous vehicle 502b is able to exit landing stage 310. For example, control subsystem 102 may determine whether autonomous vehicle 502b and area 326 around engine mount 310 are clear. This determination may be facilitated by a sensor of outbound autonomous vehicle 502b (e.g., a sensor in sensor 546 of fig. 5) and/or a sensor 312 associated with launch pad 310. The control subsystem 102 may determine that the area 326 surrounding the dock 310 is unoccupied by determining that the area 326 does not obstruct the outbound autonomous vehicle 502b from moving out of the dock 310. If the area 326 around the engine mount 310 is not clear, the control subsystem 102 may initiate an action to remove the obstruction or otherwise clear the area 326 (e.g., by sending an alert to the technician's portable device 126). The control subsystem 102 may also determine outbound routes 318a-318c (e.g., routes 204, 214) that the outbound autonomous vehicle 502b travels in order to move toward the road. For example, movement information or traffic information from the sensor(s) 320 may be used to select the route 318a-318c that is most efficient for the outbound autonomous vehicle 502a to reach the road with little or no delay.
The launch instructions 118 are then sent to the outbound autonomous vehicle 502b. The launch instructions 118 indicate whether the outbound autonomous vehicle 502b is able to exit the launch pad 310 and the routes 318a-318c that the autonomous vehicle 502b is traveling in order to exit the terminal stations 202, 206, 216. In other words, launch instructions 118 may indicate the movement that standing autonomous vehicle 502b is capable of performing in order to exit launch pad 310. For example, launch instructions 310, when executed by the control system of autonomous vehicle 502b, may direct at least a portion of outbound autonomous vehicle 502b to operate or move to leave launch pad 310 and reach a road (e.g., a road corresponding to routes 204, 214). The launch instructions 118 may include the time the outbound autonomous vehicle 502b is able to depart from the launch pad 310 and/or the route 318a-318c within the terminal station 202, 206, 216 that the outbound autonomous vehicle 502b is to travel in order to depart from the launch pad 310.
The launch instructions 118 may be updated as needed or from time to account for traffic changes along the routes 318a-318c and/or occupancy changes in the area 326 surrounding the launch pad 310. For example, control subsystem 102 may receive sensor data from sensor(s) 312 indicating that zone 326 around launch pad 310 is now occupied, and provide updated launch instructions 118 that, when executed by the control system of outbound autonomous vehicle 502b, prevent outbound autonomous vehicle 502b from departing from launch pad 310 while zone 326 is occupied. As another example, the control subsystem 102 may receive sensor data from the movement or traffic sensor 320 indicating an amount of traffic within the established terminal station 202, 206, 216 (e.g., along a given route 318a-318 c) and determine updated launch instructions 118 that, when executed by the control system of the outbound autonomous vehicle 502b, cause the outbound autonomous vehicle 502b to move away from the launch station along the route 318a-318c avoiding traffic. For example, the updated launch instructions 118 indicate alternative routes 318a-318c for the outbound autonomous vehicle 502b to travel away from the launch pad 310.
Example operation of a Mobile terminal System
Fig. 4 illustrates an example process 400 for operating the mobile terminal system 100 of the present disclosure. Process 400 generally facilitates operational improvements of autonomous vehicle 502 by increasing the efficiency and reliability of autonomous vehicle movement within terminal stations 202, 206, 216. The process 400 may begin at step 402, where the mobile terminal system 102 draws a route to the terminal site 202, 206, 216 at step 402. For example, the GPS of the sensor 104 of the mobile terminal system 100 may generate or collect route data 120 for routes 204, 214 that the autonomous vehicles 502 in the fleet are capable of traveling in order to reach the terminal stations 202, 206, 216. As described above with respect to fig. 1 and 2, route data 120 may include data collected by sensors 104, such as geographic coordinates, road condition information, travel obstacles, traffic, and the like. At step 404, route data 120 is provided for access by autonomous vehicles 502 in the fleet. Route data 120 may be communicated to autonomous vehicles 502 and/or provided to fleet management system 208, which allows autonomous vehicles 502 to access route data 120 when needed.
In step 406, the end stations 202, 206, 216 are set up using the equipment 106 of the mobile terminal system 100. The arrangement of the end stations 202, 206, 216 is described above with respect to fig. 3. During setup of the terminal station 202, 206, 216, landing and/or launch pad 310 is established in the terminal station 202, 206, 216. After the end stations 202, 206, 216 are set or established, the control subsystem 102 determines whether there is an inbound or inbound autonomous vehicle 502 (see inbound autonomous vehicle 502a of fig. 3 and corresponding description above) at step 408. For example, control subsystem 102 may determine whether a drop request 322 is received. When autonomous vehicle 502 is in, control subsystem 102 proceeds to step 410.
At step 410, the control subsystem 102 determines whether there is an unoccupied landing stage 310 and a preferred route 314a-314c for reaching the landing stage 310 (see example operations above with respect to the landing stage 310 of FIG. 3). If this is not the case, control subsystem 102 may proceed to step 412 and identify another space for incoming autonomous vehicle 502 to land (e.g., a different landing stage 310), or may send alert 340 to clear landing stage 310 for incoming autonomous vehicle 502. Once the landing stage 310 and the routes 314a-314c are determined at step 410, the control subsystem 102 proceeds to step 414.
At step 414, the drop command 116 is provided to the incoming autonomous vehicle 502. The landing instructions 116 may indicate the landing stage 310 in which the autonomous vehicle 502 is to dock and the routes 314a-314c that the autonomous vehicle 502 is to travel in order to reach the landing stage 310. At step 416, control subsystem 102 may send (e.g., to a technician's portable device 126 in terminal site 202, 206, 216) alert 340 to initiate or prepare for a post-roll activity (such as unloading autonomous vehicle 502, checking autonomous vehicle 502, weighing autonomous vehicle 502, etc.).
At step 418, control subsystem 102 determines whether launch request 324 is received from autonomous vehicle 502 in launch pad 310 (see outbound autonomous vehicle 502b of fig. 3). Upon receipt of the launch request 324, the control subsystem 102 proceeds to step 420 and determines whether the area 326 is clear (e.g., unoccupied and unobstructed) and whether there is a preferred route 318a-318c for exiting the terminal station 202, 206, 216 from the landing stage 310. If this is not the case, control subsystem 102 may proceed to step 422 to send alert 340 to clear area 326 around engine mount 310 and/or determine alternative routes 318a-318c for autonomous vehicle 502 to travel in order to exit terminal stations 202, 206, 216.
Once area 326 is clear and the preferred outbound route 318a-318c is determined, control subsystem 102 proceeds to step 424 and provides launch instructions 118 to autonomous vehicle 502. The launch instructions 118 may indicate that the autonomous vehicle 502 may exit the launch pad 310 and that the autonomous vehicle 502 is traveling along the routes 318a-318c in order to exit the terminal stations 202, 206, 216.
At step 426, control subsystem 102 may determine whether a new end station 202, 206, 216 needs to be established. For example, fleet management data 114 may indicate that additional terminal stations 202, 206, 216 are needed to support movement of autonomous vehicles 502 in a given area. The control subsystem 102 may determine the need or the fleet management system 208 may provide instructions indicating the need. As another example, control subsystem 102 may determine that short-term end stations 202, 206, 216 should be established to provide support for a concept-verifying route or temporary route 204, 214. The concept-verified route or temporary route 204, 214 may be required because an increase in traffic is detected in the area of the short-term terminal site 202, 206, 216 and/or a need for a fleet of vehicles supporting the autonomous vehicle 502 for less than a week (or less, for example) from the current time is detected. If a new end station 202, 206, 216 needs to be established, control subsystem 102 may return to step 402 to restart process 400 for the new end station 202, 206, 216.
At step 428, the control subsystem 102 may determine whether to redraw the route 204, 214 to one or more of the end stations 202, 206, 216. For example, after a predefined time interval (e.g., days, weeks, etc.), the route 204, 214 may be redrawn. If the route 204, 214 should be redrawn, the control subsystem 102 proceeds to step 430 and redraws the route 204, 214. For example, the vehicle 132 may travel along the routes 204, 214 and collect updated route data 120 to account for any possible changes in the routes 204, 214.
At step 432, the control subsystem 102 determines whether a request for off-terminal (e.g., roaming) maintenance is received. If such a request is received, control subsystem 102 may provide for support for the acknowledgement to arrive at step 434. Vehicle 132 may travel to a location maintained off-terminal and equipment 106 may be used to repair and/or recalibrate autonomous vehicle 502 at that location. The mobile terminal system 100 may assist in restarting the repaired autonomous vehicle 502, for example, as described below with respect to fig. 12A and 13.
Example autonomous vehicle and autonomous vehicle operation
FIG. 5 illustrates a block diagram of an example vehicle ecosystem 500 in which autonomous driving operations may be determined. As shown in fig. 5, autonomous vehicle 502 may be a semi-trailer truck. Vehicle ecosystem 500 includes several systems and components that are capable of generating and/or transmitting one or more information/data sources and related services to an onboard control computer 550, which may be located in autonomous vehicle 502. The onboard control computer 550 may be in data communication with a plurality of vehicle subsystems 540, all of which vehicle subsystems 540 may reside in the autonomous vehicle 502. A vehicle subsystem interface 560 is provided to facilitate data communication between the onboard control computer 550 and the plurality of vehicle subsystems 540. In some embodiments, the vehicle subsystem interface 560 may include a controller area network (controller area network, CAN) controller to communicate with devices in the vehicle subsystem 540.
Autonomous vehicle 502 may include various vehicle subsystems that support the operation of autonomous vehicle 502. The vehicle subsystems may be an emergency stop button 504, a vehicle drive subsystem 542, a vehicle sensor subsystem 544, and/or a vehicle control subsystem 548. The components or devices of the vehicle drive subsystem 542, the vehicle sensor subsystem 544, and the vehicle control subsystem 548 shown in fig. 5 are examples. Autonomous vehicle 502 may be configured as shown or any other configuration.
Emergency stop button 504 may include a physical button configured to turn off or deactivate autonomous functions of autonomous vehicle 502 when activated. The emergency stop button 504 is in signal communication with a plurality of vehicle subsystems 540 and an onboard control computer 550. Emergency stop button 504 may be activated by any suitable method, such as by pressing, pulling, sliding, toggling, using a key, etc. When activated, emergency stop button 504 may initiate a fail-safe sequence to disable autonomous functions of autonomous vehicle 502. In this process, when the emergency stop button 504 is activated, it disconnects the vehicle drive subsystem 542, the vehicle sensor subsystem 544, and the vehicle control subsystem 548 from the on-board control computer 550. In other words, when emergency stop button 504 is activated, it cuts off power from the autonomous system of autonomous vehicle 502. In one embodiment, when the emergency stop button 504 is activated, the engine/motor 542a may be turned off, the brake unit 548b may be applied, and the hazard lamps may be turned on. Upon activation, emergency stop button 504 may override all relevant start sequence functions of autonomous vehicle 502.
Vehicle drive subsystem 542 may include components operable to provide powered movement of autonomous vehicle 502. In an example embodiment, the vehicle drive subsystem 542 may include an engine/motor 542a, wheels/tires 542b, a transmission 542c, an electrical subsystem 542d, and a power supply 542e.
The vehicle sensor subsystem 544 may include a plurality of sensors 546 configured to sense information about the environment or condition of the autonomous vehicle 502. The vehicle sensor subsystem 544 may include one or more cameras 546a or image capture devices, a radar unit 546b, one or more temperature sensors 546c, a wireless communication unit 546d (e.g., a cellular communication transceiver), an Inertial Measurement Unit (IMU) 546e, a laser rangefinder/LiDAR unit 546f, a Global Positioning System (GPS) transceiver 546g, and/or a wiper control system 546h. The vehicle sensor subsystem 544 may also include sensors (e.g., O 2 monitors, fuel gauges, engine oil temperature, etc.) configured to monitor internal systems of the autonomous vehicle 502.
The IMU 546e may include any combination of sensors (e.g., accelerometers and gyroscopes) configured to sense changes in the position and orientation of the autonomous vehicle 502 based on inertial acceleration. GPS transceiver 546g may be any sensor configured to estimate the geographic location of autonomous vehicle 502. To this end, the GPS transceiver 546g may include a receiver/transmitter operable to provide information regarding the location of the autonomous vehicle 502 relative to the earth. Radar unit 546b may represent a system that utilizes radio signals to sense objects within the local environment of autonomous vehicle 502. In some embodiments, radar unit 546b may be additionally configured to sense a speed and a heading of an object proximate autonomous vehicle 502 in addition to sensing the object. The laser rangefinder or LiDAR unit 546f may be any sensor configured to sense objects in the environment in which the autonomous vehicle 502 is located using a laser. The camera 546a may include one or more devices configured to capture a plurality of images of the environment of the autonomous vehicle 502. The camera 546a may be a still image camera or a motion video camera.
Vehicle control subsystem 548 may be configured to control the operation of autonomous vehicle 502 and its components. Accordingly, the vehicle control subsystem 548 may include various elements such as a throttle and gear 548a, a brake unit 548b, a navigation unit 548c, a steering system 548d, and/or an autonomous control unit 548e. The throttle 548a may be configured to control, for example, the operating speed of the engine and, in turn, the speed of the autonomous vehicle 502. Gear 548a may be configured to control gear selection of the transmission. The braking unit 548b may include any combination of mechanisms configured to slow down the autonomous vehicle 502. The braking unit 548b may use friction to slow the wheel in a standard manner. The braking unit 548b may include an anti-lock braking system (anti-lock brake system, ABS) capable of preventing brake locking when the brakes are applied. Navigation unit 548c may be any system configured to determine a driving path or route of autonomous vehicle 502. The navigation 548c unit may additionally be configured to dynamically update the driving path while the autonomous vehicle 502 is operating. In some embodiments, navigation unit 548c may be configured to combine data from GPS transceiver 546q and one or more predetermined maps to determine a driving path of autonomous vehicle 502 (e.g., along routes 204, 214, 314a-314c, 318a-318c of FIGS. 2 and 3). Steering system 548d may represent any combination of mechanisms operable to adjust the heading of autonomous vehicle 502 in either an autonomous mode or a driver controlled mode.
Autonomous control unit 548e may represent a control system configured to identify, evaluate, and avoid or otherwise traverse potential obstacles or obstructions in the environment of autonomous vehicle 502. In general, autonomous control unit 548e may be configured to control autonomous vehicle 502 to operate without a driver or to provide driver assistance in controlling autonomous vehicle 502. In some embodiments, autonomous control unit 548e may be configured to combine data from GPS transceiver 546g, radar 546b, liDAR unit 546f, camera 546a, and/or other vehicle subsystems to determine the driving path or trajectory of autonomous vehicle 502.
Several or all functions of autonomous vehicle 502 may be controlled by on-board control computer 550. The onboard control computer 550 may include at least one data processor 570 (which may include at least one microprocessor) that executes processing instructions 580 stored in a non-transitory computer-readable medium, such as data storage 590 or memory. In-vehicle control computer 550 may also represent a plurality of computing devices that may be used to control various components or subsystems of autonomous vehicle 502 in a distributed manner. In some embodiments, data storage device 590 may contain processing instructions 580 (e.g., program logic) executable by data processor 570 to perform various methods and/or functions of autonomous vehicle 502, including the methods and/or functions described above with respect to fig. 1-4.
The data storage device 590 may also contain additional instructions, including instructions to: transmitting data to one or more of vehicle drive subsystem 542, vehicle sensor subsystem 544, and vehicle control subsystem 548, receiving data from one or more of vehicle drive subsystem 542, vehicle sensor subsystem 544, and vehicle control subsystem 548, interacting with one or more of vehicle drive subsystem 542, vehicle sensor subsystem 544, and vehicle control subsystem 548, or controlling one or more of vehicle drive subsystem 542, vehicle sensor subsystem 544, and vehicle control subsystem 548. The in-vehicle control computer 550 may be configured to include a data processor 570 and a data storage device 590. The onboard control computer 550 may control the functions of the autonomous vehicle 502 based on inputs received from various vehicle subsystems (e.g., vehicle drive subsystem 542, vehicle sensor subsystem 544, and vehicle control subsystem 548).
Fig. 6 illustrates an exemplary system 600 for providing accurate autonomous driving operations. The system 600 includes several modules that can operate in an onboard control computer 550, as shown in FIG. 5. The onboard control computer 550 includes a sensor fusion module 602 shown in the upper left corner of fig. 6, wherein the sensor fusion module 602 may perform at least four image or signal processing operations. The sensor fusion module 602 may obtain images from cameras located on the autonomous vehicle 502 to perform image segmentation 604 to detect the presence of moving objects (e.g., other vehicles, pedestrians, etc.) and/or static obstacles (e.g., parking signs, speed bumps, terrain, etc.) located around the autonomous vehicle 502. The sensor fusion module 602 may obtain LiDAR point cloud data items from LiDAR sensors located on the autonomous vehicle 502 to perform LiDAR segmentation 606 to detect the presence of objects and/or obstacles located around the autonomous vehicle 502.
The sensor fusion module 602 may perform instance segmentation 608 on the image and/or point cloud data items to identify contours (e.g., boxes) around objects and/or obstacles located around the autonomous vehicle 502. The sensor fusion module 602 may perform a temporal fusion 610 in which objects and/or obstructions in a frame of one image and/or point cloud data item are associated with or associated with objects and/or obstructions in one or more images or frames subsequently received in time.
The sensor fusion module 602 can fuse objects and/or obstacles in an image obtained from a camera and/or a point cloud data item obtained from a LiDAR sensor. For example, the sensor fusion module 602 may determine that an image from one of the cameras that includes half of the vehicle in front of the autonomous vehicle 502 is the same as the vehicle captured by the other camera based on the locations of the two cameras. The sensor fusion module 602 sends the fused object information to the interference module 646 and the fused obstacle information to the occupancy grid module 660. The onboard control computer includes an occupancy grid module 660, and the occupancy grid module 660 may retrieve landmarks from a map database 658 stored in the onboard control computer. The occupancy grid module 660 may determine the drivable area and/or obstacle from the fused obstacle obtained from the sensor fusion module 602 and the landmarks stored in the map database 658. For example, occupancy grid module 660 may determine that the drivable region may include a deceleration strip obstacle.
Below the sensor fusion module 602, the onboard control computer 550 includes a LiDAR-based object detection module 612, where the LiDAR-based object detection module 612 can perform object detection 616 based on point cloud data items obtained from LiDAR sensors 614 located on the autonomous vehicle 502. Object detection 616 techniques may provide the location of an object (e.g., in the form of 3D world coordinates) from the point cloud data item. Below the LiDAR-based object detection module 612, the onboard control computer includes an image-based object detection module 618, and the image-based object detection module 618 may perform object detection 624 based on images obtained from a camera 620 located on the autonomous vehicle 502. The object detection 624 technique may employ a deep machine learning technique 624 to provide a location of an object (e.g., in the form of 3D world coordinates) from an image provided by the camera 620.
Radar 656 on autonomous vehicle 502 may scan an area in front of autonomous vehicle 502 or an area toward which autonomous vehicle 502 is driven. The radar data is sent to the sensor fusion module 602, and the sensor fusion module 602 can use the radar data to correlate objects and/or obstructions detected by the radar 656 with objects and/or obstructions detected from both the LiDAR point cloud data item and the camera image. The radar data is also sent to an interference module 646, which interference module 646 may perform data processing on the radar data to track targets via an object tracking module 648, as described further below.
The onboard control computer includes an interference module 646, the interference module 646 receiving the location of the object from the point cloud and the location of the object from the image, and the location of the fused object from the sensor fusion module 602. The interference module 646 also receives radar data with which the interference module 646 may track objects from one point cloud data item and one image obtained at one instance in time to another (or next) point cloud data item and another image obtained at another subsequent instance in time by the object tracking module 648.
The interference module 646 may perform object attribute estimation 650 to estimate one or more attributes of objects detected in an image or point cloud data item. The one or more attributes of the object may include a type of the object (e.g., pedestrian, car, truck, etc.). The disturbance module 646 may perform behavior prediction 652 to estimate or predict a motion pattern of an object detected in an image and/or point cloud. Behavior prediction 652 may be performed to detect object locations in a set of images received at different points in time (e.g., sequential images) or in a set of point cloud data items received at different points in time (e.g., sequential point cloud data items). In some embodiments, behavior prediction 652 may be performed for each image received from a camera and/or each point cloud data item received from a LiDAR sensor. In some embodiments, the perturbation module 646 may be executed (e.g., run or execute) to reduce computational load by performing behavior prediction 652 on every other image received from the camera or every other point cloud data item received from the LiDAR sensor, or by performing behavior prediction 652 after every predetermined number of images received from the camera or every predetermined number of point cloud data items received from the LiDAR sensor (e.g., after every two images or after every three point cloud data items).
Behavior prediction 652 characteristics may determine a speed and direction of an object surrounding the autonomous vehicle 502 from radar data, where the speed and direction information may be used to predict or determine a motion pattern of the object. The motion pattern may include predicted trajectory information of the object within a predetermined length of time in the future after receiving the image from the camera. Based on the predicted movement pattern, the disturbance module 646 may assign movement pattern context labels (e.g., "located at coordinates (x, y)", "stopped", "traveling at 50 mph", "accelerating", or "decelerating") to the object. The context label may describe a movement pattern of the object. The interference module 646 sends one or more object properties (e.g., type of object) and a movement pattern context tag to the planning module 662. The interference module 646 may perform the environmental analysis 654 using any information acquired by the system 600 and any number and combination of its components.
The onboard control computer includes a planning module 662, the planning module 662 receiving object properties and movement pattern context tags from the disturbance module 646, receiving drivable areas and/or obstacles, and receiving vehicle position and pose information (described further below) from the fusion positioning module 626.
The planning module 662 may execute the navigation plan 664 to determine a set of trajectories on which the autonomous vehicle 502 may drive. The set of trajectories may be determined based on the drivable region information, one or more object properties of the object, a movement pattern context tag of the object, a position of the obstacle, and the drivable region information. In some embodiments, navigation plan 664 may include an area beside a road where autonomous vehicle 502 is determined to be able to safely park in an emergency. The planning module 662 may include behavior decision-making 666 to determine driving actions (e.g., steering, braking, throttle) in response to determining a changed condition on the road (e.g., traffic lights turning yellow, or the autonomous vehicle 502 is in an unsafe driving condition because another vehicle is driving in front of the autonomous vehicle 502 and within a predetermined safe distance of the location of the autonomous vehicle 502). The planning module 662 performs trajectory generation 668 and selects a trajectory from the set of trajectories determined by the navigation planning operation 664. The planning module 662 sends the selected trajectory information to the control module 670.
The onboard control computer includes a control module 670, the control module 670 receives the suggested trajectories from the planning module 662 and the autonomous vehicle 502 position and pose from the fusion positioning module 626. The control module 670 includes a system identifier 672. The control module 670 may perform model-based trajectory refinement 674 to refine the proposed trajectory. For example, the control module 670 may apply filtering (e.g., a kalman filter) to smooth the proposed trajectory data and/or minimize noise. The control module 670 may perform robust control 676 by determining an amount of brake pressure to apply, a steering angle, an amount of throttle to control vehicle speed, and/or a transmission gear based on the refined suggested trajectory information and the current position and/or pose of the autonomous vehicle 502. The control module 670 may send the determined brake pressure, steering angle, throttle amount, and/or transmission gear to one or more devices in the autonomous vehicle 502 to control and facilitate accurate driving operations of the autonomous vehicle 502.
The depth image-based object detection 624 performed by the image-based object detection module 618 may also be used to detect landmarks on the road (e.g., parking marks, deceleration strips, etc.). The onboard control computer includes a fused location module 626, the fused location module 626 obtaining landmarks detected from the image, landmarks obtained from a map database 636 stored on the onboard control computer, landmarks detected from point cloud data items by LiDAR based object detection module 612, speed and displacement from an odometer sensor 644, and estimated locations of the autonomous vehicle 502 from GPS/IMU sensors 638 (i.e., GPS sensor 640 and IMU sensor 642) located on or in the autonomous vehicle 502. Based on this information, the fusion positioning module 626 can perform a positioning operation 628 to determine the location of the autonomous vehicle 502, and the location of the autonomous vehicle 502 can be sent to the planning module 662 and the control module 670.
The fusion positioning module 626 may estimate the pose 630 of the autonomous vehicle 502 based on the GPS and/or IMU sensors 638. The pose of autonomous vehicle 502 may be sent to planning module 662 and control module 670. The fusion positioning module 626 can also estimate the status (e.g., position, possible movement angle) of the trailer unit based on information (e.g., angular velocity and/or linear velocity) provided by, for example, the IMU sensor 642 (e.g., trailer status estimate 634). The fusion locator module 626 may also verify map content 632.
Fig. 7 illustrates an exemplary block diagram of an onboard control computer 550 included in autonomous vehicle 502. The onboard control computer 550 includes at least one processor 704 and a memory 702 having instructions (e.g., the drop instructions 116, the launch instructions 118, and the process instructions 580 shown in fig. 1,3,5, and 6) stored thereon. The instructions, when executed by the processor 704, configure the onboard control computer 550 and/or the various modules of the onboard control computer 550 to perform the operations described in fig. 1-6. The transmitter 706 transmits or transmits information or data to one or more devices in the autonomous vehicle 502. For example, transmitter 706 may transmit instructions to one or more motors of the steering wheel to steer autonomous vehicle 502. The receiver 708 receives information or data transmitted or sent by one or more devices. For example, the receiver 708 receives a state of a current speed from an odometer sensor, or a state of a current transmission gear from the transmission. The transmitter 706 and receiver 708 are also configured to communicate with the plurality of vehicle subsystems 540 and the on-board control computer 550 described above in fig. 5 and 6.
Example engine mount and engine mount operation
FIG. 8 illustrates an example engine mount 800 in more detail. The engine mount 800 is an example of the engine mount 310 of fig. 3. The engine mount 800 includes a predefined zone or space sized and shaped to accommodate the autonomous vehicle 502 and a set of sensors 802a-802f (e.g., within the terminals 202, 206, 216 shown in fig. 3) around the perimeter of the engine mount 800 or within the engine mount 800. The sensors 802a-802f are examples of the sensor 312 of fig. 3. The engine mount 800 may be sized and shaped to fit the tractor unit autonomous vehicle 502 and an attached trailer. As an example, the physical extent of the engine mount 800 may be defined, at least in part, by sensors 802a-802f located around the engine mount 800 or within the engine mount 800. In some embodiments, launch pad 800 comprises a physical pad (e.g., a concrete pad). In some embodiments, the engine mount 800 includes physical markers (e.g., drawn lines) or position markers 130 from the equipment 106 around one or more edges or perimeters of the engine mount 800.
Sensors 802a-802f of dock 800 include any sensor capable of detecting objects, movements, and/or sounds that may be associated with the presence of in-zone obstructions 806, 808 of dock 800. For example, the sensor 802 may include a camera, liDAR sensor, motion sensor, infrared sensor, and the like. The engine mount 800 generally includes sensors 802a-802d at each corner of the engine mount 800 (i.e., at each corner of the example rectangular engine mount 800 shown in fig. 8). In some embodiments, the engine mount 800 may include additional sensors 802e and/or 802f at intermediate locations (e.g., along the length of the engine mount 800) to provide a view for detecting obstructions 806, 808 in areas of the engine mount 800 that are farther away from the sensors 802a-802d (e.g., areas near the center of the engine mount 800 that are not visible due to the presence of the autonomous vehicle 502).
One or more of the sensors 802a-802f may be positioned at various heights relative to the ground (e.g., by attaching the sensors 802a-802f to a support structure, such as a pole). Positioning the sensors 802a-802f above the ground may provide improved detection of obstacles 806, 808 above the ground (such as objects attached to one side of the autonomous vehicle 502, animals on or around the autonomous vehicle 502, etc.). In some embodiments, the sensors 802a-802f are located at multiple heights relative to the ground. For example, one or more of the sensors 802a-802f shown in FIG. 8 may represent a ground level sensor, a level sensor, and/or a high level sensor. For example, the ground level sensor 802 may be located at or near ground level such that the ground level sensor may detect obstacles 806, 808 within its field of view covering an area at or near ground (e.g., from ground level to a few feet above ground). The mid-level sensor 802 may be located at an intermediate height relative to the ground (e.g., at a height near a center point between the ground and the top of the autonomous vehicle 502) such that the mid-level sensor has a field of view that encompasses an area near the middle of the autonomous vehicle 502 (e.g., from near the ground to near the top of the autonomous vehicle 502). The high level sensor 802 may be placed above the medium level sensor, for example, to detect obstacles 806, 808 at increased heights relative to the ground and/or to provide a more top-down view of portions of the engine mount 800.
In some embodiments, the engine mount 800 includes one or more additional sensors 804a-804d on the surface of the engine mount 800 or within the surface of the engine mount 800. The sensors 804a-804d are examples of the sensor 312 of fig. 3. For example, sensors 804a-804d may be configured to provide a view under autonomous vehicle 502 in dock 800. Like the sensors 802a-802f, the sensors 804a-804d may include any suitable type of sensor for detecting the obstacles 806, 808. For example, the sensors 804a-804d may include cameras, liDAR sensors, motion sensors, infrared sensors, and the like. The sensors 804a-804d may specifically facilitate detection of an obstacle, such as obstacle 808, near the center of the engine mount 800 and/or below the autonomous vehicle 502 parked on the engine mount 800. In some cases, such an obstruction 808 may not be detected by the other sensors 802a-802 f. While the example engine block 800 of FIG. 8 shows six sensors 802a-802f and four sensors 804a-804d, it should be appreciated that the engine block 800 may include any suitable number, combination, and placement of the sensors 802a-802f and/or 804a-804d.
Sensors 802a-802f and 804a-804d of the engine mount 800 are in signal communication with the control subsystem 102. As further described with respect to the example operations below and the method 900 of FIG. 9, the sensors 802a-802f, 804a-804d typically communicate the engine mount signal 218 to the control subsystem 102. The control subsystem 102 typically receives these signals 830 and uses the signals 830 to determine whether an obstacle 806, 808 is detected within the zone of the engine station 800. The presence of obstacles 806, 808 generally indicates that it is unsafe for autonomous vehicle 502 to begin moving. As an example, if the sensors 802a-802f, 804a-804d are cameras, the signal 830 may include video and/or photographs of a portion of the engine mount 800 (i.e., the portion of the engine mount 800 that is within the field of view of the camera) that is seen by the sensors 802a-802f, 804a-804 d. Control subsystem 102 uses obstacle detection instructions 836 to determine whether an obstacle is detected in the video. For example, the obstacle detection instructions 836 may include code for implementing an object detection routine for an image corresponding to a video frame. If an unexpected object is detected (e.g., an object that is not known to be part of autonomous vehicle 502), control subsystem 102 determines that an obstacle 806, 808 is detected in the engine mount. The obstacle detection instructions 836 may similarly include code for implementing a method of detecting an obstacle based on LiDAR data (e.g., based on detection of an unexpected object in or around the engine mount 800), motion sensor data (e.g., based on detection of unexpected motion in or around the engine mount 800), sound (e.g., detection of unexpected sound near the engine mount 800), infrared data (e.g., based on detection of an unexpected object in an infrared image), and the like. Obstacle detection instructions 836 may be implemented using various modules described below with respect to detection of objects and obstacles by autonomous vehicle 502 (see fig. 6 and corresponding description above).
Control subsystem 102 also receives signal 832 from autonomous vehicle 502. Control subsystem 102 generally uses these signals 832 to determine that zone 814 in front of autonomous vehicle 502 (e.g., zone or region 814 defined at least in part by the field of view of the sensors of vehicle sensor subsystem 544) is free of obstacles 810, 812. These autonomous vehicle signals 832 may be signals from the vehicle sensor subsystem 544 of the autonomous vehicle 502 and/or communications from the onboard control computer 550 of the autonomous vehicle 502. For example, the signal 832 may be a feed of images, liDAR data, etc., obtained by the autonomous vehicle's vehicle sensor subsystem 544. In this case, control subsystem 102 may use obstacle detection instructions 836 to determine whether obstacles 810, 812 are detected in zone 814 forward of autonomous vehicle 502. In other cases, the autonomous vehicle signal 832 may include an indication of whether the onboard control computer 550 has detected an obstacle 810, 812 in front of the autonomous vehicle 502 (see fig. 6 and corresponding description above). If control subsystem 102 determines that dock 800 is clear of obstacles 806, 808 based on dock signal 830, while zone 814 in front of autonomous vehicle 502 is clear of obstacles 810, 812 based on signal 832, control subsystem 102 communicates a launch instruction 118 that includes permission 816 for autonomous vehicle 502 to begin moving out of dock 800. The control subsystem 102 may also identify outbound lanes 318a-318c that the autonomous vehicle 502 follows in order to exit the terminals 202, 206, 216 and travel along their routes 204, 214. For example, outbound lanes 318a-318c leading to a preferred origin of the route 204, 214 of the autonomous vehicle and/or based on other traffic in the terminal may be selected.
In an example operation of dock 800, control subsystem 102 may receive a request for autonomous vehicle 502 to depart from dock 800. In response to the request for departure, control subsystem 102 determines whether mount 800 is free of an obstacle that would obstruct departure from mount 800 based at least in part on the received mount sensor signal 830 (i.e., the data included in signal 830). For example, if the sensors 802a-802f and/or 804a-804d include cameras, the dock signal 830 may include images and/or video. In this case, control subsystem 102 may employ obstacle detection instructions 836 that include rules for detecting objects in the images and/or video and determining whether the detected objects correspond to obstacles 806, 808. For example, one or more predetermined object detection methods (e.g., employing neural networks or machine learning methods) may be used to detect objects and determine whether the detected objects correspond to the obstacles 806, 808. Signals from infrared sensors 802a-802f and/or 804a-804d may be similarly evaluated to detect portions of the infrared image having thermal energy characteristics associated with the presence of animals and/or humans within the zone of engine mount 800.
As another example, if the sensors 802a-802f, 804a-804d include LiDAR sensors, the dock signal 830 may include a distance measurement. In this case, control subsystem 102 may employ obstacle detection instructions 836 that include rules for detecting obstacles 806, 808 based on characteristics and/or changes in the distance measurements. For example, a change in distance measured by a LiDAR sensor may indicate the presence of obstacles 806, 808. For example, when the engine mount 800 is known to be free of obstructions 806, 808, each LiDAR sensor may be calibrated to provide an initial distance measurement. If the distance reported by a given LiDAR sensor changes from this initial value, then obstacles 806, 808 may be detected.
As yet another example, if sensors 802a-802f and/or 804a-804d include motion sensors, dock signal 830 may include motion data of dock 800. In this case, the control subsystem 102 may employ an obstacle detection command 836, the obstacle detection command 836 including rules for detecting obstacles 806, 808 based on the detected movement. For example, the movement or motion detected within the zone of dock 800 may be caused by the presence of an animal or person within the zone of dock 800. Thus, if motion is detected within the zone of dock 800, control subsystem 102 may determine that obstacle 806 or 808 is detected within the zone of dock 800. In some cases, the detected movement may need to last for at least a threshold period of time (e.g., 15 seconds or more) before the obstacle 806, 808 is detected based on motion to reduce or eliminate false positive detection of the obstacle 806, 808 caused by wind and/or other transient events (e.g., an animal, person, or vehicle passing through and immediately exiting the zone of the dock 800).
As a further example, if the sensors 802a-802f and/or 804a-804d include microphones for recording sound in or around the engine mount 800, the engine mount signal 830 may include such recordings. In this case, the control subsystem 102 may employ obstacle detection instructions 836 that include rules for detecting obstacles 806, 808 based on characteristics of the recorded sound. For example, a sound corresponding to a person speaking, a vehicle operating or undergoing maintenance, or an animal that emits a unique sound may be evidence that the obstacle 806, 808 may be within the zone of the dock 800.
Although some examples of detection of obstacles 806, 808 are described above, it should be appreciated that any other suitable obstacle detection method may be used by control subsystem 102. In some embodiments, the control subsystem may use two or more types of sensor data to determine whether an obstacle 806, 808 is detected (e.g., by combining camera images and LiDAR data, as described with respect to the sensor fusion module 602 of FIG. 6). For example, the obstacles 806, 808 (see fig. 6 and corresponding description above) may be detected using the methods and/or modules described for detection of objects and obstacles by the autonomous vehicle 502. In other words, obstacle detection instructions 836 may include instructions, rules, and/or code for implementing any of the modules described below with respect to fig. 6.
The control subsystem 102 also determines whether the area 814 in front of the autonomous vehicle 502 is free of obstacles 810, 812 that would prevent the autonomous vehicle 502 from moving away from the engine mount 800 based at least in part on the received autonomous vehicle signal 832. For example, the obstacles 810, 812 may be detected in an area 814 forward of the autonomous vehicle 502 using the same or similar methods as described above for detecting the obstacles 806, 808.
In the event that it is determined that the dock 800 does not have the obstacles 806, 808 that would obstruct the autonomous vehicle 502 from exiting the dock 800 and that the area 814 in front of the autonomous vehicle 502 is free of the obstacles 810, 812 that would obstruct the autonomous vehicle 502 from moving away from the dock 800, the control subsystem 102 sends the instruction 118 that includes a license 816 to the autonomous vehicle 502 to drive autonomously. Alternatively, for a determination that the dock 800 is not free of obstacles 806, 808 that would prevent the autonomous vehicle 502 from exiting the dock 800 and/or the area 814 in front of the autonomous vehicle 502 is not free of obstacles 810, 812 that would prevent the autonomous vehicle 502 from moving away from the dock 800, the control subsystem 102 sends the instruction 118 that includes a rejection 818 of permission to begin driving autonomously.
FIG. 9 illustrates an example method 900 of using the engine mount 800 of FIG. 8. Method 900 may be implemented by engine mount 800 and control subsystem 102. Method 900 may begin at step 902, where control subsystem 102 receives a request for autonomous vehicle 502 to start from dock 800. The request to initiate from engine block 800 may occur automatically or in response to a manual input. For example, the request to start the departure may be automatically provided any time autonomous vehicle 502 is present in dock 800. For example, upon determining that movement along routes 204, 214 should begin, autonomous vehicle 502 may submit a request to exit dock 800. As another example, an individual (e.g., an operator of autonomous vehicle 502 and/or an administrator of terminals 202, 206, 216) may provide a request to initiate movement of autonomous vehicle 502.
At step 904, the control subsystem receives an autonomous vehicle signal 832 from the autonomous vehicle 502. As described above, the autonomous vehicle signal 832 may include an indication of whether the onboard control computer 550 has detected an obstacle 810, 812 in front of the autonomous vehicle 502 and/or sensor data from one or more sensors of the vehicle sensor subsystem 544. At step 906, control subsystem 102 receives dock signal 830. As described above, the engine signals 830 generally include data from the engine sensors 802a-802f, 804a-804 d. The dock signal 830 may include one or more streams of image data, video data, distance measurement data (e.g., from LiDAR sensors), motion data, infrared data, and the like.
At step 908, control subsystem 102 determines whether neither dock 800 nor zone 814 in front of autonomous vehicle 502 is free of obstacles 806, 808, 810, 812 based on received autonomous vehicle signal 832 and dock signal 830. For example, control subsystem 102 may determine whether an obstacle 806, 808 is detected within the zone of dock 800 or after the preparation or pre-trip procedure of autonomous vehicle 502 is completed based on dock signal 830. For example, control subsystem 102 uses obstacle detection instructions 836 to determine whether an obstacle 806, 808 is detected based on images, video, motion data, liDAR data, infrared images, and/or sound recordings included in dock signal 830. Examples of detection of obstacles 806, 808 in the zone of the engine mount 800 are described above with respect to fig. 8. The obstacle detection instructions 836 generally include code for implementing a method of detecting obstacles 806, 808 based on image data, video data, liDAR data, motion sensor data, sound, infrared data, and the like. Control subsystem 102 also determines whether an obstacle 810, 812 is detected within zone 814 forward of autonomous vehicle 502 based on autonomous vehicle signal 832. As described above, the obstacles 810, 812 may be detected by the vehicle mount computer 550 and/or by the control subsystem 102 (i.e., similar to that described above for detection of the obstacles 806, 808).
If an obstacle 806, 808 is detected within the zone of the dock 800 and/or an obstacle 810, 812 is detected in front of the autonomous vehicle 502, then the control subsystem 102 determines that the autonomous vehicle 502 is unable to freely begin moving from the dock 800 at step 908, and the control subsystem 102 proceeds to step 910. At step 910, the control subsystem 102 determines whether the dock 800 and the area 814 in front of the autonomous vehicle have not cleared the obstacle 806, 808, 810, 812 within a threshold period of time (e.g., 15 minutes or any other suitable period of time). If the threshold time is not reached at step 910, control subsystem 102 continues to receive autonomous vehicle signal 832 and dock signal 830 at step 908 to determine if dock 502 is clear to the departure of autonomous vehicle 502. Otherwise, if the threshold time is reached, the control subsystem 102 may proceed to step 912, in which step 912 instructions are provided to inspect the engine mount 800 (i.e., remove the detected obstacle(s) 806, 808, 810, 812). For example, control subsystem 102 may detect a particular obstacle 808 in a particular portion of engine mount 800 for at least a threshold period of time. In response, control subsystem 102 may provide instructions to an administrator of terminals 202, 206, 216 to inspect a particular portion of engine mount 800 (e.g., an area where obstacle 808 is detected). If a response is received (e.g., from an administrator of terminals 202, 206, 216) indicating that the portion of dock 800 has become free of a particular obstacle 808 or has never contained obstacle 808, control subsystem 102 may determine that dock 800 is clear to departure of autonomous vehicle 502. In some embodiments, control subsystem 102 may flag any sensors (such as sensors 802f and/or 804b-804d that may be associated with detecting obstacle 808) to indicate that some verification or maintenance of these sensors 802f and/or 804b-804d is appropriate (e.g., if detected obstacle 808 is found not to be present in engine station 800).
If no obstacle 806, 808 is detected within the zone of dock 800 and no obstacle 810, 812 is detected in front of autonomous vehicle 502, control subsystem 102 determines at step 908 that autonomous vehicle 502 is free to begin moving from dock 800 and control subsystem 102 may proceed to step 914. At step 914, the control subsystem 102 determines whether no obstacle 806, 808, 810, 812 has been detected for at least a predefined period of time (e.g., at least one minute or more). If the engine block 800 is determined to be free of obstructions 806, 808, 810, 812 for at least a predefined period of time, the control subsystem 102 proceeds to step 916. Otherwise, if the engine block 800 is not determined to be free of the obstacle 806, 808, 810, 812 for at least a predefined period of time, the control subsystem 102 continues to receive the autonomous vehicle signal 832 and the engine block signal 830 to determine if the engine block 502 remains free of the obstacle 806, 808, 810, 812 for at least a predefined period of time.
At step 916, control subsystem 102 may determine the appropriate outbound lane 318a-318c that autonomous vehicle 502 should travel in order to begin moving along routes 204, 214 (e.g., traveling from terminals 202, 206, 216 to the road). Lanes 318a-318c may initially be determined to provide a preferred origin along route 204, 214 and/or based on local traffic in terminals 202, 206, 216. For example, the first lane 318a may be selected because the lane 318a leads to a preferred road for starting to move along the route 204, 214 and/or experiences less traffic within the terminal 202, 206, 216. However, if an obstacle 812 is detected in the first outbound lane 318a, as shown in fig. 8, the control subsystem 102 may determine an alternate outbound lane 318b or 318c in which the autonomous vehicle 502 should travel. For example, the control subsystem 102 may instruct the autonomous vehicle 502 to travel along the outbound lane 318c instead of 318b because the lane 318c leads to a preferred origin of the route 204, 214, or because the lane 318c is known to have less traffic within the terminal 202, 206, 216. Autonomous vehicle 502 may also or alternatively determine and initiate its own lane adjustments as needed to facilitate safe autonomous driving from engine mount 800 to the road beginning to move along routes 204, 214. At step 918, the control subsystem 102 provides the instruction 118 with a permission 816 to begin autonomous driving. Autonomous driving of autonomous vehicle 502 is described in more detail above with respect to fig. 5-7.
Example landing stage and landing stage operation
Fig. 10 shows example landing stages 1000a, 1000b corresponding to landing stage 310 of fig. 3 in more detail. The example landing pads 1000a, 1000b shown in fig. 10 include predefined zones or spaces sized and shaped to accommodate the autonomous vehicle 502 and a set of sensors 1002a-1002f (e.g., within the terminals 202, 206, 216, 202, 206, 216 shown in fig. 3) around the perimeter of the landing pads 1000a, 1000b or within the landing pads 1000a, 1000b. As an example, the physical extent of each landing stage 1000a, 1000b may be defined at least in part by a respective sensor 1002a-1002f located around the landing stage 1000a, 1000b or within the landing stage 1000a, 1000b. The sensors 1002a-1002f are examples of the sensor 312 of fig. 3. In some embodiments, the landing stage 1000a, 1000b comprises a physical stage (e.g., a concrete stage). In some embodiments, the landing stage 1000a, 1000b includes physical markers (e.g., drawn lines) or position markers 130 around one or more edges or perimeters of the landing stage 1000a, 1000b. Landing pads 1000a, 1000b generally facilitate safe and efficient reception of inbound autonomous vehicles 502. In addition to facilitating identification of landing pads 1000a, 1000b that are clear of obstructions to receiving the inbound autonomous vehicle 502, the landing pads 1000a, 1000b facilitate guiding the inbound autonomous vehicle to an area within the terminal 202, 206, 216 suitable for the type of cargo carried by the autonomous vehicle 502 or a carrier operating the autonomous vehicle 502. Landing pads 1000a, 1000b may also help to improve the registration of inbound goods and the location of those goods within terminals 202, 206, 216.
The sensors 1002a-1002f of the landing pads 1000a, 1000b may be the same as or similar to the sensors 802a-802f described above with respect to the example engine pad 800 of FIG. 8. For example, the sensors 1002a-1002f may include any sensor capable of detecting objects, movements, sounds, etc., which may be used to determine the presence of the in-zone obstructions 1006, 1008 of the landing platforms 1000a, 1000 b. For example, the sensors 1002a-1002f may include cameras, liDAR sensors, motion sensors, infrared sensors, and the like. Furthermore, each sensor 1002a-1002f shown in FIG. 10 may correspond to one or more sensors located at different heights relative to the ground, for example, to provide a view of different portions of space above the ground within the zone of landing pads 1000a, 1000b, as described above with respect to sensors 802a-802f of FIG. 8. In some embodiments, the landing stage 1000a, 1000b may include one or more additional sensors 1004a-1004d on the surface of the landing stage 1000a, 1000b or within the surface of the landing stage 1000a, 1000 b. The sensors 1004a-1004d are other examples of the sensor 312 of FIG. 3. These optional sensors 1004a-1004d may be the same as or similar to the sensors 804a-804d described above with respect to FIG. 8. While the example of FIG. 10 shows six sensors 1002a-1002f and four sensors 1004a-1004d, it should be appreciated that the landing pads 1000a, 1000b may include any suitable number, combination, and placement of sensors (i.e., more or less than the number of sensors 1002a-1002f or 1004a-1004d shown in FIG. 10).
The sensors 1002a-1002f and 1004a-1004d of the landing stage 1000a, 1000b are in signal communication with the control subsystem 102. As further described with respect to the example operations below and the method 1100 of fig. 11, the sensors 1002a-1002f and 1004a-1004d generally communicate signals 1030a, 1030b (i.e., the signal 1030a for the first landing stage 1000a and the signal 1030b for the second landing stage 1000 b) to the control subsystem 102. While the autonomous vehicle 502 is traveling toward the terminals 202, 206, 216, the autonomous vehicle 502 may communicate to the control subsystem 102 that the landing 1000 will soon be needed to receive the autonomous vehicle 502. For example, when the autonomous vehicle 502 comes within a threshold distance of the terminal 202, 206, 216 (e.g., within ten miles of the terminal 202, 206, 216), the autonomous vehicle 502 may request a landing stage allocation. In response to such a request for an assigned landing stage 1000, the control subsystem 102 determines that there is no landing stage 1000a, 1000b that would obstruct the obstacle 1006, 1008 received from the host vehicle 502 based on the received landing stage signals 1030a, 1030b (i.e., the sensor data included in the signals 1030a, 1030 b). As an example, if the sensors 1002a-1002f or 1004a-1004d of the landing stage 1000a, 1000b are cameras, the signal 1030a may include video of a portion of the landing stage 1000a or 1000b (i.e., the portion of the landing stage 1000a or 1000b that is within the field of view of the camera) that is seen by the sensors 1002a-1002f or 1004a-1004 d. The control subsystem 102 uses the obstacle detection instructions 1034 to determine whether an obstacle is detected in the video, similar to that described above with respect to the example engine mount 300 of fig. 3. The obstacle detection instructions 1034 may similarly include code for implementing a method of detecting an obstacle based on LiDAR data (e.g., based on detection of an unexpected object in or around the landing stage 1000), motion sensor data (e.g., based on detection of unexpected motion in or around the landing stage 1000), sound (e.g., detection of unexpected sound near the landing stage 1000), infrared data (e.g., based on detection of an unexpected object in an infrared image), and the like.
If it is determined that the first landing stage 1000a is not clear of the obstacles 1006, 1008 and the second landing stage 1000b is not obstacle as in the example of fig. 10, the control subsystem 102 provides the autonomous vehicle 502 with a landing instruction 116 that includes an indication of an identification 1012 of the second landing stage 1000b. The instructions 116 may also include an identification 1014 of the appropriate inbound lane 314 that the autonomous vehicle 502 should travel in order to reach the assigned landing stage 1000b. If the autonomous vehicle 502 detects an obstacle 1010 while traveling toward the assigned landing stage 1000a, 1000b, the autonomous vehicle 502 may move into a different inbound lane 314a, 314 b. In the example shown in fig. 10, the autonomous vehicle 502 detects an obstacle 1010 in the inbound lane 314a and moves into the inbound lane 314 b. The autonomous vehicle 502 may communicate with the control subsystem 102 to ensure that the alternate inbound lane 314b leads to the assigned landing stage 1000b, and if the alternate inbound lane 314b does not lead to the assigned landing stage 1000b, the control subsystem 102 may identify a different landing stage 1000a, 1000b for the autonomous vehicle 502.
In the example operation of landing stage 1000 of fig. 10, control subsystem 102 receives a request for landing stage allocation for an inbound autonomous vehicle 502 that is scheduled to reach terminals 202, 206, 216 soon (e.g., within the next approximately 15 minutes). After receiving the request, the control subsystem 102 receives the landing stage sensor signals 1030a, 1030b. The control subsystem 102 uses the landing stage sensor signals 1030a, 1030b to identify landing stages 1000a, 1000b that are free of obstructions 1006, 1008. For example, if the sensors 1002a-1002f and/or 1004a-1004d include cameras, the landing stage sensor signals 1030a, 1030b may include images or video. In this case, the control subsystem 102 may employ obstacle detection instructions 1034 that include rules for detecting objects in the image or video and determining whether the detected objects correspond to obstacles 1006, 1008. For example, one or more predetermined object detection methods (e.g., employing neural networks or machine learning methods) may be used to detect objects and determine whether the detected objects correspond to the obstacles 1006, 1008. Signals from the infrared sensors 1002a-1002f and/or 1004a-1004d may be similarly evaluated to detect portions of the infrared images having thermal energy characteristics associated with the presence of animals and/or humans within the zones of landing pads 1000a, 1000b.
As another example, if the sensors 1002a-1002f or 1004a-1004d include LiDAR sensors, the landing stage sensor signals 1030a, 1030b may include distance measurements. In this case, control subsystem 102 may employ obstacle detection instructions 1034 that include rules to detect obstacles 1006, 1008 based on characteristics and/or changes in the distance measurements. For example, a change in distance measured by a LiDAR sensor may indicate the presence of an obstacle 1006, 1008. For example, each LiDAR sensor may be calibrated to provide an initial distance measurement when the landing stage 1000a, 1000b is known to be clear of obstructions 1006, 1008. If the distance reported by a given LiDAR sensor changes from this initial value, then an obstacle 1006, 1008 may be detected.
As yet another example, if the sensors 1002a-1002f and/or 1004a-1004d include motion sensors, the landing stage signals 1030a, 1030b may include motion data for the landing stages 1000a, 1000 b. In this case, the control subsystem 102 may employ obstacle detection instructions 1034 that include rules for detecting obstacles 1006, 1008 based on the detected movement. For example, the movement or motion detected within the zone of landing stage 1000a, 1000b may be caused by the presence of an animal or person within the zone of landing stage 1000a, 1000 b. Thus, if motion is detected within the zone of landing pads 1000a, 1000b, control subsystem 102 may determine that an obstacle 1006, 1008 is detected within the zone of landing pads 1000a, 1000 b. In some cases, the detected movement may need to last for at least a threshold period of time (e.g., 15 seconds or more) before the obstacle 1006, 1008 is detected based on motion to reduce or eliminate false positive detection of the obstacle 1006, 1008 caused by wind and/or other transient events (e.g., an animal, person, or vehicle passing through the zone of the landing stage 1000a, 1000b and immediately exiting the zone).
As a further example, if the sensors 1002a-1002f and/or 1004a-1004d include microphones for recording sound in the landing stage 1000a, 1000b or around the landing stage 1000a, 1000b, the landing stage signals 1030a, 1030b may include such recordings. In this case, the control subsystem 102 may employ an obstacle detection instruction 1034 that includes rules for detecting obstacles 1006, 1008 based on the characteristics of the recorded sound. For example, the sound corresponding to a person speaking, a vehicle operating or undergoing maintenance, or an animal producing a unique sound may be evidence that the obstacle 1006, 1008 is within the zone of the landing stage 1000a, 1000 b. Although some examples of detection of obstacles 1006, 1008 are described above, it should be appreciated that any other suitable obstacle detection method may be used by control subsystem 102. For example, the obstacles 1006, 1008 (see fig. 6 and corresponding description above) may be detected using the methods and/or modules described with respect to the detection of objects and obstacles by the autonomous vehicle 502.
If the appropriate landing bay 1000a, 1000b is not detected, the control subsystem 102 may instruct the individual at the terminal 202, 206, 216 to clear the obstacle from the appropriate landing bay 1000a, 1000b, and the landing bay 1000a, 1000b may then be assigned to the inbound autonomous vehicle 502 (e.g., after the control subsystem 102 verifies that the landing bay 1000a, 1000b is now clear of the obstacle). In addition to assigning landing pads 1000a, 1000b to which the autonomous vehicle 502 should navigate and dock, the control subsystem 102 may also provide an identifier 1014 for the appropriate inbound lane 314a, 314b to travel through the terminal 202, 206, 216 to safely reach the assigned landing pad 1000a, 1000 b.
When the autonomous vehicle 502 enters the terminal 202, 206, 216 and begins traveling along its assigned lane 314a, 314b, the autonomous vehicle 502 may detect the obstacle 1010 in its path. In response, the autonomous vehicle 502 may request that a new inbound lane 314a, 314b be assigned to the autonomous vehicle 502 in order to reach the assigned landing stage 1000a, 1000b. Alternatively, the autonomous vehicle 502 may automatically move into a different inbound lane 314a, 314b (e.g., into an idle inbound lane 314b shown in the example of fig. 10) and travel along that new lane 314a, 314 b. The autonomous vehicle 502 may communicate with the control subsystem 102 to verify that the new inbound lane 314a, 314b may be used to reach the assigned landing stage 1000a, 1000b. If the new inbound lane 314a, 314b does not reach the assigned landing stage 1000a, 1000b, the control subsystem 102 may determine the new landing stage 1000a, 1000b to assign to the autonomous vehicle 502 (i.e., the landing stage 1000a, 1000b that may be reached from the new lane 314a, 314 b) or assign a new lane to the autonomous vehicle 502 (i.e., such that the autonomous vehicle 502 may navigate to the newly assigned lane 314a, 314b to reach the appropriate assigned landing stage 1000a, 1000 b).
Fig. 11 illustrates an example method 1100 of using the landing stage 1000a, 1000b of fig. 10. Method 1100 may be implemented by landing pads 1000a, 1000b, autonomous vehicle 502, and control subsystem 102. Method 1100 may begin at step 1102, where control subsystem 102 receives a request for an allocation of landing pads 1000a, 1000b capable of receiving an inbound autonomous vehicle 502. The request may include the expected arrival time of the autonomous vehicle 502 and other information about the autonomous vehicle 502, such as the size of the autonomous vehicle 502 (i.e., such that the assigned landing stage 1000a, 1000b is of an appropriate size), the type of cargo being transported by the autonomous vehicle 502 (e.g., such that the autonomous vehicle 502 is directed to a landing stage 1000a, 1000b adapted to receive such cargo).
At step 1104, the control subsystem 102 receives the landing stage signals 1030a, 1030b. As described above, the landing stage signals 1030a, 1030b generally include data from the landing stage sensors 1002a-1002f or 1004a-1004 d. The landing stage signals 1030a, 1030b may include one or more streams of image data, video data, distance measurement data (e.g., from LiDAR sensors), motion data, infrared data, and the like.
At step 1106, the control subsystem 102 determines that there are no landing platforms 1000a, 1000b that would obstruct receiving the obstacle 1006, 1008 of the incoming autonomous vehicle 502. For example, the control subsystem 102 may determine whether an obstacle 1006, 1008 is detected within the zone of the landing stage 1000a, 1000b based on the landing stage signals 1030a, 1030 b. For example, the control subsystem 102 may use the obstacle detection instructions 1034 to determine whether an obstacle 1006, 1008 is detected based on images, video, motion data, liDAR data, infrared images, and/or sound recordings included in the landing stage signals 1030a, 1030 b. Examples of detection of obstacles 1006, 1008 in the zone of landing platforms 1000a, 1000b are described above with respect to fig. 10. In some embodiments, the control subsystem 102 also determines the inbound lane 314a, 314b that the incoming autonomous vehicle 502 should travel in order to reach the landing stage 1000a, 1000b determined to be free of obstacles 1006, 1008. For example, the lanes 314a, 314b may be selected based on the proximity of the lanes 314a, 314b to the road from which the autonomous vehicle 502 is expected to enter the terminals 202, 206, 216, known traffic within the terminals 202, 206, 216, and/or the type of cargo being transported by the incoming autonomous vehicle 502. In some embodiments, prior to proceeding to step 1108, the control subsystem 102 may first determine that the landing stage 1000a, 1000b is free of obstructions 1006, 1008 for at least a threshold period of time (e.g., 15 minutes or any other suitable period of time).
At step 1108, control subsystem 102 provides drop command 116 to the incoming autonomous vehicle 502. As described above, the landing instructions 116 may include an indication of the identity 1012 of the landing stage 1000a, 1000b identified at step 1106. The instructions 116 may also include an identification 1014 of the appropriate inbound lane 314a, 314b that the autonomous vehicle 502 should travel in order to reach the assigned landing stage 1000a, 1000 b.
At step 1110, control subsystem 102 determines whether autonomous vehicle 502 has entered terminal 202, 206, 216. If the autonomous vehicle has not entered the terminal 202, 206, 216, the control subsystem 102 may proceed to step 1112 to verify that the assigned landing 1000a, 1000b remains clear of the obstacle 1006, 1008. For example, the control subsystem may determine whether an obstacle 1006, 1008 is detected, as described above with respect to step 1106. If an obstacle is detected at step 1112, the control subsystem 102 may proceed to step 1114 to check if there are any landing pads 1000a, 1000b available.
If no landing stage 1000a, 1000b is available at step 1114, the control subsystem 102 may proceed to step 1116, where the control subsystem 102 sends an instruction to clear the landing stage 1000a, 1000b to receive the inbound autonomous vehicle 502. For example, the control subsystem 102 may detect a particular obstacle 1006, 1008 in a particular portion of the landing stage 1000a, 1000b for at least a threshold period of time. In response, control subsystem 102 may provide instructions to an administrator of terminals 202, 206, 216 to check a particular portion of landing pads 1000a, 1000b (e.g., an area where an obstacle 1006, 1008 is detected). If control subsystem 102 receives a response (e.g., from an administrator of terminal 202, 206, 216) indicating that the portion of landing bay 1000a, 1000b has become free of particular obstacle 1006, 1008, control subsystem 102 may determine that landing bay 1000a, 1000b is available to receive incoming autonomous vehicle 502. In some embodiments, the control subsystem 102 may flag any sensors (such as sensors 1002a-1002f and/or 1004a-1004d that may be associated with detecting the obstacles 1006, 1008) to indicate that some inspection or maintenance of these sensors 1002a-1002f and/or 1004a-1004d may be appropriate (e.g., if the detected obstacles 1006, 1008 are not actually present in the zone of the landing stage 1000a, 1000b such that the sensors 1002a-1002f and 1004a-1004d are likely to fail). Control subsystem 102 then generally returns to step 1106 described above to identify landing pads 1000a, 1000b to be assigned to incoming autonomous vehicle 502.
If the control subsystem determines at step 1110 that the autonomous vehicle 502 has entered the terminal 202, 206, 216, the control subsystem 102 may continue to monitor the signal 220 received from the autonomous vehicle 502 in case a different landing stage 1000a, 1000b and/or inbound lane 314a, 314b should be assigned to the autonomous vehicle 502 for some reason, as illustrated by example steps 1118, 1120, 1122, 1124. At step 1118, the control subsystem 102 determines that the inbound lanes 314a, 314b assigned to the autonomous vehicle 502 are blocked by the obstacle 1010. For example, autonomous vehicle 502 may detect obstacle 1010 using vehicle sensor subsystem 544 and onboard computer 550, and communicate detected obstacle 1010 to control subsystem 102. If such communication is received, control subsystem 102 may determine a new landing stage 1000a, 1000b at step 1122 (e.g., as described above with respect to step 1106) and provide new landing instructions 116 to autonomous vehicle 502 at step 1124 before autonomous vehicle 502 is permitted to dock at the assigned landing stage 1000a, 1000b at step 1120. For example, at step 1118, the control subsystem 102 may receive an indication that the autonomous vehicle 502 has detected the obstacle 1010 and moved from the initial inbound lane 314a to the alternate new inbound lane 314 b. The control subsystem 102 may check that the alternate lane 314b leads to the assigned landing stage 1000a, 1000b. If the alternate lane 314b does not lead to the assigned landing stage 1000a, 1000b, the control subsystem 102 may determine a new landing stage 1000a, 1000b that is accessible from the alternate lane 128b, or determine a different inbound lane 130a, 130b that is available to reach the assigned landing stage 1000a, 1000b.
Example of re-moving an autonomous vehicle after off-terminal maintenance
Fig. 12A illustrates an example of a mobile autonomous vehicle restart system 1200, which mobile autonomous vehicle restart system 1200 may be included in mobile terminal system 100 to facilitate restarting of autonomous vehicle 502 after off-terminal maintenance. The re-launch system 1200 includes a portable device 1202 (see, e.g., portable device 126 of fig. 1), the control subsystem 102, and the autonomous vehicle 502. The restart system 1200 generally facilitates restarting movement of the autonomous vehicle 502 along its routes 204, 214 after stopping. For example, if the autonomous vehicle 502 is stopped for maintenance or any other reason, one or more users 1204 at the location of the stopped autonomous vehicle 502 may operate the portable device 1204 to confirm that at least a first portion 1206 of the zone or space around the autonomous vehicle 502 is free of obstacles 1216a, 1216b that would impede safe movement of the autonomous vehicle 502. In one embodiment, user 1204 includes a mechanic, repairman, service technician, inspector, emergency personnel, or other suitable individual that facilitates re-moving autonomous vehicle 502 after autonomous vehicle 502 has stopped, e.g., along routes 204, 214. The vehicle sensor subsystem 544 of the autonomous vehicle 502 and/or the onboard control computer 550 may provide information 1222, the information 1222 including autonomous vehicle sensor data and/or another confirmation indicating whether at least a second portion 1208 of the zone surrounding the stopped autonomous vehicle 502 is free of the obstacle 1216 c. If the control subsystem 102 determines that neither the portion 1206 nor the portion 1208 of the zone surrounding the stopped autonomous vehicle 502 has the obstacle 1216a-1216c, the control subsystem 102 may provide the stopped autonomous vehicle 502 with permission 1224 to begin moving again (e.g., by moving back into the road 1226 to travel along the routes 204, 214 of FIG. 2).
Device 1202 may be any mobile or portable device (e.g., mobile telephone, computer, etc.). The portable device 1202 generally includes a user interface operable to receive user input. The user input may include a confirmation 1218 provided by the user 1204 after the user 1204 verifies that the portion 1206 of the zone surrounding the autonomous vehicle 502 is clear of the obstacles 1216a, 1216 b. The portable device 1202 may include a camera or other suitable sensor for obtaining images and/or video 1220, which images and/or video 1220 may be provided to the control subsystem 102. As described further below with respect to fig. 13, the control subsystem 102 (or the portable device itself) may determine whether an obstacle 1216a, 1216b is detected in the image and/or video 1218 (e.g., using the obstacle detection instructions 836, 1034 described above with respect to fig. 8-11). Example components of portable device 1202 are shown in fig. 12B and described further below.
In some embodiments, the user 1204 visually inspects the portion 1206 of the zone surrounding the autonomous vehicle 502 to determine if there are obstacles 1216a, 1216b. If the user 1204 does not detect an obstacle 1216a, 1216b, the user 1204 may input an acknowledgement 1218 that the zone portion 1206 is free of the obstacle 1216a, 1216b, and the portable device 1202 may send the acknowledgement 1218 to the control subsystem 102. In embodiments where the portable device 1202 includes a camera, the user 1204 may move the portable device 1202 around the zone portion 1206 to obtain images and/or video of the zone portion 1206. For example, images and/or videos 1220 for various fields of view 1212a-1212f may be obtained such that the images and/or videos 1220 cover at least the zone portion 1206. For example, the user 1204 may move around the vehicle and capture images and/or video 1220 at locations 1210a-1210f shown with an "X" in fig. 12A such that a camera of the portable device 1202 (e.g., camera 1258 of the example portable device 1202 shown in fig. 12B) captures images and/or video 1220 of different fields of view 1212A-1212 f. In particular embodiments, the user 1204 moves around the vehicle and captures images and/or video 1220 at locations 1210a-1210f for a predetermined period of time that is determined to be short enough that it can be determined whether the zone portion 1206 is clear and safe to re-propel the autonomous vehicle 502.
In some embodiments, a portion of the autonomous vehicle 502 (e.g., a trailer attached to the autonomous vehicle 502) may include visual markers 1214a-1214f, the markers 1214a-1214f being positioned to facilitate user-friendly capturing of images and/or video 1220 that at least encompass the zone portion 1206. The user 1204 may position the portable device 1202 such that an image and/or video 1220 of each of the captured markers 1214a-1214f is taken. The markers 1214a-1214f may include bar codes that can be interpreted by the control subsystem 102 in the received image and/or video 1220. Thus, the markers 1214a-1214f may ensure that the image and/or video 1220 provided from the portable device 1202 includes a view adapted to ensure that the portion 1206 of the zone surrounding the autonomous vehicle 502 is free of the obstructions 1216a, 1216 b. The markers 1214a-1214f may also be used to identify the autonomous vehicle 502 being restarted by the re-transmission system 1200 such that the control subsystem 102 may effectively identify the stopped autonomous vehicle 502 and maintain a record of its re-transmissions.
In embodiments involving the provision of images and/or video 1220 from portable device 1202, control subsystem 102 receives images and/or video 1220 and uses obstacle detection instructions 1230 to determine whether an obstacle 1216a, 1216b is detected in images and/or video 1220. Examples of detection of an obstacle, such as the obstacle 1216a, 1216b, are described above with reference to fig. 8-11, and the same or similar methods may be used to detect the obstacle 1216a, 1216b. For example, the control subsystem 102 may use obstacle detection instructions 1230 including rules for detecting objects in the image and/or video 1220 and determining whether the detected objects correspond to the obstacles 1216a, 1216b. For example, one or more predetermined object detection methods (e.g., employing neural networks or machine learning methods) may be used to detect objects and determine whether the detected objects correspond to the obstacles 1216a, 1216b.
Control subsystem 102 also receives information 1222 from autonomous vehicle 502, information 1222 including sensor data and/or an indication of whether an obstacle 1216c was detected in portion 1208 of the zone surrounding autonomous vehicle 502 (see fig. 6 and corresponding description above regarding detection of an obstacle or obstacle by autonomous vehicle 502). The portion 1208 of the zone surrounding the autonomous vehicle 502 generally includes an area in front of the autonomous vehicle 502 (e.g., in view of one or more sensors of the vehicle sensor subsystem 544 of the autonomous vehicle 502). In some cases, the onboard control computer 550 may determine whether an obstacle 1216c is detected in the zone portion 1208 and provide this information 1222 to the control subsystem 102. In other cases, autonomous vehicle 502 may provide information 1222 as data from vehicle sensor subsystem 544 of autonomous vehicle 502. In this case, control subsystem 102 may use obstacle detection instructions 1230 to determine whether an obstacle 1216c is detected in zone portion 1208, as described above with respect to the detection of obstacles 1216a, 1216b and fig. 8-11.
In an example operation of the mobile autonomous vehicle re-transmission system 1200, the autonomous vehicle 502 stops on one side of the road 1226 for maintenance (e.g., repairing or replacing a deflated tire, etc.). A service technician (e.g., user 1204) arrives at the location of stopped autonomous vehicle 502 and performs the required maintenance. After maintenance is complete, autonomous vehicle 502 may prepare return road 1226 and continue to move along routes 204, 214. However, autonomous vehicle 502 alone may not be able to ensure that there are no obstacles along each side and behind autonomous vehicle 502. For example, the vehicle sensor subsystem 544 may not provide a view of the portion 1206 covering the space around the autonomous vehicle 502, with the example obstacle 1216a located near one side of the trailer of the autonomous vehicle 502 and the obstacle 1216b below the trailer attached to the autonomous vehicle 502. To ensure that autonomous vehicle 502 safely returns to roadway 1226, a service technician (user 1204) may operate portable device 1202 to assist in re-launching stopped autonomous vehicle 502 along its route 204, 214.
In some cases, a service technician (user 1204) may visually inspect at least a portion 1206 of the zone surrounding the stopped autonomous vehicle 502 to determine whether the autonomous vehicle 502 is free of obstacles 1216a, 1216b that would obstruct the safe movement of the autonomous vehicle 502. If the service technician (user 1204) determines that at least a portion 1206 of the zone surrounding the autonomous vehicle 502 is clear of the obstruction 1216a, 1216b, the service technician (user 1204) may operate the device 1202 to provide the control subsystem 102 with an acknowledgment that the zone portion 1206 is clear of the obstruction 1216a, 1216b. Upon receiving the acknowledgement 1218, the control subsystem 102 uses the information 1222 provided by the autonomous vehicle 502 to determine whether the portion 1208 of the zone surrounding the stopped autonomous vehicle 502 is also free of the obstacle 1214c. If neither of the zones 1206, 1208 has an obstacle 1216a-1216c, the control subsystem 102 provides permission 1224 for the autonomous vehicle 502 to begin moving to the road 1226. Otherwise, if either of the zones 1206 or 1208 does not clear the barrier 1216a-1216c, no permission 1224 is provided.
In other cases, instead of using only the acknowledgement 1218, the service technician (user 1204) may also or alternatively use the portable device 1202 to capture images and/or video 1220 of the zone portion 1206. These images and/or videos 1220 may be provided to the control subsystem 102 to determine whether the portion 1206 of the zone surrounding the stopped autonomous vehicle 502 is free of obstructions 1216a, 1216b. For example, in the example case where image 1220 is provided to control subsystem 102, a service technician (user 1204) may move around autonomous vehicle 502 and capture images 1220 of autonomous vehicle 502 and the area around autonomous vehicle 502 from different perspectives (e.g., at different locations 1210a-1210f shown in fig. 12A). In some embodiments, a service technician (user 1204) may use the device 1202 to capture an image 1220 including markers 1214a-1214f such that the image 1220 includes representations of obstructions 1216a, 1216b that may appear in the field of view 1212a-1212 f. As another example, where video 1220 is provided to control subsystem 102, a service technician (user 1204) may move around autonomous vehicle 502 to capture video 1220 of autonomous vehicle 502 and the area around autonomous vehicle 502 from different perspectives (e.g., video 1220 captured as the service technician moves between different locations 1210a-1210f shown in fig. 12A). The control subsystem 102 uses the obstacle detection instructions 1230 to detect any obstacle 1216a, 1216b present in the image and/or video 1220.
Upon determining that no obstacle 1216a, 1216b is detected in the image and/or video 1220, the control subsystem 102 uses the information 1222 provided by the autonomous vehicle 502 to determine whether the portion 1208 of the zone surrounding the stopped autonomous vehicle 502 is also free of obstacle 1214c, as described above. If neither of the zones 1206, 1208 has an obstacle 1216a-1216c, the portable device 1202 provides permission 1224 for the autonomous vehicle 502 to begin moving to the roadway 1226. Otherwise, if either of the zones 1206 or 1208 does not clear all of the obstacles 1216a-1216c, then no permission 1224 is provided.
Fig. 12B illustrates an embodiment of the portable device 1202 of fig. 12A. The portable device 1202 includes a processor 1252, a memory 1254, a network interface 1256, and a camera 1258. The portable device 1202 may be configured as shown or in any other suitable configuration.
The processor 1252 includes one or more processors operatively coupled to the memory 1254. Processor 1252 is any electronic circuit including, but not limited to, a state machine, one or more Central Processing Unit (CPU) chips, a logic unit, a core (e.g., a multi-core processor), a Field Programmable Gate Array (FPGA), an application-specific integrated circuit (ASIC), or a Digital Signal Processor (DSP). The processor 1252 may be a programmable logic device, a microcontroller, a microprocessor, or any suitable combination of the preceding. The processor 1252 is coupled in communication with the memory 1254 and the network interface 1256, and is in signal communication with the memory 1254 and the network interface 1256. The one or more processors are configured to process data and may be implemented in hardware or software. For example, the processor 1252 may be 8-bit, 16-bit, 32-bit, 64-bit, or any other suitable architecture. The processor 1252 may include an Arithmetic Logic Unit (ALU) to perform arithmetic and logical operations, processor registers to supply operands to the ALU and store the results of the ALU operations, and a control unit to fetch instructions from memory and execute the ALU, registers, and other components by directing their coordinated operations. The one or more processors are configured to implement various instructions. For example, the one or more processors are configured to execute instructions to implement the functions disclosed herein, such as some or all of the functions described with respect to fig. 12A and 13. In some embodiments, the functions described herein are implemented using logic units, FPGA, ASIC, DSP, or any other suitable hardware or electronic circuitry.
The memory 1254 is operable to store any of the information described above with respect to fig. 12A, as well as any other data, instructions, logic, rules, or code that when executed by the processor 1252 is operable to implement the function(s) described herein. The memory 1254 includes one or more magnetic disks, tape drives, or solid state drives, and may be used as an overflow data storage device to store programs as they are selected for execution, and to store instructions and data that are read during program execution. The memory 1254 may be volatile or nonvolatile and may include Read Only Memory (ROM), random Access Memory (RAM), ternary Content Addressable Memory (TCAM), dynamic Random Access Memory (DRAM), and Static Random Access Memory (SRAM).
The network interface 1256 is configured to enable wired and/or wireless communication. The network interface 1256 is configured to communicate data between the portable device 1202 and other network devices, systems, or domain(s). For example, network interface 1256 may include a WiFi interface, a Local Area Network (LAN) interface, a Wide Area Network (WAN) interface, a modem, a switch, or a router. The processor 1252 is configured to send and receive data using the network interface 1256. The network interface 1256 may be configured to use any suitable type of communication protocol.
The camera 1258 is configured to obtain images and/or video 1258. In general, the camera 1258 may be any type of camera. For example, the camera 1258 may include one or more sensors, an aperture, one or more lenses, and a shutter. The camera 1258 is in communication with a processor 1252, the processor 1252 controls the operation of the camera 1258 (e.g., opening/closing of a shutter, etc.). Data from the sensor(s) of the camera 1258 may be provided to the processor 1252 and stored in the memory 1254 in an appropriate image or video format for use by the control subsystem 102.
FIG. 13 illustrates an example method 1300 of operating the mobile autonomous vehicle restart system 1200 of FIG. 12A. Method 1300 may be implemented by portable device 1202, control subsystem 102, and/or autonomous vehicle 502. Method 1300 may begin at step 1302, where control subsystem 102 receives a request to re-launch autonomous vehicle 502. For example, user 1204 (e.g., a service technician as described above with respect to the example of fig. 12A) may provide an indication that maintenance and any appropriate testing of stopped autonomous vehicle 502 has been completed.
At step 1304, the control subsystem 102 receives an acknowledgement 1218 that the zone portion 1206 is clear and/or receives an image and/or video 1220 of the zone portion 1206, as described above with respect to fig. 12A. In some embodiments, receipt of acknowledgement 1218 and/or image and/or video 1220 serves as a request to grant re-launch of autonomous vehicle 502 (i.e., such that no separate request is received at step 1302).
At step 1306, control subsystem 102 receives information 1222 from autonomous vehicle 502. The information 1222 may include confirmation that the onboard computer 550 did not detect the obstacle 1216c in the zone portion 1208 and/or sensor data from the vehicle sensor subsystem 544.
At step 1308, control subsystem 102 determines whether the zone surrounding autonomous vehicle 502 is free of obstacles 1216a-1216c that obstruct the safe movement of autonomous vehicle 502. For example, as described above with respect to fig. 12A, if it is determined that neither the zone portion 1206 nor the zone portion 1208 surrounding the stopped autonomous vehicle 502 has obstacles 1216a-1216c, then the control subsystem 102 determines that the zone surrounding the autonomous vehicle 502 is clear to movement of the autonomous vehicle 502. Otherwise, if the control subsystem 102 determines that at least one of the zone portions 1206 or 1208 surrounding the stopped autonomous vehicle 502 is not clear of the obstacles 1216a-1216c, the control subsystem 102 determines that the zone surrounding the autonomous vehicle 502 is not clear to movement of the autonomous vehicle 502.
If the zones 1206, 1208 surrounding the stopped autonomous vehicle 502 are determined to be clear to safe movement of the stopped autonomous vehicle 502, the control subsystem 102 proceeds to step 1310 where the control subsystem 102 provides permission 1224 to the autonomous vehicle 502 to begin movement. Otherwise, if the zones 1206, 1208 surrounding the stopped autonomous vehicle 502 are determined not to be clear to safe movement of the stopped autonomous vehicle 502, the control subsystem 102 may prevent the stopped autonomous vehicle 502 from starting to move. Control subsystem 102 may further proceed to step 1312 to determine whether stopped autonomous vehicle 502 has been prevented from moving for at least a threshold time. If this is the case, control subsystem 102 may provide an alert at step 1314 to take further action to clear the zone surrounding autonomous vehicle 502 (e.g., by removing one or more of the obstacles 1216a-1216c or requesting other action from user 1204).
While several embodiments have been provided in the present disclosure, it should be understood that the disclosed systems and methods may be embodied in many other specific forms without departing from the spirit or scope of the present disclosure. The present examples are to be considered as illustrative and not restrictive, and the intention is not to be limited to the details given herein. For example, the various elements or components may be combined or integrated in another system, or certain features may be omitted or not implemented.
Furthermore, techniques, systems, subsystems, and methods described and illustrated in the various embodiments as discrete or separate may be combined or integrated with other systems, modules, techniques, or methods without departing from the scope of the present disclosure. Other items shown or discussed as coupled or directly coupled or communicating with each other may be indirectly coupled or communicating through some interface, device, or intermediate component, whether electrically, mechanically, or otherwise. Other examples of changes, substitutions, and alterations are ascertainable by one skilled in the art and could be made without departing from the spirit and scope disclosed herein.
To assist the patent office and any readers of any patent issued in this application in interpreting the claims appended thereto, the applicant notes that unless the word "means for … …" or "step for … …" is explicitly used in a particular claim, they do not wish to refer to any appended claim to 35u.s.c. ≡112 (f) already present on the filing date of this application.
Embodiments of the present disclosure may be described in terms of the following clauses, the features of which may be combined in any reasonable manner.
Clause 1: a system, comprising:
A fleet of autonomous vehicles, each autonomous vehicle in the fleet configured to autonomously move along a predetermined route;
A manually operated vehicle storing equipment configured to establish a short-term terminal site operable to support movement of a fleet of autonomous vehicles, wherein the equipment stored in the manually operated vehicle comprises one or more of: a secure data storage medium, an autonomous vehicle repair kit, a sensor repair and calibration tool, and a terminal site setup kit including a location profile configured to establish an area within a physical space as part of a short-term terminal site; and
A control subsystem includes a hardware processor configured to provide instructions to a fleet of autonomous vehicles, the instructions including locations of short-term end-stations established using an end-station setup toolkit.
Clause 2: the system of clause 1, wherein the established short-term terminal site facilitates one or more of: inspection of the autonomous vehicle, maintenance of the autonomous vehicle, sensor calibration of the autonomous vehicle, sensor cleaning of the autonomous vehicle, and unloading of items transported by the autonomous vehicle.
Clause 3: the system of clause 1, wherein the hardware processor is further configured to:
Receiving an inspection report associated with an inspection of an autonomous vehicle in a fleet at a short-term terminal site; and
An inspection report is provided that is sent to an autonomous vehicle management system.
Clause 4: the system of clause 1, wherein the hardware processor is further configured to determine that a short-term end-station should be established to provide support for the concept-verification route or the temporary route.
Clause 5: the system of clause 4, wherein the hardware processor is further configured to: it is determined that a concept-verifying route or a temporary route is required after at least one of an increase in traffic volume in an area of a short-term terminal station and a need for a fleet supporting autonomous vehicles is detected within a week from a current time.
Clause 6: the system of clause 1, wherein:
the vehicle further includes a set of sensors including a Global Positioning System (GPS) transceiver operable to determine route data indicative of geographic coordinates of a route traveled by a fleet of autonomous vehicles to reach the short-term terminal site; and
The hardware processor is further configured to provide geographic coordinates to at least one autonomous vehicle in the fleet of autonomous vehicles.
Clause 7: the system of clause 1, wherein the hardware processor is further configured to receive a request for off-terminal maintenance at another location, wherein after receiving the request, the manually operated vehicle is moved to the location with the autonomous vehicle repair kit.
Clause 8: a method, comprising:
storing in the manually operated vehicle equipment configured to establish a short-term terminal site operable to support movement of a fleet of autonomous vehicles, wherein the equipment stored in the manually operated vehicle comprises one or more of: a secure data storage medium, an autonomous vehicle repair kit, a sensor repair and calibration tool, and a terminal site setup kit including a location profile configured to establish an area within a physical space as part of a short-term terminal site; and
Instructions are provided to a fleet of autonomous vehicles via a control subsystem associated with the manually operated vehicle, the instructions including a location of a short-term end-station established using an end-station setup kit.
Clause 9: the method of clause 8, wherein the established short-term terminal site facilitates one or more of: inspection of the autonomous vehicle, maintenance of the autonomous vehicle, sensor calibration of the autonomous vehicle, sensor cleaning of the autonomous vehicle, and unloading of items transported by the autonomous vehicle.
Clause 10: the method of clause 8, further comprising:
Receiving an inspection report associated with an inspection of an autonomous vehicle in a fleet at a short-term terminal site; and
An inspection report is provided that is sent to an autonomous vehicle management system.
Clause 11: the method of clause 8, further comprising determining that a short-term end-point site should be established to provide support for the concept-verified route or the temporary route.
Clause 12: the method of clause 11, further comprising: it is determined that a concept-verifying route or a temporary route is required after at least one of an increase in traffic volume in an area of a short-term terminal station and a need for a fleet supporting autonomous vehicles is detected within a week from a current time.
Clause 13: the method of clause 8, further comprising:
Determining route data indicating geographic coordinates of a route traveled by a fleet of autonomous vehicles to reach a short-term terminal site; and
Geographic coordinates are provided to at least one autonomous vehicle in a fleet of autonomous vehicles.
Clause 14: the method of clause 8, further comprising: a request for off-terminal maintenance at another location is received, wherein after receiving the request, the manually operated vehicle is moved to a location with an autonomous vehicle repair kit.
Clause 15: a system, comprising:
A manually operated vehicle storing equipment configured to establish a short-term terminal site operable to support movement of a fleet of autonomous vehicles, wherein the equipment stored in the manually operated vehicle comprises one or more of: a secure data storage medium, an autonomous vehicle repair kit, a sensor repair and calibration tool, and a terminal site setup kit including a location profile configured to establish an area within a physical space as part of a short-term terminal site; and
A control subsystem includes a hardware processor configured to provide instructions to a fleet of autonomous vehicles, the instructions including a location of a short-term end-station established using an end-station setup kit.
Clause 16: the system of clause 15, wherein the established short-term terminal site facilitates one or more of: inspection of the autonomous vehicle, maintenance of the autonomous vehicle, sensor calibration of the autonomous vehicle, sensor cleaning of the autonomous vehicle, and unloading of items transported by the autonomous vehicle.
Clause 17: the system of clause 15, wherein the hardware processor is further configured to:
Receiving an inspection report associated with an inspection of an autonomous vehicle in a fleet at a short-term terminal site; and
An inspection report is provided that is sent to an autonomous vehicle management system.
Clause 18: the system of clause 15, wherein the hardware processor is further configured to determine that a short-term end-station should be established to provide support for the concept-verification route or the temporary route.
Clause 19: the system of clause 18, wherein the hardware processor is further configured to: it is determined that a concept-verifying route or a temporary route is required after at least one of an increase in traffic volume in an area of a short-term terminal station and a need for a fleet supporting autonomous vehicles is detected within a week from a current time.
Clause 20: the system of clause 15, wherein:
the vehicle further includes a set of sensors including a Global Positioning System (GPS) transceiver operable to determine route data indicative of geographic coordinates of a route traveled by a fleet of autonomous vehicles to reach the short-term terminal site; and
The hardware processor is further configured to provide geographic coordinates to at least one autonomous vehicle in the fleet of autonomous vehicles.
Clause 21: the system of clause 15, wherein the hardware processor is further configured to receive a request for off-terminal maintenance at another location, wherein after receiving the request, the manually operated vehicle is moved to the location with the autonomous vehicle repair kit.
Clause 22: a mobile terminal system for operating a fleet of autonomous vehicles, the mobile terminal system comprising:
A location profile configured to establish a terminal site within the physical space when deployed, wherein the established terminal site includes at least one landing stage sized and shaped to accommodate autonomous vehicles in the fleet; and
A control subsystem comprising a hardware processor configured to:
determining that an autonomous vehicle in the fleet is inbound to a terminal station established in the forward direction;
after determining that the autonomous vehicle is standing forward to the established terminal station, determining a standing order indicating a standing platform in which the standing autonomous vehicle is to stop and a route that the standing autonomous vehicle is to travel in order to reach the standing platform; and
A landing instruction is provided to the inbound autonomous vehicle, wherein the landing instruction causes the inbound autonomous vehicle to travel along the route to the landing stage.
Clause 23: the mobile terminal system of clause 22, wherein:
The control subsystem further includes a set of sensors including a Global Positioning System (GPS) transceiver operable to determine route data indicative of an external route traveled by autonomous vehicles in the fleet to reach the established terminal site; and
The hardware processor is further configured to provide route data to at least one autonomous vehicle in the fleet.
Clause 24: the mobile terminal system of clause 23, wherein:
The hardware processor is further configured to receive an instruction indicating that a new terminal station needs to be established;
the global positioning system is operable to determine new route data indicative of a new route traveled by autonomous vehicles in the fleet in order to reach the new terminal site; and
The hardware processor is further configured to provide new route data to at least one autonomous vehicle in the fleet.
Clause 25: the mobile terminal system of clause 22, wherein the hardware processor is further configured to determine that the inbound autonomous vehicle is heading for the established terminal site by receiving a landing request from the inbound autonomous vehicle, the landing request comprising a request to grant permission to the inbound autonomous vehicle to dock at the landing stage of the established terminal site.
Clause 26: the mobile terminal system of clause 22, wherein the hardware processor is further configured to determine that the inbound autonomous vehicle is on a route to the established terminal site when one or both of the following are satisfied: (i) An inbound autonomous vehicle is within a threshold distance of the established terminal station, and (ii) the inbound autonomous vehicle is traveling along a known route to the established terminal station.
Clause 27: the mobile terminal system of clause 22, wherein the landing instructions comprise at least one of: the time at which the inbound autonomous vehicle is able to enter the established terminal station, the route the inbound autonomous vehicle travels within the established terminal station in order to reach the landing stage, the location of the landing stage within the established terminal station, and the identifier of the landing stage.
Clause 28: the mobile terminal system of clause 23, wherein:
The system further includes one or more mobile sensors deployed within the established terminal site; and
The hardware processor is further configured to:
Receiving movement data from one or more movement sensors indicating the amount of traffic within the established terminal station;
After receiving the movement data, determining an updated landing instruction that causes the inbound autonomous vehicle to avoid traffic while traveling to the landing stage; and
Updated landing instructions are provided to the inbound autonomous vehicle.
Clause 29: the mobile terminal system of clause 23, wherein:
The system further includes one or more occupancy sensors disposed in, on, or near the landing stage to determine whether the landing stage is occupied; and
The hardware processor is further configured to:
receiving occupancy data from one or more occupancy sensors indicating that the landing stage is occupied;
After receiving the occupancy data, determining an updated landing instruction that prevents an inbound autonomous vehicle from entering the landing bay while the landing bay is occupied; and
Updated landing instructions are provided to the inbound autonomous vehicle.
Clause 30: the mobile terminal system of clause 22, wherein the hardware processor is further configured to:
Determining that the inbound autonomous vehicle has arrived at the landing stage; and
After determining that the inbound autonomous vehicle has arrived at the landing stage, an alert is provided to initiate a post-landing activity.
Clause 31: a mobile terminal system for operating a fleet of autonomous vehicles, the mobile terminal system comprising:
a location profile configured when deployed to establish a terminal site within the physical space, wherein the established terminal site includes an engine station sized and shaped to accommodate autonomous vehicles in the fleet; and
A control subsystem comprising a hardware processor communicatively coupled to the memory and configured to:
Determining that an autonomous vehicle in the fleet is requesting departure from the launch pad;
after determining that the autonomous vehicle is requesting departure from the launch pad, determining a launch command indicating whether the autonomous vehicle is capable of exiting the launch pad and a route the autonomous vehicle is to travel after exiting the launch pad; and
An launch instruction is provided to the autonomous vehicle, wherein the launch instruction causes the autonomous vehicle to exit the launch pad and move along the route.
Clause 32: the mobile terminal system of clause 31, wherein the control subsystem further comprises:
the control subsystem further includes a set of sensors including a Global Positioning System (GPS) transceiver operable to determine route data indicative of an external route traveled by autonomous vehicles in the fleet to reach the established terminal site; and
The hardware processor is further configured to provide route data to at least one autonomous vehicle in the fleet.
Clause 33: the mobile terminal system of clause 32, wherein:
The hardware processor is further configured to receive an instruction indicating that a new terminal station needs to be established;
the global positioning system is operable to determine new route data indicative of a new route traveled by autonomous vehicles in the fleet in order to reach the new terminal site; and
The hardware processor is further configured to provide new route data to at least one autonomous vehicle in the fleet.
Clause 34: the mobile terminal system of clause 31, wherein the initiation instructions comprise at least one of: the time at which the autonomous vehicle can depart from the launch pad, and the route the autonomous vehicle travels within the established terminal station in order to depart from the launch pad.
Clause 35: the mobile terminal system of clause 31, wherein the hardware processor is further configured to determine that the area surrounding the engine mount is unoccupied prior to providing the engine instructions.
Clause 36: the mobile terminal system of clause 35, wherein the hardware processor is further configured to: the determination that the area around the dock is unoccupied is made by using the sensor data to determine that the area around the dock is free of an object, animal, or person that prevents the autonomous vehicle from moving out of the dock.
Clause 37: the mobile terminal system of clause 35, wherein:
The system further includes one or more sensors disposed in, on, or near the engine mount to determine whether an area surrounding the engine mount is unoccupied; and
The hardware processor is further configured to:
receiving sensor data from one or more sensors indicating that an area surrounding the engine mount is occupied;
upon receiving the sensor data, determining an updated launch command that prevents the autonomous vehicle from proceeding from the launch pad while the area surrounding the launch pad is occupied; and
Updated launch instructions are provided to the autonomous vehicle.
Clause 38: the mobile terminal system of clause 35, wherein:
The system further includes one or more mobile sensors deployed within the established terminal site; and
The hardware processor is further configured to:
Receiving sensor data from one or more mobile sensors indicative of traffic within the established terminal site;
upon receiving the sensor data, determining an updated launch command that causes the autonomous vehicle to move away from the launch pad and avoid traffic; and
Updated launch instructions are provided to the autonomous vehicle.
Clause 39: the mobile terminal system of clause 35, wherein the updated launch instruction indicates an alternative route for the autonomous vehicle to travel after exiting the launch pad.
Clause 40: a mobile terminal system for operating a fleet of autonomous vehicles, the mobile terminal system comprising:
A location profile that, when deployed, is configured to establish a terminal site within a physical space, wherein the established terminal site comprises:
at least one landing stage sized and shaped to accommodate an inbound autonomous vehicle of a fleet of vehicles; and
At least one launch pad sized and shaped to receive outbound autonomous vehicles of a fleet of vehicles; and
A control subsystem comprising a hardware processor configured to:
Determining that an inbound autonomous vehicle is inbound at a terminal station established in the forward direction;
after determining that the inbound autonomous vehicle is standing at the terminal station established in the forward direction, determining a standing instruction indicating a standing station where the inbound autonomous vehicle is to stop and a route that the inbound autonomous vehicle is to travel in order to reach the standing station; providing a landing instruction to the inbound autonomous vehicle, wherein the landing instruction causes the inbound autonomous vehicle to move to the landing stage;
determining that an outbound autonomous vehicle is requesting to start from an launch station;
after determining that the outbound autonomous vehicle is requesting departure from the launch pad, determining a launch command indicating whether the outbound autonomous vehicle is capable of exiting the launch pad and a route the outbound autonomous vehicle is to travel after exiting the launch pad; and
An launch command is provided to the outbound autonomous vehicle, wherein the launch command causes the outbound autonomous vehicle to exit the launch pad and move along the route.
Clause 41: the mobile terminal system of clause 40, wherein:
the apparatus also includes one or more mobile sensors deployed within the established terminal site; and
The hardware processor is further configured to:
Receiving sensor data from one or more mobile sensors indicative of traffic volume within the established terminal site; and
After receiving the sensor data:
determining an updated landing instruction that causes an inbound autonomous vehicle to avoid traffic while traveling to the landing stage by following a first alternative route to the landing stage;
Providing updated landing instructions to the inbound autonomous vehicle;
Determining an updated launch instruction that causes the outbound autonomous vehicle to depart from the launch pad and avoid traffic by following a second alternative route that is distant from the launch pad; and
Updated launch instructions are provided to the outbound autonomous vehicle.
Clause 42: the system of any of clauses 1-7, wherein the processor is further configured to perform one or more operations of the method of any of clauses 8-14.
Clause 43: an apparatus comprising means for performing the method of any of clauses 8-14.
Clause 44: the system of any of clauses 1-7, 15-21, 22-30, 31-39, or 40-41.
Clause 45: a method, comprising:
Establishing a terminal station within the physical space, wherein the established terminal station includes at least one landing stage sized and shaped to accommodate autonomous vehicles in the fleet of vehicles;
determining that an autonomous vehicle in the fleet is inbound to a terminal station established in the forward direction;
After determining that the autonomous vehicle is standing forward to the established terminal station, determining a standing order indicating a standing bench in which the standing autonomous vehicle is to stop and a route that the standing autonomous vehicle is to travel in order to reach the standing bench; and
A landing instruction is provided to the inbound autonomous vehicle, wherein the landing instruction causes the inbound autonomous vehicle to travel along the route to the landing stage.
Clause 46: the method of clause 45, further comprising:
determining route data indicating an external route traveled by autonomous vehicles in the fleet in order to reach the established terminal; and
Route data is provided to at least one autonomous vehicle in the fleet.
Clause 47: the method of clause 46, further comprising:
receiving an instruction indicating that a new terminal station needs to be established;
determining new route data indicating a new route for autonomous vehicles in the fleet to travel in order to reach the new terminal; and
New route data is provided to at least one autonomous vehicle in the fleet.
Clause 48: the method of clause 45, further comprising:
An inbound terminal station is determined to be being established by receiving a landing request from an inbound autonomous vehicle, the landing request including a request to grant permission to the inbound autonomous vehicle to dock at a landing stage of the established terminal station.
Clause 49: the method of clause 45, further comprising:
Determining that an inbound autonomous vehicle is on a route to an established terminal site when one or both of the following are satisfied: (i) An inbound autonomous vehicle is within a threshold distance of the established terminal station, and (ii) the inbound autonomous vehicle is traveling along a known route to the established terminal station.
Clause 50: the method of clause 45, wherein the drop instruction comprises at least one of: the time at which the inbound autonomous vehicle is able to enter the established terminal station, the route the inbound autonomous vehicle travels within the established terminal station in order to reach the landing stage, the location of the landing stage within the established terminal station, and the identifier of the landing stage.
Clause 51: the method of clause 45, further comprising:
Receiving movement data from one or more movement sensors indicating the amount of traffic within the established terminal station;
After receiving the movement data, determining an updated landing instruction that causes the inbound autonomous vehicle to avoid traffic while traveling to the landing stage; and
Updated landing instructions are provided to the inbound autonomous vehicle.
Clause 52: the method of clause 45, further comprising:
receiving occupancy data from one or more occupancy sensors indicating that the landing stage is occupied;
After receiving the occupancy data, determining an updated landing instruction that prevents an inbound autonomous vehicle from entering the landing bay while the landing bay is occupied; and
Updated landing instructions are provided to the inbound autonomous vehicle.
Clause 53: the method of clause 45, further comprising:
Determining that the inbound autonomous vehicle has arrived at the landing stage; and
After determining that the inbound autonomous vehicle has arrived at the landing stage, an alert is provided to initiate a post-landing activity.
Clause 54: a non-transitory computer-readable medium storing instructions that, when executed by a processor, cause the processor to:
Determining that an autonomous vehicle in the fleet is inbound to an established terminal station within the physical space, wherein the established terminal station includes at least one landing stage sized and shaped to accommodate the autonomous vehicle in the fleet;
after determining that the autonomous vehicle is standing forward to the established terminal station, determining a standing order indicating a standing platform in which the standing autonomous vehicle is to stop and a route that the standing autonomous vehicle is to travel in order to reach the standing platform; and
A landing instruction is provided to the inbound autonomous vehicle, wherein the landing instruction causes the inbound autonomous vehicle to travel along the route to the landing stage.
Clause 55: the non-transitory computer readable medium of clause 54, wherein the processor is further configured to:
receiving occupancy data from one or more occupancy sensors indicating that the landing stage is occupied;
After receiving the occupancy data, determining an updated landing instruction that prevents an inbound autonomous vehicle from entering the landing bay while the landing bay is occupied; and
Updated landing instructions are provided to the inbound autonomous vehicle.
Clause 56: the system of any of clauses 22-30, wherein the processor is further configured to perform one or more operations of the method of any of clauses 45-53.
Clause 57: the system of any of clauses 22-30, wherein the processor is further configured to perform one or more operations of any of clauses 54-55.
Clause 58: an apparatus comprising means for performing the method of any of clauses 45-53.
Clause 59: an apparatus comprising means for performing the method of any of clauses 54-55.
Clause 60: the non-transitory computer-readable medium of any one of clauses 54-55, storing instructions that, when executed by a processor, cause the processor to perform one or more operations of the method of any one of clauses 45-53.
Clause 61: the system of any of clauses 1-7, 15-21, wherein the processor is further configured to perform one or more operations of the method of any of clauses 8-14.
Clause 62: the system of any of clauses 1-7, wherein the processor is further configured to perform one or more operations of any of clauses 15-18
Clause 63: an apparatus comprising means for performing the method of any of clauses 8-14.
Clause 64: an apparatus comprising means for performing the method of any of clauses 1-7, 15-21.
Clause 65: a method comprising one or more operations according to any one of clauses 1-7, 15-21.
Clause 66: a mobile terminal system according to any combination of clauses 22-41.
Claims (25)
1. A mobile terminal system (100) for operating a fleet of autonomous vehicles (502), the mobile terminal system (100) comprising:
A location profile (130) configured to establish an end station (202, 206, 216) within a physical space when deployed, wherein the established end station (202, 206, 216) comprises at least one landing stage (310), the at least one landing stage (310) being sized and shaped to accommodate autonomous vehicles (502) in the fleet of vehicles; and
A control subsystem (102) comprising a hardware processor (108), the hardware processor (108) configured to:
Determining that an autonomous vehicle (502) in the fleet is inbound to the established terminal station (202, 206, 216);
After determining that the autonomous vehicle (502) is standing forward to the established terminal station (202, 206, 216), determining a standing order (116), the standing order (116) indicating a standing station (310) in which the standing autonomous vehicle (510) is to park and a route along which the standing autonomous vehicle (510) is to travel to reach the standing station (310); and
-Providing the standing order (116) to the inbound autonomous vehicle (502), wherein the standing order (116) causes the inbound autonomous vehicle (502) to travel to the standing station (310) along the route.
2. The mobile terminal system (100) of claim 1, wherein:
The control subsystem (102) further comprises a set of sensors (104), the set of sensors (104) comprising a global positioning system transceiver operable to determine route data (120), the route data (120) indicating an external route for the autonomous vehicles (502) in the fleet to travel along to reach the established end stations (202, 206, 216); and
The hardware processor (108) is further configured to provide the route data (120) to at least one autonomous vehicle (502) in the fleet.
3. The mobile terminal system (100) of claim 2, wherein:
The hardware processor (108) is further configured to receive an instruction indicating that a new terminal station (202, 206, 216) needs to be established;
the global positioning system is operable to determine new route data (120), the new route data (120) indicating a new route for the autonomous vehicles (502) in the fleet to travel along to reach the new end station (202, 206, 216); and
The hardware processor (108) is further configured to provide the new route data (120) to the at least one autonomous vehicle (502) in the fleet.
4. The mobile terminal system (100) of claim 1, wherein the hardware processor (108) is further configured to determine that an inbound autonomous vehicle (502) is inbound to the established terminal station (202, 206, 216) by receiving a landing request (322) from the inbound autonomous vehicle (502), the landing request (322) comprising a request for permission for the inbound autonomous vehicle (502) to be granted to dock at the landing stage (310) of the established terminal station (202, 206, 216).
5. The mobile terminal system (100) of claim 1, wherein the hardware processor (108) is further configured to determine that the autonomous vehicle (502) inbound is on a route to the established terminal station (202, 206, 216) when one or both of the following are satisfied: (i) The autonomous vehicle (502) inbound is within a threshold distance of the end station (202, 206, 216) established, and (ii) the autonomous vehicle (502) inbound is traveling along a known route (204, 214) to the end station (202, 206, 216) established.
6. The mobile terminal system (100) of claim 1, wherein the landing instructions (116) include at least one of: -a time at which the inbound autonomous vehicle (502) is able to enter the established end station (202, 206, 216), -a route within the established end station (202, 206, 216) along which the inbound autonomous vehicle (502) is to travel to reach the landing stage (310), -a position of the landing stage (310) within the established end station (202, 206, 216), and-an identifier of the landing stage (310).
7. The mobile terminal system (100) of claim 1, wherein:
the system (100) further comprises one or more movement sensors (316) deployed within the established end stations (202, 206, 216); and
The hardware processor (108) is further configured to:
-receiving movement data from the one or more movement sensors (316) indicative of the amount of traffic within the end station (202, 206, 216) established;
After receiving the movement data, determining an updated landing instruction (116), the updated landing instruction (116) causing the inbound autonomous vehicle (502) to avoid traffic while traveling to the landing stage (310); and
-Providing the updated landing instructions (116) to the autonomous vehicle (502) that is inbound.
8. The mobile terminal system (100) of claim 1, wherein:
The system (100) further includes one or more occupancy sensors (320) disposed in, on, or near the landing stage (310) to determine whether the landing stage (310) is occupied; and
The hardware processor (108) is further configured to:
-receive occupancy data from the one or more occupancy sensors (320) indicating that the landing stage (310) is occupied;
After receiving the occupancy data, determining an updated landing instruction (116), the updated landing instruction (116) preventing the autonomous vehicle (502) that is inbound from entering the landing stage (310) while the landing stage (310) is occupied; and
-Providing the updated landing instructions (116) to the autonomous vehicle (502) that is inbound.
9. The mobile terminal system (100) of claim 1, wherein the hardware processor (108) is further configured to:
Determining that the autonomous vehicle (502) inbound has arrived at the landing stage (310); and
After determining that the autonomous vehicle (502) that is inbound has reached the landing stage (310), an alert (340) to initiate a post-landing activity is provided.
10. A method (400) comprising:
-establishing (406) an end station (202, 206, 216) within a physical space, wherein the end station (202, 206, 216) established comprises at least one landing stage (310), the at least one landing stage (310) being sized and shaped to accommodate autonomous vehicles (502) in a fleet of vehicles;
-determining (408) that an autonomous vehicle (502) in the fleet is inbound to the established end station (202, 206, 216);
After determining that the autonomous vehicle (502) is standing forward to the established terminal station (202, 206, 216), determining (410) a landing instruction (116), the landing instruction (116) indicating a landing stage (310) in which the standing autonomous vehicle (502) is to park and a route along which the standing autonomous vehicle (502) is to travel to reach the landing stage (310); and
-Providing the standing order (116) to the inbound autonomous vehicle (502), wherein the standing order (116) causes the inbound autonomous vehicle (502) to travel to the standing station (310) along the route.
11. The method (400) of claim 10, further comprising:
determining route data (120), the route data (120) indicating an external route for an autonomous vehicle (502) in the fleet to travel along to reach the established end station (202, 206, 216); and
The route data (120) is provided to at least one autonomous vehicle (502) in the fleet.
12. The method (400) of claim 11, further comprising:
receiving an instruction indicating that a new end station (202, 206, 216) needs to be established;
determining new route data (120), the new route data (120) indicating a new route (204, 214) for the autonomous vehicle (502) in the fleet to travel along to reach the new end station (202, 206, 216); and
-Providing the new route data (120) to the at least one autonomous vehicle (502) in the fleet.
13. The method (400) of claim 10, further comprising:
Determining that the autonomous vehicle (502) inbound is inbound to the established terminal station (202, 206, 216) by receiving a landing request (322) from the autonomous vehicle (502) inbound, the landing request (322) comprising a request for permission for the autonomous vehicle (502) inbound to be granted to dock at the landing stage (310) of the established terminal station (202, 206, 216).
14. The method (400) of claim 10, further comprising:
Determining that the autonomous vehicle (502) inbound is on a route to the end station (202, 206, 216) established when one or both of the following are satisfied: (i) The autonomous vehicle (502) inbound is within a threshold distance of the end station (202, 206, 216) established, and (ii) the autonomous vehicle (502) inbound is traveling along a known route (204, 214) to the end station (202, 206, 216) established.
15. The method (400) of claim 10, wherein the drop instruction (116) includes at least one of: -a time at which the inbound autonomous vehicle (502) is able to enter the established end station (202, 206, 216), -a route within the established end station (202, 206, 216) along which the inbound autonomous vehicle (502) is to travel to reach the landing stage (310), -a position of the landing stage (310) within the established end station (202, 206, 216), and-an identifier of the landing stage (310).
16. The method (400) of claim 10, further comprising:
Receiving movement data from one or more movement sensors (316) indicative of an established amount of traffic within the end station (202, 206, 216);
After receiving the movement data, determining an updated landing instruction (116), the updated landing instruction (116) causing the inbound autonomous vehicle (502) to avoid traffic while traveling to the landing stage (310); and
-Providing the updated landing instructions (116) to the autonomous vehicle (502) that is inbound.
17. The method (400) of claim 10, further comprising:
receiving occupancy data from one or more occupancy sensors (320) indicating that the landing stage (310) is occupied;
After receiving the occupancy data, determining an updated landing instruction (116), the updated landing instruction (116) preventing the autonomous vehicle (502) that is inbound from entering the landing stage (310) while the landing stage (310) is occupied; and
-Providing the updated landing instructions (116) to the autonomous vehicle (502) that is inbound.
18. The method (400) of claim 10, further comprising:
Determining that the autonomous vehicle (502) inbound has arrived at the landing stage (310); and
After determining that the autonomous vehicle (502) that is inbound has reached the landing stage (310), an alert (304) to initiate a post-landing activity is provided.
19. A non-transitory computer readable medium (110), the non-transitory computer readable medium (110) storing instructions that, when executed by a processor (108), cause the processor (108) to:
Determining that an autonomous vehicle (502) in a fleet is inbound to an established end station (202, 206, 216) within a physical space, wherein the established end station (202, 206, 216) includes at least one landing stage (310), the at least one landing stage (310) being sized and shaped to accommodate the autonomous vehicle (502) in the fleet;
After determining that the autonomous vehicle (502) is standing forward to the established terminal station (202, 206, 216), determining a landing instruction (116), the landing instruction (116) indicating a landing stage (310) in which the standing autonomous vehicle (502) is to park and a route along which the standing autonomous vehicle (502) is to travel to reach the landing stage (310); and
-Providing the standing order (116) to the inbound autonomous vehicle (502), wherein the standing order (116) causes the inbound autonomous vehicle (502) to travel to the standing station (310) along the route.
20. The non-transitory computer readable medium (110) of claim 19, wherein the processor (108) is further configured to:
receiving occupancy data from one or more occupancy sensors (320) indicating that the landing stage (310) is occupied;
After receiving the occupancy data, determining an updated landing instruction (116), the updated landing instruction (116) preventing the autonomous vehicle (502) that is inbound from entering the landing stage (310) while the landing stage (310) is occupied; and
-Providing the updated landing instructions (116) to the autonomous vehicle (502) that is inbound.
21. The system (100) according to any one of claims 1-9, wherein the instructions further cause the processor (108) to perform one or more operations of the method (400) according to any one of claims 10-18.
22. The system (100) of any of claims 1-9, wherein the instructions further cause the processor (108) to perform one or more operations of any of claims 19-20.
23. An apparatus comprising means for performing the method (400) according to any of claims 10-18.
24. An apparatus comprising means for performing one or more instructions of any of claims 19-20.
25. The non-transitory computer-readable medium (110) according to any one of claims 19-20, the non-transitory computer-readable medium (110) storing instructions that, when executed by the processor (108), cause the processor (108) to perform one or more operations of the method (400) according to any one of claims 10-18.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202163265734P | 2021-12-20 | 2021-12-20 | |
US63/265,734 | 2021-12-20 | ||
US63/265,728 | 2021-12-20 | ||
PCT/US2022/081949 WO2023122546A1 (en) | 2021-12-20 | 2022-12-19 | Mobile terminal system for autonomous vehicles |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118742868A true CN118742868A (en) | 2024-10-01 |
Family
ID=92851602
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202280091743.7A Pending CN118742868A (en) | 2021-12-20 | 2022-12-19 | Mobile terminal system for autonomous vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118742868A (en) |
-
2022
- 2022-12-19 CN CN202280091743.7A patent/CN118742868A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9547986B1 (en) | Lane assignments for autonomous vehicles | |
CN106043293B (en) | Method and device for safely parking a vehicle | |
JP7065765B2 (en) | Vehicle control systems, vehicle control methods, and programs | |
JP2020166633A (en) | Management device, management method and program | |
US20210333117A1 (en) | Autonomous vehicle route planning | |
JP6908549B2 (en) | Vehicle control device and vehicle control system | |
US11922808B2 (en) | Vehicle-based rotating camera methods and systems | |
US20230331255A1 (en) | Landing pad for autonomous vehicles | |
CN118176406A (en) | Optimized route planning application for servicing autonomous vehicles | |
US20190197472A1 (en) | Server device and vehicle dispatching method | |
JP2021068232A (en) | Automated parking system | |
US20200234572A1 (en) | Platform and method for monitoring an infrastructure for transport vehicles, associated vehicle, transport system and computer program | |
US11380109B2 (en) | Mobile launchpad for autonomous vehicles | |
JP7058234B2 (en) | Vehicle control device, information providing device, information providing system, vehicle control method, information providing method, and program | |
WO2022244285A1 (en) | Insurance premium calculation method, program and insurance premium calculation system | |
CN115465262A (en) | Method, device and storage medium for at least partially automatically transferring a motor vehicle | |
US11613381B2 (en) | Launchpad for autonomous vehicles | |
CN118742868A (en) | Mobile terminal system for autonomous vehicle | |
US20220366369A1 (en) | Delivery fleet management | |
US20230060036A1 (en) | Automatic parking lot management system, automatic parking lot management method, and storage medium | |
US20230195106A1 (en) | Mobile terminal system for autonomous vehicles | |
WO2023122546A1 (en) | Mobile terminal system for autonomous vehicles | |
EP3958234A1 (en) | Launchpad for autonomous vehicles | |
US20230384797A1 (en) | System and method for inbound and outbound autonomous vehicle operations | |
JP7548164B2 (en) | Automatic parking system, automatic parking method and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication |