[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN118235061A - Compact LiDAR system for detecting objects in blind spot areas - Google Patents

Compact LiDAR system for detecting objects in blind spot areas Download PDF

Info

Publication number
CN118235061A
CN118235061A CN202280073052.4A CN202280073052A CN118235061A CN 118235061 A CN118235061 A CN 118235061A CN 202280073052 A CN202280073052 A CN 202280073052A CN 118235061 A CN118235061 A CN 118235061A
Authority
CN
China
Prior art keywords
light
lidar
fov
lidar system
scanning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280073052.4A
Other languages
Chinese (zh)
Inventor
李宇锋
王浩森
孟庆麟
李义民
鲍君威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taida Intelligent American Co ltd
Original Assignee
Taida Intelligent American Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/975,539 external-priority patent/US20230136272A1/en
Application filed by Taida Intelligent American Co ltd filed Critical Taida Intelligent American Co ltd
Priority claimed from PCT/US2022/048294 external-priority patent/WO2023076635A1/en
Publication of CN118235061A publication Critical patent/CN118235061A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9315Monitoring blind spots
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

A light detection and ranging LiDAR system (800) for detecting objects in a blind spot region is provided. The system includes a housing (810) and a scanning-based LiDAR assembly disposed in the housing. The scanning-based LiDAR assembly includes a first light source (871), a multi-faceted polygon (820), a window (830), a collimating lens (860), a combiner (850), an opening (852), a collection lens (840), a light detector (881), a laser circuit board (871), and a detector circuit board (880). The first light source (871) provides a plurality of light beams. A multi-faceted polygon (820) is rotatable to scan the plurality of light beams to illuminate the FOV. The multi-faceted polygon (820) and the first light source (810) are vertically stacked, preferably the first light source (810) is on top of the multi-faceted polygon (820). A collimating lens (860) is optically coupled to the first light source (810) and collimates the plurality of light beams (890). One or more collection lenses (840) collect return light (895) generated based on the illumination of the first FOV. The light detector (881) receives the collected return light (895). The polygon mirror (820) has four wedge facets of variable tilt angle, and in order to detect objects in the blind spot region of a vehicle, the LiDAR system needs to have a longer detection distance and a larger vertical field of view. Thus, the LiDAR (800) preferably further includes a non-scanning based flash LiDAR assembly based on different wavelengths, which is also packaged in a housing (810) to maintain a compact design.

Description

Compact LiDAR system for detecting objects in blind spot areas
Cross Reference to Related Applications
The present application claims priority from U.S. provisional patent application serial No. 63/273,802 entitled "compact Lidar system for detecting objects in blind spot areas" filed on day 10, month 29 of 2021, and U.S. provisional patent application serial No. 63/292,404 entitled "compact Lidar system for detecting objects in blind spot areas" filed on day 12, month 21 of 2021. Furthermore, the present application claims priority from U.S. patent application Ser. No. 17/975,539 entitled "compact Lidar System for detecting objects in blind spot areas" filed on day 10, 2022, and U.S. patent application Ser. No. 17/975,543 entitled "compact Lidar System for detecting objects in blind spot areas" filed on day 10, 2022. The contents of all four of the above applications are hereby incorporated by reference in their entirety for all purposes. In addition, the present application relates to a co-pending PCT application titled "compact Lidar System for detecting objects in blind spot areas" filed on day 28, 10, 2022 (attorney docket No. 10325-2004740).
Technical Field
The present disclosure relates generally to optical scanning, and more particularly, to a compact light detection and ranging (LiDAR) system for detecting objects in blind spot areas.
Background
Light detection and ranging (LiDAR) systems use light pulses to create an image or point cloud of an external environment. Some typical LiDAR systems include a light source, a light emitter, a light turning system, and a light detector. The light source generates a beam of light that is directed in a particular direction by the light turning system when transmitted from the LiDAR system. When the transmitted beam is scattered by an object, a portion of the scattered light is returned to the LiDAR system as a return light pulse. The light detector detects the return light pulse. Using the difference between the time at which the return light pulse is detected and the time at which the corresponding light pulse in the beam is transmitted, the LiDAR system may use the speed of light to determine the distance to the object. The light steering system may direct the light beams along different paths to allow the LiDAR system to scan the surrounding environment and produce an image or point cloud. LiDAR systems may also use techniques other than time-of-flight and scanning to measure the surrounding environment.
Disclosure of Invention
Embodiments discussed herein relate to LiDAR systems and methods that may detect objects located in a blind spot region of a vehicle LiDAR system. Unlike a main LiDAR system of a vehicle, which is typically mounted on the roof of a vehicle, a LiDAR system targeting blind spot areas is typically mounted on the side or rear of the vehicle. For example, a LiDAR system capable of detecting objects in a blind spot area may be mounted in a vehicle side view mirror compartment, in a side view mirror support arm, or on a bumper, fender, side panel, or body of a vehicle. In order to be able to fit into smaller spaces rather than open spaces on the roof, the size of a LiDAR system that is able to detect objects in the blind spot area is compact.
In addition, objects located in the blind spot area may be either remote or nearby. Detection of remote objects requires that the LiDAR system have a longer detection range in the horizontal field of view ("FOV"), but may not require a large vertical FOV. Detection of nearby objects requires a greater vertical FOV of the LiDAR system, but may not require a long detection range. The embodiments discussed herein enable detection of both remote objects and nearby objects in one LiDAR system while maintaining a compact design of the system.
In one embodiment, a LiDAR system is provided for use with a vehicle to detect objects in a blind spot area. The LiDAR system includes a housing and a scanning-based LiDAR assembly disposed in the housing. The scanning-based LiDAR assembly includes a first light source configured to provide a plurality of light beams. The scanning-based LiDAR assembly also includes a multi-faceted polygon (multi-face polygon) rotatable to scan the plurality of light beams to illuminate the first FOV. The multi-faceted polygon and the first light source are vertically stacked. The scanning-based LiDAR assembly further includes one or more collimating lenses optically coupled to the first light source. Further, the collimating lens is configured to collimate the plurality of light beams provided by the first light source. The scanning-based LiDAR assembly further includes one or more collection lenses configured to collect return light generated based on the illumination of the first FOV. The scanning-based LiDAR assembly also includes a light detector configured to receive the collected return light.
In one embodiment, a method for detecting an object in a blind spot region is provided. The method comprises the following steps: a plurality of light beams is provided by a first light source. The method further comprises the steps of: the plurality of light beams is scanned by the multi-faceted polygon to illuminate the first FOV. The multi-faceted polygon is rotatable and disposed below the first light source. The method further comprises the steps of: the plurality of light beams provided by the first light source are collimated by one or more collimating lenses optically coupled to the first light source. In addition, the method further comprises: the return light generated based on the illumination of the first FOV is collected by one or more receiving lenses. Furthermore, the method comprises the following steps: both the plurality of light beams provided by the first light source and the collected return light are guided by a combining mirror provided between the collimating lens and the receiving lens. The method further comprises the steps of: the collected light is received by a light detector.
Drawings
The application may best be understood by referring to the following description of embodiments taken in conjunction with the accompanying drawings, in which like parts are referenced by like numerals.
FIG. 1 illustrates one or more exemplary LiDAR systems disposed or included in a motor vehicle.
FIG. 2 is a block diagram illustrating interactions between an exemplary LiDAR system and a plurality of other systems including a vehicle perception and planning system.
FIG. 3 is a block diagram illustrating an exemplary LiDAR system.
Fig. 4 is a block diagram illustrating an exemplary fiber-based laser source.
FIGS. 5A-5C illustrate an exemplary LiDAR system that uses pulsed signals to measure distance to objects disposed in a field of view (FOV).
FIG. 6 is a block diagram illustrating an exemplary device for implementing the systems, devices, and methods in various embodiments.
Fig. 7A illustrates a driver horizontal blind spot area on a road from a top view.
Fig. 7B illustrates, from a perspective view, the area of the driver vertical blind spot area.
FIG. 8 illustrates a LiDAR system capable of detecting objects in blind spot areas, according to one embodiment.
FIG. 9 illustrates a vertical FOV of a LiDAR system capable of detecting objects in a blind spot region from a perspective view, according to one embodiment.
FIG. 10 is a block diagram of a LiDAR system capable of detecting objects in blind spot areas, according to one embodiment.
FIG. 11 illustrates a scan-based LiDAR assembly having a flat configuration.
FIG. 12A illustrates the FOV of a scan-based LiDAR assembly having a flat configuration.
FIG. 12B illustrates the FOV of a scan-based LiDAR assembly with a stacked configuration.
FIG. 13 illustrates a cross-sectional view of a scanning-based LiDAR assembly having a stacked configuration, according to one embodiment.
FIG. 14A illustrates a perspective view of a variable angle multi-faceted polygon ("VAMFP") according to one embodiment.
FIG. 14B illustrates a side view of each facet of a variable angle multi-faceted polygon in accordance with one embodiment.
FIG. 14C illustrates a LiDAR system FOV with a combined strip of the plurality of facets from VAMFP, according to one embodiment.
Fig. 15A illustrates a Vertical Cavity Surface Emitting Laser (VCSEL) chip with a1 x 8 emitter array.
Fig. 15B illustrates an array of six VCSEL chips, each having a1 x 8 emitter region array.
FIG. 16A illustrates a configuration of a single collimating lens of a scanning-based LiDAR assembly.
FIG. 16B illustrates a configuration of a collimating lens group lens of a scanning-based LiDAR assembly.
FIG. 17A illustrates a configuration of a single receive lens of a scanning-based LiDAR assembly.
FIG. 17B illustrates a configuration of a receive lens set of a scanning-based LiDAR assembly.
Fig. 18 is a flowchart illustrating a method for detecting an object in a blind spot region.
Detailed Description
The following description sets forth numerous specific details, such as specific configurations, parameters, examples, etc., in order to provide a more thorough understanding of the present invention. It should be recognized, however, that such description is not intended as a limitation on the scope of the present invention, but is instead intended to provide a better description of the exemplary embodiments.
Throughout the specification and claims, the following terms take the meanings explicitly associated herein, unless the context clearly dictates otherwise:
The phrase "in one embodiment" as used herein does not necessarily refer to the same embodiment, although it may. Accordingly, as described below, various embodiments of the present disclosure may be readily combined without departing from the scope or spirit of the present invention.
As used herein, the term "or" is an inclusive expression "or" and is equivalent to the term "and/or" unless the context clearly dictates otherwise.
The term "based on" is not exclusive and allows for being based on additional factors not described unless the context clearly dictates otherwise.
As used herein, unless the context dictates otherwise, the term "coupled to" is intended to include both direct coupling (where two elements coupled to each other are in contact with each other) and indirect coupling (where at least one additional element is located between the two elements). Accordingly, the terms "coupled to" and "coupled with … …" are used synonymously. Within the context of a network environment in which two or more components or devices are capable of exchanging data, the terms "coupled to" and "coupled with … …" are also used to mean "communicatively coupled with … …," possibly via one or more intermediary devices.
Although the following description uses the terms "first," "second," etc. to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element. For example, the first wavelength may be referred to as the second wavelength, and similarly, the second wavelength may be referred to as the first wavelength, without departing from the scope of the various described examples. Both the first wavelength and the second wavelength may be wavelengths, and in some cases may be separate and distinct wavelengths.
In addition, throughout the specification, the meaning of "a", "an", and "the" include plural references, and the meaning of "in … …" includes "in … …" and "on … …".
While some of the various embodiments presented herein constitute a single combination of inventive elements, it should be understood that the inventive subject matter is to be considered to include all possible combinations of the disclosed elements. Thus, if one embodiment includes elements A, B and C and another embodiment includes elements B and D, then the inventive subject matter is also considered to include other remaining combinations of A, B, C or D, even if not explicitly discussed herein. Further, the transitional term "comprising" means having the parts or members, or those parts or members. As used herein, the transitional term "comprising" is inclusive or open-ended and does not exclude additional, unrecited elements or method steps.
Throughout the following disclosure, numerous references may be made to servers, services, interfaces, engines, modules, clients, peers, portals, platforms, or other systems formed by computing devices. It should be appreciated that use of such terms is considered to represent one or more computing devices having at least one processor (e.g., ASIC, FPGA, PLD, DSP, ×86, ARM, RISC-V, coldFire, GPU, multi-core processor, etc.) configured to execute software instructions stored on a computer-readable tangible, non-transitory medium (e.g., hard drive, solid state drive, RAM, flash memory, ROM, etc.). For example, a server may comprise one or more computers that operate as a web server, database server, or other type of computer server in a manner that performs the described roles, responsibilities, or functions. It should be further appreciated that the disclosed computer-based algorithms, processes, methods, or other types of instruction sets may be embodied as a computer program product comprising a non-transitory tangible computer-readable medium storing instructions that cause a processor to perform the disclosed steps. Various servers, systems, databases, or interfaces may exchange data using standardized protocols or algorithms, possibly based on HTTP, HTTPS, AES, public-private key exchanges, web services APIs, known financial transaction protocols, or other electronic information exchange methods. The data exchange may be over a packet-switched network, a circuit-switched network, the internet, LAN, WAN, VPN, or other type of network.
As used in the description herein and in the claims that follow, when a system, engine, server, device, module, or other computing element is described as being configured to perform or execute a function on data in memory, the meaning of "configured to" or "programmed to" is defined as one or more processors or cores of the computing element being programmed by a set of software instructions stored in the memory of the computing element to perform a set of functions on target data or data objects stored in memory.
It should be noted that any language pointing to a computer should be construed to include any suitable combination of computing devices or network platforms, including servers, interfaces, systems, databases, proxies, peers, engines, controllers, modules, or other types of computing devices operating alone or in concert. It should be appreciated that the computing device includes a processor configured to execute software instructions stored on a tangible, non-transitory computer-readable storage medium (e.g., hard drive, FPGA, PLA, solid state drive, RAM, flash memory, ROM, etc.). The software instructions configure or program the computing device to provide roles, responsibilities, or other functions as discussed below with respect to the disclosed apparatus. Further, the disclosed techniques may be embodied as a computer program product comprising a non-transitory computer-readable medium storing software instructions that cause a processor to perform the disclosed steps associated with the implementation of a computer-based algorithm, process, method, or other instruction. In some embodiments, the various servers, systems, databases, or interfaces exchange data using standardized protocols or algorithms, possibly based on HTTP, HTTPS, AES, public-private key exchanges, web service APIs, known financial transaction protocols, or other electronic information exchange methods. Data exchange among devices may be through a packet-switched network, the internet, LAN, WAN, VPN or other type of packet-switched network, circuit-switched network, cell-switched network, or other type of network.
In this disclosure, when discussing the vertical angle of the FOV of a LiDAR system, zero degrees refers to the direction pointing from the LiDAR system parallel to the ground, i.e., the direction when drawing a horizontal line from the LiDAR system. Ninety degrees refers to the direction pointing perpendicularly from the LiDAR system toward the ground, i.e., the direction when the gravity line is drawn from the LiDAR system. Negative degrees refer to the angle between the horizon and the direction pointing upward from the LiDAR system above the horizon.
In some embodiments, a forward facing LiDAR system mounted on top of a vehicle requires detection of distant objects in a horizontal direction. This is because, when the vehicle moves forward, an object in front of the vehicle (such as an automobile, a pedestrian crossing a road, or a traffic sign and signal) is critical to safe driving of the vehicle. These objects may be located at a remote distance (e.g., several blocks away), but the vehicle should still be able to detect them to make the correct driving decisions. However, such LiDAR systems may not need to detect objects in a large vertical direction, as objects located within about 50 to 90 of the vertical FOV may be the windshield and hood of a vehicle. An example LiDAR system mounted forward of the vehicle roof and facing forward may have a FOV of 120 ° in the horizontal FOV and 30 ° in the vertical FOV. Such systems, while having a smaller FOV, can detect objects at a far distance (e.g., beyond 100 meters).
The aforementioned LiDAR systems have blind spot areas, such as areas outside the FOV of the LiDAR system, including areas on both sides of the vehicle and behind the vehicle. These blind spot areas are critical for safe driving of the vehicle when the vehicle is, for example, turning, changing lanes, reversing or stopping. Thus, in some embodiments, one or more separate LiDAR systems are required to detect objects in the blind spot area. These objects are sometimes in close proximity to the vehicle, such as a curb, a hydrant on the curb, or a child playing behind the vehicle, etc. In order to detect objects at close distances, a large vertical FOV is required. An example LiDAR system configured to detect blind spot regions may have a larger FOV (compared to the example LiDAR system in the previous paragraph) that is 120 ° in the horizontal FOV and 70 ° in the vertical FOV. In addition, it may also be desirable to detect objects beyond 100 meters away when the vehicle turns, such as a fast approaching vehicle attempting to make a red light on the other side of the intersection. Thus, to assist in turning and lane changing of a vehicle, etc., a LiDAR system may need to be able to detect objects located both nearby and remotely. Thus, such LiDAR systems need to have both a long detection range and a large vertical FOV.
The present disclosure discloses systems and methods for detecting both nearby objects having a large FOV and more distant objects having a smaller FOV while maintaining a compact size such that the LiDAR system may fit into, for example, a side view mirror or side panel of a vehicle.
Embodiments of the present invention are described below. In various embodiments of the present invention, one embodiment of a LiDAR system includes a housing and a scanning-based LiDAR assembly disposed in the housing. The scanning-based LiDAR assembly includes a first light source configured to provide a plurality of light beams. The scanning-based LiDAR assembly further includes a multi-faceted polygon rotatable to scan the plurality of light beams to illuminate the first FOV. The multi-faceted polygon and the first light source are vertically stacked. The scanning-based LiDAR assembly further includes one or more collimating lenses optically coupled to the first light source. Further, the collimating lens is configured to collimate the plurality of light beams provided by the first light source. The scanning-based LiDAR assembly further includes one or more collection lenses configured to collect return light generated based on the illumination of the first FOV. The scanning-based LiDAR assembly also includes a light detector configured to receive the collected return light.
FIG. 1 illustrates one or more exemplary LiDAR systems 110 disposed or included in a motor vehicle 100. The motor vehicle 100 may be a vehicle having any level of automation. For example, the motor vehicle 100 may be a partially automated vehicle, a highly automated vehicle, a fully automated vehicle, or an unmanned vehicle. Partially automated vehicles may perform some driving functions without human driver intervention. For example, the partially automated vehicle may perform blind spot monitoring, lane keeping and/or lane changing operations, automatic emergency braking, intelligent cruising, and/or traffic following, among others. Certain operations of partially automated vehicles may be limited to specific applications or driving scenarios (e.g., limited to highway driving only). Highly automated vehicles may generally perform all of the operations of a partially automated vehicle, but with fewer limitations. Highly automated vehicles can also detect their own limits when operating the vehicle and require the driver to take over control of the vehicle if necessary. Full-automatic vehicles can perform all vehicle operations without driver intervention, but can also detect own limits and require the driver to take over if necessary. The unmanned vehicle may operate by itself without any driver intervention.
In a typical configuration, the motor vehicle 100 includes one or more LiDAR systems 110 and 120A-F. Each of the LiDAR systems 110 and 120A-F may be a scanning-based LiDAR system and/or a non-scanning LiDAR system (e.g., a flashing LiDAR). The scanning-based LiDAR system scans one or more light beams in one or more directions (e.g., horizontal and vertical directions) to detect objects in a field of view (FOV). Non-scanning based LiDAR systems transmit laser light to illuminate the FOV without scanning. For example, flash LiDAR is a non-scanning based LiDAR system. The flash LiDAR may emit laser light to illuminate the FOV simultaneously with a single pulse of light or light.
LiDAR systems are often the primary sensors of at least partially automated vehicles. In one embodiment, as shown in FIG. 1, a motor vehicle 100 may include a single LiDAR system 110 (e.g., without LiDAR systems 120A-F) disposed at the highest location of the vehicle (e.g., at the roof of the vehicle). Locating the LiDAR system 110 at the roof of the vehicle facilitates a 360 degree scan around the vehicle 100. In some other embodiments, motor vehicle 100 may include multiple LiDAR systems, including two or more of systems 110 and/or 120A-F. As shown in FIG. 1, in one embodiment, multiple LiDAR systems 110 and/or 120A-F are attached to a vehicle 100 at different locations of the vehicle. For example, liDAR system 120A is attached to vehicle 100 at the front right corner; liDAR system 120B is attached to vehicle 100 at the front center; liDAR system 120C is attached to vehicle 100 at the front left corner; liDAR system 120D is attached to vehicle 100 at the right side rearview mirror; liDAR system 120E is attached to vehicle 100 at the left side rearview mirror; and/or LiDAR system 120F is attached to vehicle 100 at the rear center. In some embodiments, liDAR systems 110 and 120A-F are stand-alone LiDAR systems with their respective laser sources, control electronics, transmitters, receivers, and/or steering mechanisms. In other embodiments, some of the LiDAR systems 110 and 120A-F may share one or more components, thereby forming a distributed sensor system. In one example, an optical fiber is used to deliver laser light from a centralized laser source to all LiDAR systems. It should be appreciated that one or more LiDAR systems may be distributed and attached to a vehicle in any desired manner, and FIG. 1 illustrates only one embodiment. As another example, liDAR systems 120D and 120E may be attached to the B-pillar of vehicle 100 instead of a rearview mirror. As another example, liDAR system 120B may be attached to a windshield of vehicle 100, rather than a front bumper.
FIG. 2 is a block diagram 200 illustrating interactions between on-board LiDAR system(s) 210 and a plurality of other systems including a vehicle perception and planning system 220. LiDAR system(s) 210 may be mounted on or integrated into a vehicle. LiDAR system(s) 210 include sensor(s) that scan laser light to the surrounding environment to measure the distance, angle, and/or velocity of an object. Based on the scattered light returned to the LiDAR system(s) 210, it may generate sensor data (e.g., image data or 3D point cloud data) representative of the perceived external environment.
The LiDAR system(s) 210 may include one or more of short range LiDAR sensors, medium range LiDAR sensors, and long range LiDAR sensors. Short range LiDAR sensors measure objects located up to about 20 to 40 meters from the LiDAR sensor. Short range LiDAR sensors may be used, for example, to monitor nearby moving objects (e.g., pedestrians crossing roads in a school zone), parking assistance applications, and the like. The medium range LiDAR sensor measures objects located up to about 100 to 150 meters from the LiDAR sensor. The mid-range LiDAR sensor may be used, for example, to monitor road intersections, assist in driving into or out of highways, and the like. Remote LiDAR sensors measure objects located up to about 150 to 300 meters. Remote LiDAR sensors are typically used when the vehicle is traveling at high speed (e.g., on a highway), such that the control system of the vehicle may only take a few seconds (e.g., 6 to 8 seconds) to respond to any conditions detected by the LiDAR sensor. As shown in FIG. 2, in one embodiment, liDAR sensor data may be provided to a vehicle perception and planning system 220 via a communication path 213 for further processing and control of vehicle operation. The communication path 213 may be any wired or wireless communication link that can transmit data.
Still referring to FIG. 2, in some embodiments, other on-board sensor(s) 230 are used to provide additional sensor data alone or with LiDAR system(s) 210. Other in-vehicle sensors 230 may include, for example, one or more cameras 232, one or more radars 234, one or more ultrasonic sensors 236, and/or other sensor(s) 238. The camera(s) 232 may take images and/or video of the environment external to the vehicle. The camera(s) 232 may capture High Definition (HD) video having, for example, millions of pixels in each frame. Cameras produce monochrome or color images and video. For some cases, color information may be important in interpreting the data (e.g., interpreting an image of a traffic light). Color information may not be available from other sensors, such as LiDAR or radar sensors. The camera(s) 232 may include one or more of a narrow focal length camera, a wide focal length camera, a side camera, an infrared camera, a fisheye camera, and the like. The image and/or video data generated by the camera(s) 232 may also be provided to the vehicle perception and planning system 220 via the communication path 233 for further processing and control of vehicle operation. Communication path 233 may be any wired or wireless communication link that may transmit data.
Other in-vehicle sensor(s) 230 may also include radar sensor(s) 234. The radar sensor(s) 234 use radio waves to determine the distance, angle, and speed of the object. The radar sensor(s) 234 generate electromagnetic waves in the radio or microwave spectrum. The electromagnetic waves are reflected by the object and some of the reflected waves return to the radar sensor, thereby providing information about the position and velocity of the object. The radar sensor(s) 234 may include one or more of short range radar(s), medium range radar(s), and long range radar(s). Short range radar measures objects located about 0.1 to 30 meters from the radar. Short range radar is useful for detecting objects located near a vehicle, such as other vehicles, buildings, walls, pedestrians, cyclists, etc. Short range radars may be used to detect blind spots, assist lane changes, provide rear-end collision warnings, assist parking, provide emergency braking, and the like. The mid range radar measures objects located about 30 to 80 meters from the radar. Remote radar measures objects located at about 80 to 200 meters. Mid-range and/or remote radar may be useful for, for example, traffic tracking, adaptive cruise control, and/or arterial road automatic braking. Sensor data generated by radar sensor(s) 234 may also be provided to vehicle perception and planning system 220 via communication path 233 for further processing and control of vehicle operation.
Other in-vehicle sensor(s) 230 may also include ultrasonic sensor(s) 236. The ultrasonic sensor(s) 236 use sound waves or pulses to measure objects located outside the vehicle. The acoustic waves generated by the ultrasonic sensor(s) 236 are transmitted to the surrounding environment. At least some of the transmitted waves are reflected by the object and return to the ultrasonic sensor(s) 236. Based on the return signal, the distance of the object can be calculated. The ultrasonic sensor(s) 236 may be useful, for example, to check blind spots, identify parking spots, provide lane change assistance in traffic, and the like. Sensor data generated by ultrasonic sensor(s) 236 may also be provided to vehicle perception and planning system 220 via communication path 233 for further processing and control of vehicle operation.
In some embodiments, one or more other sensors 238 may be attached in the vehicle, and may also generate sensor data. Other sensor(s) 238 may include, for example, a Global Positioning System (GPS), an Inertial Measurement Unit (IMU), and the like. Sensor data generated by the other sensor(s) 238 may also be provided to the vehicle perception and planning system 220 via communication path 233 for further processing and control of vehicle operation. It should be appreciated that the communication path 233 may include one or more communication links to communicate data between the various sensors 230 and the vehicle perception and planning system 220.
In some embodiments, as shown in fig. 2, sensor data from other in-vehicle sensor(s) 230 may be provided to in-vehicle LiDAR system(s) 210 via communication path 231. LiDAR system(s) 210 may process sensor data from other onboard sensor(s) 230. For example, sensor data from the camera(s) 232, radar sensor(s) 234, ultrasonic sensor(s) 236, and/or other sensor(s) 238 may be correlated or fused with the sensor data LiDAR system(s) 210, thereby at least partially offloading the sensor fusion process performed by the vehicle perception and planning system 220. It should be appreciated that other configurations may also be implemented for transmitting and processing sensor data from the various sensors (e.g., the data may be transmitted to a cloud service for processing, and then the processing results may be transmitted back to the vehicle perception and planning system 220).
Still referring to FIG. 2, in some embodiments, sensors on other vehicle(s) 250 are used to provide additional sensor data alone or in conjunction with LiDAR system(s) 210. For example, two or more nearby vehicles may have their respective LiDAR sensor(s), camera(s), radar sensor(s), ultrasonic sensor(s), and so forth. Nearby vehicles may communicate with each other and share sensor data. The communication between vehicles is also referred to as V2V (vehicle to vehicle) communication. For example, as shown in fig. 2, sensor data generated by other vehicle(s) 250 may be communicated to vehicle perception and planning system 220 and/or on-board LiDAR system(s) 210 via communication path 253 and/or communication path 251, respectively. Communication paths 253 and 251 may be any wired or wireless communication link that may transmit data.
Sharing sensor data facilitates better perception of the environment external to the vehicle. For example, the first vehicle may not sense a pedestrian behind the second vehicle but approaching the first vehicle. The second vehicle may share sensor data related to this pedestrian with the first vehicle so that the first vehicle may have additional reaction time to avoid collisions with pedestrians. In some embodiments, similar to the data generated by the sensor(s) 230, the data generated by the sensors on the other vehicle(s) 250 may be correlated or fused with the sensor data generated by the LiDAR system(s) 210, thereby at least partially offloading the sensor fusion process performed by the vehicle perception and planning system 220.
In some embodiments, the intelligent infrastructure system(s) 240 are used to provide sensor data alone or with LiDAR system(s) 210. Some infrastructures may be configured to communicate with vehicles to convey information, and vice versa. Communication between a vehicle and an infrastructure is commonly referred to as V2I (vehicle to infrastructure) communication. For example, the intelligent infrastructure system(s) 240 may include intelligent traffic lights that may communicate their status to approaching vehicles with messages such as "turn yellow after 5 seconds. The intelligent infrastructure system 240 may also include its own LiDAR system installed near the intersection so that it can communicate traffic monitoring information to the vehicle. For example, a vehicle turning left at an intersection may not have sufficient sensing capability because some of its own sensors may be blocked by traffic in the opposite direction. In such cases, the sensors of the intelligent infrastructure system(s) 240 may provide useful, and sometimes important, data to the left-turning vehicle. Such data may include, for example, traffic conditions, information of objects in the direction in which the vehicle is turning, traffic light status and predictions, and the like. Sensor data generated by the intelligent infrastructure system(s) 240 may be provided to the vehicle perception and planning system 220 and/or the on-board LiDAR system(s) 210 via communication path 243 and/or communication path 241, respectively. Communication paths 243 and/or 241 may include any wired or wireless communication links that may communicate data. For example, sensor data from the intelligent infrastructure system(s) 240 may be transmitted to the LiDAR system(s) 210 and correlated or fused with sensor data generated by the LiDAR system(s) 210, thereby at least partially offloading the sensor fusion process performed by the vehicle perception and planning system 220. The V2V and V2I communications described above are examples of vehicle-to-x (V2 x) communications, where "x" represents any other device, system, sensor, infrastructure, etc. that may share data with a vehicle.
Still referring to FIG. 2, via various communication paths, the vehicle perception and planning system 220 receives sensor data from one or more of LiDAR system(s) 210, other onboard sensor(s) 230, other vehicle(s) 250, and/or intelligent infrastructure system(s) 240. In some embodiments, different types of sensor data are correlated and/or integrated by the sensor fusion subsystem 222. For example, the sensor fusion subsystem 222 may generate a 360 degree model using multiple images or videos captured by multiple cameras disposed at different locations of the vehicle. The sensor fusion subsystem 222 obtains sensor data from different types of sensors and uses the combined data to more accurately perceive the environment. For example, the onboard camera 232 may not be able to capture a clear image because it is directly facing the sun or light source (e.g., the headlights of another vehicle during the night). LiDAR system 210 may not be too affected and thus sensor fusion subsystem 222 may combine sensor data provided by both camera 232 and LiDAR system 210 and use the sensor data provided by LiDAR system 210 to compensate for the unclear image captured by camera 232. As another example, radar sensor 234 may perform better than camera 232 or LiDAR system 210 in rainy or foggy weather. Thus, the sensor fusion subsystem 222 may use sensor data provided by the radar sensor 234 to compensate for sensor data provided by the camera 232 or LiDAR system 210.
In other examples, sensor data generated by other onboard sensor(s) 230 may have a lower resolution (e.g., radar sensor data), and thus may need to be correlated and validated by LiDAR system(s) 210 that typically have a higher resolution. For example, a manhole cover (also referred to as a manhole cover) may be detected by the radar sensor 234 as an object toward which the vehicle is approaching. Due to the low resolution nature of radar sensor 234, vehicle perception and planning system 220 may not be able to determine whether the object is an obstacle that the vehicle needs to avoid. Thus, the high-resolution sensor data generated by the LiDAR system(s) 210 can be used to correlate and confirm that the object is a manhole cover and that it is not damaging to the vehicle.
The vehicle perception and planning system 220 further includes an object classifier 223. Using raw sensor data and/or correlation/fusion data provided by the sensor fusion subsystem 222, the object classifier 223 may detect and classify objects and estimate the position of the objects. In some embodiments, object classifier 233 may use machine learning based techniques to detect and classify objects. Examples of machine learning based techniques include the use of algorithms such as: a region-based convolutional neural network (R-CNN), a fast R-CNN, a faster R-CNN, a directional gradient Histogram (HOG), a region-based full convolutional network (R-FCN), a one-shot detector (SSD), spatial pyramid pooling (SPP-net), and/or You Only Look Once (Yolo).
The vehicle perception and planning system 220 further includes a road detection subsystem 224. The road detection subsystem 224 locates roads and identifies objects and/or markers on the roads. For example, based on raw or fused sensor data provided by radar sensor(s) 234, camera(s) 232, and/or LiDAR system(s) 210, road detection subsystem 224 may construct a 3D model of the road based on machine learning techniques (e.g., pattern recognition algorithms for identifying lanes). Using a 3D model of the road, the road detection subsystem 224 may identify objects (e.g., obstacles or debris on the road) and/or markers on the road (e.g., lane lines, turn markers, crosswalk markers, etc.).
The vehicle perception and planning system 220 further includes a positioning and vehicle pose subsystem 225. Based on raw or fused sensor data, the position and vehicle pose subsystem 225 may determine the position of the vehicle and the pose of the vehicle. For example, using sensor data and/or GPS data from LiDAR system(s) 210, camera(s) 232, the positioning and vehicle pose subsystem 225 may determine the exact position of the vehicle on the road and six degrees of freedom of the vehicle (e.g., whether the vehicle is moving forward or backward, upward or downward, left or right). In some embodiments, high Definition (HD) maps are used for vehicle positioning. HD maps can provide a very detailed three-dimensional computerized map that accurately determines the location of the vehicle. For example, using the HD map, the positioning and vehicle pose subsystem 225 may accurately determine the current location of the vehicle (e.g., on which lane of the road the vehicle is currently on, how close it is to the roadside or the sidewalk) and predict the future location of the vehicle.
The vehicle perception and planning system 220 further includes an obstacle predictor 226. The objects identified by the object classifier 223 may be stationary (e.g., light poles, road signs) or dynamic (e.g., moving pedestrians, bicycles, another car). For moving objects, predicting their path of movement or future position may be important to avoid collisions. The obstacle predictor 226 may predict an obstacle trajectory and/or alert a driver or a vehicle planning subsystem 228 of a potential collision. For example, if there is a high likelihood that the trajectory of the obstacle intersects the current path of movement of the vehicle, the obstacle predictor 226 may generate such a warning. Obstacle predictor 226 may use various techniques to make such predictions. Such techniques include, for example, constant velocity or acceleration models, constant rotational rate and velocity/acceleration models, kalman filter-based and extended Kalman filter-based models, recurrent Neural Network (RNN) -based models, long-short term memory (LSTM) -based neural network models, encoder-decoder RNN models, and the like.
Still referring to FIG. 2, in some embodiments, the vehicle perception and planning system 220 further includes a vehicle planning subsystem 228. The vehicle planning subsystem 228 may include a route planner, a driving behavior planner, and a movement planner. The route planner may plan a route of the vehicle based on current location data of the vehicle, target location data, traffic information, and the like. The driving behavior planner uses the obstacle predictions provided by the obstacle predictor 226 to adjust timing and planned movement based on how other objects may move. The movement planner determines the specific operations that the vehicle needs to follow. The planning results are then communicated to a vehicle control system 280 via a vehicle interface 270. Communication may be performed through communication paths 223 and 271, including any wired or wireless communication links over which data may be communicated.
The vehicle control system 280 controls the steering mechanism, throttle, brakes, etc. of the vehicle to operate the vehicle according to the planned route and movement. The vehicle perception and planning system 220 may further include a user interface 260 that provides a user (e.g., driver) with access to the vehicle control system 280, for example, to maneuver or take over control of the vehicle as necessary. The user interface 260 may communicate with the vehicle perception and planning system 220, for example, to obtain and display raw or fused sensor data, identified objects, the position/pose of the vehicle, and the like. These displayed data may help the user to better operate the vehicle. The user interface 260 may communicate with the vehicle awareness and planning system 220 and/or the vehicle control system 280 via communication paths 221 and 261, respectively, including any wired or wireless communication links over which data may be communicated. It should be appreciated that the various systems, sensors, communication links, and interfaces in fig. 2 may be configured in any desired manner and are not limited to the configuration shown in fig. 2.
FIG. 3 is a block diagram illustrating an exemplary LiDAR system 300. LiDAR system 300 may be used to implement LiDAR systems 110, 120A-F, and/or 210 shown in FIGS. 1 and 2. In one embodiment, liDAR system 300 includes a laser source 310, an emitter 320, an optical receiver and photodetector 330, a steering system 340, and control circuitry 350. These components are coupled together using communication paths 312, 314, 322, 332, 343, 352, and 362. These communication paths include communication links (wired or wireless, bi-directional or uni-directional) among the various LiDAR system components, but need not be the physical components themselves. Although the communication path may be implemented by one or more wires, buses, or optical fibers, the communication path may also be a wireless channel or a free-space optical path such that no physical communication medium exists. For example, in one embodiment of LiDAR system 300, communication path 314 between laser source 310 and emitter 320 may be implemented using one or more optical fibers. Communication paths 332 and 352 may represent optical paths implemented using free-space optics and/or optical fibers. And communication paths 312, 322, 342, and 362 may be implemented using one or more wires carrying electrical signals. The communication paths may also include one or more of the above types of communication media (e.g., they may include optical fibers and free space optics, or include one or more optical fibers and one or more wires).
LiDAR system 300 may also include other components not depicted in FIG. 3, such as a power bus, a power source, an LED indicator, a switch, and the like. Additionally, other communication connections between components may exist, such as a direct connection between light source 310 and optical receiver and light detector 330 to provide a reference signal, so that the time from transmission of a light pulse until detection of a return light pulse may be accurately measured.
The laser source 310 outputs laser light for illuminating an object in a field of view (FOV). The laser source 310 may be, for example, a semiconductor-based laser (e.g., a diode laser) and/or a fiber-based laser. The semiconductor-based laser may be, for example, an edge-emitting laser (EEL), a Vertical Cavity Surface Emitting Laser (VCSEL), or the like. An optical fiber-based laser is one in which the active gain medium is an optical fiber doped with rare earth elements such as erbium, ytterbium, neodymium, dysprosium, praseodymium, thulium, and/or holmium. In some embodiments, the fiber laser is based on a double-clad fiber, where the gain medium forms the core of the fiber surrounded by two cladding layers. Double-clad optical fibers allow pumping of the core with a high power beam, thereby enabling the laser source to be a high power fiber laser source.
In some embodiments, laser source 310 includes a master oscillator (also referred to as a seed laser) and a power amplifier (MOPA). The power amplifier amplifies the output power of the seed laser. The power amplifier may be a fiber amplifier, a bulk amplifier, or a semiconductor optical amplifier. The seed laser may be a diode laser (e.g., a fabry-perot cavity laser, a distributed feedback laser), a solid-state laser, or a tunable external cavity diode laser. In some embodiments, the laser source 310 may be an optically pumped microchip laser. Microchip lasers are alignment-free monolithic solid state lasers in which the laser crystal is in direct contact with the end mirror of the laser resonator. Microchip lasers are typically pumped (directly or using optical fibers) with a laser diode to obtain the desired output power. Microchip lasers may be based on neodymium-doped yttrium aluminum garnet (Y 3Al5O12) laser crystals (i.e., nd: YAG), or neodymium-doped vanadate (i.e., ND: YVO 4) laser crystals.
Fig. 4 is a block diagram illustrating an exemplary fiber-based laser source 400 having a seed laser and one or more pumps (e.g., laser diodes) for pumping a desired output power. Fiber-based laser source 400 is an example of laser source 310 depicted in fig. 3. In some embodiments, fiber-based laser source 400 includes a seed laser 402 to generate initial optical pulses at one or more wavelengths (e.g., 1550 nm) that are provided to a Wavelength Division Multiplexer (WDM) 404 via an optical fiber 403. The fiber-based laser source 400 further comprises a pump 406 for providing (e.g. different wavelengths, such as 980 nm) laser power to the WDM 404 via the optical fiber 405. WDM 404 multiplexes the optical pulses provided by seed laser 402 and the laser power provided by pump 406 onto a single optical fiber 407. The output of WDM 404 can then be provided to one or more preamplifiers 408 via optical fibers 407. The pre-amplifier(s) 408 may be optical amplifier(s) that amplify the optical signal (e.g., have a gain of about 20 to 30 dB). In some embodiments, pre-amplifier(s) 408 are low noise amplifiers. Preamplifier(s) 408 are output to combiner 410 via optical fiber 409. Combiner 410 combines the output laser of pre-amplifier(s) 408 with the laser power provided by pump 412 via optical fiber 411. Combiner 410 may combine optical signals having the same wavelength or different wavelengths. One example of a combiner is WDM. Combiner 410 provides the pulses to boost amplifier 414, thereby producing output light pulses via optical fiber 410. The boost amplifier 414 provides further amplification of the optical signal. The output light pulses may then be transmitted to the emitter 320 and/or the steering mechanism 340 (shown in fig. 3). It should be appreciated that fig. 4 illustrates one exemplary configuration of a fiber-based laser source 400. The laser source 400 may have many other configurations using different combinations of one or more components shown in fig. 4 and/or other components not shown in fig. 4 (e.g., other components such as power supplies, lenses, filters, splitters, combiners, etc.).
In some variations, the fiber gain curve used in the fiber-based laser source 400 may control the fiber-based laser source 400 (e.g., via the control circuitry 350) to produce pulses of different magnitudes. The communication path 312 couples the fiber-based laser source 400 to control circuitry 350 (shown in fig. 3) such that components of the fiber-based laser source 400 may be controlled by or in communication with the control circuitry 350. Alternatively, the fiber-based laser source 400 may include its own dedicated controller. Instead of the control circuitry 350 directly communicating with the components of the fiber-based laser source 400, a dedicated controller of the fiber-based laser source 400 communicates with the control circuitry 350 and controls and/or communicates with the components of the fiber-based light source 400. The fiber-based light source 400 may also include other components not shown, such as one or more power connectors, power sources, and/or power lines.
Referring to FIG. 3, typical operating wavelengths of laser source 310 include, for example, about 850nm, about 905nm, about 940nm, about 1064nm, and about 1550nm. The upper limit of the maximum available laser power is set by the U.S. fda (U.S. food and drug administration) regulations. The optical power limit at 1550nm is much higher than the optical power limit for the other aforementioned wavelengths. Further, at 1550nm, the optical power loss in the optical fiber is low. These characteristics of 1550nm wavelength make it more beneficial for remote LiDAR applications. The amount of optical power output from the laser source 310 may be characterized by its peak power, average power, and pulse energy. Peak power is the ratio of pulse energy to pulse width (e.g., full width half maximum or FWHM). Thus, for a fixed amount of pulse energy, a smaller pulse width may provide a larger peak power. The pulse width may be in the range of nanoseconds or picoseconds. The average power is the product of the energy of the pulse and the Pulse Repetition Rate (PRR). As described in more detail below, PRR represents the frequency of the pulsed laser. PRR generally corresponds to the maximum range that the LiDAR system can measure. The laser source 310 may be configured to pulse at a high PRR to meet a desired number of data points in a point cloud generated by the LiDAR system. The laser source 310 may also be configured to pulse at medium or low PRRs to meet a desired maximum detection distance. Wall Plug Efficiency (WPE) is another factor in assessing overall power consumption, which may be a key indicator in assessing laser efficiency. For example, as shown in fig. 1, multiple LiDAR systems may be attached to a vehicle, which may be an electric vehicle or a vehicle that otherwise has a limited fuel or battery power supply. Thus, high WPE and intelligent ways of using laser power are often among important considerations when selecting and configuring the laser source 310 and/or designing a laser delivery system for an in-vehicle LiDAR application.
It should be appreciated that the above description provides a non-limiting example of laser source 310. Laser source 310 may be configured to include many other types of light sources (e.g., laser diodes, short cavity fiber lasers, solid state lasers, and/or tunable external cavity diode lasers) configured to generate one or more optical signals at various wavelengths. In some examples, light source 310 includes amplifiers (e.g., preamplifiers and/or booster amplifiers) that may be doped fiber amplifiers, solid-state amplifiers, and/or semiconductor optical amplifiers. The amplifier is configured to receive the optical signals and amplify the optical signals with a desired gain.
Referring back to FIG. 3, liDAR system 300 further includes a transmitter 320. The laser source 310 provides laser light (e.g., in the form of a laser beam) to the emitter 320. The laser light provided by the laser source 310 may be an amplified laser light having a predetermined or controlled wavelength, pulse repetition rate, and/or power level. The transmitter 320 receives the laser light from the laser source 310 and transmits the laser light to the steering mechanism 340 with low divergence. In some embodiments, the emitter 320 may include, for example, optical components (e.g., lenses, optical fibers, mirrors, etc.) for transmitting the laser beam to a field of view (FOV) either directly or via the steering mechanism 340. Although fig. 3 illustrates the transmitter 320 and steering mechanism 340 as separate components, in some embodiments they may be combined or integrated into one system. Steering mechanism 340 is described in more detail below.
The laser beam provided by the laser source 310 may diverge as it travels to the emitter 320. Accordingly, the emitter 320 often includes a collimating lens configured to collect the diverging laser beams and produce a more parallel optical beam with reduced or minimal divergence. The collimated optical beam may then be further directed by various optics, such as mirrors and lenses. The collimating lens may be, for example, a single plano-convex lens or a lens group. The collimating lens may be configured to achieve any desired properties, such as beam diameter, divergence, numerical aperture, focal length, and the like. The beam propagation ratio or beam quality factor (also known as the M 2 factor) is used to measure the quality of the laser beam. In many LiDAR applications, it is important to have good laser beam quality in the generated transmitted laser beam. The M 2 factor represents the degree of change in the beam relative to an ideal Gaussian beam. Thus, the M 2 factor reflects how well the collimated laser beam can be focused on a small spot, or how well the divergent laser beam can be collimated. . Thus, the laser source 310 and/or the emitter 320 may be configured to meet, for example, scan resolution requirements, while maintaining the desired M 2 factor.
One or more of the beams provided by the emitter 320 are scanned to the FOV by the steering mechanism 340. Steering mechanism 340 scans the beam in multiple dimensions (e.g., in both the horizontal and vertical dimensions) to facilitate LiDAR system 300 to map an environment by generating a 3D point cloud. The steering mechanism 340 will be described in more detail below. The laser light scanned into the FOV may be scattered or reflected by objects in the FOV. At least a portion of the scattered or reflected light is returned to the LiDAR system 300. Fig. 3 further illustrates an optical receiver and photodetector 330 configured to receive the return light. The optical receiver and photodetector 330 includes an optical receiver configured to collect return light from the FOV. The optical receiver may include optics (e.g., lenses, fibers, mirrors, etc.) for receiving, redirecting, focusing, amplifying, and/or filtering the return light from the FOV. For example, optical receivers often include a collection lens (e.g., a single plano-convex lens or lens group) to collect return light and/or focus the collected return light onto a light detector.
The photodetector detects the return light focused by the optical receiver and generates a current and/or voltage signal proportional to the incident intensity of the return light. Based on such current and/or voltage signals, depth information of the object in the FOV may be derived. One exemplary method for deriving such degree information is based on direct TOF (time of flight), which is described in more detail below. The photodetector may be characterized by its detection sensitivity, quantum efficiency, detector bandwidth, linearity, signal-to-noise ratio (SNR), overload resistance, interference immunity, and the like. The light detector may be configured or customized to have any desired characteristics, depending on the application. For example, the optical receiver and photodetector 330 may be configured such that the photodetector has a large dynamic range while having good linearity. Photodetector linearity indicates the ability of a detector to maintain a linear relationship between the input optical signal power and the output of the detector. Detectors with good linearity can maintain a linear relationship over a large dynamic input optical signal range.
The structure of the light detector and/or the material system of the detector may be configured or customized to achieve the desired detector characteristics. Various detector configurations may be used for the light detector. For example, the photodetector structure may be a PIN-based structure having an undoped intrinsic semiconductor region (i.e., an "i" region) between a p-type semiconductor and an n-type semiconductor region. Other photodetector structures include, for example, APD (avalanche photodiode) based structures, PMT (photomultiplier tube) based structures, siPM (silicon photomultiplier tube) based structures, SPAD (single photon avalanche diode) based structures, and/or quantum wires. For the material system used in the photodetector, si, inGaAs and/or Si/Ge based materials may be used. It should be appreciated that many other detector structures and/or material systems may be used in the optical receiver and photodetector 330.
The photodetector (e.g., APD-based detector) may have an internal gain such that the input signal is amplified when the output signal is generated. However, due to the internal gain of the photodetector, noise may also be amplified. Common types of noise include signal shot noise, dark current shot noise, thermal noise, and amplifier noise (TIA). In some embodiments, optical receiver and photodetector 330 may include a pre-amplifier that is a Low Noise Amplifier (LNA). In some embodiments, the preamplifier may also include a TIA transimpedance amplifier that converts the current signal to a voltage signal. For linear detector systems, the input equivalent noise or Noise Equivalent Power (NEP) measures the sensitivity of the photodetector to weak signals. They can therefore be used as indicators of overall system performance. For example, the NEP of the photodetector indicates the power of the weakest signal that can be detected, and thus it in turn specifies the maximum range of the LiDAR system. It should be appreciated that a variety of light detector optimization techniques may be used to meet the requirements of the LiDAR system 300. Such optimization techniques may include selecting different detector structures, materials, and/or implementing signal processing techniques (e.g., filtering, noise reduction, amplification, etc.). For example, coherent detection may also be used for the light detector in addition to or instead of direct detection using a return signal (e.g., by using TOF). Coherent detection allows the amplitude and phase information of the received light to be detected by interfering the received light with a local oscillator. Coherent detection can improve detection sensitivity and noise immunity.
FIG. 3 further illustrates that LiDAR system 300 includes a steering mechanism 340. As described above, the steering mechanism 340 directs the beam from the emitter 320 to scan the FOV in multiple dimensions. The steering mechanism is referred to as a raster mechanism or scanning mechanism. Scanning the light beam in multiple directions (e.g., in both the horizontal and vertical directions) facilitates LiDAR systems to map an environment by generating images or 3D point clouds. The steering mechanism may be based on mechanical scanning and/or solid state scanning. Mechanical scanning uses a rotating mirror to steer the laser beam or a physically rotating LiDAR transmitter and receiver (collectively referred to as a transceiver) to scan the laser beam. Solid state scanning directs the laser beam to various locations through the FOV without mechanically moving any macroscopic component, such as a transceiver. Solid state scanning mechanisms include, for example, optical phased array based steering and flash light LiDAR based steering. In some embodiments, because the solid state scanning mechanism does not physically move macroscopic components, the steering performed by the solid state scanning mechanism may be referred to as effective steering. LiDAR systems that use solid state scanning may also be referred to as non-mechanically scanned or simply as non-scanned LiDAR systems (flash LiDAR systems are exemplary non-scanned LiDAR systems).
Steering mechanism 340 may be used with transceivers (e.g., emitter 320 and optical receiver and photodetector 330) to scan the FOV to generate an image or a 3D point cloud. As an example, to implement steering mechanism 340, a two-dimensional mechanical scanner may be used with a single point transceiver or several single point transceivers. The single point transceiver transmits a single beam or a small number of beams (e.g., 2 to 8 beams) to the steering mechanism. The two-dimensional mechanical steering mechanism includes, for example, a polygon mirror(s), an oscillating mirror(s), a rotating prism(s), a rotating tilting mirror(s), or a combination thereof. In some embodiments, steering mechanism 340 may include non-mechanical steering mechanism(s), such as solid state steering mechanism(s). For example, steering mechanism 340 may be based on a tuned wavelength of a laser incorporating a refractive effect, and/or based on a reconfigurable grating/phased array. In some embodiments, the steering mechanism 340 may implement two-dimensional scanning using a single scanning device, or two-dimensional scanning using a combination of two devices.
As another example, to implement steering mechanism 340, a one-dimensional mechanical scanner may be used with a single point transceiver array or a large number of single point transceivers. In particular, the transceiver array may be mounted on a rotating platform to achieve a 360 degree horizontal field of view. Alternatively, the static transceiver array may be combined with a one-dimensional mechanical scanner. The one-dimensional mechanical scanner includes polygon mirror(s), oscillating mirror(s), rotating prism(s), rotating tilting mirror(s) for obtaining a forward looking horizontal field of view. Steering mechanisms using mechanical scanners can provide robustness and reliability in mass production for automotive applications.
As another example, to implement steering mechanism 340, a two-dimensional transceiver may be used to directly generate a scanned image or a 3D point cloud. In some embodiments, stitching or micro-shifting methods may be used to increase the resolution of the scanned image or scanned field of view. For example, using a two-dimensional transceiver, signals generated in one direction (e.g., the horizontal direction) and signals generated in another direction (e.g., the vertical direction) may be integrated, interleaved, and/or matched to generate a higher or full resolution image or 3D point cloud representing the scanned FOV.
Some embodiments of the steering mechanism 340 include one or more optical redirecting elements (e.g., mirrors or lenses) that direct (e.g., by rotation, vibration, or guidance) the return light signals along the receive path to direct the return light signals to the optical receiver and light detector 330. The optical redirection element that directs the optical signal along the transmit and receive paths may be the same component (e.g., shared), separate components (e.g., dedicated), and/or a combination of shared and separate components. This means that in some cases the transmission and reception paths are different, although they may partially overlap (or in some cases substantially overlap).
Still referring to FIG. 3, liDAR system 300 further includes control circuitry 350. Control circuitry 350 may be configured and/or programmed to control various portions of LiDAR system 300 and/or to perform signal processing. In a typical system, control circuitry 350 may be configured and/or programmed to perform one or more control operations, including, for example, controlling laser source 310 to obtain a desired laser pulse timing, repetition rate, and power; control steering mechanism 340 (e.g., control speed, direction, and/or other parameters) to scan the FOV and maintain pixel registration/alignment; controlling the optical receiver and light detector 330 (e.g., controlling sensitivity, noise reduction, filtering, and/or other parameters) so that it is at an optimal state; and monitoring overall system health/functional safety status.
Control circuitry 350 may also be configured and/or programmed to perform signal processing on raw data generated by optical receiver and light detector 330 to derive range and reflectivity information and to perform data packaging and communication with vehicle perception and planning system 220 (shown in fig. 2). For example, the control circuitry 350 determines the time taken from transmitting the light pulse until receiving the corresponding return light pulse; determining when no return light pulse has been received for the transmitted light pulse; determining the direction (e.g., horizontal and/or vertical information) of the transmitted/returned light pulses; determining an estimated range in a particular direction; and/or determine any other type of data related to LiDAR system 300.
LiDAR system 300 may be disposed in a vehicle that may operate in a variety of different environments, including hot or cold weather, rough road conditions that may cause strong vibrations, high or low humidity, dusty areas, and the like. Thus, in some embodiments, the optical and/or electronic components of LiDAR system 300 (e.g., the emitter 320, optical receiver and photodetector 330, and the optics in steering mechanism 340) are arranged or configured in such a way as to maintain long-term mechanical and optical stability. For example, components in LiDAR system 300 may be fixed and sealed so that they can operate under all conditions that a vehicle may encounter. As an example, a moisture-resistant coating and/or hermetic seal may be applied to the optical components of emitter 320, optical receiver and photodetector 330, and steering mechanism 340 (as well as other components susceptible to moisture). As another example, housing(s), enclosure(s), and/or windows may be used in the LiDAR system 300 for providing desired characteristics, such as hardness, protection level (IP) rating, self-cleaning ability, chemical and impact resistance, and the like. In addition, an efficient and economical method for assembling LiDAR system 300 can be used to meet LiDAR operational requirements while maintaining low cost.
It will be appreciated by those of ordinary skill in the art that fig. 3 and the above description are for illustrative purposes only, and that the LiDAR system may include other functional units, blocks, or segments, and may include variations or combinations of these above functional units, blocks, or segments. For example, liDAR system 300 may also include other components not depicted in FIG. 3, such as a power bus, power supply, LED indicators, switches, and the like. Additionally, other connections among the components may exist, such as a direct connection between light source 310 and optical receiver and light detector 330, such that light detector 330 may accurately measure the time from the transmission of a light pulse by light source 310 until the detection of a return light pulse by light detector 330.
These components shown in fig. 3 are coupled together using communication paths 312, 314, 322, 332, 342, 352, and 362. These communication paths represent communications (bi-directional or uni-directional) among the various LiDAR system components, but need not be the physical components themselves. Although the communication path may be implemented by one or more wires, buses, or optical fibers, the communication path may also be a wireless channel or an open air optical path such that no physical communication medium exists. For example, in one exemplary LiDAR system, communications path 314 includes one or more optical fibers; communication path 352 represents an optical path; and communication paths 312, 322, 342, and 362 are all wires carrying electrical signals. The communication paths may also include more than one of the above types of communication media (e.g., they may include optical fibers and optical paths, or one or more optical fibers and one or more wires).
As described above, some LiDAR systems use the time of flight (TOF) of an optical signal (e.g., an optical pulse) to determine a distance to an object in an optical path. For example, referring to FIG. 5A, an exemplary LiDAR system 500 includes a laser light source (e.g., a fiber laser), a steering system (e.g., a system of one or more moving mirrors), and a light detector (e.g., a photon detector with one or more optics). LiDAR system 500 may be implemented using, for example, liDAR system 300 described above. LiDAR system 500 transmits light pulses 502 along an optical path 504 as determined by the steering system of LiDAR system 500. In the depicted example, the light pulse 502 generated by the laser light source is a short pulse of laser light. Further, the signal steering system of LiDAR system 500 is a pulsed signal steering system. However, it should be appreciated that LiDAR systems may operate by generating, transmitting, and detecting non-pulsed light signals, and use techniques other than time of flight to derive distance to objects in the surrounding environment. For example, some LiDAR systems use frequency modulated continuous waves (i.e., "FMCW"). It should be further appreciated that any of the techniques described herein with respect to time-of-flight based systems using pulsed signals may also be applicable to LiDAR systems that do not use one or both of these techniques.
Referring back to FIG. 5A (e.g., illustrating a time-of-flight LiDAR system using light pulses), when light pulse 502 reaches object 506, light pulse 502 is scattered or reflected to generate return light pulse 508. The return light pulse 508 may return to the system 500 along an optical path 510. The time from when transmitted light pulse 502 leaves LiDAR system 500 to when return light pulse 508 returns to reach LiDAR system 500 may be measured (e.g., by a processor or other electronic device within the LiDAR system, such as control circuitry 350). This time of flight, combined with knowledge of the speed of light, can be used to determine the range/distance from the LiDAR system 500 to the portion of the object 506 from which the light pulse 502 was scattered or reflected.
As depicted in FIG. 5B, liDAR system 500 scans the external environment by directing a number of light pulses (e.g., by directing light pulses 502, 522, 526, 530 along light paths 504, 524, 528, 532, respectively). As depicted in FIG. 5C, liDAR system 500 receives return light pulses 508, 542, 548 (which correspond to transmitted light pulses 502, 522, 530, respectively). The return light pulses 508, 542, and 548 are generated by scattering or reflecting the transmitted light pulses by one of the objects 506 and 514. Return light pulses 508, 542, and 548 can be returned to LiDAR system 500 along light paths 510, 544, and 546, respectively. Based on the direction of the transmitted light pulse (as determined by LiDAR system 500) and the calculated distance from LiDAR system 500 to the object portion (e.g., portions of objects 506 and 514) that scattered or reflected the light pulse, the external environment within the detectable range (e.g., including the field of view between paths 504 and 532) can be accurately plotted or plotted (e.g., by generating a 3D point cloud or image).
If no corresponding light pulse is received for a particular transmitted light pulse, it may be determined that there is no object within the detectable range of LiDAR system 500 (e.g., the object is beyond the maximum scanning distance of LiDAR system 500). For example, in fig. 5B, the light pulse 526 may not have a corresponding return light pulse (as illustrated in fig. 5C) because the light pulse 526 may not generate scattering events along its transmission path 528 within a predetermined detection range. LiDAR system 500 or an external system (e.g., a cloud system or service) in communication with LiDAR system 500 may interpret the lack of a return light pulse as no object being disposed along light path 528 within a detectable range of LiDAR system 500.
In fig. 5B, the light pulses 502, 522, 526, and 530 may be transmitted in any order, serially, in parallel, or based on other timing relative to each other. Additionally, although fig. 5B depicts the transmitted light pulses as being directed in one dimension or plane (e.g., paper plane), the LiDAR system 500 may direct the transmitted light pulses along other dimension(s) or plane(s). For example, liDAR system 500 may also direct the transmitted light pulses in a dimension or plane perpendicular to the dimension or plane shown in FIG. 5B, thereby forming a 2-dimensional transmission of light pulses. Such 2-dimensional transmission of the light pulses may be point-by-point, line-by-line, simultaneous or in some other way. A point cloud or image from a 1-dimensional transmission of light pulses (e.g., a single horizontal line) may generate 2-dimensional data (e.g., (1) data from the horizontal transmission direction and (2) range or distance to an object). Similarly, a point cloud or image from 2-dimensional transmission of light pulses may generate 3-dimensional data (e.g., (1) data from a horizontal transmission direction, (2) data from a vertical transmission direction, and (3) range or distance to an object). In general, liDAR systems that perform n-dimensional transmission of light pulses generate (n+1) -dimensional data. This is because the LiDAR system can measure the depth of an object or range/distance to an object, which provides an additional dimension of data. Thus, a 2D scan by a LiDAR system may generate a 3D point cloud that is used to map the external environment of the LiDAR system.
The density of the point cloud refers to the number of measurements (data points) per area performed by the LiDAR system. The point cloud density is related to the LiDAR scanning resolution. Generally, at least for a region of interest (ROI), a greater point cloud density, and thus a higher resolution, is desired. The point density in the point cloud or image generated by the LiDAR system is equal to the number of pulses divided by the field of view. In some embodiments, the field of view may be fixed. Thus, in order to increase the density of points generated by a set of transmit-receive optics (or transceiver optics), a LiDAR system may need to generate pulses more frequently. In other words, a light source having a higher Pulse Repetition Rate (PRR) is required. On the other hand, by generating and transmitting pulses more frequently, the furthest distance that a LiDAR system can detect may be limited. For example, if a return signal from a distant object is received after the system transmits the next pulse, the return signal may be detected in a different order than the order in which the corresponding signals were transmitted, thereby causing ambiguity if the system is unable to properly correlate the return signal with the transmitted signal.
For illustration, consider an exemplary LiDAR system that can transmit laser pulses at a repetition rate between 500kHz and 1 MHz. Based on the time it takes for the pulse to return to the LiDAR system, and to avoid obscuring the return pulse from the continuous pulse in conventional LiDAR designs, the furthest distances that the LiDAR system can detect can be 300 meters and 150 meters for 500kHz and 1MHz, respectively. The spot density of a LiDAR system with a repetition rate of 500kHz is half that of a LiDAR system with a repetition rate of 1 MHz. Thus, this example shows that increasing the repetition rate from 500kHz to 1MHz (and thus increasing the dot density of the system) can reduce the detection range of the system if the system cannot properly correlate the return signals arriving out of order. Various techniques are used to mitigate the tradeoff between higher PRR and limited detection range. For example, multiple wavelengths may be used to detect objects within different ranges. Optical and/or signal processing techniques are also used to correlate between the transmitted optical signal and the return optical signal.
The various systems, apparatus, and methods described herein may be implemented using digital electronic circuitry, or using one or more computers using well known computer processors, memory units, storage devices, computer software, and other components. Generally, a computer includes a processor for executing instructions and one or more memories for storing instructions and data. The computer may also include or be coupled to one or more mass storage devices, such as one or more magnetic disks, internal hard disks and removable magnetic disks, magneto-optical disks, and the like.
The various systems, apparatus, and methods described herein may be implemented using a computer operating in a client-server relationship. Typically, in such systems, the client computers are located remotely from the server computer and interact via a network. The client-server relationship may be defined and controlled by computer programs running on the respective client and server computers. Examples of client computers may include desktop computers, workstations, portable computers, cellular smartphones, tablet computers, or other types of computing devices.
The various systems, apparatus, and methods described herein may be implemented using a computer program product tangibly embodied in an information carrier (e.g., in a non-transitory machine-readable storage device) for execution by a programmable processor; and the method processes and steps described herein (including one or more of the steps of fig. 18) may be implemented using one or more computer programs that may be executed by such processors. A computer program is a set of computer program instructions that can be used, directly or indirectly, in a computer to perform a certain activity or bring about a certain result. A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
A high-level block diagram of an exemplary device that may be used to implement the systems, devices, and methods described herein is illustrated in fig. 6. The apparatus 600 includes a processor 610 operatively coupled to a persistent storage 620 and a main memory 630. The processor 610 controls the overall operation of the device 600 by executing computer program instructions defining such operations. The computer program instructions may be stored in persistent storage 620 or other computer-readable medium and loaded into main memory device 630 when execution of the computer program instructions is desired. For example, the processor 610 may be used to implement one or more of the components and systems described herein, such as the control circuitry 350 (shown in fig. 3), the vehicle perception and planning system 220 (shown in fig. 2), and the vehicle control system 280 (shown in fig. 2). Accordingly, the method steps of fig. 18 may be defined by computer program instructions stored in main memory device 630 and/or persistent storage device 620 and controlled by processor 610 executing the computer program instructions. For example, the computer program instructions may be embodied as computer executable code programmed by one skilled in the art to perform algorithms defined by the method steps described herein. Thus, by executing computer program instructions, the processor 610 executes the algorithm defined by the method of FIG. 18. The apparatus 600 also includes one or more network interfaces 680 for communicating with other devices via a network. The device 600 may also include one or more input/output devices 690 that enable a user to interact with the device 600 (e.g., display, keyboard, mouse, speakers, keys, etc.).
Processor 610 may include both general purpose and special purpose microprocessors, and may be the only processor or one of multiple processors of device 600. The processor 610 may include one or more Central Processing Units (CPUs) and one or more Graphics Processing Units (GPUs), which may, for example, operate separately from and/or perform multitasking with the one or more CPUs to speed up processing, such as for the various image processing applications described herein. Processor 610, persistent storage 620, and/or main memory 630 may include or be supplemented by, or incorporated in, one or more application-specific integrated circuits (ASICs) and/or one or more field-programmable gate arrays (FPGAs).
Persistent storage 620 and main memory 630 each include tangible, non-transitory computer-readable storage media. Persistent storage 620 and main memory device 630 may each include high-speed random access memory, such as Dynamic Random Access Memory (DRAM), static Random Access Memory (SRAM), double data rate synchronous dynamic random access memory (DDR RAM), or other random access solid state memory devices, and may include non-volatile memory, such as one or more magnetic disk storage devices, such as internal hard disks and removable disks, magneto-optical disk storage devices, flash memory devices, semiconductor memory devices, such as erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory (CD-ROM), digital versatile disc read-only memory (DVD-ROM) disks, or other non-volatile solid state memory devices.
The input/output device 690 may include peripherals such as printers, scanners, display screens, and the like. For example, input/output devices 690 may include a display device (such as a Cathode Ray Tube (CRT), plasma, or Liquid Crystal Display (LCD) monitor), a keyboard, and a pointing device (such as a mouse or trackball by which a user may provide input to device 600) for displaying information to the user.
Any or all of the functions of the systems and devices discussed herein may be performed by the processor 610 and/or incorporated into a device or system (such as the LiDAR system 300). Further, liDAR system 300 and/or device 600 may utilize one or more neural networks or other deep learning techniques performed by processor 610 or other systems or devices discussed herein.
Those skilled in the art will recognize that an actual computer or implementation of a computer system may have other structures and may also contain other components, and that fig. 6 is a high-level representation of some of the components of such a computer for illustrative purposes.
As used in this disclosure, a "blind spot" may include, but is not limited to, and may be different from, a "blind spot" as used in colloquial, which essentially means a "driver blind spot". Driver blind spots are of two types. The first type of driver blind spot refers to areas of the road that are outside the driver's field of view that are not visible by looking at both the rear view mirror and the side view mirror. This type of driver blind spot is referred to as a driver horizontal blind spot. The second type of driver blind spot refers to an area that is blocked by a structure of the vehicle, such as a pillar or door of the vehicle. This type of driver blind spot is referred to as a driver vertical blind spot. Fig. 7A and 7B below illustrate driver blind spots using an example in which a human driver is located at a left front position of the vehicle interior. It should be appreciated that the driver may also be located at a front right position within the vehicle. In some embodiments, a human driver may not be present inside the vehicle. As discussed in more detail below, therefore, the blind spot may be relative to any particular location within the vehicle, relative to any LiDAR system mounted to or integrated with the vehicle, and/or relative to other components of the vehicle (e.g., rearview mirrors, cameras, radar sensors, etc.). Further, while the illustrations in this disclosure take automobiles or sport utility vehicles as examples, it should be understood that any other type of vehicle may have a blind spot, including a boat, aircraft, train, truck, bus, drone, or any device for carrying or transporting something.
Fig. 7A illustrates a driver horizontal blind spot area on a road from a top view. The vehicle 700 has a rear view mirror 702 and two side view mirrors 704. When the driver drives the vehicle 700 on the road, the driver can see objects within the rearview range 720 when the driver looks toward the rearview mirror 702. When the driver looks at both side view mirrors 704, the driver can see objects within the side view range 730 on both sides of the vehicle. Mainly, due to the orientation and size of the side view mirror, the side view range is not wide enough to cover all areas on the side of the vehicle. Thus, the driver cannot see the driver horizontal blind spot region 740 on both sides of the vehicle without performing the shoulder check. The size of these blind spot areas may be large enough to block another vehicle (shown as vehicle 780 on both sides of vehicle 700, not to scale), a cyclist, or a pedestrian from entering the driver's view.
Fig. 7B illustrates, from a perspective view, the area of the driver vertical blind spot area. A driver (not shown) sitting in the left driver seat cannot see the object in the driver vertical blind spot area 760 because the driver's view is blocked by the side door and no object is observed in either the rear view mirror 702 or the side view mirror 704. The object in the driver vertical blind spot area 760 may be a parked motorcycle or a child playing on the side of the vehicle. Unless electronic devices (such as a LiDAR system) are used to assist in detection, the driver may not have an effective way to thoroughly inspect objects hidden in the driver's vertical blind spot area 760 other than vigilance. Checking the driver vertical blind spot area 760 is important for safe driving of the vehicle, especially when the vehicle is turning or is parked.
As used in this disclosure, a "blind spot" refers to one or more areas that are outside the FOV of a particular LiDAR system of a vehicle (such as the vehicle main LiDAR system). An exemplary primary LiDAR system is shown in FIG. 1 as LiDAR system 110. For example, for a forward facing primary LiDAR system mounted on top of a vehicle (which has a 120 ° horizontal FOV and a 30 ° vertical FOV), the blind spots of the primary LiDAR system cover the remaining 240 ° in the horizontal FOV and 60 ° in the vertical FOV (assuming the primary LiDAR system focuses only on 0 ° to 90 ° in the vertical FOV). Thus, in some embodiments, the blind spot area of a LiDAR system mounted on a vehicle may cover a significantly larger area than the driver "blind spot" area as described above.
In this disclosure, a "stacked configuration" refers to a LiDAR system in which a laser source and a polygon are vertically stacked with respect to each other. "planar configuration" refers to a LiDAR system in which the laser source is placed on the side of a polygon mirror. The stacked configuration may reduce horizontal asymmetry of the system field of view (described in more detail in the afternoon) caused by the flat configuration.
FIG. 8 illustrates a LiDAR system capable of detecting objects in blind spot areas, according to one embodiment. LiDAR system 800 has a stacked configuration for a scan-based LiDAR component. The LiDAR system includes a housing 810 and a scanning-based LiDAR assembly disposed in the housing 801. The scanning-based LiDAR assembly includes a laser source 871, a polygon mirror 820, a window 830, a collimating lens 860, a combiner mirror 850, an opening 852, a receiving lens 840, a light detector 881, a laser circuit board 870, and a detector circuit board 880.
In fig. 8, the polygon mirror 820 has four reflective facets. For each reflective facet, the angle between the facet and the top planar surface is less than 90 degrees. Thus, the facets are "wedge-shaped". This also means that the cross section of the polygon mirror 820 may have a trapezoidal shape. In other embodiments, the number of reflective facets of the polygon mirror may be different than four. The laser source 871 and the polygon mirror 820 are vertically stacked with respect to each other. Specifically, the laser source 871 is positioned on top of the polygon mirror 820. In other embodiments, the laser source 871 may be placed under the polygon mirror 820.
In some embodiments, a laser source 871 on the laser circuit board 870 generates one or more channels of outgoing laser light (in the form of multiple laser beams). The laser beam is directed to a collimating lens 860 to collimate the outgoing beam. One of the outgoing beams is depicted as beam 890. The combiner mirror 850 has one or more openings. Opening 852 allows exit beam 890 to pass through the mirror. Opening 852 may be a cut-out. In other embodiments, opening 852 may be a lens, an optic with an anti-reflective coating, or anything that allows the outgoing light beam to pass through. The reflective surface of combiner mirror 850 (on the opposite side of laser source 871) redirects return light 895 to photodetector 881 on detector circuit board 880. In one embodiment, opening 852 is located in the center of combination mirror 850. In other embodiments, the opening 852 may be located in other portions of the combined mirror that are not centered. In yet other embodiments, the opening of the combiner mirror is configured to pass the collected return light to the light detector, and the remainder of the combiner mirror is configured to redirect the plurality of light beams of the laser source.
Still referring to fig. 8, the collimated beam is then directed through an opening 852 of the combiner mirror 850 to the polygon mirror 820. In other embodiments, the polygon mirror 820 can have a number of facets other than 4. For example, the polygon mirror 820 can have 3 facets, 5 facets, 6 facets, and so on. The outgoing beam is reflected by one facet of the polygon 820 and directed through a window 830 to illuminate the field of view.
If an object is present in the field of view, the return light is scattered by the object and directed back through window 830 to the facets of polygon 820. One such return light is depicted as return light 895. The return light 895 then travels back to the combiner 850, which directs the return light to the receiving lens 840. The receiving lens 840 focuses the return light to a small spot size. The return light is detected by a light detector 881 on the detector circuit board 880. The light detector 881 may have one or more sensor arrays, where each sensor array has one or more sensor units.
As explained above, in order to detect objects in the blind spot region, the LiDAR system needs to have both a long detection range and a large vertical FOV. FIG. 9 illustrates a vertical FOV of a LiDAR system capable of detecting objects in a blind spot region from a perspective view, according to one embodiment. LiDAR system 900 includes a scanning-based LiDAR assembly 910 and a non-scanning-based LiDAR assembly 920, both enclosed in a single housing 901 to maintain a compact design. In other embodiments, the scanning-based LiDAR assembly 910 and the non-scanning-based LiDAR assembly 920 may have different housings. Window 930 facilitates light transmission to and from components 910 and 920. The scan-based LiDAR assembly 910 may cover a detection distance range of, for example, 100 meters, 150 meters, 200 meters, or more, with a smaller FOV of 120 deg. horizontal FOV and a smaller 30 deg. vertical FOV. The non-scanning based LiDAR assembly 920 may cover a shorter detection range of, for example, 10 meters, 20 meters, 30 meters, or more, but have a120 deg. horizontal FOV and a larger 70 deg. vertical FOV.
In some embodiments, the scanning-based LiDAR assembly 910 includes a rotating polygon having a plurality of reflective facets. The plurality of facets may have different facet angles, each facet covering a smaller vertical angle range. The scanning-based LiDAR component 910 further includes a transceiver component having a plurality of channels. The scanning-based LiDAR component may have a one-dimensional sensor array with typical pixel counts of, for example, 1×16, 1×32, 1×64,1×128, etc. The non-scanning based LiDAR assembly 820 may include a fixed laser source for illumination and a fixed detection array for detecting return light scattered by a near-field object.
In some embodiments, non-scanning based LiDAR component 920 may be a flashing LiDAR system. The flash LiDAR system may have a two-dimensional sensor array with a typical resolution of 320 x 240 pixels. The flash LiDAR system has a laser source that can simultaneously emit divergent two-dimensional planar lasers with an angular range sufficient to illuminate objects in the FOV in a single pulse. The receiving optics also capture the return light in two dimensions. The flashing LiDAR system has no moving parts, has a higher signal-to-noise ratio, can detect objects in a shorter distance, but can have a significantly larger vertical FOV than a scanning-based LiDAR system.
In some embodiments, the laser sources of the scanning-based LiDAR assembly 910 and the non-scanning-based LiDAR assembly 920 are configured to generate laser beams at different wavelengths. In one embodiment, the scanning-based LiDAR component 910 generates a laser beam at 905 nm. The non-scanning based LiDAR assembly 920 generates laser light at 940 nm.
The scan-based LiDAR component 910 and the non-scan-based LiDAR component 920 may be adjusted such that they overlap. Still referring to fig. 9, window 905 facilitates light transmission to components 910 and 920 and light transmission from components 910 and 920. The vertical FOV of scan-based LiDAR component 910 is depicted by region 950. The angular range 951 of the vertical FOV 950 is from-10 ° to 20 ° in this example. Within this vertical range, the scanning-based LiDAR assembly 910 can detect distant objects 980 that are located beyond a maximum detection range of 100 meters out to 200 meters.
The vertical FOV of non-scanning based LiDAR assembly 920 is depicted by area 960. The angular range 952 of the vertical FOV 960 is from 15 ° to 90 ° in this example. Within this vertical range, non-scanning based LiDAR assembly 920 may detect near objects 990 up to a maximum detection point 965. In this embodiment, the vertical FOV of scan-based LiDAR assembly 910 and non-scan-based LiDAR assembly 920 overlap by 5 ° (depicted by region 970), resulting in an overall vertical FOV of LiDAR system 900 of-10 ° to 90 °.
In other embodiments, the vertical FOVs of the two assemblies 910 and 920 do not overlap, but are continuous with each other such that they cover the entire vertical FOV of the LiDAR system 900, which is-10 ° to 90 °. For example, the vertical FOV 950 may have an angular range of-10 ° to 20 °, and the vertical FOV 960 may have an angular range of 20 ° to 90 °. In yet another embodiment, the non-overlapping vertical FOV of the two components 910 and 920 may not cover the entire vertical FOV (-10 ° to 90 °) of the system 900, i.e., leave a gap in the vertical FOV of the system 900. For example, the vertical FOV 950 may have an angular range of-10 ° to 20 °, and the vertical FOV 960 may have an angular range of 30 ° to 90 °, leaving a 10 ° gap between the FOV 950 and FOV 960.
In addition, the range of the vertical FOV of the LiDAR system 900 is not limited to 0 ° to 90 °. As explained above, negative degrees in the vertical FOV means that the vertical FOV covers an area above the horizon, which is drawn horizontally from the LiDAR system. Furthermore, the vertical FOV may cover a vertical angle exceeding 90 °. When the LiDAR system is mounted on a structure protruding from the vehicle body, such as a side view mirror or a support arm for a side view mirror, a vertical angle of over 90 is useful. Referring back to FIG. 7B, a LiDAR system capable of detecting objects in a blind spot area (not shown) may be mounted on the outer edge of the side view mirror 704. A vertical FOV range exceeding 90 ° refers to a region between the gravitational line of the outer edge (not shown in the figure) of the mirror 704 and the right side of the vehicle body.
Referring back to FIG. 9, an exemplary vertical FOV 960 based on a non-scanning LiDAR assembly 920 ranges from 15 to 90, with line 961 depicting a 90 line and line 962 depicting a 15 line. Lines 961 and 962 intersect ground 966 at points 963 and 964, respectively. In order to detect close objects within a large vertical FOV, the assembly 920 needs to be aimed downward. Thus, the maximum distance that the assembly 920 can detect is either the capacity of the assembly 920 (which may be up to 30 meters) or the distance from 920 to point 964, whichever is greater. The distance from 920 to 964 may be calculated by dividing the vertical distance between 920 and 963 by the cosine of angle 952. For example, if the angle 952 is 70 °, then assuming that the vehicle side view mirror to which the assembly 920 is mounted is 1.5 meters above the ground, the furthest distance that the non-scanning based LiDAR assembly 920 can detect is 1.5 m/cos (70 °) =4.4 m.
Still referring to FIG. 9, a scanning-based LiDAR assembly 910 and a non-scanning-based LiDAR assembly 920 are enclosed in a housing 901. The height of the housing 901 may be equal to or less than about 50mm. The housing 901 may be mounted in a vehicle side view mirror compartment, a side view mirror support structure, or a bumper, fender, or side panel of the vehicle. In some examples, the housing 901 is a side view mirror compartment of the vehicle or a side panel of the vehicle. A window 930 located on the housing 901 facilitates the transmission of outgoing light from the two assemblies 910 and 920 and the transmission of return light to the two assemblies 910 and 920. In some embodiments, the housing 901 may have two or more windows, with at least one window being located in front of the scanning-based LiDAR assembly 910 and at least one other window being located in front of the non-scanning-based LiDAR assembly 920. Window 905 is positioned in front of housing 901. In other embodiments, one or more windows may face in different directions and may be located on the top, bottom, front, back, or sides of the housing 901.
It should be appreciated that the components 910 and 920 may take any relative position with respect to each other, and that they may be positioned at any location within the housing 901. In the example shown in fig. 9, assembly 920 is positioned in assembly 910. In other embodiments, components 910 and 920 may be in any relative position with respect to each other. For example, they may be side-to-side or up-and-down with respect to each other. In other embodiments, component 910 may be located inside component 920. The components 910 and 920 may also take any position within the housing 901. In some embodiments, either component 910 or component 920 may be located on the left, middle, right, top, bottom, front, or rear of housing 901.
FIG. 10 is a block diagram of a LiDAR system capable of detecting objects in blind spot areas, according to one embodiment. LiDAR system 1000 includes two components, namely a scanning-based LiDAR component 1010 and a non-scanning-based LiDAR component 1020. The scan-based LiDAR assembly 1010 includes a laser array 1008 and a laser driver 1010 on the transmit side, and a detector array 1002, an amplifier 1004, and an a/D converter 1006 on the receive side. The laser driver 1010 is controlled by control circuitry 1031 that has a control function similar to control circuitry 350 of fig. 3. The control circuitry 1031 may be implemented with a field programmable gate array ("FPGA") and/or a system on a chip ("SOC"). The laser array 1008 is driven by a laser driver 1010 and may have laser emitter arrays of 1 x 8, 2 x 4, 1 x 16, 6 x 8, etc. The laser array 1008 and the laser driver 1010 perform the functions of the laser source 310 in fig. 3.
On the receiving side of the scan-based LiDAR assembly 1010, the detector array 1002 receives the returned scattered light and may be in an array of 1X 8, 2X 4, 1X 16, 1X 64, etc. In some embodiments, the configuration of the detector array 1002 matches the configuration of the laser array 1008. For example, if laser array 1008 has 4 arrays of 1 x 8 emitters, detector array 1002 will also have 4 arrays of 1 x 8 detectors. In other embodiments, the configuration of the detector array and the laser array may be different. The output of the detector array 1002 is an analog signal of the return light pulse, which is amplified by an amplifier 1004 and passed to an analog-to-digital (a/D) converter 1006. The output of the a/D converter 1006 is a digital signal that returns the light pulses and is forwarded to the control circuitry 1031 for processing.
Still referring to FIG. 10, in some embodiments, a non-scanning based LiDAR assembly 1020 includes a two-dimensional (2-D) laser emitter 1020 and a laser driver 1022 on a transmit side, and a two-dimensional (2-D) detector array 1024 and detector adjustment circuitry 1026 on a receive side. In one embodiment, the component 1020 may be a flashing LiDAR system. The laser driver 1022 is controlled by control circuitry 1031 to drive the 2-D laser emitter 1020 to generate 2-D planar rays. In one embodiment, the 2-D laser emitter 1020 may generate a laser field with a typical resolution of 320X 240 pixels. On the receiving side, the returned scattered light is detected by a 2-D detector array 1024. In some embodiments, 2-D detector array 1024 has the same resolution as 2-D laser emitters 1020. In other embodiments, the 2-D detector array 1024 may have a different resolution than the 2-D laser emitters 1020. The optical signals detected by the 2-D detector array 1024 are sent to a detector adjustment circuit 1026 to process timing and signal adjustment. The output of detector conditioning circuit 1026 is a digital signal that returns the light pulses and is forwarded to control circuitry 1031 for processing.
LiDAR system 1000 also includes a steering mechanism 1032 that functions similarly to steering mechanism 340 in FIG. 3. In some embodiments, steering mechanism 1032 includes a motor driver 1033, a motor 1035, and an encoder 1037. The motor driver 1033 is controlled by the control circuitry 1031 and causes the motor 1035 to rotate in accordance with the rotational speed set by the control circuitry 1031. The motor 1035 is attached to a multi-faceted polygon mirror (described in detail below). Thus, rotation of the motor 1035 will cause the polygon mirror to rotate in the same direction and at the same rotational speed. The encoder 1037 measures the actual rotational speed of the motor 1035 and provides the actual rotational speed of the motor back to the control circuitry 1031 as a feedback signal 1039. The control circuitry 1031 may adjust its control of the motor driver 1033 based on the feedback signal 1039 so that fine adjustments may be made to the rotational speed of the motor 1035.
The detector array 1002 of the scan-based LiDAR component 1010 is configured to generate signals representing a FOV map of the scan-based component. The 2D detector array 1024 of the non-scanning based LiDAR component 1020 is configured to generate signals representing FOV mappings of the non-scanning based component. As previously discussed, the LiDAR system 1000 may or may not have overlapping vertical FOVs from the two components 1010 and 1020. Without vertical overlap, the vertical FOV of the scanning-based LiDAR assembly 1010 may be from-10 ° to 20 °, and the vertical FOV of the non-scanning-based LiDAR assembly 1020 may be from 20 ° to 100 °. To generate a complete point cloud covering the data points of both components, the data points from both components are combined in control circuitry 1031 to generate a unified point cloud. When there is an overlap of FOVs, control circuitry 1031 may select overlapping data points generated by one component and discard data points generated by another component for the same FOV. In some embodiments, control circuitry 1031 may combine overlapping data points generated by the two components to produce a higher quality point cloud.
As described above, a scanning-based LiDAR assembly having a stacked configuration illustrated by fig. 8 may reduce field-of-view horizontal asymmetry of the system. If the laser source is placed on the side of the polygon in a scan-based LiDAR assembly (a "flat" configuration), then a FOV horizontal asymmetry can occur. In the flat configuration, the outgoing light is guided onto the reflecting surface of the polygon mirror from the side. FIG. 11 illustrates a scan-based LiDAR assembly having a flat configuration. The scanning-based LiDAR assembly 1100 includes a laser source 1171, a polygon mirror 1120, a collimating lens 1160, a combiner mirror 1150, an opening 1152, a receiving lens 1140, a photodetector 1181, a laser circuit board 1170, and a detector circuit board 1180.
The scan-based LiDAR assembly 1100 in FIG. 11 is very similar to the scan-based LiDAR assembly in FIG. 8, which has a stacked configuration. In fig. 8, the light source is placed on top of a polygon mirror. In fig. 11, the light source is placed on the side of the polygon mirror. A laser source 1171 on the laser circuit board 1170 emits laser light. The outgoing light beam (e.g., 1190) is directed to a collimating lens 1160, which collimates the outgoing light beam. The collimated outgoing beam is then directed through an opening 1152 of the combiner mirror 1150 and onto one of the reflective facets of the polygon mirror 1120. Similar to the polygon mirror 820 of FIG. 8, the polygon mirror 1120 has four wedge-shaped reflective facets. The outgoing beam is directed to the field of view by one of the 4 reflective facets of the polygon 1120. The return light (e.g., 1195) is scattered back by objects in the FOV to this facet of the polygon 1120. Then, the return light travels back to the combining mirror 1150, which guides the return light to the receiving lens 1140. The receive lens 1140 focuses the return light to a small spot size, which is detected by a light detector 1181 on the detector circuit board 1180.
The flat configuration of fig. 11 may cause FOV horizontal asymmetry. The polygon mirror 1120 rotates along an axis 1121. When the polygon mirror 1120 rotates clockwise, for each facet that reflects the outgoing beam, that facet first moves toward the laser beam and then away from the laser beam. When the facet is moved towards the laser beam (forming one end of the horizontal FOV), the angle of the incident light is small. Thus, the outgoing beam can cover a larger vertical angle range. When the facet moves away from the incident light (to the other end of the horizontal FOV), the vertical angle of reflection does not change as much, resulting in a narrowing of the vertical FOV range. FOV level asymmetry may or may not be desirable depending on the application of the LiDAR system. For example, asymmetry may not be desirable if the LiDAR system is configured to primarily scan the forward direction of the vehicle's field of view. However, if the LiDAR system is configured to scan the lateral direction of the field of view (e.g., when the vehicle turns), asymmetry may be acceptable or even desirable.
FIG. 12A illustrates the FOV of a scan-based LiDAR assembly having a flat configuration. The scanning-based LiDAR assembly 1100 with a flat configuration includes an emitter 1171 that emits 4 outgoing laser channels. The FOV of each laser channel is shown as 1201, 1202, 1203, and 1204 in fig. 12A. The ideal FOV in the horizontal 130 ° and 70 ° vertical directions is shown as rectangle 1205. Due to the flat configuration (i.e., the emitter is located on the side of the polygon), the horizontal FOV is compressed as the polygon facets move away from the emitter. Thus, at the left end of the FOV, the FOV vertical coverage of all four channels combined is only 40 ° to 45 °. At the right end of the FOV, the FOV vertical coverage of all four channels combined may reach 80 ° to 90 °. The difference in vertical coverage on both ends of the FOV is about 40 ° to 45 °, which accounts for half of the vertical coverage on the right end. By fine tuning the facet angle and position of the polygon mirror, a certain homogenization can be achieved. However, if the emitters are located on the sides of the polygon, large asymmetries in the horizontal FOV may still exist.
FIG. 12B illustrates the FOV of a scan-based LiDAR assembly with a stacked configuration. The scanning-based LiDAR assembly 800 with a stacked configuration includes an emitter 871 that emits 4 outgoing laser channels. The FOV of each laser channel is shown as 1205, 1206, 1207, and 1208 in fig. 12B. The ideal FOV in the horizontal 120 ° and 70 ° vertical directions is shown as rectangle 1209. In the stacked configuration, i.e. the emitter is in a vertical position with respect to the polygon. When the polygon mirror rotates, the reflective facet does not move towards or away from the laser beam as in the flat configuration. The FOV horizontal asymmetry effect of a flat configuration can be reduced or avoided.
As illustrated in FIG. 12B, a scanning-based LiDAR assembly 800 having a stacked configuration shows a symmetrical scanning pattern in the horizontal direction. Referring back to fig. 8, in the stacked configuration, the emitter 871 is located on top of the polygon 820. This configuration leaves enough space under the polygon mirror 820 to achieve a large vertical FOV in the downward direction. The stacked configuration allows up to +15° of the vertical scanning beam to be directed undisturbed out to the field of view. With optimization of polygon facet angles, the stacked configuration may achieve a vertical FOV coverage of 70 ° or greater, and a horizontal FOV coverage of 120 ° or greater.
FIG. 13 illustrates a cross-sectional view of a scanning-based LiDAR assembly having a stacked configuration, according to one embodiment. The scanning-based LiDAR assembly 1300 having a stacked configuration includes a polygon 1320, a receiving lens 1340, a combiner mirror 1350, a collimating lens 1360, a laser circuit board 1370, and a detector circuit board 1380.
In some embodiments, a laser source 1370 on the laser circuit board 1371 generates one or more channels of outgoing laser light (in the form of multiple laser beams). The laser beam is directed to a collimating lens 1360 to collimate the outgoing beam. One of the outgoing beams is depicted as beam 1390. The combination mirror 1350 has one or more openings. Opening 1352 allows outgoing light beam 1390 to pass through the mirror. Opening 1352 may be a resection. In other embodiments, opening 1352 may be a lens, optics with an anti-reflective coating, or anything that allows the outgoing light beam to pass through. The reflective surface of the combiner mirror 1350 (on the opposite side of the laser source 1371) redirects the return light to a photodetector 1381 on a detector circuit board 1380. In one embodiment, the opening 1352 is located in the center of the combination mirror 1350. In other embodiments, the opening 1352 may be located in other portions of the combined mirror that are not centered. In yet other embodiments, the opening of the combiner mirror is configured to pass the collected return light to the light detector, and the remainder of the combiner mirror is configured to redirect the plurality of light beams of the laser source.
Still referring to fig. 13, the collimated light beam (e.g., beam 1390) is then directed through the opening 1352 of the combiner mirror 1350 and is then redirected to the polygon mirror 1320. In other embodiments, the outgoing beam from the laser source 1371 may be redirected by one or more temporary mirrors before reaching the polygon 1320. In some embodiments, the polygon mirror 1320 may have multiple facets. For example, the polygon mirror 1320 may have 3 facets, 4 facets, 5 facets, 6 facets, and so on. The outgoing beam is reflected by the facets of the polygon mirror 1320. The polygon mirror 1320 rotates about an axis 1321. As the polygon mirror 1320 rotates, each of the plurality of facets in turn reflects the outgoing beams and directs them through the window 1330 to illuminate the field of view.
If an object is present in the field of view, the return light is scattered by the object and directed back through window 1330 to the facets of polygon 1310. One such return light is depicted as 1395. The return light then travels back to fold mirror 850, which directs the return light to the reflective surface of combiner mirror 1350. The combiner mirror 1350 then directs the return light to a receive lens 1340, which focuses the return light to a small spot size. The return light is then directed to and detected by a detector array 1381 on a detector circuit board 1380.
In some embodiments, the multi-faceted polygon 1320 is a variable angle multi-faceted polygon (VAMFP) according to an embodiment. FIG. 14A illustrates a perspective view of a variable angle multi-faceted polygon, according to one embodiment. FIG. 14B illustrates a side view of each facet of a variable angle multi-faceted polygon in accordance with one embodiment. FIG. 14C illustrates a LiDAR system FOV with a combined strip of the plurality of facets from VAMFP, according to one embodiment. VAMFP is described in more detail in U.S. non-provisional patent application No. 16/837,429, entitled "variable angle polygonal article for use with Lidar systems," filed on 1,4, 2020, the contents of which are incorporated herein by reference in their entirety for all purposes.
Returning to fig. 14A, the variable angle multi-faceted polygon 1400 rotates about an axis 1410. VAMFP 1400 can include 4 reflective surfaces (facets). As discussed herein, each facet may be referenced by its index (i.e., facets 0, 1,2, and 3) or by its reference number (i.e., facets 1420, 1421, 1422, and 1423, respectively). Light source 1430 (which is similar to laser source 1371 in FIG. 13) generates a plurality of laser beams 1430a-1130c. Beams 1430a-1130c are aimed toward one of the 4 facets of VAMFP 1400 by a collimating lens or lens group (not shown). As VAMFP 1400 rotates about axis 1410, light source 1430 interfaces with each of facets 1420, 1421, 1422, and 1423 in a repeated succession. The beams redirected by each facet are depicted as beams 1430a 'x', where x is the index number of the facet reflecting the beams. For example, as illustrated in fig. 14A, individual beams 1430a-1130c redirected by facet 3 (or facet 1423) are depicted as 1430a3, 1430b3 and 1430c3. As illustrated in fig. 14B, the beams redirected by facet 0 (or facet 1420) are depicted as 1430a0, 1430B0, and 1430c0.
Fig. 14B illustrates side views of facet 1420 (upper left sub-graph), facet 1421 (upper right sub-graph), facet 1422 (lower left sub-graph), and facet 1423 (lower right sub-graph). Each of facets 1420, 1421, 1422, and 1423 has its own unique facet angle (shown as θ 03, respectively). The facet angle of a facet represents the angle between the facet surface and the top planar surface of the polygon 1400. Facet 1420 corresponds to a facet angle θ 0, facet 1421 corresponds to a facet angle θ 1, facet 1422 corresponds to a facet angle θ 2, and facet 1423 corresponds to a facet angle θ 3. In one embodiment, the facet angles of the polygon 1400 are all 90 degrees. In other embodiments (such as the embodiments shown in fig. 14A and 14B), the facet angle of each facet of the polygon 1400 is less than 90 degrees, thereby forming a wedge-shaped facet. The cross section of the polygon mirror 1400 may have a trapezoidal shape. FIG. 14B shows that individual beams 1430a-1430c are redirected by different facets 1420-1423.
The facet angle of each facet corresponds to the vertical extent of the scan. The vertical extent of the scan of at least one facet is different from the vertical extent of the other facets. FIG. 14C shows an illustrative LiDAR system FOV 1470 having four non-overlapping bands 1480-1483 in the FOV, each band corresponding to a separate FOV produced by one of facets 1420-1423 and their respective facet angles θ 03. The FOV 1470 also shows redirected beams 1430a0-1430c0, 1430a1-1430c1, 1430a2-1430c2, and 1430a3-1430c3 in respective bands 1480-1483. Each of the bands 1480-1483 spans the entire horizontal axis of the FOV 1470 and occupies a subset of the vertical axis of the FOV 1470. The facet angle θ 03 may be selected such that the bands 1480-1483 cover the entire FOV of the LiDAR system and are continuous in adjacent relation. In other embodiments, the bands may be discontinuous and leave gaps between the bands. In other embodiments, two or more bands may overlap each other.
Each facet angle may be different from each other. The facet angle difference of the facets may be constant or variable. In some embodiments, the facet angles are 2.5 to 5 degrees apart such that the total vertical range of scanning is about 20 to 40 degrees. For example, in one embodiment, the facet angles are 4 degrees apart: θ 0 is 60 °, θ 1 is 64 °, θ 2 is 78 °, and θ 3 is 72 °. In other embodiments, the facet angles are 9 degrees apart, resulting in a total vertical range of about 72 degrees for scanning.
It should be understood that the use of four facet and three beam type beams in VAMFP a-1400 in fig. 14C is merely illustrative. VAMFP may have any number of facets and any number of beams may be used.
Fig. 15A illustrates a Vertical Cavity Surface Emitting Laser (VCSEL) chip with a1 x 8 emitter array. VCSEL chip 1510 can be used as laser sources 871, 1171, and 1371 depicted in various embodiments of the present disclosure. A VCSEL is a type of semiconductor laser diode with a laser beam emission perpendicular from the top surface. Since the VCSEL emits from the top surface of the chip, it can be tested on a wafer before being diced into individual devices. This reduces the manufacturing costs of the device. With a larger output aperture, the VCSEL produces a lower divergence angle of the output beam and a potentially high coupling efficiency with the fiber.
The VCSEL chip 1510 has an array of 1 x 8 emitter regions arranged in a row (starting from the first emitter region 1514) at the center of the chip. Each emitter region has a plurality of miniature VCSEL emitters (depicted as small circles inside each emitter region). Each emitter region corresponds to a laser channel and can be individually turned on and off. When the emission region is turned on or off, the miniature VCSEL emitters in that particular emission region are turned on or off together. The emitter region may be connected to one or more electrodes. The electrode may control one or a group of emitter regions by switching the emitter region(s) on or off. The electrodes may be of several different types, e.g. anode, cathode, etc. All emitter regions on a VCSEL chip may share a common electrode. In other embodiments, multiple emitter regions on a VCSEL chip may be connected to more than one electrode. Each electrode may have one or more pads. As illustrated in fig. 15A, the emission region 1514 is connected to two pads 1512 and 1513 of the same electrode. In other embodiments, the emitter region may be connected to only one pad. In still other embodiments, the emitter region may be connected to more than two pads.
Fig. 15B illustrates an array of six VCSEL chips, each having a1 x 8 emitter region array. The chip array 1530 has 6 VCSEL chips 1510, representing a total of 48 (i.e., 6 x 8) laser channels. Chip array 1530 may be used as laser sources 871, 1171, and 1371 depicted in various embodiments of the present disclosure. Individual VCSEL chips in the chip array 1530 can be staggered in different ways to achieve different exit beam configurations.
It should be understood that the use of a1 x 8 emitter array in VCSEL chip 1510 and six VCSEL chips in chip array 1530 is merely illustrative. The VCSEL chips can have an array in any number of rows and columns. For example, the VCSEL chip 1510 can have an array of emitter regions of 1 x 8, 2 x 4, 1 x 16, etc. In addition, any number of VCSEL chips can be used to form any number of rows and columns of an array of VCSEL chips. The VCSEL chips may also be staggered in any layout within the VCSEL chip array. For example, the chip array 1530 may have 4, 8, 12, 16, or any number of VCSEL chips staggered in any layout. In some embodiments, the total number of VCSEL emission regions of the LiDAR system laser source is substantially equal to the total number of sensor units in the sensor array of the LiDAR system light detector. In other embodiments, the total number of VCSEL emitter regions and the total number of sensor units of the LiDAR system may be substantially different.
The number of emitter arrays in the VCSEL chip and the number of VCSEL chips in the chip array have a direct relationship to the vertical FOV coverage of each facet of the polygon. 14A-14C and 15B, and taking chip array 1530 as an example, 48 laser channels of chip array 1530 are directed simultaneously onto one of the facets of polygon 1400. Assuming an angular resolution of 0.4 ° is to be achieved, the vertical angle range of each facet (1420-1423) will be 0.4 ° x 48=19.2°. As explained above, each facet produces its respective vertical FOV (corresponding to the corresponding bands 1480-1483 in fig. 14C). Thus, each of the 4 bands in fig. 14C covers a vertical angle of 19.2 °, resulting in a total vertical coverage of the entire FOV 1470 of 19.2 ° x 4 = 76.8 °.
FIG. 16A illustrates a configuration of a single collimating lens of a scanning-based LiDAR assembly. The collimating lens 1620 may be used as the collimating lenses 860, 1160, and 1360 depicted in various embodiments of the present disclosure. The collimating lens 1620 may be a meniscus lens having two convex cylindrical or substantially cylindrical optical windows on either side of the lens. Four laser emitters 1601-1604 are shown as laser sources for a LiDAR system. The laser emitters 1601-1604 emit four channels of outgoing laser beams 1611-1614, respectively, which are collimated by a collimating lens 1620. The use of a meniscus lens can extend the horizontal FOV of a LiDAR system beyond 120 °.
FIG. 16B illustrates a configuration of a collimating lens group lens of a scanning-based LiDAR assembly. Better collimation of more laser channels can be achieved using a set of collimating lenses than the single collimating lens of fig. 16A. The collimator lens set 1600 includes, for example, two concave lenses 1630 and 1650, and two meniscus lenses 1640 and 1660. The collimating lens group 1600 may be used in place of the collimating lenses 860, 1160, and 1360 depicted in the various embodiments of the present disclosure. Three laser emitters 1605-1607 are shown as laser sources of a LiDAR system. The laser emitters 1605-1607 emit three channels of outgoing laser beams 1615-1617, respectively. Through collimating lens group 1600, outgoing laser beams 1615-1617 are substantially collimated as they reach the facets of polygon mirror 1680.
It should be understood that the depiction of 4 laser channels in fig. 16A and 3 laser channels in fig. 16B is merely illustrative. The single collimating lens 1620 in fig. 16A and the collimating lens group 1600 in fig. 16B may have any number of laser channels. Further, the lens group shown in fig. 16B may be modified to include a greater or lesser number of lenses, different types of lenses, and/or different orders of lenses.
FIG. 17A illustrates a configuration of a single receive lens of a scanning-based LiDAR assembly. The receive lens 1720 may be used as the receive lenses 840, 1140, and 1340 depicted in various embodiments of the disclosure. The receiving lens 1720 is a meniscus lens with two convex cylindrical or substantially cylindrical optical windows on either side of the lens. The four channels of return light 1711-1714 are focused by the receiving lens 1720 as dots 1701-1704, respectively, when passing through the receiving lens 1720. The return light is then detected by corresponding photodetectors located at small spots 1701-1704 on the detector board 1708.
FIG. 17B illustrates a configuration of a receive lens set of a scanning-based LiDAR assembly. The use of a set of receiving lenses may achieve better focusing of more laser channels than the single receiving lens of fig. 17A. Receive lens group 1700 includes lenses 1730, 1740, 1750, and 1760. Receive lens group 1700 may be used in place of receive lenses 840, 1140, and 1340 depicted in various embodiments of the disclosure. When passing through the receiving lens 1700, the three channels of return light 1715-1717 are focused by the lens group on spots 1705-1707, respectively. The return light is then detected by corresponding photodetectors located at small spots 1701-1704 on the detector circuit board 1780.
It should be understood that the depiction of 4 laser channels in fig. 17A and 3 laser channels in fig. 17B is merely illustrative. The single receive lens 1720 in fig. 17A and the receive lens group 1700 in fig. 17B may have any number of laser channels. Further, the lens group shown in fig. 17B may be modified to include a greater or lesser number of lenses, different types of lenses, and/or different orders of lenses.
Fig. 18 is a flowchart illustrating a method for detecting an object in a blind spot region. In some embodiments, the method 1800 may be performed by the LiDAR system 800 in FIG. 8. The method 1800 includes steps 1810 to 1860. At step 1810, a first light source provides a plurality of light beams. At step 1820, one or more collimating lenses (e.g., the collimating lenses described above in fig. 8 and 13) collimate the plurality of light beams provided by the first light source. The one or more collimating lenses are optically coupled to the first light source. At step 1830, a multi-faceted polygon (e.g., the polygon mirror described above in fig. 8 and 13) scans the plurality of light beams to illuminate the first FOV. The multi-faceted polygon is rotatable and disposed below the first light source. At step 1840, one or more receiving lenses collect return light generated based on the illumination of the first FOV. At step 1580, a combiner mirror directs both the plurality of light beams provided by the first light source and the collected return light. The combined mirror is arranged between the collimating lens and the receiving lens. In step 1860, the light detector receives the collected light.
The foregoing description is to be understood as being in every respect illustrative and exemplary, but not restrictive, and the scope of the invention disclosed herein is to be determined not from the description, but from the claims, as interpreted according to the full breadth permitted by the patent laws. It will be understood that the embodiments shown and described herein are merely illustrative of the principles of this invention and that various modifications may be implemented by those skilled in the art without departing from the scope and spirit of the invention. Various other combinations of features may be implemented by those skilled in the art without departing from the scope and spirit of the invention.

Claims (27)

1. A light detection and ranging LiDAR system for detecting objects in a blind spot region, comprising:
A housing; and
A scanning-based LiDAR assembly disposed in the housing, the scanning-based LiDAR assembly comprising:
a first light source configured to provide a plurality of light beams,
A multi-faceted polygon rotatable to scan the plurality of light beams to illuminate a first field of view FOV, the multi-faceted polygon and the first light source vertically stacked,
One or more collimating lenses optically coupled to the first light source, the collimating lenses configured to collimate the plurality of light beams provided by the first light source,
One or more collection lenses configured to collect return light generated based on the illumination of the first FOV, an
A light detector configured to receive the collected return light.
2. The LiDAR system of claim 1, wherein the first light source is vertically stacked on top of the multi-faceted polygon.
3. The LiDAR system of any of claims 1 to 2, wherein the first light source is positioned to emit the plurality of light beams in a direction toward the multi-faceted polygon.
4. The LiDAR system of any of claims 1 to 3, further comprising:
a non-scanning based LiDAR assembly disposed in the housing, the non-scanning based LiDAR assembly configured to transmit laser light to illuminate the second FOV without scanning.
5. The LiDAR system of any of claims 1 to 4, wherein the scan-based LiDAR assembly further comprises: a combiner mirror disposed between the one or more collimating lenses and the one or more collecting lenses.
6. The LiDAR system of any of claims 1 to 5, wherein the multi-faceted polygon is a variable-angle multi-faceted polygon VAMFP, the VAMFP comprising a plurality of facets, each facet having a facet angle, the facet angle of each facet corresponding to a vertical range of scanning, wherein the vertical range of at least one facet is different from the vertical ranges of other facets.
7. The LiDAR system of claim 6, wherein the VAMFP comprises four facets whose facet angles are about 9 degrees apart, wherein the facet angles of the four facets are configured such that a total vertical range of scanning of all the four facets is about 72 degrees.
8. The LiDAR system of any of claims 6 to 7, wherein the plurality of vertical ranges of all of the facets are non-overlapping vertical ranges.
9. The LiDAR system of any of claims 6 to 8, wherein at least two vertical ranges of the plurality of facets are overlapping vertical ranges.
10. The LiDAR system of claim 4, wherein the non-scanning based LiDAR assembly comprises a flashing LiDAR device configured to simultaneously illuminate the second FOV in a single pulse of light.
11. The LiDAR system of any of claims 4 and 10, wherein the first light source comprises a first laser source configured to provide the plurality of light beams at a first wavelength; wherein the non-scanning based LiDAR assembly includes a second laser source configured to provide laser light at a second wavelength, the second wavelength being different from the first wavelength.
12. The LiDAR system of any of claims 1 to 11, wherein the first light source comprises a plurality of vertical cavity surface emitting laser VCSEL arrays, each VCSEL array having a plurality of VCSEL emission regions.
13. The LiDAR system of claim 12, wherein the light detector comprises a plurality of sensor arrays, each sensor array having a plurality of sensor units, wherein a total number of the sensor units is substantially equal to a total number of the VCSEL emission regions.
14. The LiDAR system of claim 5, wherein the combiner mirror comprises:
A first portion configured to allow the plurality of light beams from the first light source to pass through; and
A second portion configured to redirect the collected return light to the light detector.
15. The LiDAR system of claim 14, wherein the first portion comprises a resection port.
16. The LiDAR system of any of claims 14 to 15, wherein the first portion is a central portion of the combiner mirror and the second portion is a portion of the combiner mirror other than the central portion.
17. The LiDAR system of claim 5, wherein the combiner mirror comprises:
a first portion configured to allow the collected return light to pass through to the light detector; and
A second portion configured to redirect the plurality of light beams from the first light source.
18. The LiDAR system of any of claims 1 to 17, wherein the housing further comprises:
One or more windows mounted to or integrated with the housing, wherein the one or more windows are configured to facilitate scanning of the plurality of light beams by the scanning-based LiDAR assembly to illuminate the first FOV.
19. The LiDAR system of any of claims 4 and 10 to 11, wherein the housing comprises:
One or more windows mounted to or integrated with the housing, wherein the one or more windows are configured to:
facilitating passing the plurality of light beams scanned by the scanning-based LiDAR assembly to illuminate the first FOV, and
Facilitating passage of the laser light transmitted by the non-scanning based LiDAR assembly to illuminate the second FOV.
20. The LiDAR system of any of claims 4, 10 to 11, and 19, wherein the non-scanning based LiDAR assembly is configured to transmit divergent laser light having an angular extent sufficient to illuminate the entire second FOV in a single pulse.
21. The LiDAR system of any of claims 4, 10-11, and 19-20, wherein the scan-based LiDAR component comprises a first sensor array configured to generate a signal representing a mapping of the first FOV; and
Wherein the non-scanning based LiDAR assembly includes a second sensor array configured to generate a signal representative of a mapping of the second FOV.
22. The LiDAR system of claim 21, further comprising processing circuitry configured to generate a unified point cloud representing both the first FOV and the second FOV based on the signal representing the mapping of the first FOV and the signal representing the mapping of the second FOV, wherein the first FOV and the second FOV at least partially overlap.
23. The LiDAR system of any of claims 1 to 22, wherein the height of the LiDAR system is equal to or less than about 50mm, or is configured such that the LiDAR system is mountable in at least one of a vehicle side view mirror or a support structure of a vehicle side view mirror, or a vehicle fender.
24. A method performed by a light detection and ranging LiDAR system for detecting objects in a blind spot region, the method comprising:
providing a plurality of light beams by a first light source;
Collimating the plurality of light beams provided by the first light source by one or more collimating lenses optically coupled to the first light source;
Scanning the plurality of light beams by a multi-faceted polygon to illuminate a first FOV, the multi-faceted polygon being rotatable and disposed below the first light source;
collecting, by one or more receiving lenses, return light generated based on illumination of the first FOV;
Directing, by a combiner mirror disposed between the collimating lens and the receiving lens, both the plurality of light beams provided by the first light source and the collected return light; and
The collected light is received by a light detector.
25. The method of claim 24, further comprising:
the second FOV is simultaneously illuminated by the non-scanning based LiDAR component transmitting laser light without scanning.
26. The method of claim 25, further comprising:
generating, by a first sensor array of the light detector, a signal representative of a mapping of the first FOV; and
Signals representing a mapping of the second FOV are generated by a second sensor array of the non-scanning based LiDAR assembly.
27. The method of claim 26, further comprising: generating, by processing circuitry, a unified point cloud representing both the first FOV and the second FOV based on the signals representing the mapping of the first FOV and the signals representing the mapping of the second FOV, wherein the first FOV and the second FOV at least partially overlap.
CN202280073052.4A 2021-10-29 2022-10-28 Compact LiDAR system for detecting objects in blind spot areas Pending CN118235061A (en)

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
US202163273802P 2021-10-29 2021-10-29
US63/273,802 2021-10-29
US202163292404P 2021-12-21 2021-12-21
US63/292,404 2021-12-21
US17/975,539 2022-10-27
US17/975,539 US20230136272A1 (en) 2021-10-29 2022-10-27 Compact lidar systems for detecting objects in blind-spot areas
US17/975,543 US20230138819A1 (en) 2021-10-29 2022-10-27 Compact lidar systems for detecting objects in blind-spot areas
US17/975,543 2022-10-27
PCT/US2022/048294 WO2023076635A1 (en) 2021-10-29 2022-10-28 Compact lidar systems for detecting objects in blind-spot areas

Publications (1)

Publication Number Publication Date
CN118235061A true CN118235061A (en) 2024-06-21

Family

ID=91539579

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280073052.4A Pending CN118235061A (en) 2021-10-29 2022-10-28 Compact LiDAR system for detecting objects in blind spot areas

Country Status (2)

Country Link
EP (1) EP4423532A1 (en)
CN (1) CN118235061A (en)

Also Published As

Publication number Publication date
EP4423532A1 (en) 2024-09-04

Similar Documents

Publication Publication Date Title
US20220413102A1 (en) Lidar systems and methods for vehicle corner mount
US12072447B2 (en) Compact LiDAR design with high resolution and ultra-wide field of view
US20230138819A1 (en) Compact lidar systems for detecting objects in blind-spot areas
WO2024076687A1 (en) Curved window for expansion of fov in lidar application
WO2024063880A1 (en) Low-profile lidar system with single polygon and multiple oscillating mirror scanners
WO2023220316A1 (en) Dual emitting co-axial lidar system with zero blind zone
CN117769658A (en) Emitter channel for light detection and ranging system
US11768294B2 (en) Compact lidar systems for vehicle contour fitting
US20240103138A1 (en) Stray light filter structures for lidar detector array
US11624806B2 (en) Systems and apparatuses for mitigating LiDAR noise, vibration, and harshness
US20240134011A1 (en) Two dimensional transmitter array-based lidar
US20230366988A1 (en) Low profile lidar systems with multiple polygon scanners
US11662439B2 (en) Compact LiDAR design with high resolution and ultra-wide field of view
US20240094351A1 (en) Low-profile lidar system with single polygon and multiple oscillating mirror scanners
US11871130B2 (en) Compact perception device
CN118235061A (en) Compact LiDAR system for detecting objects in blind spot areas
US20240118389A1 (en) Curved window for expansion of fov in lidar application
US20230366984A1 (en) Dual emitting co-axial lidar system with zero blind zone
US20240295633A1 (en) Thin profile windshield mounted lidar system
US20240103174A1 (en) Point cloud data compression via below horizon region definition
US20230305124A1 (en) Methods and systems of window blockage detection for lidar
US20240210541A1 (en) Detector alignment method for lidar production
WO2023076635A1 (en) Compact lidar systems for detecting objects in blind-spot areas
WO2022272144A1 (en) Lidar systems and methods for vehicle corner mount
CN117813525A (en) Compact LiDAR system for vehicle contour fitting

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination