[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2020139385A1 - Systems and methods for vehicle identification - Google Patents

Systems and methods for vehicle identification Download PDF

Info

Publication number
WO2020139385A1
WO2020139385A1 PCT/US2018/067988 US2018067988W WO2020139385A1 WO 2020139385 A1 WO2020139385 A1 WO 2020139385A1 US 2018067988 W US2018067988 W US 2018067988W WO 2020139385 A1 WO2020139385 A1 WO 2020139385A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
individual
vehicles
identifications
information
Prior art date
Application number
PCT/US2018/067988
Other languages
French (fr)
Inventor
Xiaoyong Yi
Liwei Ren
Jiang Zhang
Original Assignee
Didi Research America, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Didi Research America, Llc filed Critical Didi Research America, Llc
Priority to PCT/US2018/067988 priority Critical patent/WO2020139385A1/en
Priority to CN201880100552.6A priority patent/CN113366548A/en
Publication of WO2020139385A1 publication Critical patent/WO2020139385A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0112Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/12Detection or prevention of fraud
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/60Context-dependent security
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]

Definitions

  • the disclosure relates generally to vehicle identification.
  • APR automated license plate reader
  • An autonomous vehicle may be equipped with a set of sensors configured to generate output signals conveying information about the surroundings of the autonomous vehicle.
  • a sensor may include an image sensor configured to generate output signals conveying image information defining images of a surrounding environment. The images may be used to identify vehicles present in the environment and their locations.
  • the autonomous vehicle may be part of a fleet of autonomous vehicles individually equipped with such sensors.
  • the output of the sensors and/or information derived from the output of the sensors from the fleet of autonomous vehicles may facilitate a crowd-sourced technique vehicle identification. For example, information derived from different autonomous vehicles may be compared to determine one or a combination of vehicle identity profiles, speed of travel, direction of travel, or trajectory.
  • One aspect of the present disclosure is directed to a method for vehicle identification.
  • the method may comprise: obtaining, from a set of autonomous vehicles, a set of vehicle identification information, individual vehicle identification information being obtained from an individual autonomous vehicle and conveying identifications of one or more vehicles and locations of the one or more vehicles; and determining, from the set of vehicle identification information, vehicle context information for individual vehicles of the one or more vehicles, the vehicle context information for the individual vehicles describing a context of the individual vehicles, the context including one or a combination of a speed of travel, a direction of travel, an identity profile, or a trajectory.
  • the system may comprise one or more processors and a memory storing instructions.
  • the instructions when executed by the one or more processors, may cause the system to perform: obtaining, from a set of autonomous vehicles, a set of vehicle identification information, individual vehicle identification information being obtained from an individual autonomous vehicle and conveying identifications of one or more vehicles and locations of the one or more vehicles; and determining, from the set of vehicle identification information, vehicle context information for individual vehicles of the one or more vehicles, the vehicle context information for the individual vehicles describing a context of the individual vehicles, the context including one or a combination of a speed of travel, a direction of travel, an identity profile, or a trajectory.
  • the identifications may include one or a
  • individual vehicle identification information may include the identifications of the one or more vehicles.
  • identifications of the one or more vehicles may be derived from the vehicle identification information.
  • individual vehicle identification information may include one or a combination of image information or video information.
  • the identifications of the one or more vehicles may be derived from the image information and/or video information through one or more image and/or video processing techniques.
  • determining the context may be based on comparing individual ones of the identifications and the locations of the individual vehicles to other ones of the identifications and the locations of the individual vehicles. [0010] In some embodiments, comparing the individual ones of the identifications of the individual vehicles to the other ones of the identifications of the individual vehicles may facilitate determining that multiples ones of the identifications are for a same vehicle based on matches between the individual ones of the identifications.
  • the identity profile of an individual vehicle may represent an identity of the individual vehicle as a whole.
  • the identity profile may be determined by combining multiples ones of the identifications determined to be for the same vehicle.
  • the trajectory may include a path followed by a vehicle.
  • the system may further perform tracking of a vehicle based on the set of vehicle identification information and/or vehicle context information.
  • FIG. 1 illustrates an example environment for vehicle identification, in accordance with various embodiments of the disclosure.
  • FIG. 2 illustrates an example flow chart of vehicle identification, in accordance with various embodiments of the disclosure.
  • FIG. 3 illustrates a block diagram of an example computer system in which any of the embodiments described herein may be implemented.
  • One or more techniques presented herein may perform vehicle identification using a fleet of autonomous vehicles.
  • the information collected from the autonomous vehicles may improve vehicle identification due to the distribution of the autonomous vehicles in an environment and due to the amount of information that may be retrieved from the autonomous vehicles providing a dense dataset through which vehicles may be identified.
  • FIG. 1 illustrates an example system 100 for vehicle identification, in accordance with various embodiments.
  • the example system 100 may include one or a combination of a computing system 102, an autonomous vehicle 116, or one or more other autonomous vehicles 122.
  • the autonomous vehicle 116 may include one or more processors and memory (e.g., permanent memory, temporary memory).
  • the processor(s) may be configured to perform various operations by interpreting machine-readable instructions stored in the memory.
  • the autonomous vehicle 116 may include other computing resources.
  • the autonomous vehicle 116 may have access (e.g., via one or more connections, via one or more networks 110) to other computing resources or other entities participating in the system 100.
  • the autonomous vehicle 116 may include one or a combination of an identification component 118 or a set of sensors 120.
  • the autonomous vehicle 116 may include other components.
  • the set of sensors 120 may include one or more sensors configured to generate output signals conveying vehicle identification information or other information.
  • the vehicle identification information may convey identifications, locations, or combination of identifications and locations of one or more vehicles present in an environment surrounding autonomous vehicle 116.
  • the identifications of the one or more vehicles may include one or a combination of a license plate number, one or more vehicle colors, a vehicle make (e.g., manufacturer), a vehicle model, or a unique marking.
  • a license plate number may be comprised of one or a combination of alphanumeric characters or symbols.
  • a unique marking may refer to one or a combination of decals, writing, damage, or other marking upon a vehicle.
  • the identifications may be partial identifications.
  • a partial identification may include one or a combination of a part of a license plate number (less than all alphanumeric characters or symbols making up the license plate number), some of the colors of the vehicle (if the vehicle is multi colored), or a make identification without a model.
  • the set of sensors 120 may include an image sensor, a set of image sensors, a location sensor, a set of location sensors, or a combination of image sensors, location sensors, and other sensors.
  • a set of sensors e.g., set of image sensors
  • An image sensor may be configured to generate output signals conveying image information and/or video information.
  • the image information may define visual content in the form of one or more images.
  • the video information may define visual content in the form of a sequence of images.
  • Individual images may be defined by pixels and/or other information. Pixels may be characterized by one or a combination of pixel location, pixel color, or pixel transparency.
  • An image sensor may include one or more of charge-coupled device sensor, active pixel sensor, complementary metal-oxide semiconductor sensor, N-type metal-oxide- semiconductor sensor, and/or other image sensor.
  • the identifications of the one or more vehicles may be derived from the image
  • Such techniques may include one or a combination of computer vision, Speeded Up Robust Features (SURF), Scale-invariant Feature Transform (SIFT), Oriented FAST and rotated BRIEF (ORB), deep learning (of neural networks), or Optical Character Recognition (OCR).
  • SURF Speeded Up Robust Features
  • SIFT Scale-invariant Feature Transform
  • ORB Oriented FAST and rotated BRIEF
  • OCR Optical Character Recognition
  • a location sensor may be configured to generate output signals conveying location information.
  • Location information derived from output signals of a location sensor may define one or a combination of a location of autonomous vehicle 116, an elevation of autonomous vehicle 116, a timestamp when a location was obtained, or other measurements.
  • a location sensor may include one or a combination of a GPS, an altimeter, or a pressure sensor.
  • individual vehicle identification information may include one or a combination of image information, video information, or
  • the identification component 118 may determine the identifications of the one or more vehicles from one or a combination of the image information or the video information through one or more of the image or video processing techniques described herein. In some implementations, the identifications may be included in the vehicle
  • identification information or the vehicle identification information may include one or a combination of the image information or the video information from which the vehicle identification information may be derived.
  • the identification component 118 may communicate the vehicle identification information to the computing system 102 via one or more networks 110.
  • the one or more networks 110 may include the Internet or other networks.
  • the computing system 102 may obtain other vehicle identification information from other autonomous vehicle(s) 122. Accordingly, the computing system 102 may obtain a set of vehicle identification information.
  • the computing system 102 may include one or more processors and memory (e.g., permanent memory, temporary memory).
  • the processor(s) may be configured to perform various operations by interpreting machine-readable instructions.
  • the computing system 102 may include other computing resources.
  • the computing system 102 may have access (e.g., via one or more connections, via one or more networks 110) to other computing resources or other entities participating in the system 100.
  • the computing system 102 may include one or a combination of an identification component 104, a context component 106, or a tracking component 108. While the computing system 102 is shown in FIG. 1 as a single entity, this is merely for ease of reference and is not meant to be limiting. One or more components or one or more functionalities of the computing system 102 described herein may be implemented in a single computing device or multiple computing devices. In some embodiments, one or more components or one or more functionalities of the computing system 102 described herein may be implemented in one or more networks 110, one or more endpoints, one or more servers, or one or more clouds.
  • the identification component 104 may obtain, from a set of autonomous vehicles, a set of vehicle identification information.
  • the identification component 104 may obtain vehicle identification information from autonomous vehicle 116 and individual ones of one or more other autonomous vehicles 122.
  • the individual vehicle identification information obtained from an individual autonomous vehicle may include identifications, locations, or combinations of the identifications and locations of one or more vehicles.
  • the vehicle identification information obtained by identification component 104 may include the identifications of vehicles.
  • autonomous vehicle 116 may determine the identifications of vehicles via identification component 118 and communicate the identifications to computing system 102.
  • the identification component 104 may determine the identifications of vehicles from the vehicle identification information.
  • the identification component 104 may obtain vehicle identification information including one or a combination of image information or video information.
  • the identification component 104 may determine the
  • the context component 106 may determine, from the set of vehicle identification information, vehicle context information for individual vehicles of the one or more vehicles.
  • the context component 106 may determine vehicle context information for autonomous vehicle 116 based on the vehicle identification information obtained from autonomous vehicle 116 and/or other vehicle identification information from other autonomous vehicles.
  • the vehicle context information for an individual vehicles may describe a context of the individual vehicles.
  • the context of the individual vehicles may describe circumstances specific to the individual vehicles.
  • the context may include one or a combination of a speed of travel, a direction of travel, a trajectory, or an identity profile.
  • determining the context may be based on comparing individual ones of the identifications and the locations of individual vehicles to other ones of the identifications and the locations of the individual vehicles.
  • the comparisons of individual ones of the identifications of the individual vehicles to the other ones of the identifications of the individual vehicles may facilitate determining that multiples ones of the identifications are for the same vehicle. For example, based on the comparisons, it may be determined that multiple ones of the identifications (obtained from the same or different autonomous vehicles) match. A match may convey a logical inference that the multiple identifications are for the same vehicle. In some implementations, matching may mean that
  • being the same match may mean the
  • identifications are within a threshold degree of sameness.
  • two (or more) identifications may be determined to match if they depict the same series of alphanumerical characters or symbols or depict the same 90% of the series of alphanumerical characters or symbols.
  • two (or more) identifications may be determined to match if they depict colors within a threshold range as observed on a color scale.
  • vehicle make identifications two (or more) identifications may be determined to match if they depict the same make vehicle.
  • vehicle model identifications two (or more) identifications may be determined to match if they depict the same model vehicle.
  • unique marking identifications two (or more) identifications may be determined to match if they depict the same visual distinct unique marking in the same location of the vehicle.
  • being a complementary match may mean that multiple identifications may be partial identifications which, when combined, form a complete identification.
  • the partial identifications may include depiction of different parts of the vehicle. That partial identification may include depiction of one or more overlapping parts of the vehicle.
  • license plate identifications one identification may depict part of a series of alphanumerical characters or symbols and another identification may depict another part of the series of alphanumerical characters or symbols.
  • the identifications may be combined to depict the series of alphanumerical characters or symbols defining the license plate number as a whole.
  • color identifications one identification may depict a part of a vehicle having a color and another identification may depict another part of the vehicle having the same color.
  • the identifications may be combined to depict the vehicle as a whole having the color uniform throughout.
  • one identification may depict part of a unique marking and another identification may depict another part of the unique marking.
  • the identifications may be combined to depict the unique marking as a whole. In some implementations, combining identifications may be accomplished through stitching images and/or video together.
  • stitching may include operations such as one or a combination of feature point detection, image registration, alignment, or composing.
  • Feature point detection may be accomplished through techniques such as SIFT and SURF.
  • Image registration may involve matching features in a set of images.
  • a method for image registration may include Random Sample Consensus (RAN SAC) or other techniques.
  • Alignment may include transforming an image to match a view point of another image.
  • Composing may comprise the process where the images are aligned in such a way that they appear as a single shot of vehicle.
  • deep learning (of neural networks) based approaches may also be used.
  • the identity profile of an individual vehicle may represent an identity of the individual vehicle as a whole.
  • the identity of the vehicle as a whole may comprise a representation of more than one identification which may have been obtained for the given vehicle. That is, the identity profile may be determined by combining multiples ones of the identifications (obtained from one or more autonomous vehicles) determined to be for the same vehicle.
  • the identity profile may be made through the stitching techniques described herein which may result in one or more images which depict more than one identification.
  • an identity profile of a vehicle may include an image or series of image from which two or more of a license plate number, a color, a make, a model, or a unique marking of the vehicle may be identifiable.
  • determining one or a combination of the speed of travel, the direction of travel, or the trajectory of a vehicle may be based on comparing the individual locations associated with the multiples ones of the identifications determined to be for the same vehicle.
  • a speed of travel may specify a vehicle was traveling at 110 kilometers an hour.
  • a direction of travel may be represented by cardinal directions.
  • the cardinal directions may include north, south, east, and west.
  • Determining a direction of travel may be accomplished by one or a combination of: comparing locations of identifications of the vehicle, determining which locations occurred before other ones of the locations, determining that the vehicle is traveling from a first location to a second location, and determining a pointing direction from the first location to the second location, or associating the pointing direction with a cardinal direction.
  • a direction of travel may specify that a vehicle was traveling north.
  • a trajectory of a vehicle may include a path followed by the vehicle.
  • the trajectory may be specified with respect to one or a combination of named roads, highways, freeways, intersections, neighborhoods, or cities.
  • locations may be referenced within a map of an environment including information about one or a combination of named roads, highways, freeways, intersections, neighborhoods, or cities.
  • a mapping service may be accessed and used to cross reference the determined locations with the information conveyed in the map.
  • a mapping service may include Google® Maps.
  • a trajectory may specify that a vehicle traveled for two miles on a Main St., turned left and proceeded for six blocks on 1 st St., etc.
  • how a vehicle’s trajectory changes (or doesn’t change) over time may reflect a travel pattern of the vehicle.
  • a travel pattern may include a common trajectory which appears more than once.
  • the tracking component 108 may be configured to actively look for one or more vehicles using the set of vehicle identification information, vehicle context information, or a combination of vehicle identification information and vehicle context information.
  • tracking component 108 may obtain identification(s) of a vehicle, for example through user input by a user designing to locate the vehicle.
  • the tracking component 108 may monitor the set of vehicle identification information obtained from a fleet of autonomous vehicle and the vehicle context information determined therefrom.
  • the tracking component 108 may perform such monitoring while looking for a match between the user-provided identification(s) and the identification conveyed by the set of vehicle identification information.
  • the tracking component 108 may, in response to finding a match, provide the vehicle context information for the matched vehicle to the user through one or more user interfaces.
  • the provided vehicle context information may include one or a combination of a speed of travel, a direction of travel, or a trajectory of the vehicle to allow the user to track the vehicle.
  • autonomous vehicles may exchange notifications of vehicle identification information, requests for tracking, or a combination of requests and notifications.
  • autonomous vehicle 116 may send requests via identification component 118 to other autonomous vehicles 122 to identify a particular vehicle.
  • the autonomous vehicle 116 may obtain requests via identification component 118 from other autonomous vehicles 122 to identify a particular vehicle.
  • the autonomous vehicle 116 may notify, via identification component 118, other vehicles about an identified vehicle (e.g., by sending vehicle identification
  • the autonomous vehicle 116 may obtain notifications, via
  • identification component 118 from other vehicles about an identified vehicle (e.g., by receiving vehicle identification information).
  • the requests and notifications may be sent to computer system 102, which in turn may forward the requests or notifications to autonomous vehicles nearby a requesting or notifying vehicle’s GPS location, or may be sent directly from autonomous vehicle 116 to one or more nearby autonomous vehicles.
  • FIG. 2 illustrates an example flow chart 200 for vehicle identification, in accordance with various embodiments of the disclosure.
  • a set of vehicle identification information may be obtained from a set of autonomous vehicles.
  • the individual vehicle identification information may be obtained from an individual autonomous vehicle.
  • the individual vehicle identification information may convey identifications of one or more vehicles and locations of the one or more vehicles.
  • vehicle context information for individual vehicles may be determined from the set of vehicle identification information.
  • the vehicle context information for the individual vehicles may describe a context of the individual vehicles.
  • FIG. 3 is a block diagram that illustrates a computer system 300 upon which any of the embodiments described herein may be implemented.
  • the computer system 300 includes a bus 302 or other communication mechanism for
  • Hardware processors 304 may be, for example, one or more general purpose microprocessors.
  • the computer system 300 also includes a main memory 303, such as a random access memory (RAM), cache and/or other dynamic storage devices, coupled to bus 302 for storing information and instructions to be executed by processor(s) 304.
  • Main memory 306 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor(s) 304. Such instructions, when stored in storage media accessible to processor(s) 304, render computer system 300 into a special-purpose machine that is customized to perform the operations specified in the instructions.
  • Main memory 306 may include non-volatile media and/or volatile media. Non-volatile media may include, for example, optical or magnetic disks. Volatile media may include dynamic memory.
  • Common forms of media may include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a DRAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge, and networked versions of the same.
  • the computer system 300 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 300 to be a special-purpose machine. According to one
  • the techniques herein are performed by computer system 300 in response to processor(s) 304 executing one or more sequences of one or more instructions contained in main memory 306. Such instructions may be read into main memory 306 from another storage medium, such as storage device 308. Execution of the sequences of instructions contained in main memory 306 causes processor(s) 304 to perform the process steps described herein. For example, the process/method shown in FIG. 2 and described in connection with this figure can be implemented by computer program instructions stored in main memory 306. When these instructions are executed by processor(s) 304, they may perform the steps as shown in FIG. 2 and described above. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
  • the computer system 300 also includes a communication interface 310 coupled to bus 302.
  • Communication interface 310 provides a two-way data communication coupling to one or more network links that are connected to one or more networks.
  • communication interface 310 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN (or WAN component to communicated with a WAN).
  • LAN local area network
  • Wireless links may also be implemented.
  • the performance of certain of the operations may be distributed among the processors, not only residing within a single machine, but deployed across a number of machines.
  • the processors or processor- implemented engines may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the processors or processor-implemented engines may be distributed across a number of geographic locations.
  • components may constitute either software components (e.g., code embodied on a machine-readable medium) or hardware components (e.g., a tangible unit capable of performing certain operations which may be configured or arranged in a certain physical manner).
  • software components e.g., code embodied on a machine-readable medium
  • hardware components e.g., a tangible unit capable of performing certain operations which may be configured or arranged in a certain physical manner.
  • components of the computing system 102 and autonomous vehicle 116 may be described as performing or configured for performing an operation, when the components may comprise instructions which may program or configure the computing system 102 and autonomous vehicle 116 to perform the operation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

Systems and methods for vehicle identification are described herein. A set of vehicle identification information may be obtained from a set of autonomous vehicles. Individual vehicle identification information may convey identifications of one or more vehicles and locations of the one or more vehicles. Vehicle context information for individual vehicles may be determined from the set of vehicle identification information. The vehicle context information for the individual vehicles may describe a context of the individual vehicles. The context may include one or a combination of a speed of travel, a direction of travel, a trajectory, or an identity profile.

Description

SYSTEMS AND METHODS FOR VEHICLE IDENTIFICATION
TECHNICAL FIELD
[0001] The disclosure relates generally to vehicle identification.
BACKGROUND
[0002] Some technologies such as automated license plate reader (ALPR) systems have been used to automatically capture data such as license plate numbers of vehicles that come into view, location, date, time, and/or photographs of the vehicle. This data may be uploaded to a central repository for use with many applications. An application may include law enforcement use to find out where a vehicle has been in the past, to determine whether a specific vehicle was at the spot of a crime, or to discover the travel patterns so that they are able to dig out more criminal activities. Another application may include use with a“hotlist” of identifications of stolen vehicles. Law enforcement may load the hotlist into an ALPR system to actively looking for those stolen vehicles and the vehicles relevant to criminals.
SUMMARY
[0003] One or more implementations of the systems and methods relate to vehicle identification using autonomous vehicles. An autonomous vehicle may be equipped with a set of sensors configured to generate output signals conveying information about the surroundings of the autonomous vehicle. For example, a sensor may include an image sensor configured to generate output signals conveying image information defining images of a surrounding environment. The images may be used to identify vehicles present in the environment and their locations. The autonomous vehicle may be part of a fleet of autonomous vehicles individually equipped with such sensors. The output of the sensors and/or information derived from the output of the sensors from the fleet of autonomous vehicles may facilitate a crowd-sourced technique vehicle identification. For example, information derived from different autonomous vehicles may be compared to determine one or a combination of vehicle identity profiles, speed of travel, direction of travel, or trajectory.
[0004] One aspect of the present disclosure is directed to a method for vehicle identification. The method may comprise: obtaining, from a set of autonomous vehicles, a set of vehicle identification information, individual vehicle identification information being obtained from an individual autonomous vehicle and conveying identifications of one or more vehicles and locations of the one or more vehicles; and determining, from the set of vehicle identification information, vehicle context information for individual vehicles of the one or more vehicles, the vehicle context information for the individual vehicles describing a context of the individual vehicles, the context including one or a combination of a speed of travel, a direction of travel, an identity profile, or a trajectory.
[0005] Another aspect of the present disclosure is directed to a system for vehicle identification. The system may comprise one or more processors and a memory storing instructions. The instructions, when executed by the one or more processors, may cause the system to perform: obtaining, from a set of autonomous vehicles, a set of vehicle identification information, individual vehicle identification information being obtained from an individual autonomous vehicle and conveying identifications of one or more vehicles and locations of the one or more vehicles; and determining, from the set of vehicle identification information, vehicle context information for individual vehicles of the one or more vehicles, the vehicle context information for the individual vehicles describing a context of the individual vehicles, the context including one or a combination of a speed of travel, a direction of travel, an identity profile, or a trajectory.
[0006] In some embodiments, the identifications may include one or a
combination of a license plate number, a color, a make, a model, or a unique marking.
[0007] In some embodiments, individual vehicle identification information may include the identifications of the one or more vehicles.
[0008] In some embodiments, identifications of the one or more vehicles may be derived from the vehicle identification information. By way of non-limiting illustration, individual vehicle identification information may include one or a combination of image information or video information. The identifications of the one or more vehicles may be derived from the image information and/or video information through one or more image and/or video processing techniques.
[0009] In some embodiments, determining the context may be based on comparing individual ones of the identifications and the locations of the individual vehicles to other ones of the identifications and the locations of the individual vehicles. [0010] In some embodiments, comparing the individual ones of the identifications of the individual vehicles to the other ones of the identifications of the individual vehicles may facilitate determining that multiples ones of the identifications are for a same vehicle based on matches between the individual ones of the identifications.
[0011] In some embodiments, the identity profile of an individual vehicle may represent an identity of the individual vehicle as a whole. The identity profile may be determined by combining multiples ones of the identifications determined to be for the same vehicle.
[0012] In some embodiments, the trajectory may include a path followed by a vehicle.
[0013] In some embodiments, the system may further perform tracking of a vehicle based on the set of vehicle identification information and/or vehicle context information.
[0014] These and other features of the systems, methods, and non-transitory computer readable media disclosed herein, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for purposes of illustration and description only and are not intended as a definition of the limits of the invention. It is to be understood that the foregoing general description and the following detailed description are exemplary and explanatory only, and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] Preferred and non-limiting embodiments of the invention may be more readily understood by referring to the accompanying drawings in which:
[0016] FIG. 1 illustrates an example environment for vehicle identification, in accordance with various embodiments of the disclosure.
[0017] FIG. 2 illustrates an example flow chart of vehicle identification, in accordance with various embodiments of the disclosure. [0018] FIG. 3 illustrates a block diagram of an example computer system in which any of the embodiments described herein may be implemented.
DETAILED DESCRIPTION
[0019] Specific, non-limiting embodiments of the present invention will now be described with reference to the drawings. It should be understood that particular features and aspects of any embodiment disclosed herein may be used and/or combined with particular features and aspects of any other embodiment disclosed herein. It should also be understood that such embodiments are by way of example and are merely illustrative of a small number of embodiments within the scope of the present invention. Various changes and modifications obvious to one skilled in the art to which the present invention pertains are deemed to be within the spirit, scope and contemplation of the present invention as further defined in the appended claims.
[0020] The approaches disclosed herein improve functioning of computing systems that identify vehicles. One or more techniques presented herein may perform vehicle identification using a fleet of autonomous vehicles. The information collected from the autonomous vehicles may improve vehicle identification due to the distribution of the autonomous vehicles in an environment and due to the amount of information that may be retrieved from the autonomous vehicles providing a dense dataset through which vehicles may be identified.
[0021] FIG. 1 illustrates an example system 100 for vehicle identification, in accordance with various embodiments. The example system 100 may include one or a combination of a computing system 102, an autonomous vehicle 116, or one or more other autonomous vehicles 122.
[0022] It is noted that while some features and functions of the systems and methods presented herein may be directed to the autonomous vehicle 116, this is for illustrative purposes only and not to be considered limiting. For example, it is to be understood that other autonomous vehicle(s) included in the one or more other autonomous vehicles 122 may be configured the same as or similar to autonomous vehicle 116 and may include the same or similar components, described herein. The autonomous vehicle 116 and the one or more other autonomous vehicles 122 may represent a set of autonomous vehicles which may be part of a fleet of autonomous vehicles. [0023] The autonomous vehicle 116 may include one or more processors and memory (e.g., permanent memory, temporary memory). The processor(s) may be configured to perform various operations by interpreting machine-readable instructions stored in the memory. The autonomous vehicle 116 may include other computing resources. The autonomous vehicle 116 may have access (e.g., via one or more connections, via one or more networks 110) to other computing resources or other entities participating in the system 100.
[0024] The autonomous vehicle 116 may include one or a combination of an identification component 118 or a set of sensors 120. The autonomous vehicle 116 may include other components.
[0025] The set of sensors 120 may include one or more sensors configured to generate output signals conveying vehicle identification information or other information. The vehicle identification information may convey identifications, locations, or combination of identifications and locations of one or more vehicles present in an environment surrounding autonomous vehicle 116. The identifications of the one or more vehicles may include one or a combination of a license plate number, one or more vehicle colors, a vehicle make (e.g., manufacturer), a vehicle model, or a unique marking. A license plate number may be comprised of one or a combination of alphanumeric characters or symbols. A unique marking may refer to one or a combination of decals, writing, damage, or other marking upon a vehicle. In some embodiments, the identifications may be partial identifications. By way of non limiting illustration, a partial identification may include one or a combination of a part of a license plate number (less than all alphanumeric characters or symbols making up the license plate number), some of the colors of the vehicle (if the vehicle is multi colored), or a make identification without a model.
[0026] The set of sensors 120 may include an image sensor, a set of image sensors, a location sensor, a set of location sensors, or a combination of image sensors, location sensors, and other sensors. A set of sensors (e.g., set of image sensors) may include one or more sensors (e.g., one or more image sensors).
[0027] An image sensor may be configured to generate output signals conveying image information and/or video information. The image information may define visual content in the form of one or more images. The video information may define visual content in the form of a sequence of images. Individual images may be defined by pixels and/or other information. Pixels may be characterized by one or a combination of pixel location, pixel color, or pixel transparency. An image sensor may include one or more of charge-coupled device sensor, active pixel sensor, complementary metal-oxide semiconductor sensor, N-type metal-oxide- semiconductor sensor, and/or other image sensor. In some embodiments, the identifications of the one or more vehicles may be derived from the image
information or the video information through one or more image and/or video processing techniques. Such techniques may include one or a combination of computer vision, Speeded Up Robust Features (SURF), Scale-invariant Feature Transform (SIFT), Oriented FAST and rotated BRIEF (ORB), deep learning (of neural networks), or Optical Character Recognition (OCR).
[0028] In some implementations, a location sensor may be configured to generate output signals conveying location information. Location information derived from output signals of a location sensor may define one or a combination of a location of autonomous vehicle 116, an elevation of autonomous vehicle 116, a timestamp when a location was obtained, or other measurements. A location sensor may include one or a combination of a GPS, an altimeter, or a pressure sensor.
[0029] In some embodiments, individual vehicle identification information may include one or a combination of image information, video information, or
identifications derived from the image information or video information. The identification component 118 may determine the identifications of the one or more vehicles from one or a combination of the image information or the video information through one or more of the image or video processing techniques described herein. In some implementations, the identifications may be included in the vehicle
identification information or the vehicle identification information may include one or a combination of the image information or the video information from which the vehicle identification information may be derived.
[0030] The identification component 118 may communicate the vehicle identification information to the computing system 102 via one or more networks 110. The one or more networks 110 may include the Internet or other networks. The computing system 102 may obtain other vehicle identification information from other autonomous vehicle(s) 122. Accordingly, the computing system 102 may obtain a set of vehicle identification information.
[0031] The computing system 102 may include one or more processors and memory (e.g., permanent memory, temporary memory). The processor(s) may be configured to perform various operations by interpreting machine-readable
instructions stored in the memory. The computing system 102 may include other computing resources. The computing system 102 may have access (e.g., via one or more connections, via one or more networks 110) to other computing resources or other entities participating in the system 100.
[0032] The computing system 102 may include one or a combination of an identification component 104, a context component 106, or a tracking component 108. While the computing system 102 is shown in FIG. 1 as a single entity, this is merely for ease of reference and is not meant to be limiting. One or more components or one or more functionalities of the computing system 102 described herein may be implemented in a single computing device or multiple computing devices. In some embodiments, one or more components or one or more functionalities of the computing system 102 described herein may be implemented in one or more networks 110, one or more endpoints, one or more servers, or one or more clouds.
[0033] The identification component 104 may obtain, from a set of autonomous vehicles, a set of vehicle identification information. By way of non-limiting illustration, the identification component 104 may obtain vehicle identification information from autonomous vehicle 116 and individual ones of one or more other autonomous vehicles 122. The individual vehicle identification information obtained from an individual autonomous vehicle may include identifications, locations, or combinations of the identifications and locations of one or more vehicles.
[0034] In some embodiments, the vehicle identification information obtained by identification component 104 may include the identifications of vehicles. For example, autonomous vehicle 116 may determine the identifications of vehicles via identification component 118 and communicate the identifications to computing system 102.
[0035] In some embodiments, the identification component 104 may determine the identifications of vehicles from the vehicle identification information. By way of non-limiting illustration, the identification component 104 may obtain vehicle identification information including one or a combination of image information or video information. The identification component 104 may determine the
identifications of vehicles using one or more image or video-based techniques described herein.
[0036] The context component 106 may determine, from the set of vehicle identification information, vehicle context information for individual vehicles of the one or more vehicles. By way of non-limiting illustration, the context component 106 may determine vehicle context information for autonomous vehicle 116 based on the vehicle identification information obtained from autonomous vehicle 116 and/or other vehicle identification information from other autonomous vehicles.
[0037] In some embodiments, the vehicle context information for an individual vehicles may describe a context of the individual vehicles. The context of the individual vehicles may describe circumstances specific to the individual vehicles.
By way of non-limiting illustration, the context may include one or a combination of a speed of travel, a direction of travel, a trajectory, or an identity profile.
[0038] In some embodiments, determining the context may be based on comparing individual ones of the identifications and the locations of individual vehicles to other ones of the identifications and the locations of the individual vehicles.
[0039] The comparisons of individual ones of the identifications of the individual vehicles to the other ones of the identifications of the individual vehicles may facilitate determining that multiples ones of the identifications are for the same vehicle. For example, based on the comparisons, it may be determined that multiple ones of the identifications (obtained from the same or different autonomous vehicles) match. A match may convey a logical inference that the multiple identifications are for the same vehicle. In some implementations, matching may mean that
identifications are the same or complementary.
[0040] In some implementations, being the same match may mean the
identifications are within a threshold degree of sameness. By way of non-limiting illustration, for license plate identifications, two (or more) identifications may be determined to match if they depict the same series of alphanumerical characters or symbols or depict the same 90% of the series of alphanumerical characters or symbols. By way of non-limiting illustration, for color identifications, two (or more) identifications may be determined to match if they depict colors within a threshold range as observed on a color scale. By way of non-limiting illustration, for vehicle make identifications, two (or more) identifications may be determined to match if they depict the same make vehicle. By way of non-limiting illustration, for vehicle model identifications, two (or more) identifications may be determined to match if they depict the same model vehicle. By way of non-limiting illustration, for unique marking identifications, two (or more) identifications may be determined to match if they depict the same visual distinct unique marking in the same location of the vehicle.
[0041] In some implementations, being a complementary match may mean that multiple identifications may be partial identifications which, when combined, form a complete identification. The partial identifications may include depiction of different parts of the vehicle. That partial identification may include depiction of one or more overlapping parts of the vehicle. By way of non-limiting illustration, for license plate identifications, one identification may depict part of a series of alphanumerical characters or symbols and another identification may depict another part of the series of alphanumerical characters or symbols. The identifications may be combined to depict the series of alphanumerical characters or symbols defining the license plate number as a whole. By way of non-limiting illustration, for color identifications, one identification may depict a part of a vehicle having a color and another identification may depict another part of the vehicle having the same color. The identifications may be combined to depict the vehicle as a whole having the color uniform throughout.
By way of non-limiting illustration, for unique marking identifications, one identification may depict part of a unique marking and another identification may depict another part of the unique marking. The identifications may be combined to depict the unique marking as a whole. In some implementations, combining identifications may be accomplished through stitching images and/or video together.
[0042] In some implementations, stitching may include operations such as one or a combination of feature point detection, image registration, alignment, or composing. Feature point detection may be accomplished through techniques such as SIFT and SURF. Image registration may involve matching features in a set of images. A method for image registration may include Random Sample Consensus (RAN SAC) or other techniques. Alignment may include transforming an image to match a view point of another image. Composing may comprise the process where the images are aligned in such a way that they appear as a single shot of vehicle. In some
embodiments, deep learning (of neural networks) based approaches may also be used.
[0043] In some implementations, the identity profile of an individual vehicle may represent an identity of the individual vehicle as a whole. The identity of the vehicle as a whole may comprise a representation of more than one identification which may have been obtained for the given vehicle. That is, the identity profile may be determined by combining multiples ones of the identifications (obtained from one or more autonomous vehicles) determined to be for the same vehicle. The identity profile may be made through the stitching techniques described herein which may result in one or more images which depict more than one identification. By way of non-limiting illustration, an identity profile of a vehicle may include an image or series of image from which two or more of a license plate number, a color, a make, a model, or a unique marking of the vehicle may be identifiable.
[0044] In some embodiments, determining one or a combination of the speed of travel, the direction of travel, or the trajectory of a vehicle may be based on comparing the individual locations associated with the multiples ones of the identifications determined to be for the same vehicle.
[0045] In some embodiments, a speed of travel may be represented by a distance traveled per unit. Determining a speed of travel of a vehicle may be accomplished by one or a combination of: comparing locations of identifications of the vehicle, determining distance between locations, determine time span(s) between the locations, or dividing the distance by the time span to obtain a speed of travel (e.g., distance per unit time). By way of non-limiting illustration, a speed of travel may specify a vehicle was traveling at 110 kilometers an hour.
[0046] In some embodiments, a direction of travel may be represented by cardinal directions. The cardinal directions may include north, south, east, and west.
Determining a direction of travel may be accomplished by one or a combination of: comparing locations of identifications of the vehicle, determining which locations occurred before other ones of the locations, determining that the vehicle is traveling from a first location to a second location, and determining a pointing direction from the first location to the second location, or associating the pointing direction with a cardinal direction. By way of non-limiting illustration, a direction of travel may specify that a vehicle was traveling north.
[0047] In some embodiments, a trajectory of a vehicle may include a path followed by the vehicle. The trajectory may be specified with respect to one or a combination of named roads, highways, freeways, intersections, neighborhoods, or cities. In some implementations, locations may be referenced within a map of an environment including information about one or a combination of named roads, highways, freeways, intersections, neighborhoods, or cities. By way of non-limiting illustration, a mapping service may be accessed and used to cross reference the determined locations with the information conveyed in the map. A mapping service may include Google® Maps. By way of non-limiting illustration, a trajectory may specify that a vehicle traveled for two miles on a Main St., turned left and proceeded for six blocks on 1st St., etc. In some implementations, how a vehicle’s trajectory changes (or doesn’t change) over time may reflect a travel pattern of the vehicle. A travel pattern may include a common trajectory which appears more than once.
[0048] The tracking component 108 may be configured to actively look for one or more vehicles using the set of vehicle identification information, vehicle context information, or a combination of vehicle identification information and vehicle context information. By way of non-limiting illustration, tracking component 108 may obtain identification(s) of a vehicle, for example through user input by a user designing to locate the vehicle. The tracking component 108 may monitor the set of vehicle identification information obtained from a fleet of autonomous vehicle and the vehicle context information determined therefrom. The tracking component 108 may perform such monitoring while looking for a match between the user-provided identification(s) and the identification conveyed by the set of vehicle identification information. The tracking component 108 may, in response to finding a match, provide the vehicle context information for the matched vehicle to the user through one or more user interfaces. By way of non-limiting illustration, the provided vehicle context information may include one or a combination of a speed of travel, a direction of travel, or a trajectory of the vehicle to allow the user to track the vehicle. [0049] In some embodiments, autonomous vehicles may exchange notifications of vehicle identification information, requests for tracking, or a combination of requests and notifications. For example, autonomous vehicle 116 may send requests via identification component 118 to other autonomous vehicles 122 to identify a particular vehicle. The autonomous vehicle 116 may obtain requests via identification component 118 from other autonomous vehicles 122 to identify a particular vehicle. The autonomous vehicle 116 may notify, via identification component 118, other vehicles about an identified vehicle (e.g., by sending vehicle identification
information). The autonomous vehicle 116 may obtain notifications, via
identification component 118, from other vehicles about an identified vehicle (e.g., by receiving vehicle identification information). The requests and notifications may be sent to computer system 102, which in turn may forward the requests or notifications to autonomous vehicles nearby a requesting or notifying vehicle’s GPS location, or may be sent directly from autonomous vehicle 116 to one or more nearby autonomous vehicles.
[0050] FIG. 2 illustrates an example flow chart 200 for vehicle identification, in accordance with various embodiments of the disclosure. At block 202, a set of vehicle identification information may be obtained from a set of autonomous vehicles. The individual vehicle identification information may be obtained from an individual autonomous vehicle. The individual vehicle identification information may convey identifications of one or more vehicles and locations of the one or more vehicles. At a block 204, vehicle context information for individual vehicles may be determined from the set of vehicle identification information. The vehicle context information for the individual vehicles may describe a context of the individual vehicles.
[0051] FIG. 3 is a block diagram that illustrates a computer system 300 upon which any of the embodiments described herein may be implemented. The computer system 300 includes a bus 302 or other communication mechanism for
communicating information, one or more hardware processors 304 coupled with bus 302 for processing information. Hardware processor(s) 304 may be, for example, one or more general purpose microprocessors.
[0052] The computer system 300 also includes a main memory 303, such as a random access memory (RAM), cache and/or other dynamic storage devices, coupled to bus 302 for storing information and instructions to be executed by processor(s) 304. Main memory 306 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor(s) 304. Such instructions, when stored in storage media accessible to processor(s) 304, render computer system 300 into a special-purpose machine that is customized to perform the operations specified in the instructions. Main memory 306 may include non-volatile media and/or volatile media. Non-volatile media may include, for example, optical or magnetic disks. Volatile media may include dynamic memory. Common forms of media may include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a DRAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge, and networked versions of the same.
[0053] The computer system 300 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 300 to be a special-purpose machine. According to one
embodiment, the techniques herein are performed by computer system 300 in response to processor(s) 304 executing one or more sequences of one or more instructions contained in main memory 306. Such instructions may be read into main memory 306 from another storage medium, such as storage device 308. Execution of the sequences of instructions contained in main memory 306 causes processor(s) 304 to perform the process steps described herein. For example, the process/method shown in FIG. 2 and described in connection with this figure can be implemented by computer program instructions stored in main memory 306. When these instructions are executed by processor(s) 304, they may perform the steps as shown in FIG. 2 and described above. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
[0054] The computer system 300 also includes a communication interface 310 coupled to bus 302. Communication interface 310 provides a two-way data communication coupling to one or more network links that are connected to one or more networks. As another example, communication interface 310 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN (or WAN component to communicated with a WAN). Wireless links may also be implemented.
[0055] The performance of certain of the operations may be distributed among the processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processors or processor- implemented engines may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the processors or processor-implemented engines may be distributed across a number of geographic locations.
[0056] Certain embodiments are described herein as including logic or a number of components. Components may constitute either software components (e.g., code embodied on a machine-readable medium) or hardware components (e.g., a tangible unit capable of performing certain operations which may be configured or arranged in a certain physical manner). As used herein, for convenience, components of the computing system 102 and autonomous vehicle 116 may be described as performing or configured for performing an operation, when the components may comprise instructions which may program or configure the computing system 102 and autonomous vehicle 116 to perform the operation.
[0057] While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. Also, the words
“comprising,”“having,”“containing,” and“including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms“a,”“an,” and“the” include plural references unless the context clearly dictates otherwise.
[0058] The embodiments illustrated herein are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.

Claims

WHAT IS CLAIMED IS:
1. A system for vehicle identification, the system comprising: one or more processors; and
a memory storing instructions that, when executed by the one or more processors, cause the system to perform:
obtaining, from a set of autonomous vehicles, a set of vehicle identification information, individual vehicle identification information being obtained from an individual autonomous vehicle and conveying identifications of one or more vehicles and locations of the one or more vehicles; and
determining, from the set of vehicle identification information, vehicle context information for individual vehicles of the one or more vehicles, the vehicle context information for the individual vehicles describing a context of the individual vehicles, the context including one or a combination of a speed of travel, a direction of travel, or a trajectory.
2. The system of claim 1, wherein the identifications include one or a combination of a license plate number, a color, a make, a model, or a unique marking.
3. The system of claim 1, wherein the individual vehicle identification information includes the identifications of the one or more vehicles.
4. The system of claim 1, wherein the individual vehicle identification information includes one or a combination of image information or video information, and the identifications of the one or more vehicles are derived from the image information and/or the video information through one or more image and/or video processing techniques.
5. The system of claim 1, wherein determining the vehicle context information is based on comparing individual ones of the identifications and the locations of the individual vehicles to other ones of the identifications and the locations of the individual vehicles.
6. The system of claim 5, wherein the comparing the individual ones of the identifications of the individual vehicles to the other ones of the identifications of the individual vehicles facilitates determining that multiples ones of the identifications are for a same vehicle based on matches between the individual ones of the identifications.
7. The system of claim 6, wherein the context further includes an identity profile, the identity profile of an individual vehicle representing an identity of the individual vehicle as a whole, and wherein the identity profile is determined by combining the multiples ones of the identifications determined to be for the same vehicle.
8. The system of claim 6, wherein determining one or a combination of the speed of travel, the direction of travel, or the trajectory is based on comparing the individual locations of the multiples ones of the identifications determined to be for the same vehicle.
9. The system of claim 1, wherein the trajectory includes a path followed by the
individual vehicles.
10. The system of claim 1, wherein the system further performs tracking a vehicle based on the set of vehicle identification information and/or the vehicle context information.
11. A method for vehicle identification, the method comprising:
obtaining, from a set of autonomous vehicles, a set of vehicle identification information, individual vehicle identification information being obtained from an individual autonomous vehicle and conveying identifications of one or more vehicles and locations of the one or more vehicles; and
determining, from the set of vehicle identification information, vehicle context information for individual vehicles of the one or more vehicles, the vehicle context information for the individual vehicles describing a context of the individual vehicles, the context including one or a combination of a speed of travel, a direction of travel, or a trajectory.
12. The method of claim 11, wherein the identifications include one or a combination of a license plate number, a color, a make, a model, or a unique marking.
13. The method of claim 11, wherein the individual vehicle identification information includes the identifications of the one or more vehicles.
14. The method of claim 11, wherein the individual vehicle identification information includes one or a combination of image information or video information, and the identifications of the one or more vehicles are derived from the image information and/or the video information through one or more image and/or video processing techniques.
15. The method of claim 11, wherein determining the vehicle context information is based on comparing individual ones of the identifications and the locations of the individual vehicles to other ones of the identifications and the locations of the individual vehicles.
16. The method of claim 15, wherein the comparing the individual ones of the
identifications of the individual vehicles to the other ones of the identifications of the individual vehicles facilitates determining that multiples ones of the identifications are for a same vehicle based on matches between the individual ones of the identifications.
17. The method of claim 16, wherein the context further includes an identity profile, the identity profile of an individual vehicle representing an identity of the individual vehicle as a whole, and wherein the identity profile is determined by combining the multiples ones of the identifications determined to be for the same vehicle.
18. The method of claim 16, wherein determining one or a combination of the speed of travel, the direction of travel, or the trajectory is based on comparing the individual locations of the multiples ones of the identifications determined to be for the same vehicle.
19. The method of claim 11, wherein the trajectory includes a path followed by the individual vehicles.
20. The method of claim 11, further comprising tracking a vehicle based on the set of vehicle identification information and/or the vehicle context information.
PCT/US2018/067988 2018-12-28 2018-12-28 Systems and methods for vehicle identification WO2020139385A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/US2018/067988 WO2020139385A1 (en) 2018-12-28 2018-12-28 Systems and methods for vehicle identification
CN201880100552.6A CN113366548A (en) 2018-12-28 2018-12-28 System and method for vehicle identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2018/067988 WO2020139385A1 (en) 2018-12-28 2018-12-28 Systems and methods for vehicle identification

Publications (1)

Publication Number Publication Date
WO2020139385A1 true WO2020139385A1 (en) 2020-07-02

Family

ID=71127367

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/067988 WO2020139385A1 (en) 2018-12-28 2018-12-28 Systems and methods for vehicle identification

Country Status (2)

Country Link
CN (1) CN113366548A (en)
WO (1) WO2020139385A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110095908A1 (en) * 2009-10-22 2011-04-28 Nadeem Tamer M Mobile sensing for road safety, traffic management, and road maintenance
US20160068156A1 (en) * 2014-09-10 2016-03-10 Volkswagen Ag Modifying autonomous vehicle driving by recognizing vehicle characteristics
CN107481526A (en) * 2017-09-07 2017-12-15 公安部第三研究所 System and method for drive a vehicle lane change detection record and lane change violating the regulations report control
US9952594B1 (en) * 2017-04-07 2018-04-24 TuSimple System and method for traffic data collection using unmanned aerial vehicles (UAVs)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103093516B (en) * 2012-12-25 2016-05-11 北京理工大学 Track of vehicle playback system
CN105632175B (en) * 2016-01-08 2019-03-29 上海微锐智能科技有限公司 Vehicle behavior analysis method and system
CN107085946A (en) * 2017-06-13 2017-08-22 深圳市麦谷科技有限公司 A kind of vehicle positioning method and system based on picture recognition technology
CN208126647U (en) * 2018-01-02 2018-11-20 乌鲁木齐明华智能电子科技有限公司 vehicle management system based on cloud server

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110095908A1 (en) * 2009-10-22 2011-04-28 Nadeem Tamer M Mobile sensing for road safety, traffic management, and road maintenance
US20160068156A1 (en) * 2014-09-10 2016-03-10 Volkswagen Ag Modifying autonomous vehicle driving by recognizing vehicle characteristics
US9952594B1 (en) * 2017-04-07 2018-04-24 TuSimple System and method for traffic data collection using unmanned aerial vehicles (UAVs)
CN107481526A (en) * 2017-09-07 2017-12-15 公安部第三研究所 System and method for drive a vehicle lane change detection record and lane change violating the regulations report control

Also Published As

Publication number Publication date
CN113366548A (en) 2021-09-07

Similar Documents

Publication Publication Date Title
US10867189B2 (en) Systems and methods for lane-marker detection
US10733759B2 (en) Broad area geospatial object detection using autogenerated deep learning models
CN109977776B (en) Lane line detection method and device and vehicle-mounted equipment
JP7025276B2 (en) Positioning in urban environment using road markings
US11094112B2 (en) Intelligent capturing of a dynamic physical environment
CN112419374B (en) Unmanned aerial vehicle positioning method based on image registration
EP3552388B1 (en) Feature recognition assisted super-resolution method
US10380889B2 (en) Determining car positions
AU2019203567B2 (en) Geo-registering an aerial image by an object detection model using machine learning
CN112598743B (en) Pose estimation method and related device for monocular vision image
US20210341308A1 (en) Global map creation using fleet trajectories and observations
Yang et al. Vehicle counting method based on attention mechanism SSD and state detection
WO2020255628A1 (en) Image processing device, and image processing program
Chandrasekaran et al. Computer vision based parking optimization system
CN115578468A (en) External parameter calibration method and device, computer equipment and storage medium
CN112699711A (en) Lane line detection method, lane line detection device, storage medium, and electronic apparatus
US20220189292A1 (en) Systems and methods for vehicle identification
Thornton et al. Multi-source feature fusion for object detection association in connected vehicle environments
CN113704276A (en) Map updating method and device, electronic equipment and computer readable storage medium
WO2020139385A1 (en) Systems and methods for vehicle identification
CN114743395B (en) Signal lamp detection method, device, equipment and medium
Ghosh et al. Sensing the sensor: Estimating camera properties with minimal information
CN116007637B (en) Positioning device, method, in-vehicle apparatus, vehicle, and computer program product
CN116758517B (en) Three-dimensional target detection method and device based on multi-view image and computer equipment
Zambanini et al. A Hough voting strategy for registering historical aerial images to present-day satellite imagery

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18944998

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18944998

Country of ref document: EP

Kind code of ref document: A1