CN113496167B - Method and device for determining movement track of target - Google Patents
Method and device for determining movement track of target Download PDFInfo
- Publication number
- CN113496167B CN113496167B CN202010255414.4A CN202010255414A CN113496167B CN 113496167 B CN113496167 B CN 113496167B CN 202010255414 A CN202010255414 A CN 202010255414A CN 113496167 B CN113496167 B CN 113496167B
- Authority
- CN
- China
- Prior art keywords
- target
- targets
- determining
- data
- category
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 41
- 230000003993 interaction Effects 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 9
- 230000006403 short-term memory Effects 0.000 claims description 6
- 230000007787 long-term memory Effects 0.000 claims description 5
- 230000004927 fusion Effects 0.000 claims 1
- 239000000523 sample Substances 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 14
- 238000004891 communication Methods 0.000 description 8
- 230000015654 memory Effects 0.000 description 8
- 238000011176 pooling Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000013135 deep learning Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 102100028187 ATP-binding cassette sub-family C member 6 Human genes 0.000 description 2
- 101000986621 Homo sapiens ATP-binding cassette sub-family C member 6 Proteins 0.000 description 2
- 101001014572 Homo sapiens MARCKS-related protein Proteins 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 102100028162 ATP-binding cassette sub-family C member 3 Human genes 0.000 description 1
- 238000013256 Gubra-Amylin NASH model Methods 0.000 description 1
- 101000986633 Homo sapiens ATP-binding cassette sub-family C member 3 Proteins 0.000 description 1
- 230000008485 antagonism Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/047—Probabilistic or stochastic networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/049—Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- Molecular Biology (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Probability & Statistics with Applications (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Multimedia (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses a method and a device for determining a target motion trail, and relates to the technical field of computers. One embodiment of the method comprises the following steps: and (3) identification: identifying a plurality of targets and determining category information of each target; the acquisition step: acquiring historical track data of a plurality of targets; and a determining step: and determining the motion trail of each target based on the historical trail data of the targets and the category information. The method and the device can realize the motion trail determination of the multi-category targets.
Description
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a method and an apparatus for determining a target motion trajectory.
Background
In the automatic driving perception system, track prediction needs to be performed on different types of targets, including pedestrians, cars, large trucks, buses, tricycles, bicycles, motorcycles and the like, in real time and simultaneously, and then future driving paths of the automatic driving automobile are planned by using the predicted track.
In an autopilot reality scenario application, the trajectories of different classes of objects are very different, e.g. the trajectories of pedestrians are more uncertain, the vehicle can only travel in a defined road, etc. In the existing methods, deep learning frameworks such as Social long-short term memory network (Social LSTM), social generative countermeasure network (Social GAN) and the like can only perform track prediction aiming at the same type of target or regarding multiple types as the same type of target.
In addition, interactions between different classes of objects are different, and interactions between pedestrians and pedestrians, between pedestrians and cars, between pedestrians and buses, etc. are different. However, in the existing methods in the deep learning-based framework, such as social-based LSTM or social GAN models, the interactions between multiple categories cannot be predicted in a targeted manner.
Disclosure of Invention
In view of this, the embodiments of the present invention provide a method and an apparatus for determining a motion trajectory of a target, which can implement motion trajectory determination of multiple types of targets.
To achieve the above object, according to an aspect of the embodiment of the present invention, there is provided a method for determining a target motion trajectory, including: and (3) identification: identifying a plurality of targets and determining category information for each of the targets; the acquisition step: acquiring historical track data of the plurality of targets; and a determining step: and determining the motion trail of each target based on the historical trail data of the targets and the category information.
Preferably, in the determining step, the movement locus of each of the targets is determined based on the history locus data of the plurality of targets, the category information, and the interaction information between the plurality of targets.
Preferably, the determining step includes: calculating the relative position of one object and other objects in the plurality of objects; and generating second vector data of the target based on the relative position, first vector data representing historical track data of the target, and target attribute information, and determining the motion track based on the second vector data, first vector data representing historical track data of the target, and third vector data representing a category of the target.
Preferably, the historical track data of the target is encoded by a long-short-term memory network to obtain the first vector data representing the historical track data of the target, wherein the long-short-term memory network encodes a data sequence into a one-dimensional vector representing a characteristic value of the data sequence.
Preferably, the target attribute information includes at least one of a category, a length, a width, a height, a current speed, and a current heading of the target.
Preferably, the method further comprises: and judging whether the determined motion trail of each target is true.
Preferably, in the judging step, it is directly judged whether the determined movement locus is correct or whether the determined movement locus is correct is judged based on the category of the object.
According to another aspect of the embodiment of the present invention, there is provided an apparatus for determining a target motion trajectory, including: an identification unit that identifies a plurality of targets and determines category information of each of the targets; an acquisition unit that acquires historical track data of the plurality of targets; and a determining unit that determines a movement locus of each of the targets based on the history locus data of the plurality of targets and the category information.
According to still another aspect of an embodiment of the present invention, there is provided an electronic device for determining a target motion trajectory, including: one or more processors; and the storage device is used for storing one or more programs, and when the one or more programs are executed by the one or more processors, the one or more processors are enabled to realize the method for determining the target motion trail.
According to a further aspect of an embodiment of the present invention, there is provided a computer readable medium having stored thereon a computer program, characterized in that the program, when executed by a processor, implements the method for determining a target motion trajectory provided by the present invention.
One embodiment of the above invention has the following advantages or benefits: the invention can provide a method and a device for determining the motion trail of a target, which can realize the motion trail determination of the target with multiple categories aiming at the target with multiple categories.
Further effects of the above-described non-conventional alternatives are described below in connection with the embodiments.
Drawings
The drawings are included to provide a better understanding of the invention and are not to be construed as unduly limiting the invention. Wherein:
FIG. 1 is a schematic diagram of the main flow of a method for determining a target motion trajectory according to an embodiment of the invention;
FIG. 2 is a block diagram showing a specific example of a method of determining a target motion trajectory according to the present invention;
FIG. 3 illustrates a block diagram of a multi-category pooling module in a specific example of a method of determining a target motion trajectory in accordance with the present invention;
fig. 4 is a schematic diagram of main units of an apparatus for determining a target motion trajectory according to an embodiment of the present invention;
FIG. 5 is an exemplary system architecture diagram in which embodiments of the present invention may be applied;
fig. 6 is a schematic diagram of a computer system suitable for use in implementing an embodiment of the invention.
Detailed Description
Exemplary embodiments of the present invention will now be described with reference to the accompanying drawings, in which various details of the embodiments of the present invention are included to facilitate understanding, and are to be considered merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
The method and the device for determining the motion trail of the target are applied to an automatic driving sensing system to predict the motion trail of different types of targets such as pedestrians, sedans, large trucks, buses, tricycles, bicycles, motorcycles and the like in real time.
Fig. 1 is a schematic diagram of a main flow of a method for determining a target motion trajectory according to an embodiment of the present invention. As shown in fig. 1, in the identifying step S1, a plurality of targets are identified, and category information of each target is determined; in the acquiring step S2, historical track data of a plurality of targets are acquired; and in the determining step S3, the movement trajectories of the respective targets are determined based on the history trajectory data and the category information of the plurality of targets.
Preferably, in the determining step S3, the movement trajectories of the respective targets are determined based on the history trajectory data of the plurality of targets, the category information, and the interaction information between the plurality of targets.
The determining step S3 includes, for example, calculating a relative position of one of the plurality of targets to the other target; and generating second vector data of the targets based on the calculated relative positions, the first vector data representing the historical track data of the targets, and the target attribute information, and determining the movement track of each target based on the second vector data, the first vector data representing the historical track data of the targets, and the third vector data representing the class of the targets.
The historical track data of the object is encoded, for example, by a long and short term memory network encoding the data sequence as a one-dimensional vector of eigenvalues representing the data sequence, to obtain first vector data representing the historical track data of the object.
The target attribute information includes, for example, at least one of a category, a length, a width, a height, a current speed, and a current heading of the target.
According to an embodiment of the present invention, the method for determining a motion trajectory of a target according to the embodiment of the present invention further includes: and judging whether the determined motion trail of each target is true.
In the judging step, it is preferable to directly judge whether the determined motion trajectory is correct or judge whether the determined motion trajectory is correct based on the category of the object.
Fig. 2 is a block diagram showing a specific example of a method of determining a target motion trajectory according to the present invention. Fig. 3 shows a block diagram of a multi-category pooling module in a specific example of a method of determining a target motion trajectory according to the present invention.
With reference to fig. 2 and 3, a specific example of a method of determining a target motion trajectory according to the present invention will be described. A plurality of targets i are identified (refer to fig. 2, i=1, 2, 3, for example), and category information of each target is determined. A historical track X i of the object i is obtained. The historical track X i of objects i with different categories, which may be for example pedestrians, cars, buses, etc., is input to a social generative countermeasure network with a generator G and a discriminator D.
The generator G determines the motion trajectories of the plurality of targets of different categories based on the input historical trajectory data of the plurality of targets and the category information of the plurality of targets.
The generator G preferably comprises a multi-class pooling module MCPM, the main purpose of which is to fuse the feature vector of the current target trajectory with the interaction information of other obstacles in the current scene and to output a feature vector containing such interaction information for further processing, so that a trajectory prediction can be performed for different target classes and interactions between them, a number of possible trajectories being predicted for each target. That is, the multi-category pooling module MCPM can be utilized to predict the trajectory of interactions between a plurality of targets having different categories.
Referring to fig. 2, for example, a history track X i of a target i (refer to fig. 2, for example, i=1, 2, 3) input into the generator G is encoded as H i t through a long short-term memory network (LSTM) i; the LSTMi-encoded data H i t is passed through a multi-class pooling module MCPM to generate a vector P i; then, a vector P i generated by MCPM, the encoded data H i t, a vector C representing the class of the target i, and a random vector Z are connected; and inputs the connected result to another LSTMi, and generates a predicted track of the target i after LSTMi encoding, for example, predicted tracks 1,2, and 3 shown in fig. 3. LSTM in generator G performs a process of encoding a data sequence into a one-dimensional vector representing the eigenvalues of the data sequence.
Referring to fig. 3, the mcpm performs, for example, the following processes: the relative position of one object and the other objects in a plurality of objects i (i=1, 2, 3, for example) is calculated. For example, as shown in fig. 3, the positions of the targets 1,2, 3 are (X1, Y1), (X2, Y2), and (X3, Y3), respectively; the relative positions of the object 1 and the objects 2, 3 are (X1-X2, Y1-Y2) and (X1-X3, Y1-Y3), respectively. Inputting the calculated relative position to the first multi-layer sensor MLP1; connecting the output data of the MLP1, the output data of the first LSTM in the generator and the vector data generated by the second multi-layer perceptron MLP2 of the target attribute information; the connected data is input to the third multi-layer perceptron MLP3 and the maximum pooling layer (MAX), thereby generating vector data of the object i.
The target attribute information includes, for example, at least one of a category, length, width, height, current speed, and current heading of the target.
Further, as shown in fig. 2, the predicted trajectory of the target i predicted by the generator is input to the discriminator D of the generative countermeasure network, and it is judged by the discriminator D whether the predicted trajectory of the target i (for example, the predicted trajectories 1, 2, 3 shown in fig. 2) is true. For example, the discriminator D may directly determine whether the predicted trajectory of the target i is correct (unconditional loss function (unconditional loss)), or may determine whether the predicted trajectory is correct by connecting the vector C with the data of the predicted trajectory, using the vector C representing the type of the target i as a condition (conditional loss function).
A model example of a method of predicting a target motion trajectory according to the present invention is described below. The Generative Antagonism Network (GAN) consists of two mutually opposed neural networks. The two contralateral trained models were: a generation model G of the data distribution is captured, and a discrimination model D that estimates the probability that the sample is from training data instead of G. The generative model G takes as input the latent variable z and outputs a sample G (z). The discrimination model D takes as input the sample x and outputs D (x), which represents the probability that it is true. The training program is similar to a two-player min-max game with the following objective functions:
GAN can use the condition model by providing additional inputs c to the generator and discriminator, thereby generating G (z, c) and D (x, c).
Social conditional GAN fight loss function:
In addition to the above social conditional GAN fight loss function, the L2 paradigm loss function may also be employed to calculate the distance between the predicted trajectory and the actual data (ground truth).
Since the trajectory has multiple possibilities, a diversified sample generation may also be performed, capturing the multiple possibilities of the predicted trajectory.
Y i represents the actual trajectory of the object,Representing the generated trajectory; k is a superparameter representing k tracks. The specific meaning of the function (2) is as follows: and selecting the L2 distance of the track with the smallest distance from the actual track L2 from the generated k tracks as a loss function to perform reverse transfer of deep learning.
Fig. 4 is a schematic diagram of main units of an apparatus for determining a motion profile of a target according to an embodiment of the present invention.
As shown in fig. 4, the apparatus 200 for determining a target motion trajectory includes: an identification unit 201, an acquisition unit 202, and a determination unit 203.
In the recognition unit 201, a plurality of targets are recognized, and category information of each target is determined. In the acquisition unit 202, historical track data of a plurality of targets is acquired. In the determination unit 203, the movement trajectories of the respective targets are determined based on the history trajectory data and the category information of the plurality of targets.
Preferably, the determining unit 203 determines the movement trajectories of the respective targets based on the history trajectory data of the plurality of targets, the category information, and the interaction information between the plurality of targets.
The determination unit 203 includes, for example: a first module that calculates a relative position of one of the plurality of targets to the other target; and a second module that generates second vector data of the target based on the calculated relative position, first vector data representing historical track data of the target, and target attribute information, and determines the motion trail based on the second vector data, first vector data representing historical track data of the target, and third vector data representing a category of the target.
First vector data representing historical track data of the object is obtained, for example, by encoding the historical track data of the object via a long and short term memory network, wherein the long and short term memory network encodes a data sequence into a one-dimensional vector representing characteristic values of the data sequence.
The target attribute information includes, for example, at least one of a category, a length, a width, a height, a current speed, and a current heading of the target.
The device for determining the motion trail of the target according to the embodiment of the invention further comprises: and a judging unit that judges whether or not the determined motion trajectories of the respective targets are true.
Preferably, the judging unit directly judges whether the determined motion trajectory is correct or judges whether the determined motion trajectory is correct based on the category of the object.
Fig. 5 illustrates an exemplary system architecture 400 of a method of determining a target motion profile or an apparatus of determining a target motion profile to which embodiments of the present invention may be applied.
As shown in fig. 5, the system architecture 400 may include terminal devices 401, 402, 403, a network 404, and a server 405. The network 404 is used as a medium to provide communication links between the terminal devices 401, 402, 403 and the server 405. The network 404 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
A user may interact with the server 405 via the network 404 using the terminal devices 401, 402, 403 to receive or send messages or the like. Various communication client applications, such as a web browser application, an instant messaging tool, a mailbox client, etc. (by way of example only) may be installed on the terminal devices 401, 402, 403.
The terminal devices 401, 402, 403 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablets, laptop and desktop computers, and the like.
The server 405 may be a server providing various services, such as a background management server (by way of example only) that provides support for communication of instant messaging tools by users utilizing the terminal devices 401, 402, 403.
It should be noted that, the method for determining the motion trail of the target provided by the embodiment of the present invention is generally executed by the server 405, and accordingly, the device for determining the motion trail of the target is generally disposed in the server 405.
It should be understood that the number of terminal devices, networks and servers in fig. 5 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to FIG. 6, there is illustrated a schematic diagram of a computer system 500 suitable for use in implementing an embodiment of the present invention. The terminal device shown in fig. 6 is only an example, and should not impose any limitation on the functions and the scope of use of the embodiment of the present invention.
As shown in fig. 6, the computer system 500 includes a Central Processing Unit (CPU) 501, which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data required for the operation of the system 500 are also stored. The CPU 501, ROM 502, and RAM 503 are connected to each other through a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
The following components are connected to the I/O interface 505: an input section 506 including a keyboard, a mouse, and the like; an output portion 507 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker, and the like; a storage portion 508 including a hard disk and the like; and a communication section 509 including a network interface card such as a LAN card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The drive 510 is also connected to the I/O interface 505 as needed. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as needed so that a computer program read therefrom is mounted into the storage section 508 as needed.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 509, and/or installed from the removable media 511. The above-described functions defined in the system of the present invention are performed when the computer program is executed by a Central Processing Unit (CPU) 501.
The computer readable medium shown in the present invention may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present invention may be implemented in software or in hardware. The described units may also be provided in a processor, for example, described as: a processor includes an identification unit, an acquisition unit, and a determination unit. The names of these units do not constitute a limitation of the unit itself in some cases, and for example, the identifying unit may also be described as "a unit that identifies a plurality of targets and determines category information of each of the targets".
As another aspect, the present invention also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be present alone without being fitted into the device. The computer readable medium carries one or more programs which, when executed by a device, cause the device to include: identifying a plurality of targets and determining category information for each of the targets; acquiring historical track data of the plurality of targets; and determining a motion trajectory of each of the targets based on the historical trajectory data of the plurality of targets and the category information.
According to the technical scheme provided by the embodiment of the invention, the method and the device for determining the movement track of the target can be provided, and the movement track determination of the targets in multiple categories can be realized aiming at the targets in multiple categories.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives can occur depending upon design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.
Claims (9)
1. A method for determining a target motion profile, comprising:
and (3) identification: identifying a plurality of targets and determining category information for each of the targets;
The acquisition step: acquiring historical track data of the plurality of targets; and
Determining: determining a motion trail of each target based on historical trail data of the plurality of targets and the category information;
The determining step includes: for each target, fusing the feature vector of the current target track of the target with the interaction information of each of other targets in the current scene, and outputting a feature vector containing the interaction information;
The generative countermeasure network determines a movement locus of each of the targets based on the historical locus data of the plurality of targets, the feature vector of each of the targets containing the interaction information, and the category information.
2. The method of claim 1, wherein the determining step comprises:
calculating the relative position of one object and other objects in the plurality of objects; and
Generating second vector data of the target based on the relative position, first vector data representing historical track data of the target, and target attribute information, and determining the motion track based on the second vector data, first vector data representing historical track data of the target, and third vector data representing a category of the target.
3. The method of claim 2, wherein the historical track data of the target is encoded by a long and short term memory network to obtain the first vector data representing the historical track data of the target, wherein the long and short term memory network encodes a sequence of data as a one-dimensional vector representing a characteristic value of the sequence of data.
4. The method of claim 2, wherein the target attribute information includes at least one of a category, a length, a width, a height, a current speed, and a current heading of the target.
5. The method according to any one of claims 1 to 4, further comprising: and judging whether the determined motion trail of each target is true.
6. The method of claim 5, wherein the step of determining the position of the probe is performed,
In the judging step, it is directly judged whether the determined motion trajectory is correct or whether the determined motion trajectory is correct is judged based on the category of the object.
7. An apparatus for determining a motion profile of a target, comprising:
an identification unit that identifies a plurality of targets and determines category information of each of the targets;
an acquisition unit that acquires historical track data of the plurality of targets; and
A determining unit that determines a movement locus of each of the targets based on historical locus data of the plurality of targets and the category information;
a determining unit that performs fusion of a feature vector of a current target trajectory of the target and interaction information between each of other targets in a current scene for each of the targets through a generative countermeasure network, and outputs a feature vector containing the interaction information;
And determining the motion trail of each target based on the historical trail data of the targets, the characteristic vector containing the interaction information of each target and the category information.
8. An electronic device for determining a motion trajectory of a target, comprising:
One or more processors;
storage means for storing one or more programs,
When executed by the one or more processors, causes the one or more processors to implement the method of any of claims 1-6.
9. A computer readable medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010255414.4A CN113496167B (en) | 2020-04-02 | 2020-04-02 | Method and device for determining movement track of target |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010255414.4A CN113496167B (en) | 2020-04-02 | 2020-04-02 | Method and device for determining movement track of target |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113496167A CN113496167A (en) | 2021-10-12 |
CN113496167B true CN113496167B (en) | 2024-07-19 |
Family
ID=77994066
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010255414.4A Active CN113496167B (en) | 2020-04-02 | 2020-04-02 | Method and device for determining movement track of target |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113496167B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114140553A (en) * | 2021-11-26 | 2022-03-04 | 武汉中海庭数据技术有限公司 | Intersection guide line generation method and system based on condition variation self-encoder |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110223318A (en) * | 2019-04-28 | 2019-09-10 | 驭势科技(北京)有限公司 | A kind of prediction technique of multi-target track, device, mobile unit and storage medium |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105183800A (en) * | 2015-08-25 | 2015-12-23 | 百度在线网络技术(北京)有限公司 | Information prediction method and apparatus |
US9961099B2 (en) * | 2016-04-18 | 2018-05-01 | Acalvio Technologies, Inc. | Systems and methods for detecting and tracking adversary trajectory |
CN106951871B (en) * | 2017-03-24 | 2020-07-28 | 北京地平线机器人技术研发有限公司 | Motion trajectory identification method and device of operation body and electronic equipment |
CN109102531A (en) * | 2018-08-21 | 2018-12-28 | 北京深瞐科技有限公司 | A kind of target trajectory method for tracing and device |
CN109583151B (en) * | 2019-02-20 | 2023-07-21 | 阿波罗智能技术(北京)有限公司 | Method and device for predicting running track of vehicle |
CN110570383B (en) * | 2019-09-25 | 2022-05-06 | 北京字节跳动网络技术有限公司 | Image processing method and device, electronic equipment and storage medium |
-
2020
- 2020-04-02 CN CN202010255414.4A patent/CN113496167B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110223318A (en) * | 2019-04-28 | 2019-09-10 | 驭势科技(北京)有限公司 | A kind of prediction technique of multi-target track, device, mobile unit and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN113496167A (en) | 2021-10-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10817714B2 (en) | Method and apparatus for predicting walking behaviors, data processing apparatus, and electronic device | |
CN109544396B (en) | Account recommendation method and device, server, terminal and storage medium | |
CN113264066B (en) | Obstacle track prediction method and device, automatic driving vehicle and road side equipment | |
CN111079619B (en) | Method and apparatus for detecting target object in image | |
CN108229419B (en) | Method and apparatus for clustering images | |
WO2020131140A1 (en) | Systems and methods for geolocation prediction | |
CN112734808B (en) | Trajectory prediction method for vulnerable road users in vehicle driving environment | |
CN113221677A (en) | Track abnormity detection method and device, road side equipment and cloud control platform | |
CN111553488A (en) | Risk recognition model training method and system for user behaviors | |
CN110929799A (en) | Method, electronic device, and computer-readable medium for detecting abnormal user | |
CN114758502B (en) | Dual-vehicle combined track prediction method and device, electronic equipment and automatic driving vehicle | |
CN113392793A (en) | Method, device, equipment, storage medium and unmanned vehicle for identifying lane line | |
CN110955965A (en) | Pedestrian motion prediction method and system considering interaction | |
CN112233428B (en) | Traffic flow prediction method, device, storage medium and equipment | |
CN113496167B (en) | Method and device for determining movement track of target | |
CN116088537B (en) | Vehicle obstacle avoidance method, device, electronic equipment and computer readable medium | |
CN113111692A (en) | Target detection method and device, computer readable storage medium and electronic equipment | |
CN114724116B (en) | Vehicle traffic information generation method, device, equipment and computer readable medium | |
CN109859254B (en) | Method and device for sending information in automatic driving | |
CN114620055B (en) | Road data processing method and device, electronic equipment and automatic driving vehicle | |
CN114756774A (en) | Travel scheme recommendation method, travel scheme recommendation device, model training equipment and storage medium | |
CN114912568A (en) | Method, apparatus and computer-readable storage medium for data processing | |
CN116842392B (en) | Track prediction method and training method, device, equipment and medium of model thereof | |
CN113963322B (en) | Detection model training method and device and electronic equipment | |
CN116194964A (en) | System and method for training machine learning visual attention models |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |