[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN111784730A - Object tracking method and device, electronic equipment and storage medium - Google Patents

Object tracking method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111784730A
CN111784730A CN202010628706.8A CN202010628706A CN111784730A CN 111784730 A CN111784730 A CN 111784730A CN 202010628706 A CN202010628706 A CN 202010628706A CN 111784730 A CN111784730 A CN 111784730A
Authority
CN
China
Prior art keywords
track information
same
information
track
image acquisition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010628706.8A
Other languages
Chinese (zh)
Other versions
CN111784730B (en
Inventor
龚晖
朱皓
戴华东
张天琦
曾杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN202010628706.8A priority Critical patent/CN111784730B/en
Publication of CN111784730A publication Critical patent/CN111784730A/en
Application granted granted Critical
Publication of CN111784730B publication Critical patent/CN111784730B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the application provides an object tracking method, an object tracking device, electronic equipment and a storage medium, which relate to the technical field of computer vision and comprise the following steps: determining the motion track information of each object in the image acquisition area corresponding to each image acquisition device, wherein the image acquisition areas corresponding to the adjacent image acquisition devices have overlapping areas; selecting the track information of the objects in the same overlapping area from the determined track information, determining the track information of the same object in each overlapping area from the selected track information, and fusing and associating the track information of the same object in the same overlapping area into one track information according to the motion time sequence; and according to the motion time sequence, correlating the track information of the same object in all the areas to obtain the global track information of each object. By applying the scheme provided by the embodiment of the application, the object tracking can be realized across the image acquisition equipment.

Description

Object tracking method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer vision technologies, and in particular, to an object tracking method and apparatus, an electronic device, and a storage medium.
Background
In a security scene, in order to track objects such as vehicles and users in the scene, a plurality of image acquisition devices need to be deployed in the scene. Since the field angle of the image capturing device is limited, the image capturing area corresponding to each image capturing device is also limited. Wherein, the image acquisition area corresponding to each image acquisition device refers to: the image captured by the image capturing device reflects the actual area in the scene.
In the prior art, each image capturing device works independently, and each image capturing device can track an object moving in an image capturing area corresponding to the image capturing device. For each image acquisition device, when the object is tracked, the track information of the object moving in the image acquisition area corresponding to the image acquisition device can be obtained by using the image acquired by the image acquisition device, and when the object leaves the image acquisition area, the track information of the object is difficult to continuously obtain. And the object will typically move within the image acquisition area corresponding to the plurality of image acquisition devices. For example, in a non-sensitive payment scenario, a customer usually takes goods in a plurality of goods areas, each goods area is disposed with an image capturing device, and in order to obtain a complete shopping process of the customer, it is necessary to track a movement trajectory of the customer in image capturing areas corresponding to the plurality of image capturing devices. By applying the prior art, only the track information of the object moving in the image acquisition area corresponding to a single image acquisition device can be obtained, but the track information of the object moving in the image acquisition areas corresponding to a plurality of image acquisition devices is difficult to obtain.
Therefore, in order to track an object and obtain a motion trajectory of the object in the entire scene, an object tracking method is needed to track the object across image capturing devices. That is, object tracking can only be performed based on a single image capture device, and object tracking across image capture devices is difficult to achieve.
Disclosure of Invention
An object of the embodiments of the present application is to provide an object tracking method, an object tracking device, an electronic device, and a storage medium, so as to implement object tracking across image acquisition devices. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present application provides an object tracking method, where the method includes:
determining the motion track information of each object in the image acquisition area corresponding to each image acquisition device, wherein the image acquisition areas corresponding to the adjacent image acquisition devices have overlapping areas;
selecting the track information of the objects in the same overlapping area from the determined track information, determining the track information of the same object in each overlapping area from the selected track information, and fusing and associating the track information of the same object in the same overlapping area into one piece of track information according to the motion time sequence, wherein the motion time is as follows: a time of movement of the object along the track characterized by the track information;
and according to the motion time sequence, correlating the track information of the same object in all the areas to obtain the global track information of each object.
In an embodiment of the present application, the fusing and associating the track information of the same object in the same overlapping area into one piece of track information according to the motion time sequence includes:
for each coincidence region, determining local track information of the same object with the same motion time in the track information of the motion of different image acquisition regions to which the coincidence region belongs, and fusing the first local track information to obtain fused track information;
according to the motion time sequence, associating second local track information and fusion track information into track information, wherein the second local track information is as follows: and track information belonging to a non-overlapped area in the track information of the same object in the same overlapped area.
In an embodiment of the application, the associating the track information of the same object in all the regions according to the motion time sequence to obtain the global track information of each object includes:
and judging whether target global track information belonging to the same object with the track information exists in the existing global track information or not aiming at each track information, if so, adding the track information to the target global track information according to the motion time sequence to obtain new global track information, and if not, directly taking the track information as the new global track information.
In an embodiment of the application, the selecting, from the determined trajectory information, trajectory information of objects in the same overlapping area includes:
selecting the track information of the object with the latest position in the same overlapping area from the determined track information as the track information of the object in the same overlapping area; or
And selecting the track information of the object of which the local track belongs to the same overlapping area in the track represented by the track information from the determined track information as the track information of the object in the same overlapping area.
In an embodiment of the application, the determining, from the selected trajectory information, trajectory information of the same object in each overlapping area includes:
determining track information of which the similarity of the tracks meets a first similarity condition as track information of the same object according to the track information of the selected object in each overlapped area; and/or
And determining the track information of the objects with the similarity of the appearance characteristics meeting the second similarity condition as the track information of the same object according to the track information of the objects in each selected overlapping area.
In a second aspect, an embodiment of the present application provides an object tracking apparatus, including:
the track determining module is used for determining track information of the motion of each object in the image acquisition area corresponding to each image acquisition device, wherein the image acquisition areas corresponding to the adjacent image acquisition devices have overlapping areas;
a track association module, configured to select track information of objects in the same overlapping area from the determined track information, determine track information of the same object in each overlapping area from the selected track information, and fuse and associate the track information of the same object in the same overlapping area into one piece of track information according to a motion time sequence, where the motion time is: a time of movement of the object along the track characterized by the track information;
and the global track obtaining module is used for correlating the track information of the same object in all the areas according to the motion time sequence to obtain the global track information of each object.
In an embodiment of the application, the track association module is specifically configured to:
selecting the track information of the objects in the same overlapping area from the determined track information, and determining the track information of the same object in each overlapping area from the selected track information;
for each coincidence region, determining local track information of the same object with the same motion time in the track information of the motion of different image acquisition regions to which the coincidence region belongs, and fusing the first local track information to obtain fused track information;
according to the motion time sequence, associating second local track information and fusion track information into track information, wherein the second local track information is as follows: and track information belonging to a non-overlapped area in the track information of the same object in the same overlapped area.
In an embodiment of the application, the global trajectory obtaining module is specifically configured to:
and judging whether target global track information belonging to the same object with the track information exists in the existing global track information or not aiming at each track information, if so, adding the track information to the target global track information according to the motion time sequence to obtain new global track information, and if not, directly taking the track information as the new global track information.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor and the communication interface complete communication between the memory and the processor through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of the first aspect when executing a program stored in the memory.
In a fourth aspect, the present application provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the method steps of any one of the first aspect.
Embodiments of the present application further provide a computer program product containing instructions, which when run on a computer, cause the computer to perform any one of the above object tracking methods.
The embodiment of the application has the following beneficial effects:
when the object tracking scheme provided by the embodiment of the application is applied to track the object track across the image acquisition devices, the track information of the motion of each object in the image acquisition area corresponding to each image acquisition device can be determined, wherein the image acquisition areas corresponding to the adjacent image acquisition devices have overlapping areas; selecting the track information of the objects in the same overlapping area from the determined track information, determining the track information of the same object in each overlapping area from the selected track information, and fusing and associating the track information of the same object in the same overlapping area into one piece of track information according to the motion time sequence, wherein the motion time is as follows: the time at which the object moves along the track characterized by the track information. Because each coincidence region contains the track information of the object obtained by different image acquisition equipment, the track information of the object in each coincidence region is subjected to fusion association, namely the track information of the object in the image acquisition regions corresponding to different image acquisition equipment is subjected to fusion association. And then according to the motion time sequence, correlating the track information of the same object in all the areas to obtain the global track information of each object. In this way, the track information of the object in the image acquisition area corresponding to all the image acquisition devices is further correlated, so that the global track information of the object moving in the image acquisition area corresponding to all the image acquisition devices can be obtained. Therefore, the object tracking across the image acquisition equipment can be realized by applying the scheme provided by the embodiment of the application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other embodiments can be obtained by using the drawings without creative efforts.
Fig. 1 is a schematic flowchart of an object tracking method according to an embodiment of the present application;
fig. 2 is a schematic diagram of an image capturing area according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an object tracking system according to an embodiment of the present application;
fig. 4 is a schematic flowchart of a track information fusion association method according to an embodiment of the present application;
fig. 5 is a schematic diagram of local track information provided in an embodiment of the present application;
fig. 6 is a schematic diagram of a process of fusion and association of track information in a coincidence area according to an embodiment of the present application;
fig. 7 is a schematic diagram of a global track information obtaining process provided in an embodiment of the present application;
fig. 8 is a schematic structural diagram of an object tracking apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In order to realize tracking of an object across image acquisition devices, embodiments of the present application provide an object tracking method and apparatus, an electronic device, and a storage medium, which are described in detail below.
Referring to fig. 1, fig. 1 is a schematic flowchart of an object tracking method according to an embodiment of the present disclosure, where the method may be applied to electronic devices such as an electronic computer, a notebook computer, a server, and the like. Specifically, the object tracking method includes the following steps 101 to 103.
Step 101, determining track information of each object moving in an image acquisition area corresponding to each image acquisition device.
The image acquisition equipment can be a monocular camera, a binocular camera and the like, and the image acquisition equipment can be a depth camera and also can be a common camera and the like. The depth camera may be a binocular camera that measures depth based on stereoscopic vision, or may be a camera that measures depth based on a structured light technique or a TOF (Time Of Flight) technique. Three-dimensional information of each object within the image acquisition area can be obtained using the depth camera. Each image acquisition equipment can be erected vertically to the ground or obliquely.
Each image acquisition device corresponds to an image acquisition area, and the image acquisition areas corresponding to the adjacent image acquisition devices have an overlapping area. For example, referring to fig. 2, fig. 2 is a schematic diagram of an image capturing area according to an embodiment of the present disclosure. As shown in fig. 2, black circles with numerical numbers represent image capturing devices, gray areas represent image capturing areas of the respective image capturing devices, and shaded areas represent overlapping areas between the image capturing areas corresponding to the respective adjacent image capturing devices.
Specifically, for each image capturing device, the image capturing device corresponds to an image capturing area, a plurality of objects may exist in the image capturing area, and trajectory information of the movement of the plurality of objects in the image capturing area may be obtained.
In an embodiment of the application, each image acquisition device may perform image acquisition on an object located in its image acquisition area, identify each object included in the image, track each object to obtain trajectory information of each object, and send the obtained trajectory information to the electronic device. The electronic equipment receives the track information sent by each image acquisition equipment, so that the track information of the motion of each object in the image acquisition area corresponding to each image acquisition equipment is obtained.
Specifically, for each image, the image acquisition device may identify each object included in the image, obtain position information of each object, obtain a plurality of position information of each object by sequentially identifying a series of images, and associate each position information according to the sequence of time for obtaining each position information, so as to obtain trajectory information of the object. When the image acquisition equipment acquires a new image of each frame, the new image can be identified to obtain new position information of each object, and the new position information is added into the existing track information to update the track information of each object.
After the track information is updated every time, the image acquisition equipment can send the updated track information to the electronic equipment, so that the electronic equipment can obtain the latest track information in time.
In an embodiment of the application, each image acquisition device may further perform image acquisition on an object located in its image acquisition area, and send the acquired image to the electronic device, and the electronic device receives the image sent by each image acquisition device, identifies the object included in the image, and tracks each object to obtain trajectory information of each object moving in each image acquisition area. The method for tracking each object by the electronic device according to the image is similar to the method for tracking the object in the image by the image acquisition device, and is not described herein again. The image acquisition equipment can send the newly acquired image to the electronic equipment in time, so that the electronic equipment can obtain new position information of each object in time according to the new image and update the track information of each object according to the new position information, thereby obtaining the new track information of each object.
And 102, selecting the track information of the objects in the same overlapping area from the determined track information, determining the track information of the same object in each overlapping area from the selected track information, and fusing and associating the track information of the same object in the same overlapping area into one piece of track information according to the motion time sequence.
Wherein, the movement time is as follows: the time at which the object moves along the track characterized by the track information. The movement time may be a time period, for example, assuming that the time during which the object moves along the trajectory is 0 th to 3 rd minute time, the movement time may be 0-3 minute time. The movement time may be a start time of the object moving along the trajectory, and for example, the movement time may be a 5 second time, assuming that the object starts the trajectory movement at a 5 second time and ends the trajectory movement at a 20 second time. The motion time may be an end time of the motion of the object along the trajectory.
Specifically, the determined area to which the trajectory information of each object belongs may be first divided into areas, so as to obtain the area where each trajectory information is located, where the area where each trajectory information is located includes an overlapping area or a non-overlapping area. Then, the track information of each object in the same overlapping area can be selected, the track information of the same object is determined from the track information of each object in the same overlapping area, and the track information of the same object in the same overlapping area is fused and associated into one track information according to the sequence of motion time.
The fusion association of the plurality of pieces of track information includes fusion of track information generated by the same object in the plurality of pieces of track information at the same motion time, and then association of the fused track information. Specifically, when the object moves to the overlap region, since the overlap region belongs to the image capturing regions corresponding to the plurality of image capturing devices, a plurality of trajectory information of the object can be obtained based on the plurality of image capturing devices at the same movement time. Because there is an error in obtaining the trajectory information of the object in the image capturing area corresponding to each image capturing device, the trajectory information of the same object obtained based on the plurality of image capturing devices at the same motion time is not completely the same, and thus the plurality of trajectory information need to be fused. When merging the trajectory information generated at the same motion time, the mean value of the trajectory information at the same motion time may be calculated as new trajectory information at the motion time. The average may be an arithmetic average, a weighted average, or the like. The trajectory information with the highest confidence may be selected from the plurality of trajectory information at the same motion time as the fused trajectory information. And calculating the average value of the preset number of track information with the highest confidence level in the plurality of track information under the same motion time to serve as the track information after fusion.
When the fused track information is associated, the track information can be directly spliced together according to the motion time sequence, so that the associated track information is obtained.
In an embodiment of the application, for the track information of each object in the image acquisition area corresponding to each image acquisition device, the track information of the object in the same overlapping area may be searched in the track information of each object in the image acquisition area corresponding to the adjacent image acquisition device adjacent to the image acquisition device. Therefore, the track information of the object in the same overlapping area is searched in the image acquisition areas corresponding to the image acquisition devices adjacent in position, the track information of the object in the same overlapping area is not required to be searched in the image acquisition areas corresponding to the image acquisition devices, the track information of the object in the same overlapping area is prevented from being searched in the image acquisition areas corresponding to the image acquisition devices not adjacent in position, and therefore the searching efficiency can be improved.
In an embodiment of the application, when determining track information of the same object in the same overlapping area, the current position of each object in each different image acquisition area may be obtained based on images currently acquired by different image acquisition devices, so as to determine each object in the same overlapping area. And calculating the distance between the positions of the objects obtained based on different image acquisition equipment aiming at the objects in each overlapping area, wherein the objects can be considered as the same object under the condition that the distance is less than a preset distance threshold value. For example, assuming that an object is located in an overlapping region of the image capturing regions corresponding to the image capturing device O1 and the image capturing device O2, if the position of an object obtained based on O1 is M1, the position of an object obtained based on O2 is M2, M1 and M2 are both located in the overlapping region, and the distance between M1 and M2 is smaller than the distance threshold, it can be considered that the object in the image captured by O1 and the object in the image captured by O2 are the same object.
And 103, associating the track information of the same object in all the areas according to the motion time sequence to obtain the global track information of each object.
Specifically, the track information in each area may be associated, since the track information in each area is divided based on the image capturing area corresponding to the image capturing device, and the same object may move in the image capturing areas corresponding to the plurality of image capturing devices. By associating the track information of the same object in each area, the track information of the object moving in the image acquisition areas of all the image acquisition devices can be obtained, namely the global track information of the object can be obtained.
In an embodiment of the application, track information of the same object in each area may be first searched in track information of each object in each area, and then the track information of the same object in each area is associated according to a motion time sequence, so as to obtain global track information of each object.
When the object tracking scheme provided by the embodiment is applied to track the object track across the image acquisition devices, the track information of the motion of each object in the image acquisition area corresponding to each image acquisition device can be determined, wherein the image acquisition areas corresponding to the adjacent image acquisition devices have overlapping areas; selecting the track information of the objects in the same overlapping area from the determined track information, determining the track information of the same object in each overlapping area from the selected track information, and fusing and associating the track information of the same object in the same overlapping area into one piece of track information according to the motion time sequence, wherein the motion time is as follows: the time at which the object moves along the track characterized by the track information. Because each coincidence region contains the track information of the object obtained by different image acquisition equipment, the track information of the object in each coincidence region is subjected to fusion association, namely the track information of the object in the image acquisition regions corresponding to different image acquisition equipment is subjected to fusion association. And then according to the motion time sequence, correlating the track information of the same object in all the areas to obtain the global track information of each object. In this way, the track information of the object in the image acquisition area corresponding to all the image acquisition devices is further correlated, so that the global track information of the object moving in the image acquisition area corresponding to all the image acquisition devices can be obtained. Therefore, the object tracking across the image acquisition equipment can be realized by applying the scheme provided by the embodiment.
In an embodiment of the application, time calibration may be performed on each image acquisition device and each electronic device in advance, so that each image acquisition device and each electronic device maintain time synchronization. Therefore, the track information of the object in the image acquisition area corresponding to each image acquisition device is convenient to be fused and associated subsequently, and the accuracy of object tracking is further improved.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an object tracking system according to an embodiment of the present application. The object tracking system comprises a plurality of image acquisition devices, a tracking server and an NTP (Network Time Protocol) server. The above devices may be connected by wire, or may be connected wirelessly via a router or a switch. The NTP server is used for carrying out time calibration on each image acquisition device and the tracking server, the tracking server is used for determining track information of each object in the image acquisition area corresponding to each image acquisition device, and the global track information of each object is obtained according to the track information. The object tracking system may further include a display device for displaying the track information, the global track information, and the like, and allowing a worker to visually obtain the movement track of each object through an electronic device.
In an embodiment of the present application, initial trajectory information sent by each image capturing device may be obtained, and each piece of trajectory information at the same motion time may be determined from each piece of initial trajectory information.
Specifically, each image acquisition device may perform image acquisition on each object in its image acquisition area, then perform trajectory tracking on each object according to the acquired image to obtain initial trajectory information of each object, and then send each initial trajectory information to the electronic device. Due to the fact that the data processing capacity of each image acquisition device is different, time consumption is different when the object tracking is carried out according to the image, and transmission time delay of each image acquisition device when the initial track information is sent to the electronic device is different, so that the initial track information received by the electronic device and sent by each image acquisition device can be track information of the object in different motion time. Therefore, the track information at the same motion time can be determined from the initial track information, and the track information at the same motion time can be fused and associated subsequently. This may improve the accuracy of object tracking.
In an embodiment of the present application, for the step 101, the determined pieces of track information are: and track information under the same space coordinate system. Specifically, for an image acquired by each image acquisition device, image trajectory information of each object in the image on the image may be first obtained, and trajectory information of each object in a space coordinate system in an image acquisition area corresponding to the image acquisition device is obtained based on a correspondence relationship between a position in the image acquired by the image acquisition device and a position in a preset space coordinate system, which is obtained in advance. Therefore, the track information of the object in the image acquisition area corresponding to each image acquisition device can be unified to the same space coordinate system, so that the track information can be conveniently fused and associated subsequently, and the object tracking efficiency is improved.
In an embodiment of the present application, coordinates of each image capturing device in a preset spatial coordinate system and an installation angle of the image capturing device may also be obtained in advance, and a corresponding relationship between a position in an image captured by the image capturing device and a position in a preset ground coordinate system is calibrated, so that image trajectory information of each object in the image may be converted into the ground coordinate system to obtain ground trajectory information, the ground trajectory information is converted into a transfer spatial coordinate system established with the image capturing device as a reference, and finally trajectory information in the transfer spatial coordinate system is converted into the preset spatial coordinate system, so as to obtain trajectory information of the object in the same spatial coordinate system in an image capturing area of each image capturing device.
In an embodiment of the application, each image acquisition device may record track information of each object in its image acquisition area in the form of a track linked list, and for the track linked list recorded by each image acquisition device, the track linked list may include a device identifier of the image acquisition device, a track identifier of the track information of each object in its image acquisition area, and track information of each object. The image acquisition equipment can update the object and the track information of the object recorded in the track linked list according to the acquired image, and timely sends the updated track linked list to the electronic equipment. The device identifier may be an english character, such as "camera", "C", or the like, or may be a mathematical character, such as "1", "2", or the like. Since each object corresponds to one piece of track information in an image capturing area corresponding to one image capturing device, the track identifier may also be understood as an identifier of the object, and the track identifier may be an english character, such as "people", "car", "P", or the like, or a mathematical character, such as "001", "002", and the like, which is not limited in this embodiment of the present application.
Referring to fig. 4, when performing fusion association on each track information for the above step 102, the following steps 401 and 402 may be included.
Step 401, for each overlapping area, determining local trajectory information of the same object with the same motion time in the trajectory information of the motion of the different image acquisition areas to which the overlapping area belongs, and fusing the first local trajectory information to obtain fused trajectory information.
Specifically, for each overlapping area, since the overlapping area belongs to the image capturing areas corresponding to the plurality of image capturing devices, in the same motion time, the same object can generate track information in the plurality of image capturing areas, and the plurality of track information of the object can be obtained based on the plurality of image capturing devices. Since there is an error in obtaining the trajectory information of the object in the image capturing area corresponding to each image capturing device, the trajectory information of the same object obtained based on the plurality of image capturing devices at the same motion time is not completely the same. Therefore, it is necessary to identify local trajectory information having the same motion time among the trajectory information of the same object, and to obtain the fused trajectory information by fusing the first local trajectory information as the first local trajectory information.
And step 402, associating the second local track information and the fusion track information into a piece of track information according to the motion time sequence.
Wherein the second local track information is: and track information belonging to a non-overlapped area in the track information of the same object in the same overlapped area. That is, the second local trajectory information is: and track information of the same object in the same overlapping area except the first local track information.
Referring to fig. 5, fig. 5 is a schematic diagram of local track information provided in an embodiment of the present application. As shown in fig. 5, the first trajectory information is trajectory information of an object moving in a first image capturing area corresponding to the image capturing device 1, the second trajectory information is trajectory information of an object moving in a second image capturing area corresponding to the image capturing device 2, the first image capturing area and the second image capturing area have a superposition area, and the object moves in the superposition area for the same time, so that the local trajectory information in the superposition area is the first local trajectory information, and the local trajectory information in the superposition area can be fused and then associated with the second local trajectory information outside the superposition area, so as to obtain a new trajectory information.
In one embodiment of the present application, when determining the trajectory information of the objects in the same overlapping area in step 102, the trajectory information of the object whose position is the latest in the trajectory information is selected from the determined trajectory information as the trajectory information of the object in the same overlapping area.
Specifically, according to the determined track information, the position of the object to which each track information belongs at the latest time can be obtained as the current position of each object. Then, the region to which the current position of each object belongs may be determined as the region to which the trajectory information of the object belongs. The regions include overlapping regions and non-overlapping regions. The trajectory information of the object whose latest position is in the same overlapping area may be determined as the trajectory information of the object within the same overlapping area.
In an embodiment of the present application, the position ranges of the overlapping areas and the non-overlapping areas in the actual scene may be preset, and the area to which the object currently belongs may be determined according to the position range in which the current position of the object falls.
The current position of each object in the actual scene can be determined according to the image currently acquired by each image acquisition device and the position of each object in the image currently acquired, and further the track information of the objects in the same overlapping area can be determined.
In an embodiment of the present application, in a case where each image capturing device is fixedly installed, an image capturing area corresponding to each image capturing device is also fixed, so that an overlapping area between different image capturing devices is also fixed. The corresponding relation between the image area and the overlapping area in the image acquired by each image acquisition device can be determined, so that whether each object in the image is in the image area corresponding to the overlapping area or not can be identified for the image acquired by each image acquisition device, and if the object is in the overlapping area, the object can be considered to be in the overlapping area. Further, trajectory information of objects within the same overlapping area can be determined.
In an embodiment of the application, trajectory information of an object whose local trajectory belongs to the same overlapping area in a trajectory represented by the trajectory information may be selected from the determined trajectory information as trajectory information of an object in the same overlapping area.
Specifically, the area to which the local trajectory belongs may be determined based on the trajectory represented by the trajectory information of the object in the image capturing area corresponding to each image capturing device, the area to which the local trajectory belongs is used as the area to which the trajectory information belongs, and then the trajectory information of the object whose local trajectory belongs to the same overlapping area is determined as the trajectory information of the object in the same overlapping area. The local trajectory may be a trajectory having a movement time interval of a predetermined length, for example, a trajectory having a movement time interval of 3 seconds, or a trajectory having a movement time interval of 5 seconds. The local trajectory may be a trajectory having a movement distance satisfying a preset distance, and the distance may be 1 meter, 2 meters, or the like, for example.
In this way, the area where the object is located is determined based on the local trajectory, and for an object whose dwell time in the overlapping area is too short, the object that is determined to be in the overlapping area does not need to be determined, so that the accuracy of the determined area where the object is located can be further improved.
Referring to fig. 6, fig. 6 is a schematic diagram of a process of fusion and association of track information in a coincidence area according to an embodiment of the present application. Each image acquisition device can send track information of an object in an image acquisition area to the electronic device, after the electronic device obtains each track information, the electronic device divides local areas to which each track information belongs, divides the track information belonging to the same local area together, and the local areas comprise overlapping areas and non-overlapping areas. And then judging whether the local area is an overlapped area or not according to the track information belonging to each local area, and if so, performing fusion association on the track information of the same object in the overlapped area.
In an embodiment of the present application, for step 102, when determining track information of the same object in the same overlapping area, for the track information of the selected object in each overlapping area, it is determined that the track information whose similarity of the tracks meets the first similarity condition is the track information of the same object.
Specifically, in the same overlapping region, trajectory information of the object may be obtained based on a plurality of image capturing devices. The trajectory information of the object in the overlapping region obtained based on each image acquisition device can be fitted to a curve respectively, and the similarity between different curves can be calculated as the similarity of the trajectories of the object in the overlapping region obtained based on different image acquisition devices. And then determining the track information with the similarity meeting the first similarity condition as the track information of the same object.
The first similarity condition may be that the similarity satisfies a preset similarity threshold, or may be that a preset number of pieces of track information with the highest similarity of the track information are determined as the track information of the same object. When the similarity between the curves is calculated, the similarity between the curves can be obtained by calculating the similarity of the slope, the change rate, and the like between the curves.
In an embodiment of the present application, a relative position relationship, such as a relative position, a distance interval, and the like, between tracks represented by each piece of track information may also be determined as a similarity between each piece of track information. The first similarity condition may be that the relative orientations of the tracks represented by the track information are the same, and the distance interval difference is smaller than a preset difference threshold.
Specifically, the relative position relationship of the tracks represented by the track information of each object in each image acquisition area may be obtained as the similarity between the track information, and the track information whose similarity satisfies the second similarity condition may be determined as the track information of the same object.
For example, assume that the first similarity condition is: the relative orientations of the tracks represented by the track information are the same, and the difference threshold of the distance interval difference is 10 centimeters. If the track information of the objects in the image acquisition area C8 and the image acquisition area C9 exists in the track information of the same overlapping area, two objects X1 and Y1 exist in the image acquisition area C8, wherein the track represented by the track information of X1 is located on the left side of the track represented by the track information of Y1, and the distance interval between the tracks is 30 centimeters; there are two objects X2 and Y2 in the image capturing area C9, where the trajectory represented by the trajectory information of X2 is located on the left side of the trajectory represented by the trajectory information of Y2, and the distance interval between the trajectories is 32 centimeters. It can be seen that the relative orientations of X1 with respect to Y1 and X2 with respect to Y2 are the same, and the distance interval difference is 2 centimeters and smaller than the difference threshold, so that the relative positional relationship of X1 with respect to Y1 can be considered to be similar to the relative positional relationship of X2 with respect to Y2, that is, the trajectory information of X1 and X2 can be regarded as the same object, Y1 and Y2 can be regarded as the same object, that is, the trajectory information of X1 and X2 can be regarded as the trajectory information of the same object, and the trajectory information of Y1 and Y2 can be regarded as the trajectory information of the same object.
In an embodiment of the application, for the selected track information of the object in each overlapped area, the track information of the object whose similarity of the appearance characteristics meets the second similarity condition is determined to be the track information of the same object.
When the object is a user, the appearance feature may be at least one of features of the user, such as a face feature, a body feature, a head-shoulder feature, and the like; when the object is a vehicle, the appearance feature may be at least one of a license plate feature, a color feature, a shape feature, and the like of the vehicle, or may be a feature of a user in the vehicle.
Specifically, for each overlapping region, the appearance features of the objects in the overlapping region may be identified, then the similarity between the appearance features may be calculated, and when the similarity between the appearance features of a plurality of objects satisfies a preset second similarity condition, the plurality of objects may be regarded as the same object, and then the trajectory information of the plurality of objects may be determined as the trajectory information of the same object.
The second similarity condition may be that the similarity satisfies a preset similarity threshold, or that a preset number of objects with the highest similarity of the appearance features are determined as the same object. When the similarity between the appearance features is calculated, the calculation may be performed by using a cosine similarity calculation method, an euclidean distance algorithm, a hamming distance algorithm, or the like.
In an embodiment of the present application, track information of the same object in the overlapping area may be initially determined based on appearance characteristics of the object to which each piece of track information belongs, and then the determined track information of the same object may be checked according to similarity of tracks represented by each piece of track information. The track information of the same object in the overlapping area can be determined based on the similarity of the tracks represented by the track information, and then the examination is carried out according to the track information of the same object determined by the appearance characteristics of the object to which the track information belongs. Therefore, the track information of the same object in the overlapping area is determined by combining the appearance characteristics and the track similarity, the determined track information of the same object meets the requirements of appearance characteristic similarity and represented track similarity of the objects, the accuracy of the determined track information of the same object can be improved, and the accuracy of object tracking is further improved.
In an embodiment of the present application, for step 104, when obtaining global track information of the same object, it may be determined, for each track information, whether target global track information that belongs to the same object as the track information exists in existing global track information, if so, the track information is added to the target global track information according to a motion time sequence to obtain new global track information, and if not, the track information is directly used as the new global track information.
Specifically, the electronic device may record the obtained global track information, after obtaining new track information, may first search for whether global track information that belongs to the same object as the track information exists in the global track information stored in advance, and if so, may take the global track information as target global track information, and add the track information to the target global track information according to the sequence of the motion time; if not, the track information can be directly used as new global track information.
In an embodiment of the application, the electronic device may create global track identifiers for the global track information, and for each track information, if there is global track information belonging to the same object as the track information in the existing global track information, add the track information to the target global track information to obtain updated global track information, and the global track identifier of the updated global track information remains unchanged. And if the global track identifier does not exist, taking the track information as new global track information, and creating a new global track identifier for the new global track information.
Referring to fig. 7, fig. 7 is a schematic diagram of a global track information obtaining process provided in the embodiment of the present application. Track information of objects in a plurality of areas can be obtained, then whether target global track information which belongs to the same object with each piece of track information exists or not is searched in the existing global track information, if yes, the searched track information can be added to the target global track information, and track information association is realized; if the global track information does not exist, the track information can be directly created as new global track information.
In an embodiment of the application, when it is determined whether target global track information belonging to the same object as the track information exists in existing global track information, it may be determined whether each piece of global track information and the track information are track information belonging to the same object based on appearance characteristics of the object to which the track information belongs and/or similarity of tracks represented by the track information.
In an embodiment of the application, track identifiers of each piece of track information in an image acquisition area corresponding to the image acquisition device can be further utilized to associate track information of objects in each area, so as to obtain global track information.
Specifically, different image capturing devices may add identifiers to the trajectory information of the object in their image capturing areas in different ways, so that the trajectory information of the same object moving in the image capturing areas corresponding to the different image capturing devices has different trajectory identifiers. And the image acquisition areas corresponding to the adjacent image acquisition devices have overlapping areas, so that when the track information of the object in each overlapping area is associated, the association relationship of the track information between the adjacent image acquisition areas can be determined, and the association relationship between the track information with different track identifications can be obtained. Different image acquisition areas have different overlapping areas, that is, different overlapping areas exist between the image acquisition area corresponding to each image acquisition device and the image acquisition areas corresponding to different and adjacent image acquisition devices, so that the association relationship of the track information in all the image acquisition areas can be determined one by one according to the association relationship of the track information of different track identifications in different areas, and then the associated track information is associated to obtain the global track information.
For example, it is assumed that, within the first overlapping area, the intra-image-capturing-area locus information Q1 corresponding to the image-capturing apparatus W1 and the intra-image-capturing-area locus information Q2 corresponding to the image-capturing apparatus W2 are contained; in the second overlapping area, the intra-image-capturing-area locus information Q2 corresponding to the image capturing device W2 and the intra-image-capturing-area locus information Q3 corresponding to the image capturing device W3 are contained. When the track information association is carried out in the first overlapping area, the track information of the same object can be determined as Q1 and Q2; when the track information is associated in the second overlapping area, the track information that Q2 and Q3 are the same object can be identified, and thus Q1, Q2, and Q3 are the track information of the same object, so that Q1, Q2, and Q3 can be associated to obtain the global track information.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an object tracking apparatus provided in an embodiment of the present application, where the apparatus includes:
a track determining module 801, configured to determine track information of a motion of each object in an image capturing area corresponding to each image capturing device in the image capturing area, where image capturing areas corresponding to adjacent image capturing devices have an overlapping area;
a track association module 802, configured to select track information of objects in the same overlapping area from the determined track information, determine track information of the same object in each overlapping area from the selected track information, and fuse and associate the track information of the same object in the same overlapping area into one piece of track information according to a motion time sequence, where the motion time is: a time of movement of the object along the track characterized by the track information;
a global track obtaining module 803, configured to associate track information of the same object in all the regions according to the motion time sequence, so as to obtain global track information of each object.
In an embodiment of the present application, the track association module 802 is specifically configured to:
selecting the track information of the objects in the same overlapping area from the determined track information, and determining the track information of the same object in each overlapping area from the selected track information;
for each coincidence region, determining local track information of the same object with the same motion time in the track information of the motion of different image acquisition regions to which the coincidence region belongs, and fusing the first local track information to obtain fused track information;
according to the motion time sequence, associating second local track information and fusion track information into track information, wherein the second local track information is as follows: and track information belonging to a non-overlapped area in the track information of the same object in the same overlapped area.
In an embodiment of the present application, the global track obtaining module 803 is specifically configured to:
and judging whether target global track information belonging to the same object with the track information exists in the existing global track information or not aiming at each track information, if so, adding the track information to the target global track information according to the motion time sequence to obtain new global track information, and if not, directly taking the track information as the new global track information.
In an embodiment of the present application, the track association module 802 is specifically configured to:
selecting the track information of the object with the latest position in the same overlapping area from the determined track information as the track information of the object in the same overlapping area, determining the track information of the same object in each overlapping area from the selected track information, and fusing and associating the track information of the same object in the same overlapping area into one piece of track information according to the motion time sequence; or
And selecting the track information of the objects of which the local tracks belong to the same overlapping area in the tracks represented by the track information from the determined track information as the track information of the objects in the same overlapping area, determining the track information of the same object in each overlapping area from the selected track information, and fusing and associating the track information of the same object in the same overlapping area into one track information according to the motion time sequence.
In an embodiment of the present application, the track association module 802 is specifically configured to:
selecting the track information of the objects in the same overlapping area from the determined track information, determining the track information of which the similarity of the tracks in the overlapping area meets a first similarity condition as the track information of the same object aiming at each overlapping area, and fusing and associating the track information of the same object in the same overlapping area into one piece of track information according to the motion time sequence; and/or
Selecting the track information of the objects in the same overlapping area from the determined track information, determining the track information of the objects with the similarity of the appearance characteristics meeting the second similarity condition as the track information of the same object according to the track information of the objects in each selected overlapping area, and fusing and associating the track information of the same object in the same overlapping area into one piece of track information according to the motion time sequence.
When the object tracking scheme provided by the embodiment is applied to track the object track across the image acquisition devices, the track information of the motion of each object in the image acquisition area corresponding to each image acquisition device can be determined, wherein the image acquisition areas corresponding to the adjacent image acquisition devices have overlapping areas; selecting the track information of the objects in the same overlapping area from the determined track information, determining the track information of the same object in each overlapping area from the selected track information, and fusing and associating the track information of the same object in the same overlapping area into one piece of track information according to the motion time sequence, wherein the motion time is as follows: the time at which the object moves along the track characterized by the track information. Because each coincidence region contains the track information of the object obtained by different image acquisition equipment, the track information of the object in each coincidence region is subjected to fusion association, namely the track information of the object in the image acquisition regions corresponding to different image acquisition equipment is subjected to fusion association. And then according to the motion time sequence, correlating the track information of the same object in all the areas to obtain the global track information of each object. In this way, the track information of the object in the image acquisition area corresponding to all the image acquisition devices is further correlated, so that the global track information of the object moving in the image acquisition area corresponding to all the image acquisition devices can be obtained. Therefore, the object tracking across the image acquisition equipment can be realized by applying the scheme provided by the embodiment.
The embodiment of the present application further provides an electronic device, as shown in fig. 9, which includes a processor 901, a communication interface 902, a memory 903, and a communication bus 904, where the processor 901, the communication interface 902, and the memory 903 complete mutual communication through the communication bus 904,
a memory 903 for storing computer programs;
the processor 901 is configured to implement the steps of the object tracking method described above when executing the program stored in the memory 903.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
In yet another embodiment provided by the present application, a computer-readable storage medium is further provided, in which a computer program is stored, and the computer program, when executed by a processor, implements the steps of any of the above object tracking methods.
In yet another embodiment provided by the present application, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform any of the object tracking methods of the above embodiments.
When the object tracking scheme provided by the embodiment is applied to track the object track across the image acquisition devices, the track information of the motion of each object in the image acquisition area corresponding to each image acquisition device can be determined, wherein the image acquisition areas corresponding to the adjacent image acquisition devices have overlapping areas; selecting the track information of the objects in the same overlapping area from the determined track information, determining the track information of the same object in each overlapping area from the selected track information, and fusing and associating the track information of the same object in the same overlapping area into one piece of track information according to the motion time sequence, wherein the motion time is as follows: the time at which the object moves along the track characterized by the track information. Because each coincidence region contains the track information of the object obtained by different image acquisition equipment, the track information of the object in each coincidence region is subjected to fusion association, namely the track information of the object in the image acquisition regions corresponding to different image acquisition equipment is subjected to fusion association. And then according to the motion time sequence, correlating the track information of the same object in all the areas to obtain the global track information of each object. In this way, the track information of the object in the image acquisition area corresponding to all the image acquisition devices is further correlated, so that the global track information of the object moving in the image acquisition area corresponding to all the image acquisition devices can be obtained. Therefore, the object tracking across the image acquisition equipment can be realized by applying the scheme provided by the embodiment.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, apparatus embodiments, electronic device embodiments, computer-readable storage medium embodiments, and computer program product embodiments are substantially similar to method embodiments and therefore are described with relative ease, as appropriate, with reference to the partial description of the method embodiments.
The above description is only for the preferred embodiment of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application are included in the protection scope of the present application.

Claims (10)

1. An object tracking method, the method comprising:
determining the motion track information of each object in the image acquisition area corresponding to each image acquisition device, wherein the image acquisition areas corresponding to the adjacent image acquisition devices have overlapping areas;
selecting the track information of the objects in the same overlapping area from the determined track information, determining the track information of the same object in each overlapping area from the selected track information, and fusing and associating the track information of the same object in the same overlapping area into one piece of track information according to the motion time sequence, wherein the motion time is as follows: a time of movement of the object along the track characterized by the track information;
and according to the motion time sequence, correlating the track information of the same object in all the areas to obtain the global track information of each object.
2. The method according to claim 1, wherein the fusion and association of the trajectory information of the same object in the same coincidence region into one piece of trajectory information according to the motion time sequence comprises:
for each coincidence region, determining local track information of the same object with the same motion time in the track information of the motion of different image acquisition regions to which the coincidence region belongs, and fusing the first local track information to obtain fused track information;
according to the motion time sequence, associating second local track information and fusion track information into track information, wherein the second local track information is as follows: and track information belonging to a non-overlapped area in the track information of the same object in the same overlapped area.
3. The method according to claim 1 or 2, wherein the associating the track information of the same object in all the regions according to the motion time sequence to obtain the global track information of each object comprises:
and judging whether target global track information belonging to the same object with the track information exists in the existing global track information or not aiming at each track information, if so, adding the track information to the target global track information according to the motion time sequence to obtain new global track information, and if not, directly taking the track information as the new global track information.
4. The method according to claim 1 or 2, wherein the selecting the trajectory information of the objects in the same coincidence region from the determined trajectory information comprises:
selecting the track information of the object with the latest position in the same overlapping area from the determined track information as the track information of the object in the same overlapping area; or
And selecting the track information of the object of which the local track belongs to the same overlapping area in the track represented by the track information from the determined track information as the track information of the object in the same overlapping area.
5. The method according to claim 1 or 2, wherein the determining the trajectory information of the same object in each coincidence region from the selected trajectory information comprises:
determining track information of which the similarity of the tracks meets a first similarity condition as track information of the same object according to the track information of the selected object in each overlapped area; and/or
And determining the track information of the objects with the similarity of the appearance characteristics meeting the second similarity condition as the track information of the same object according to the track information of the objects in each selected overlapping area.
6. An object tracking apparatus, characterized in that the apparatus comprises:
the track determining module is used for determining track information of the motion of each object in the image acquisition area corresponding to each image acquisition device, wherein the image acquisition areas corresponding to the adjacent image acquisition devices have overlapping areas;
a track association module, configured to select track information of objects in the same overlapping area from the determined track information, determine track information of the same object in each overlapping area from the selected track information, and fuse and associate the track information of the same object in the same overlapping area into one piece of track information according to a motion time sequence, where the motion time is: a time of movement of the object along the track characterized by the track information;
and the global track obtaining module is used for correlating the track information of the same object in all the areas according to the motion time sequence to obtain the global track information of each object.
7. The apparatus of claim 6, wherein the trajectory correlation module is specifically configured to:
selecting the track information of the objects in the same overlapping area from the determined track information, and determining the track information of the same object in each overlapping area from the selected track information;
for each coincidence region, determining local track information of the same object with the same motion time in the track information of the motion of different image acquisition regions to which the coincidence region belongs, and fusing the first local track information to obtain fused track information;
according to the motion time sequence, associating second local track information and fusion track information into track information, wherein the second local track information is as follows: and track information belonging to a non-overlapped area in the track information of the same object in the same overlapped area.
8. The apparatus according to claim 6 or 7, wherein the global trajectory obtaining module is specifically configured to:
and judging whether target global track information belonging to the same object with the track information exists in the existing global track information or not aiming at each track information, if so, adding the track information to the target global track information according to the motion time sequence to obtain new global track information, and if not, directly taking the track information as the new global track information.
9. An electronic device is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any one of claims 1 to 5 when executing a program stored in the memory.
10. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any one of the claims 1-5.
CN202010628706.8A 2020-07-01 2020-07-01 Object tracking method and device, electronic equipment and storage medium Active CN111784730B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010628706.8A CN111784730B (en) 2020-07-01 2020-07-01 Object tracking method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010628706.8A CN111784730B (en) 2020-07-01 2020-07-01 Object tracking method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111784730A true CN111784730A (en) 2020-10-16
CN111784730B CN111784730B (en) 2024-05-03

Family

ID=72759217

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010628706.8A Active CN111784730B (en) 2020-07-01 2020-07-01 Object tracking method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111784730B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112598703A (en) * 2020-12-14 2021-04-02 北京爱笔科技有限公司 Article tracking method and device
CN112651386A (en) * 2020-10-30 2021-04-13 杭州海康威视系统技术有限公司 Identity information determination method, device and equipment
CN113034546A (en) * 2021-04-07 2021-06-25 杭州海康威视数字技术股份有限公司 Track merging method and device, electronic equipment and storage medium
CN117011816A (en) * 2022-05-04 2023-11-07 动态Ad有限责任公司 Trace segment cleaning of trace objects

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050073585A1 (en) * 2003-09-19 2005-04-07 Alphatech, Inc. Tracking systems and methods
JP2008250999A (en) * 2007-03-08 2008-10-16 Omron Corp Object tracing method, object tracing device and object tracing program
US20110135154A1 (en) * 2009-12-04 2011-06-09 Canon Kabushiki Kaisha Location-based signature selection for multi-camera object tracking
US20140294231A1 (en) * 2013-03-28 2014-10-02 International Business Machines Corporation Automatically determining field of view overlap among multiple cameras
AU2013242830A1 (en) * 2013-10-10 2015-04-30 Canon Kabushiki Kaisha A method for improving tracking in crowded situations using rival compensation
CN106257301A (en) * 2016-05-12 2016-12-28 内蒙古工业大学 Distributed space time correlation model trace tracking method based on statistical inference
CN106970353A (en) * 2017-03-16 2017-07-21 重庆邮电大学 A kind of tracking and track approach based on communication base station three-dimensional localization
CN107358622A (en) * 2017-06-19 2017-11-17 三峡大学 A kind of video information processing method and system based on visualization movement locus
CN108051777A (en) * 2017-12-01 2018-05-18 北京迈格威科技有限公司 Method for tracing, device and the electronic equipment of target
US20180144481A1 (en) * 2016-11-18 2018-05-24 Kabushiki Kaisha Toshiba Moving object tracking device, display device, and moving object tracking method
CN108509896A (en) * 2018-03-28 2018-09-07 腾讯科技(深圳)有限公司 A kind of trace tracking method, device and storage medium
CN108986158A (en) * 2018-08-16 2018-12-11 新智数字科技有限公司 A kind of across the scene method for tracing identified again based on target and device and Computer Vision Platform
WO2019019943A1 (en) * 2017-07-28 2019-01-31 阿里巴巴集团控股有限公司 Method for tracing track of target in cross regions, and data processing method, apparatus and system
CN109584265A (en) * 2017-09-28 2019-04-05 杭州海康威视数字技术股份有限公司 A kind of method for tracking target and device
JP2019175165A (en) * 2018-03-28 2019-10-10 Kddi株式会社 Object tracking device, object tracking method, and object tracking program
CN110443828A (en) * 2019-07-31 2019-11-12 腾讯科技(深圳)有限公司 Method for tracing object and device, storage medium and electronic device
CN110619658A (en) * 2019-09-16 2019-12-27 北京地平线机器人技术研发有限公司 Object tracking method, object tracking device and electronic equipment
CN110969644A (en) * 2018-09-28 2020-04-07 杭州海康威视数字技术股份有限公司 Personnel trajectory tracking method, device and system
CN111127582A (en) * 2018-10-31 2020-05-08 驭势(上海)汽车科技有限公司 Method, device and system for identifying track overlapping section and storage medium
JP2020091519A (en) * 2018-12-03 2020-06-11 Kddi株式会社 Object tracking apparatus, object tracking method and object tracking program
CN111309780A (en) * 2020-01-21 2020-06-19 腾讯云计算(北京)有限责任公司 Track data processing method and device

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050073585A1 (en) * 2003-09-19 2005-04-07 Alphatech, Inc. Tracking systems and methods
JP2008250999A (en) * 2007-03-08 2008-10-16 Omron Corp Object tracing method, object tracing device and object tracing program
US20110135154A1 (en) * 2009-12-04 2011-06-09 Canon Kabushiki Kaisha Location-based signature selection for multi-camera object tracking
US20140294231A1 (en) * 2013-03-28 2014-10-02 International Business Machines Corporation Automatically determining field of view overlap among multiple cameras
AU2013242830A1 (en) * 2013-10-10 2015-04-30 Canon Kabushiki Kaisha A method for improving tracking in crowded situations using rival compensation
CN106257301A (en) * 2016-05-12 2016-12-28 内蒙古工业大学 Distributed space time correlation model trace tracking method based on statistical inference
US20180144481A1 (en) * 2016-11-18 2018-05-24 Kabushiki Kaisha Toshiba Moving object tracking device, display device, and moving object tracking method
CN106970353A (en) * 2017-03-16 2017-07-21 重庆邮电大学 A kind of tracking and track approach based on communication base station three-dimensional localization
CN107358622A (en) * 2017-06-19 2017-11-17 三峡大学 A kind of video information processing method and system based on visualization movement locus
CN109309809A (en) * 2017-07-28 2019-02-05 阿里巴巴集团控股有限公司 The method and data processing method, device and system of trans-regional target trajectory tracking
WO2019019943A1 (en) * 2017-07-28 2019-01-31 阿里巴巴集团控股有限公司 Method for tracing track of target in cross regions, and data processing method, apparatus and system
CN109584265A (en) * 2017-09-28 2019-04-05 杭州海康威视数字技术股份有限公司 A kind of method for tracking target and device
CN108051777A (en) * 2017-12-01 2018-05-18 北京迈格威科技有限公司 Method for tracing, device and the electronic equipment of target
CN108509896A (en) * 2018-03-28 2018-09-07 腾讯科技(深圳)有限公司 A kind of trace tracking method, device and storage medium
JP2019175165A (en) * 2018-03-28 2019-10-10 Kddi株式会社 Object tracking device, object tracking method, and object tracking program
CN108986158A (en) * 2018-08-16 2018-12-11 新智数字科技有限公司 A kind of across the scene method for tracing identified again based on target and device and Computer Vision Platform
CN110969644A (en) * 2018-09-28 2020-04-07 杭州海康威视数字技术股份有限公司 Personnel trajectory tracking method, device and system
CN111127582A (en) * 2018-10-31 2020-05-08 驭势(上海)汽车科技有限公司 Method, device and system for identifying track overlapping section and storage medium
JP2020091519A (en) * 2018-12-03 2020-06-11 Kddi株式会社 Object tracking apparatus, object tracking method and object tracking program
CN110443828A (en) * 2019-07-31 2019-11-12 腾讯科技(深圳)有限公司 Method for tracing object and device, storage medium and electronic device
CN110619658A (en) * 2019-09-16 2019-12-27 北京地平线机器人技术研发有限公司 Object tracking method, object tracking device and electronic equipment
CN111309780A (en) * 2020-01-21 2020-06-19 腾讯云计算(北京)有限责任公司 Track data processing method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
WEIHUA CHEN,ET AL: "An equalized global graph model-based approach for multi-camera object tracking", ARXIV:1502.03532V2, 19 June 2016 (2016-06-19), pages 1 - 14 *
常发亮等: "视频序列中面向人的多目标跟踪算法", 控制与决策, no. 04, 30 April 2007 (2007-04-30), pages 418 - 422 *
徐梦溪等: "基于多级信息融合的视频目标跟踪", 南京工程学院学报(自然科学版), no. 02, 15 June 2010 (2010-06-15), pages 15 - 24 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112651386A (en) * 2020-10-30 2021-04-13 杭州海康威视系统技术有限公司 Identity information determination method, device and equipment
CN112651386B (en) * 2020-10-30 2024-02-27 杭州海康威视系统技术有限公司 Identity information determining method, device and equipment
CN112598703A (en) * 2020-12-14 2021-04-02 北京爱笔科技有限公司 Article tracking method and device
CN113034546A (en) * 2021-04-07 2021-06-25 杭州海康威视数字技术股份有限公司 Track merging method and device, electronic equipment and storage medium
CN117011816A (en) * 2022-05-04 2023-11-07 动态Ad有限责任公司 Trace segment cleaning of trace objects

Also Published As

Publication number Publication date
CN111784730B (en) 2024-05-03

Similar Documents

Publication Publication Date Title
CN111784730A (en) Object tracking method and device, electronic equipment and storage medium
Liu et al. “Seeing is not always believing”: detecting perception error attacks against autonomous vehicles
CN108027877B (en) System and method for non-obstacle area detection
CN111784729B (en) Object tracking method and device, electronic equipment and storage medium
Cai et al. Robust hybrid approach of vision-based tracking and radio-based identification and localization for 3D tracking of multiple construction workers
CN103238163B (en) For specifying the equipment as ad distribution object, and ad distribution equipment
CN110587597B (en) SLAM closed loop detection method and detection system based on laser radar
WO2023016271A1 (en) Attitude determining method, electronic device, and readable storage medium
CN103716687A (en) Method and system for using fingerprints to track moving objects in video
US20240320840A1 (en) Target tracking method, target tracking apparatus, electronic device and computer readable medium
CN111381586A (en) Robot and movement control method and device thereof
CN111462226A (en) Positioning method, system, device, electronic equipment and storage medium
KR101678004B1 (en) node-link based camera network monitoring system and method of monitoring the same
CN115376109B (en) Obstacle detection method, obstacle detection device, and storage medium
CN111754388B (en) Picture construction method and vehicle-mounted terminal
CN115393681A (en) Target fusion method and device, electronic equipment and storage medium
CN116681739A (en) Target motion trail generation method and device and electronic equipment
CN115407355B (en) Library position map verification method and device and terminal equipment
CN107832598B (en) Unlocking control method and related product
CN114066974A (en) Target track generation method and device, electronic equipment and medium
Yin et al. Calibration and object correspondence in camera networks with widely separated overlapping views
Zhang et al. Reidentification-based automated matching for 3D localization of workers in construction sites
CN115457592A (en) Pedestrian identification method and device
Xu et al. Space-time vehicle tracking at the edge of the network
CN117635657A (en) BEV fusion sensing and multi-target tracking method based on road side end

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant