CN113781524A - Target tracking system and method based on two-dimensional label - Google Patents
Target tracking system and method based on two-dimensional label Download PDFInfo
- Publication number
- CN113781524A CN113781524A CN202111071116.0A CN202111071116A CN113781524A CN 113781524 A CN113781524 A CN 113781524A CN 202111071116 A CN202111071116 A CN 202111071116A CN 113781524 A CN113781524 A CN 113781524A
- Authority
- CN
- China
- Prior art keywords
- target
- aircraft
- tracking
- dimensional
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 56
- 238000012545 processing Methods 0.000 claims abstract description 73
- 239000011159 matrix material Substances 0.000 claims description 20
- 238000012937 correction Methods 0.000 claims description 12
- 238000004422 calculation algorithm Methods 0.000 claims description 4
- 238000007621 cluster analysis Methods 0.000 claims description 4
- 238000010845 search algorithm Methods 0.000 claims description 4
- 238000013507 mapping Methods 0.000 claims description 3
- 230000026676 system process Effects 0.000 claims description 3
- 230000009466 transformation Effects 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 description 6
- 238000001514 detection method Methods 0.000 description 4
- 238000013461 design Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000012163 sequencing technique Methods 0.000 description 3
- 238000001914 filtration Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000004804 winding Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
- G06K7/1404—Methods for optical code recognition
- G06K7/1408—Methods for optical code recognition the method being specifically adapted for the type of code
- G06K7/1417—2D bar codes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Electromagnetism (AREA)
- General Health & Medical Sciences (AREA)
- Toxicology (AREA)
- Artificial Intelligence (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a target tracking system and a target tracking method based on a two-dimensional tag, which are used for tracking a moving target object provided with the two-dimensional tag, and comprise the following steps: the aircraft is provided with a camera and is used for shooting target images of continuous multi-frame moving target objects in real time in the flight process; the target tracking module is used for sequentially identifying each frame of target image and processing the target image containing the two-dimensional label to obtain relative position information between the aircraft and the moving target object; and processing the relative position information to obtain a tracking signal so as to control the aircraft to track the flying of the moving target object, so that the two-dimensional tag is positioned in the center of the target image in the shot target image. The method has the advantages that the method can realize the identification and autonomous tracking of the aircraft on the specific moving target object, reduce the manpower input in the tracking process, improve the tracking efficiency and stability and reduce the system risk.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a target tracking system and method based on a two-dimensional label.
Background
The method has important significance for the identification and tracking technology of a specific target in a moving state, the fields of industrial scene monitoring, intelligent traffic system management, power maintenance, even military application and the like. The rapid development of machine vision technology has further pushed the development of automated tracking technology. The existing similar system has the technical defects of huge platform, high cost, much manpower input and the like.
Aprilat is a visual positioning method developed in recent years for positioning based on two-dimensional code signposts, which can calculate the precise three-dimensional position, direction and tag ID of a two-dimensional code tag relative to a camera. AprilTag has played an important role in multi-agent cooperation and indoor positioning at present. How to combine unmanned aerial vehicle and machine vision technique, realize unmanned aerial vehicle to specific target's discernment and independently track become the technological problem that awaits a urgent need to solve.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a target tracking system based on a two-dimensional label, which is used for tracking a moving target object, wherein the moving target object is provided with the two-dimensional label;
the target tracking system includes:
the aircraft is provided with a camera and is used for shooting continuous multiframe target images of the moving target object in real time in the flying process of the aircraft and outputting the target images;
the target tracking module is connected respectively the camera with the aircraft, the target tracking module includes:
the image processing submodule is used for sequentially identifying each frame of target image and processing the target image containing the two-dimensional label to obtain relative position information between the aircraft and the moving target object;
and the tracking control sub-module is connected with the image processing sub-module and used for processing according to the relative position information to obtain a tracking signal so as to control the aircraft to track the moving target object to fly, so that the two-dimensional tag is positioned in the center of the target image in the target image obtained by shooting.
Preferably, the two-dimensional tag is an AprilTag tag.
Preferably, the image processing sub-module includes:
the first processing unit is used for processing each frame of target image in sequence to obtain the gradient direction and amplitude of each pixel in the target image, and performing cluster analysis on each gradient direction and amplitude to obtain a plurality of line segments contained in the target image;
the second processing unit is connected with the first processing unit and used for traversing each line segment to identify a quadrangle and outputting an identification result indicating that the target image of the current frame contains the two-dimensional label when the quadrangle is identified for the first time;
and the third processing unit is connected with the second processing unit and used for starting a tracking mode according to the identification result, then sequentially processing the target image of the current frame and each frame of target image after the current frame respectively and continuously outputting the relative position information between the aircraft and the moving target object obtained through processing.
Preferably, the second processing unit performs quadrilateral recognition by traversing each line segment by using a recursive depth-first search algorithm with a depth of 4.
Preferably, the third processing unit includes:
the first processing subunit is configured to obtain a homography matrix representing a position mapping relationship of the two-dimensional tag between a tag coordinate system and an image coordinate system according to a pre-acquired focal length of the camera, a size of the two-dimensional tag, and the target image, where the tag coordinate system uses a center of the two-dimensional tag as an origin and a plane where the two-dimensional tag is located is an XOY plane;
and the second processing subunit is connected with the first processing subunit and is used for processing the internal reference matrix of the camera obtained by calibration in advance and the homography matrix to obtain the position information of the two-dimensional label in the image coordinate system as the relative position information between the aircraft and the moving target object.
Preferably, the first processing subunit processes the homography matrix by using a direct linear transformation algorithm.
Preferably, the second processing subunit obtains the position information by processing according to the following formula:
wherein H is used to represent the homography matrix; s is used to represent a scale factor; p is used for representing the internal reference matrix; rij(i-0, 1, 2; j-0, 1) for representing a rotation component of the two-dimensional label in the image coordinate system; t isx,Ty,TzFor representing a distance component of the two-dimensional tag in the image coordinate system;
the location information includes T in the distance componentxAnd Ty,TxRepresents a first relative distance, T, between the aircraft and the moving target object in the x-axis direction in the tag coordinate systemyA second relative distance between the aircraft and the moving target object in the y-axis direction in the tag coordinate system, wherein the relative position information includes the first relative distance and the second relative distance.
Preferably, the image processing sub-module further comprises a position correction unit connected to the third processing unit, the position correction unit comprising:
the first correction subunit is used for acquiring an Euler angle and a flight altitude of the aircraft in real time, and respectively processing the Euler angle and the flight altitude to obtain a first position deviation between the aircraft and the moving target object in the x-axis direction in a label coordinate system and a second position deviation between the aircraft and the moving target object in the y-axis direction in the label coordinate system;
the second correcting subunit is connected with the first correcting subunit and is used for correcting the first relative distance and the second relative distance respectively according to the first position deviation and the second position deviation to obtain corrected relative position information;
and the tracking control sub-module processes the corrected relative position information to obtain the tracking signal so as to control the aircraft to track the moving target object to fly.
Preferably, the first correcting subunit obtains the first position deviation and the second position deviation by processing according to the following formulas:
L=h*tanθ
wherein h represents the flying height; l represents the first positional deviation when θ represents a roll angle of the aircraft; and when theta represents the yaw angle of the aircraft, L represents the second position deviation.
The invention also provides a target tracking method based on the two-dimensional label, which is applied to the target tracking system, and the target tracking method comprises the following steps:
step S1, the target tracking system controls a camera arranged on an aircraft to shoot a continuous multiframe target image of a moving target object provided with a two-dimensional label in real time in the flying process of the aircraft;
step S2, the target tracking system receives the target images, sequentially identifies each frame of the target images, and processes the target images according to the target images containing the two-dimensional labels to obtain the relative position information between the aircraft and the moving target object;
step S3, the target tracking system processes the relative position information to obtain a tracking signal to control the aircraft to track the moving target object for flight, so that the two-dimensional tag is located in the center of the target image in the target image obtained by shooting.
The technical scheme has the following advantages or beneficial effects: the aircraft can recognize and autonomously track a specific moving target object, the manpower input in the tracking process is reduced, the tracking efficiency and stability are improved, and the system risk is reduced.
Drawings
Fig. 1 is a schematic structural diagram of a target tracking system based on two-dimensional code tags according to a preferred embodiment of the present invention;
FIG. 2 is a schematic diagram of a two-dimensional label in accordance with a preferred embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a position correction principle according to a preferred embodiment of the present invention;
fig. 4 is a flowchart illustrating a target tracking method based on two-dimensional tags according to a preferred embodiment of the present invention.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments. The present invention is not limited to the embodiment, and other embodiments may be included in the scope of the present invention as long as the gist of the present invention is satisfied.
In a preferred embodiment of the present invention, based on the above problems in the prior art, a target tracking system based on two-dimensional tags is provided, which is used for tracking a moving target object 1, wherein a two-dimensional tag is arranged on the moving target object 1;
as shown in fig. 1, the target tracking system includes:
the device comprises an aircraft 2, wherein the aircraft 2 is provided with a camera 21 and is used for shooting and outputting target images of continuous multi-frame moving target objects 1 in real time in the flying process of the aircraft 2;
the image processing submodule 31 is used for sequentially identifying each frame of target image and processing the target image containing the two-dimensional label to obtain the relative position information between the aircraft and the moving target object;
and the tracking control sub-module 32 is connected with the image processing sub-module 31 and is used for processing the relative position information to obtain a tracking signal so as to control the aircraft to track the moving target object to fly, so that the two-dimensional tag is located at the center of the target image in the shot target image.
Specifically, in the present embodiment, the two-dimensional TAG is an aprilat TAG, which includes, but is not limited to, TAG36H11-0, as shown in fig. 2. Before the target tracking is performed, the generated aprilat label may be printed and then attached to the surface of the moving target object 1, preferably to the upper surface of the moving target object 1, so that the camera 21 flying on the aircraft 2 above the moving target object 1 can accurately capture a target image including the two-dimensional label in the target tracking process.
In the actual tracking process, when the target tracking is performed, because the moving target object 1 and the aircraft 2 both move rapidly, in order to keep the stability of the initial tracking state and the success rate of the tracking, the system starts the tracking mode and allows the moving target object 1 to be at the center position of the visual field of the aircraft 2 as much as possible. In the operation process of the system, the moving target object 1 is always kept at the central position of the visual field of the camera by adjusting the attitude of the aircraft, which is also the realization scheme of tracking. More preferably, the camera may be arranged vertically downward such that the positive Z-axis direction of the camera coordinate system is opposite to the positive Z-axis direction of the body coordinate system.
In the target tracking process, the aircraft 2 flies above the moving target object 1, the camera 21 is controlled to shoot a target image in real time, and then the target image is sent to the target tracking module 3 to be processed, preferably, the target tracking module 3 can be a local upper computer, a remote server or a control chip integrated in the aircraft 2. When the target tracking module 3 is integrated in the control chip of the aircraft 2, the processed position information is preferably sent to the controller of the aircraft 2 through a serial port, a 64-byte circular queue space is preferably pre-configured in the controller, and the position information is moved to the circular queue space after being received by the serial port. Meanwhile, in order to ensure the correctness of the position information, a check bit is set for each frame of position information data, and the check bit is positioned at the end of each frame of position information data and is the sum of all data bits. After the position information data are received, each position information data are summed again, and the acquired position information data are used as correct data only after the check bit is verified, so that the safety is guaranteed.
After receiving the target image, firstly, image recognition needs to be performed on the target image, when the two-dimensional tag is recognized, the corresponding moving target object is represented as a tracking target, then, image processing can be performed on the target image to acquire relative position information between the aircraft and the moving target object, and then the aircraft is controlled to track the moving target object to fly according to the relative position information, so that the two-dimensional tag is located at the center position of the target image in the shot target image, in other words, the aircraft 2 flies right above the moving target object, and autonomous tracking of the moving target object is achieved.
In a preferred embodiment of the present invention, the image processing sub-module 31 includes:
the first processing unit 311 is configured to sequentially obtain, for each frame of target image, a gradient direction and an amplitude of each pixel in the target image, and perform cluster analysis on each gradient direction and amplitude to obtain a plurality of line segments included in the target image;
the second processing unit 312, connected to the first processing unit 311, is configured to traverse each line segment to perform quadrilateral recognition, and output a recognition result indicating that the current frame target image includes a two-dimensional tag when a quadrilateral is recognized for the first time;
the third processing unit 313 is connected to the second processing unit 312, and is configured to start the tracking mode according to the recognition result, sequentially process the current frame target image and each frame target image after the current frame target image, and continuously output the processed relative position information between the aircraft and the moving target object.
Specifically, in this embodiment, before the tracking mode is started, the two-dimensional tag of the moving target object may not be in the visual field of the camera, a search process needs to be performed, that is, multiple frames of target images are continuously captured and identified in the flight process, and one or more target images that are initially captured may not include the two-dimensional tag.
When the image recognition is carried out on the target image, firstly, the line segment in the target image is recognized, and the method comprises the following steps: and acquiring the gradient direction and amplitude of each pixel in the target image, and gathering pixels with similar gradient direction and amplitude components into a line segment. The algorithm of cluster analysis used by the first processing unit 311 is similar to the graph-based method of Felzenszwalb, and specifically includes: each point in the target image captured by the camera 21 represents a pixel, and edges are added between adjacent pixels, with the weight of the edges being equal to the difference in the gradient direction between adjacent pixels. Then, the pixels are sorted according to the edge weights, and whether the pixels are classified into one class (line segment) is judged. Further specifically, the pixel component is denoted by n, which is a vector value; the gradient direction of the pixel is represented by a function D (n), and the gradient direction is a scalar value and represents the direction in which the pixel changes the fastest; the amplitude of a pixel is represented by a function m (m) representing the difference between the maximum and minimum values of the change of a certain pixel. Based on this, when there are two pixel points that satisfy the following two conditions, they are connected together to form a line segment:
in the above formula: min (D (n), D (m)) and min (M (n), M (m)) respectively representing the gradient direction and the smaller pixel amplitude, KDAnd KMFor indicating regulating parameters, preferably KD=100、KM=1200。
In the process of actually sequencing the pixels, a linear time counting sequencing method is preferably used, and the information of the upper limit and the lower limit of the gradient direction and the amplitude is stored while sequencing. This gradient-based clustering method is sensitive to noise in the image, resulting in local gradient direction changes even in the presence of less noise, and we solve this problem by low-pass filtering the image. The aprilat label used at the same time has a large-scale feature of edge nature, so that the effective information is not blurred when low-pass filtering is used, which is different from other problem domains, and in a specific design, a filter with sigma of 0.8 is selected. After the clustering operation is finished, the traditional least square method can be used for fitting and connecting the line segments, and meanwhile, the line segments are classified according to the brightness and darkness of images on the two sides of the line segments, so that the quadrilateral extraction work in the next processing stage is facilitated. The work is also the slowest stage in the detection scheme, the resolution of the target image is preferably reduced to half of the original resolution in practical development, and experiments show that the recognition speed is improved by 4 times.
After the identification of the line segments in the target image is completed, and then the quadrangle in the target image is identified, some directed line segments are obtained in the previous work, which provides convenience for the task of finding the line segment sequence with the shape of the quadrangle, namely the rectangle. The system uses a recursive depth-first search algorithm with the depth of 4 as a rectangular identification scheme, and each depth layer of the recursive depth-first search algorithm acquires an edge for a quadrangle. All the segments will be retrieved at the first depth level and each segment will be the starting segment of the rectangle. And searching the line segment adjacent to the line segment by taking the line segment of the first layer as a starting point until a closed quadrangle is obtained, wherein the whole searching process obeys a counterclockwise winding sequence. Meanwhile, whether the line segments belong to the same quadrangle or not can be judged by selecting a proper threshold value, so that the recognition accuracy and the success rate under the shielding condition are increased. Searching all line segments in the detection process is a huge workload, the resource consumption of the MCU is large, and a two-dimensional lookup table is preferably adopted in the design to accelerate the query. By the optimization mode and the anticlockwise searching mode, the detection times of each straight line are limited, and the running time occupied by quadrilateral detection is greatly reduced.
In a preferred embodiment of the present invention, the third processing unit 313 includes:
a first processing subunit 3131, configured to obtain, according to a focal length of the camera, a size of the two-dimensional tag, and a target image, a homography matrix representing a position mapping relationship of the two-dimensional tag between a tag coordinate system and an image coordinate system, where the tag coordinate system uses a center of the two-dimensional tag as an origin and a plane where the two-dimensional tag is located is an XOY plane;
and the second processing subunit 3132, connected to the first processing subunit 3131, is configured to process, according to the internal reference matrix and the homography matrix of the camera obtained through calibration in advance, to obtain position information of the two-dimensional tag in the image coordinate system as relative position information between the aircraft and the moving target object.
Specifically, in this embodiment, the tag coordinate system may process to obtain a three-dimensional coordinate of the two-dimensional tag in the camera coordinate system after acquiring a focal length of the camera and a size of the two-dimensional tag according to an imaging principle of the camera, and further process to obtain a homography matrix by using a direct linear transformation algorithm in combination with the target image.
In a preferred embodiment of the present invention, the second processing subunit 3132 obtains the position information by using the following formula:
wherein H is used for representing a homography matrix; s is used to represent a scale factor; p is used to represent an internal reference matrix; rij(i-0, 1, 2; j-0, 1) is used for representing the rotation component of the two-dimensional label in the image coordinate system; t isx,Ty,TzFor representing a distance component of the two-dimensional label in an image coordinate system;
the location information includes T in the distance componentxAnd Ty,TxRepresenting a first relative distance, T, between the aircraft and the moving target object in the x-axis direction in the tag coordinate systemyAnd the relative position information comprises the first relative distance and the second relative distance.
Specifically, in this embodiment, H is a matrix of 3 × 3, and P is a matrix of 3 × 4, which is specifically represented as:wherein f isxAnd fyIs the focal length of the camera. After the formula is substituted into the formula, the formula is converted into a set of equivalent equations as follows:
solving the equation system can obtain T in the position information including the distance componentxAnd TyI.e. the position information of the two-dimensional tag in the image coordinate system.
The position information is referred to by an image coordinate system, the central position of the selected target image is the original point of the image coordinate system, the positive direction of an X axis from the original point to the right and the positive direction of a Y axis from the original point to the top are selected in the image plane, and the output position information is the deviation from the central point of the image. After the design, the position information can be directly used as the relative position information between the aircraft and the moving target object, and then the aircraft can fly along with the moving target object by adjusting the attitude of the aircraft according to the relative position information, so that the tracking effect is realized by the fact that the moving target object is always in the central position of the target image.
Further, since the flight process of the aircraft is dynamic and the camera and the aircraft are relatively static, the visual angle of the camera may change with the body posture. In the change process, the visual angle of the camera cannot be kept vertical to the ground at any moment, and in addition, the image captured by the camera in the flight process has distortion. Therefore, the position information needs to be corrected, and the position information is completely obtained by calculating image pixels, so that the attitude interference in the position information is eliminated to ensure the correctness of the position data. Based on this, the image processing sub-module 31 further includes a position correction unit 314 connected to the third processing unit 313, and the position correction unit 314 includes:
the first correction subunit 3141 is configured to obtain an euler angle and a flight altitude of the aircraft in real time, and respectively process the euler angle and the flight altitude to obtain a first position deviation between the aircraft and the moving target object in the x-axis direction in the tag coordinate system and a second position deviation between the aircraft and the moving target object in the y-axis direction in the tag coordinate system;
a second correcting subunit 3142, connected to the first correcting subunit 3141, and configured to correct the first relative distance and the second relative distance according to the first position deviation and the second position deviation, respectively, to obtain corrected relative position information;
and the tracking control sub-module 32 processes the corrected relative position information to obtain a tracking signal so as to control the aircraft to track the moving target object to fly.
Specifically, in the present embodiment, the principle and method of position correction are described by taking the Roll angle Roll of the aircraft as an example, as shown in fig. 3, where a straight line L1 represents the ground plane on which the moving target object 1 is located; the point O is an aircraft, the straight line L2 is an aircraft body plane, the straight line L3 is a sight line of a camera carried by the aircraft, and the straight line L4 is a straight line parallel to the horizon line through the point O; the dotted line is perpendicular to the horizon through the moving target object 1 and is perpendicular to the straight line L4 through the point N; the roll angle of the machine body is theta. At this time, the aircraft is not directly above the moving target object 1 due to the inclination of the body, but the tracking of the aircraft is in an unbiased state when the moving target object 1 is at the center in the image captured by the camera, i.e., from the viewpoint of the target position data. At this time, the position data needs to be corrected, the broken line in the triangle formed by the moving target object 1, the airframe and the point N is the height of the aircraft acquired in advance, and according to the pythagorean theorem:
tan(π-θ)=h/L
the actual deviation is L ═ h × tan θ.
Wherein h represents the flying height; l represents a first positional deviation when θ represents a roll angle of the aircraft; and theta represents the yaw angle of the aircraft, and L represents the second position deviation. The calculation of the second position deviation is analogized, and the description is omitted here.
The present invention further provides a target tracking method based on a two-dimensional tag, which is applied to the target tracking system described above, as shown in fig. 4, the target tracking method includes:
step S1, the target tracking system controls a camera arranged on an aircraft to shoot a continuous multiframe target image of a moving target object provided with a two-dimensional label in real time in the flying process of the aircraft;
step S2, the target tracking system receives the target images, identifies each frame of target image in turn, and processes the target images according to the target images containing the two-dimensional labels to obtain the relative position information between the aircraft and the moving target object;
and step S3, the target tracking system processes the relative position information to obtain a tracking signal to control the aircraft to track the moving target object to fly, so that the two-dimensional tag is located at the center of the target image in the shot target image.
While the invention has been described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention.
Claims (10)
1. A target tracking system based on two-dimensional labels is characterized by being used for tracking a moving target object, wherein a two-dimensional label is arranged on the moving target object;
the target tracking system includes:
the aircraft is provided with a camera and is used for shooting continuous multiframe target images of the moving target object in real time in the flying process of the aircraft and outputting the target images;
the target tracking module is connected respectively the camera with the aircraft, the target tracking module includes:
the image processing submodule is used for sequentially identifying each frame of target image and processing the target image containing the two-dimensional label to obtain relative position information between the aircraft and the moving target object;
and the tracking control sub-module is connected with the image processing sub-module and used for processing according to the relative position information to obtain a tracking signal so as to control the aircraft to track the moving target object to fly, so that the two-dimensional tag is positioned in the center of the target image in the target image obtained by shooting.
2. The object tracking system of claim 1, wherein the two-dimensional tag is an aprilat tag.
3. The target tracking system of claim 2, wherein the image processing sub-module comprises:
the first processing unit is used for processing each frame of target image in sequence to obtain the gradient direction and amplitude of each pixel in the target image, and performing cluster analysis on each gradient direction and amplitude to obtain a plurality of line segments contained in the target image;
the second processing unit is connected with the first processing unit and used for traversing each line segment to identify a quadrangle and outputting an identification result indicating that the target image of the current frame contains the two-dimensional label when the quadrangle is identified for the first time;
and the third processing unit is connected with the second processing unit and used for starting a tracking mode according to the identification result, then sequentially processing the target image of the current frame and each frame of target image after the current frame respectively and continuously outputting the relative position information between the aircraft and the moving target object obtained through processing.
4. The object tracking system of claim 3, wherein the second processing unit performs quadrilateral recognition by traversing each of the line segments using a recursive depth-first search algorithm with a depth of 4.
5. The target tracking system of claim 3, wherein the third processing unit comprises:
the first processing subunit is configured to obtain a homography matrix representing a position mapping relationship of the two-dimensional tag between a tag coordinate system and an image coordinate system according to a pre-acquired focal length of the camera, a size of the two-dimensional tag, and the target image, where the tag coordinate system uses a center of the two-dimensional tag as an origin and a plane where the two-dimensional tag is located is an XOY plane;
and the second processing subunit is connected with the first processing subunit and is used for processing the internal reference matrix of the camera obtained by calibration in advance and the homography matrix to obtain the position information of the two-dimensional label in the image coordinate system as the relative position information between the aircraft and the moving target object.
6. The object tracking system of claim 5, wherein the first processing subunit processes the homography matrix using a direct linear transformation algorithm.
7. The object tracking system of claim 5, wherein the second processing subunit processes the position information using the following formula:
wherein H is used to represent the homography matrix; s is used to represent a scale factor; p is used for representing the internal reference matrix; rij(i-0, 1, 2; j-0, 1) for representing a rotation component of the two-dimensional label in the image coordinate system; t isx,Ty,TzFor representing a distance component of the two-dimensional tag in the image coordinate system;
the location information includes T in the distance componentxAnd Ty,TxRepresents a first relative distance, T, between the aircraft and the moving target object in the x-axis direction in the tag coordinate systemyRepresenting the aircraft and the motionA second relative distance between the target objects in the y-axis direction in the tag coordinate system, the relative position information including the first relative distance and the second relative distance.
8. The object tracking system of claim 7, wherein the image processing sub-module further comprises a position correction unit coupled to the third processing unit, the position correction unit comprising:
the first correction subunit is used for acquiring an Euler angle and a flight altitude of the aircraft in real time, and respectively processing the Euler angle and the flight altitude to obtain a first position deviation between the aircraft and the moving target object in the x-axis direction in a label coordinate system and a second position deviation between the aircraft and the moving target object in the y-axis direction in the label coordinate system;
the second correcting subunit is connected with the first correcting subunit and is used for correcting the first relative distance and the second relative distance respectively according to the first position deviation and the second position deviation to obtain corrected relative position information;
and the tracking control sub-module processes the corrected relative position information to obtain the tracking signal so as to control the aircraft to track the moving target object to fly.
9. The object tracking system of claim 8, wherein the first correction subunit processes the first and second position deviations using the following equations:
L=h*tanθ
wherein h represents the flying height; l represents the first positional deviation when θ represents a roll angle of the aircraft; and when theta represents the yaw angle of the aircraft, L represents the second position deviation.
10. A target tracking method based on two-dimensional tags, which is applied to the target tracking system according to any one of claims 1 to 9, and comprises the following steps:
step S1, the target tracking system controls a camera arranged on an aircraft to shoot a continuous multiframe target image of a moving target object provided with a two-dimensional label in real time in the flying process of the aircraft;
step S2, the target tracking system receives the target images, sequentially identifies each frame of the target images, and processes the target images according to the target images containing the two-dimensional labels to obtain the relative position information between the aircraft and the moving target object;
step S3, the target tracking system processes the relative position information to obtain a tracking signal to control the aircraft to track the moving target object for flight, so that the two-dimensional tag is located in the center of the target image in the target image obtained by shooting.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111071116.0A CN113781524B (en) | 2021-09-13 | 2021-09-13 | Target tracking system and method based on two-dimensional label |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111071116.0A CN113781524B (en) | 2021-09-13 | 2021-09-13 | Target tracking system and method based on two-dimensional label |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113781524A true CN113781524A (en) | 2021-12-10 |
CN113781524B CN113781524B (en) | 2023-12-08 |
Family
ID=78843283
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111071116.0A Active CN113781524B (en) | 2021-09-13 | 2021-09-13 | Target tracking system and method based on two-dimensional label |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113781524B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114157813A (en) * | 2022-02-07 | 2022-03-08 | 深圳市慧为智能科技股份有限公司 | Electronic scale camera motion control method and device, control terminal and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101287190B1 (en) * | 2012-06-04 | 2013-07-17 | 주식회사 로드코리아 | Photographing position automatic tracking method of video monitoring apparatus |
CN107103615A (en) * | 2017-04-05 | 2017-08-29 | 合肥酷睿网络科技有限公司 | A kind of monitor video target lock-on tracing system and track lock method |
CN107463181A (en) * | 2017-08-30 | 2017-12-12 | 南京邮电大学 | A kind of quadrotor self-adoptive trace system based on AprilTag |
KR20200114924A (en) * | 2019-03-26 | 2020-10-07 | 주식회사 에프엠웍스 | Method and apparatus of real-time tracking a position using drones, traking a position system including the apparatus |
-
2021
- 2021-09-13 CN CN202111071116.0A patent/CN113781524B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101287190B1 (en) * | 2012-06-04 | 2013-07-17 | 주식회사 로드코리아 | Photographing position automatic tracking method of video monitoring apparatus |
CN107103615A (en) * | 2017-04-05 | 2017-08-29 | 合肥酷睿网络科技有限公司 | A kind of monitor video target lock-on tracing system and track lock method |
CN107463181A (en) * | 2017-08-30 | 2017-12-12 | 南京邮电大学 | A kind of quadrotor self-adoptive trace system based on AprilTag |
KR20200114924A (en) * | 2019-03-26 | 2020-10-07 | 주식회사 에프엠웍스 | Method and apparatus of real-time tracking a position using drones, traking a position system including the apparatus |
Non-Patent Citations (1)
Title |
---|
石祥滨;张健;代钦;张德园;张利国;: "采用显著性分割与目标检测的形变目标跟踪方法", 计算机辅助设计与图形学学报, no. 04 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114157813A (en) * | 2022-02-07 | 2022-03-08 | 深圳市慧为智能科技股份有限公司 | Electronic scale camera motion control method and device, control terminal and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN113781524B (en) | 2023-12-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106981073B (en) | A kind of ground moving object method for real time tracking and system based on unmanned plane | |
CN108363946B (en) | Face tracking system and method based on unmanned aerial vehicle | |
CN115035260B (en) | Three-dimensional semantic map construction method for indoor mobile robot | |
CN109949361A (en) | A kind of rotor wing unmanned aerial vehicle Attitude estimation method based on monocular vision positioning | |
CN114004977B (en) | Method and system for positioning aerial data target based on deep learning | |
CN106845491B (en) | Automatic correction method based on unmanned plane under a kind of parking lot scene | |
Li et al. | UAV autonomous landing technology based on AprilTags vision positioning algorithm | |
CN111829532B (en) | Aircraft repositioning system and method | |
CN111123962A (en) | Rotor unmanned aerial vehicle repositioning photographing method for power tower inspection | |
CN110560373A (en) | multi-robot cooperation sorting and transporting method and system | |
CN109035294B (en) | Image extraction system and method for moving target | |
CN110231835A (en) | A kind of accurate landing method of unmanned plane based on machine vision | |
CN112700498A (en) | Wind driven generator blade tip positioning method and system based on deep learning | |
CN114972767A (en) | Vehicle track and course angle extraction method based on high-altitude unmanned aerial vehicle video | |
CN116866719B (en) | Intelligent analysis processing method for high-definition video content based on image recognition | |
CN114815871A (en) | Vision-based autonomous landing method for vertical take-off and landing unmanned mobile platform | |
CN113781524B (en) | Target tracking system and method based on two-dimensional label | |
CN116466586A (en) | Transformer network-based blocking target space-ground collaborative tracking method | |
CN114689030A (en) | Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision | |
CN109240319A (en) | The method and device followed for controlling unmanned plane | |
CN110382358A (en) | Holder attitude rectification method, holder attitude rectification device, holder, clouds terrace system and unmanned plane | |
CN110191311A (en) | A kind of real-time video joining method based on multiple no-manned plane | |
CN109283942A (en) | For controlling the flying method and device that unmanned plane is tracked | |
CN109283933A (en) | The control method and device that unmanned plane follows | |
CN109542120A (en) | The method and device that target object is tracked by unmanned plane |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |