[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN107330917B - The track up method and tracking equipment of mobile target - Google Patents

The track up method and tracking equipment of mobile target Download PDF

Info

Publication number
CN107330917B
CN107330917B CN201710488228.3A CN201710488228A CN107330917B CN 107330917 B CN107330917 B CN 107330917B CN 201710488228 A CN201710488228 A CN 201710488228A CN 107330917 B CN107330917 B CN 107330917B
Authority
CN
China
Prior art keywords
target
visual angle
characteristic point
image
shooting visual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710488228.3A
Other languages
Chinese (zh)
Other versions
CN107330917A (en
Inventor
曹维雨
陈翔
赵大川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Inc
Original Assignee
Goertek Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Inc filed Critical Goertek Inc
Priority to CN201710488228.3A priority Critical patent/CN107330917B/en
Priority to PCT/CN2017/095247 priority patent/WO2018232837A1/en
Priority to US15/756,545 priority patent/US10645299B2/en
Publication of CN107330917A publication Critical patent/CN107330917A/en
Application granted granted Critical
Publication of CN107330917B publication Critical patent/CN107330917B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/28Indexing scheme for image data processing or generation, in general involving image processing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses the track up methods and tracking equipment of mobile target, extract the characteristic point of the corresponding template image of each shooting visual angle in advance first, only need to calculate the characteristic point of the target image of current shooting in matching later, then the characteristic point of the characteristic point of target image template image corresponding with each shooting visual angle is matched, the corresponding shooting visual angle of matched template image is determined as to the shooting visual angle of the target image of current shooting;If the shooting visual angle of the target image of current shooting and the preset shooting visual angle for target are inconsistent, then mobile tracking equipment carries out track up to target, so that the shooting visual angle for target is consistent with the preset shooting visual angle for target, characteristic point detection and matched calculation amount are greatly reduced, the real-time of track up is improved.

Description

The track up method and tracking equipment of mobile target
Technical field
The present invention relates to the track up methods and tracking of Visual Tracking field more particularly to a kind of mobile target to set It is standby.
Background technique
Moving target detection always is that the core technology of research is asked in fields such as computer vision, pattern-recognitions with tracking Topic.Moving target detection and the key of tracking are that be detected to regard using computer vision technique, pattern recognition classification method Moving target in frequency sequence image, and effective, tenacious tracking is carried out to target area.For example, in intelligent transportation system, it can By carrying out automatically tracking monitoring to vehicle to movable object tracking;In family's intelligent entertainment equipment, it can control System automatically tracks movement human;In military field, precise guidance etc. can be carried out to weapon.
Identification, the positioning of target object and the fortune of target object of the existing computer vision technique to mobile target object When dynamic analysis, it is necessary first to perceive ambient enviroment, obtain depth information, establish 2D or 3D map, then searched by algorithmic rule Rope goes out optimal route.However for the image of known target with clapping situation, target identification and ring are carried out using video camera simultaneously Border identification and planning path, data calculation amount is too big, and the resource allocation of processor can be made insufficient.It is very high for requirement of real-time Automatically with clapping mode, this paths planning method can very likely result in equipment tracking delay.
Summary of the invention
To solve the above-mentioned problems, the present invention provides the track up method and tracking equipment of a kind of mobile target, can be with Improve the real-time shot to movable object tracking.
The present invention provides a kind of track up method of mobile target, comprising:
Extract the characteristic point of the target image of current shooting;
The feature of the characteristic point of target image template image corresponding with each shooting visual angle extracted in advance is clicked through Row matching, the most template image of matched characteristic point is determined as and the matched template image of target image;
The corresponding shooting visual angle of matched template image is determined as to the shooting visual angle of the target image of current shooting;
If the shooting visual angle of the target image of current shooting and the preset shooting visual angle for target are inconsistent, move Tracking equipment carries out track up to target, so that for the shooting visual angle and the preset shooting visual angle one for target of target It causes.
Optionally, by the spy of the characteristic point of target image template image corresponding with each shooting visual angle extracted in advance Before sign point is matched, comprising:
Target is shot from multiple shooting visual angles in advance, the corresponding template image of each shooting visual angle is obtained, mentions Take the characteristic point of the corresponding template image of each shooting visual angle.
Optionally, by the spy of the characteristic point of target image template image corresponding with each shooting visual angle extracted in advance Sign point is matched, comprising:
It is retouched by the characteristic point that rapid robust feature SURF algorithm calculates the corresponding template image of each shooting visual angle State son;
Feature point description of target image is calculated by rapid robust feature SURF algorithm;
By the consistent RANSAC algorithm of random sampling to feature point description and each shooting visual angle pair of target image Feature point description for the template image answered is matched, and feature point description of erroneous matching is rejected, and is determined correct matched Feature point description;
The correct most template image of matched feature point description subnumber amount is determined as the most template of matching characteristic point Image.
Optionally, the method further include:
First position of the target image on tracking equipment screen is preset according to every acquisition parameters of tracking equipment Region, when so that target image being located in the region of first position, the shooting effect of target image reaches best
The characteristic point of target image is compared with the characteristic point of matched template image, determines for being located at the upper left corner With characteristic point and positioned at the matching characteristic point in the lower right corner;
Using the straight line positioned at the matching characteristic point in the upper left corner and between the matching characteristic point in the lower right corner as second The diagonal line for setting region determines second position region, and the shape in second position region includes quadrangle, by cornerwise central point As the central point in second position region, second position region is present position-region of the target image on tracking equipment screen Domain.
Optionally, the method further include:
The pixel number of target image is compared with preset pixel number max-thresholds and pixel number minimum threshold, true Set the goal image pixel number be greater than pixel number max-thresholds when, tracking equipment is moved backward, in the pixel for determining target image When number is less than pixel number minimum threshold, tracking equipment is moved forward, so that the pixel number of target image is less than pixel number maximum threshold Value is greater than pixel number minimum threshold;And/or
When the central point in second position region is not in the region of first position, mobile tracking equipment, so that the second position The central point in region is located in the region of first position.
The application also provides a kind of tracking equipment, for carrying out track up, including photographic subjects image to mobile target Camera, further includes: processor and memory;
The memory is used to store the program for supporting tracking equipment to execute the track up method to mobile target, described Program includes one or more computer instruction, wherein one or more computer instruction is held for processor calling Row;
When the processor is configured to following step can be executed for when executing the program stored in the memory It is rapid:
Extract camera current shooting target image characteristic point, by the characteristic point of target image with extract in advance it is every The characteristic point of the corresponding template image of one shooting visual angle is matched, and the most template image of matched characteristic point is determined as With the matched template image of target image, and the corresponding shooting visual angle of matched template image is determined as to the target of current shooting The shooting visual angle of image, if the shooting visual angle of the target image of current shooting and the preset shooting visual angle for target are inconsistent Then move is sent to mobile device;So that mobile device carries out tracking bat to target according to move mobile tracking equipment It takes the photograph, so that the shooting visual angle for target is consistent with the preset shooting visual angle for target.
In the embodiment of the present invention, in order to reduce the calculation amount of images match, each shooting visual angle pair is extracted in advance first The characteristic point for the template image answered only needs to calculate the characteristic point of the target image of current shooting in matching later, then will The characteristic point of the characteristic point of target image template image corresponding with each shooting visual angle is matched, by matched Prototype drawing As corresponding shooting visual angle is determined as the shooting visual angle of the target image of current shooting;If the shooting of the target image of current shooting Visual angle and the preset shooting visual angle for target are inconsistent, then mobile tracking equipment carries out track up to target, so that needle It is consistent with the preset shooting visual angle for target to the shooting visual angle of target, greatly reduce characteristic point detection and matched meter Calculation amount improves the real-time of track up.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is this hair Bright some embodiments for those of ordinary skill in the art without creative efforts can be with root Other attached drawings are obtained according to these attached drawings.
Fig. 1 is the flow diagram of the track up method for the mobile target that one embodiment of the invention provides;
Fig. 2 present invention using tested point P as the center of circle, radius be 3 border circular areas schematic diagram;
Fig. 3 present invention using characteristic point as the center of circle, radius be 3 border circular areas schematic diagram;
The direction x (a) of Fig. 4 present invention and the Haar small echo response filter in the direction y (b);
The first position region of Fig. 5 present invention and second position area schematic;
Fig. 6 is the concrete methods of realizing flow diagram of step 105 in embodiment illustrated in fig. 1;
Fig. 7 is the structural schematic diagram for the tracking equipment that one embodiment of the invention provides.
Specific embodiment
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present invention In attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is A part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art Every other embodiment obtained without making creative work, shall fall within the protection scope of the present invention.
The term used in embodiments of the present invention is only to be not intended to be limiting merely for for the purpose of describing particular embodiments The present invention.In the embodiment of the present invention and the "an" of singular used in the attached claims, " described " and "the" It is also intended to including most forms, unless the context clearly indicates other meaning, " a variety of " generally comprise at least two, but not It excludes to include at least one situation.
It should be appreciated that term "and/or" used herein is only a kind of incidence relation for describing affiliated partner, indicate There may be three kinds of relationships, for example, A and/or B, can indicate: individualism A, exist simultaneously A and B, individualism B these three Situation.In addition, character "/" herein, typicallys represent the relationship that forward-backward correlation object is a kind of "or".
It will be appreciated that though XXX may be described in embodiments of the present invention using term first, second, third, etc., but These XXX should not necessarily be limited by these terms.These terms are only used to for XXX being distinguished from each other out.For example, not departing from implementation of the present invention In the case where example range, the first XXX can also be referred to as the 2nd XXX, and similarly, the 2nd XXX can also be referred to as the first XXX.
It should also be noted that, the terms "include", "comprise" or its any other variant are intended to nonexcludability Include, so that commodity or system including a series of elements not only include those elements, but also including not clear The other element listed, or further include for this commodity or the intrinsic element of system.In the feelings not limited more Under condition, the element that is limited by sentence "including a ...", it is not excluded that in the commodity or system for including the element also There are other identical elements.
Fig. 1 is the track up method flow schematic diagram for the mobile target that one embodiment of the invention provides, as shown in Figure 1, Include:
100, target is shot from multiple shooting visual angles in advance, obtains the corresponding Prototype drawing of each shooting visual angle Picture;
In the prior art, in order to determine current shooting target shooting visual angle, usually according to chronological order, Extracting and matching feature points are carried out to the image being continuously shot, so that it is determined that for the shooting visual angle of mobile target, as such, it is desirable to Feature point extraction is carried out to the image of each captured in real-time, calculation amount can be very big, therefore will cause the lag of track up, Real-time is poor.
Therefore, in the embodiment of the present invention, in order to improve the real-time of track up, the calculation amount of images match is reduced, it can To shoot in advance from multiple shooting visual angles to target, the corresponding template image of each shooting visual angle is obtained, for example, target In the shooting visual angle on the right of tracking equipment, the shooting visual angle of the shooting visual angle on the left side and centre.
A kind of corresponding relationship of the table 1 between template image and shooting visual angle:
Template image 1 The shooting visual angle on the right
Template image 2 The shooting visual angle on the left side
Template image 3 Intermediate shooting visual angle
101, the characteristic point of the corresponding template image of each shooting visual angle is extracted.
When specific implementation, the corresponding template image of each shooting visual angle is converted into grayscale image, is calculated every on grayscale image The gray value of a point detects characteristic point for example, by using improved FAST algorithm;As shown in Fig. 2, being chosen using tested point P as the center of circle The circle that radius is 3, by 16 pixels on circumference respectively with 1,2,, 15,16 be marked;If the gray value of tested point P is The gray value of each pixel is Ix on Ip, circle, the collection of continuous N number of pixel composition is combined into S on gray threshold t, circumference, Middle x=1,2,, 15,16, N=9;
If any one pixel x meets condition on circumference:OrThen using tested point P as symmetrical centre, the point in circle is divided into symmetrical point to di-di ', if point Condition is met to di-di ' | Ip-di | < t, | Ip-di ' | < t, then tested point P is angle point, i.e. characteristic point, is not otherwise;
If any one pixel x is unsatisfactory for condition on circumference:OrThen choose next tested point.
The characteristic point detected in the corresponding template image of each shooting visual angle is finally saved in corresponding characteristic point In library.
A kind of corresponding relationship of the table 2 between template image and characteristic point library:
Template image 1 Characteristic point library 1
Template image 2 Characteristic point library 2
Template image 3 Characteristic point library 3
102, the characteristic point of the target image of extract real-time current shooting;
After presetting the corresponding template image of each shooting visual angle and corresponding characteristic point library, tracking equipment Track up can be carried out to mobile target, when the object moves, in order to determine the shooting visual angle for being currently directed to target, needed pair The target image of current shooting carries out the extraction of characteristic point, and specific implementation can detect characteristic point with reference to algorithm shown in Fig. 2, This is repeated no more.
103, by the feature of the characteristic point of target image template image corresponding with each shooting visual angle extracted in advance Point is matched, and the most template image of matched characteristic point is determined as and the matched template image of target image;
In a kind of optional embodiment, step 103 includes: when implementing
It is retouched by the characteristic point that rapid robust feature SURF algorithm calculates the corresponding template image of each shooting visual angle State son;
Feature point description of target image is calculated by rapid robust feature SURF algorithm;
By the consistent RANSAC algorithm of random sampling to feature point description and each shooting visual angle pair of target image Feature point description for the template image answered is matched, and feature point description of erroneous matching is rejected, and is determined correct matched Feature point description;
The correct most template image of matched feature point description subnumber amount is determined as the most template of matching characteristic point Image.
It should be noted that the target image of current shooting can be calculated using SURF algorithm in the embodiment of the present invention Feature point description of feature point description template image corresponding with each shooting visual angle, then by the feature of target image Feature point description of point description template image corresponding with each shooting visual angle is matched, because using SURF algorithm It may be implemented to greatly shorten the time required to feature point extraction, to improve the real-time of characteristic point detection.
In the present embodiment, include: using the detailed process that SURF algorithm establishes feature point description
Firstly, the direction of characteristic point is calculated, and using characteristic point as the center of circle, the direction for the circle calculating characteristic point that radius is 3, such as Using characteristic point as the center of circle, the border circular areas that radius is 3 is chosen, is by central angleSector rotate around the center of circle, every time rotation 15 °, 24 fan-shaped regions are obtained, as shown in Figure 3;It uses side length for 2 filter, calculates each point in each fan-shaped region It is responded in the Haar small echo response of x-axis direction and the Haar small echo in y-axis direction, as shown in figure 4, and two centered on characteristic point Rank Gaussian function numerical value is weighted the Haar small echo response of x-axis direction and the Haar small echo response in y-axis direction respectively, after weighting X-axis direction Haar small echo response and y-axis direction Haar small echo respond respectively as the point in fan-shaped region in fan section Vertical response in domain along the horizontal respone of abscissa x-axis direction and along ordinate y-axis direction;To own in each fan-shaped region The horizontal respone of point and vertical response are separately summed, and obtain a partial vector, in 24 fan-shaped regions, the longest office of length Direction of portion's vector as characteristic point remembers that the bat deflection of characteristic point is θ.
Secondly, establishing description, centered on characteristic point, 9 × 9 region is chosen, and be divided into 93 × 3 subdomains, often 4 data are calculated in a subdomain, generate description of the vector as characteristic point of one 36 dimension, specifically:
1, centered on characteristic point, 9 × 9 region is chosen, and is divided into 93 × 3 subdomains;
2, it uses side length for 2 filter, calculates separately the horizontal direction Haar small echo response hxi, j of each subdomain and hang down Histogram responds hyi, j to Haar small echo, wherein i=1, and 2,,, 4, j=1,2,,, 9, and the second order centered on characteristic point Gaussian function numerical value responds hxi to horizontal direction Haar small echo respectively, and j and vertical direction Haar small echo respond hyi, and j is weighted, Horizontal direction Haar small echo after being weighted responds hXi, and j and vertical direction Haar small echo respond hYi, j, respectively to level side Hxi is responded to Haar small echo, j and vertical direction Haar small echo respond hyi, and j carries out rotation transformation, obtains on characteristic point direction ComponentRotation transformation formula is respectively that rotation transformation formula is respectively as follows:Wherein w is with feature Second order Gauss weight centered on point, θ are characterized deflection a little;
3, it for each subdomain, calculates separately outIt is then each Subdomain just generates one 4 dimension description vectorsBy 9 subdomains generate description to Amount connects, and obtains the description vectors that length is 36, i.e., description of 36 dimensions.
It should be noted that the RANSAC algorithm of use can be by assigning Feature Points Matching matter in the embodiment of the present invention Amount arranges quality of match combination from high to low, and the high feature point group of preferential quality closes the sample of computation model, rather than random Selection, since the probability that the high characteristic point of quality obtains correct model is big, can greatly reduce in this way RANSAC algorithm repeatedly Generation number improves the arithmetic speed of RANSAC algorithm, so that feature so as to greatly improve the speed for rejecting error characteristic point Further reduced the time required to point matching, more further increases the real-time of Feature Points Matching.
104, the corresponding shooting visual angle of matched template image is determined as to the shooting visual angle of the target image of current shooting;
That is, the target image of current shooting is most matched with pre-set one of template image, then this is most The corresponding shooting visual angle of matched template image is the shooting visual angle of the target image of current shooting.
If 105, the shooting visual angle of the target image of current shooting and the preset shooting visual angle for target are inconsistent, Mobile tracking equipment carries out track up to target, so that the shooting visual angle and the preset shooting for target for target regard Angle is consistent.
It should be noted that, in order to realize that mobile tracking equipment carries out track up to target, being needed in the embodiment of the present invention Preset the band of position (solid line of such as Fig. 5 when the shooting effect of target image reaches best in tracking equipment screen The band of position shown in frame is first position region) and determine current shooting target image working as on tracking equipment screen Front position region (band of position as shown in the dotted line frame of Fig. 5 is second position region).
Wherein, the setting in first position region includes: to preset target figure according to every acquisition parameters of tracking equipment As the first position region on tracking equipment screen, when so that target image being located in the region of first position, target image Shooting effect reaches best.
Wherein, the determination in second position region includes: by the feature of the characteristic point of target image and matched template image Point is compared, and determines the matching characteristic point for being located at the upper left corner and the matching characteristic point positioned at the lower right corner;The upper left corner will be located at Matching characteristic point and diagonal line of the straight line as second position region between the matching characteristic point in the lower right corner, determine second The band of position, the second position region are Current location area of the target image of current shooting on tracking equipment screen, In, the shape in second position region includes but is not limited to quadrangle, and using cornerwise central point as second position region Central point.Reach optimal first position region by the above-mentioned shooting effect for being configured such that target image, letter can be passed through Whether single center for efficiently judging second position region is in the region of first position, if not then simply and easily passing through shifting Motion tracking equipment is moved to the center in second position region in the region of first position, greatly improves locating and tracking shooting effect Rate.
Fig. 6 is the concrete methods of realizing flow diagram of step 105 in embodiment illustrated in fig. 1, in practical applications, in order to Reach best shooting effect, the method also includes:
201, the pixel number of target image is compared with preset pixel number max-thresholds and pixel number minimum threshold;
It optionally, include step 202 or 203 after step 201.
202, if it is determined that the pixel number of target image is greater than pixel number max-thresholds, then tracking equipment is moved backward, so that The pixel number of target image is less than pixel number max-thresholds;Or
203 if it is determined that the pixel number of target image then moves forward tracking equipment, so that mesh less than pixel number minimum threshold The pixel number of logo image is greater than pixel number minimum threshold;
Optionally, include: after step 202 or 203
204, when the central point in second position region is not in the region of first position, mobile tracking equipment, so that second The central point of the band of position is located in the region of first position;
If second position central point is located at right side in the region of first position, move right tracking equipment, if the second position Central point is located at left side in the region of first position, then is moved to the left tracking equipment.
If 205, the shooting visual angle of the target image of current shooting and the preset shooting visual angle for target are inconsistent, Mobile tracking equipment carries out track up to target, so that the shooting visual angle and the preset shooting for target for target regard Angle is consistent.
For example, detecting the target right side if default shooting visual angle is the intermediate shooting visual angle (positive shooting visual angle) for target Side is towards video camera, i.e. the shooting visual angle of the target image of the current shooting shooting visual angle that is the right, then can to camera into Row original place rotates counterclockwise, so that the shooting visual angle for target is consistent with the preset shooting visual angle for target.
In the embodiment of the present invention, the corresponding template image of each shooting visual angle is precalculated using SURF algorithm first Feature point description only needs to calculate feature point description of the target image of current shooting, greatly reduces in matching later The calculating time of real time characteristic points detection, then using RANSAC algorithm by feature point description of target image and each Feature point description of the corresponding template image of shooting visual angle is matched, and the number of iterations of RANSAC algorithm is greatly reduced, it The corresponding shooting visual angle of matched template image is determined as to the shooting visual angle of the target image of current shooting afterwards;Finally work as judgement When the shooting visual angle of the target image of current shooting and the preset shooting visual angle for target are inconsistent, then mobile tracking equipment Track up is carried out to target, so that the shooting visual angle for target is consistent with the preset shooting visual angle for target, therefore Can greatly reduce real-time tracking shooting when image characteristic point detection and matched calculation amount, may be implemented characteristic point detection with It is shorter with required time, improve the real-time of track up.
Fig. 7 is the structural schematic diagram for the tracking equipment that one embodiment of the invention provides, which is used for mobile mesh Mark carries out track up, as shown in fig. 7, the tracking equipment specifically includes the camera of photographic subjects image, further includes: processor And memory;
The memory is used to store the program for supporting tracking equipment to execute the track up method to mobile target, described Program includes one or more computer instruction, wherein one or more computer instruction is held for processor calling Row;
When the processor is configured to following step can be executed for when executing the program stored in the memory It is rapid:
Extract camera current shooting target image characteristic point, by the characteristic point of target image with extract in advance it is every The characteristic point of the corresponding template image of one shooting visual angle is matched, and the most template image of matched characteristic point is determined as With the matched template image of target image, and the corresponding shooting visual angle of matched template image is determined as to the target of current shooting The shooting visual angle of image, if the shooting visual angle of the target image of current shooting and the preset shooting visual angle for target are inconsistent Then move is sent to mobile device;So that mobile device carries out tracking bat to target according to move mobile tracking equipment It takes the photograph, so that the shooting visual angle for target is consistent with the preset shooting visual angle for target.
Optionally, in the present embodiment, camera shoots target from multiple shooting visual angles to obtain corresponding shooting in advance The template image at visual angle;
Processor carries out the extraction of characteristic point to the corresponding template image of each shooting visual angle;
The corresponding template image of each shooting visual angle and corresponding characteristic point are preserved in memory.
Optionally, in the present embodiment, when the processor is configured to for executing the program stored in the memory When, following steps can also be performed:
It is retouched by the characteristic point that rapid robust feature SURF algorithm calculates the corresponding template image of each shooting visual angle State son;Feature point description of target image is calculated by rapid robust feature SURF algorithm;It is consistent by random sampling Feature point description of the RANSAC algorithm to feature point description of target image template image corresponding with each shooting visual angle Son is matched, and feature point description of erroneous matching is rejected, and determines correct matched feature point description;It will be correct matched The most template image of feature point description subnumber amount is determined as the most template image of matching characteristic point.
Optionally, it is also preserved in the present embodiment, in the memory preparatory according to every acquisition parameters of tracking equipment First position region of the target image of setting on tracking equipment screen, wherein first position region is the bat of target image Take the photograph the band of position that effect reaches optimum state;
The characteristic point of target image is compared by the processor with the characteristic point of matched template image, and determination is located at The matching characteristic point in the upper left corner and matching characteristic point positioned at the lower right corner;By the matching characteristic point for being located at the upper left corner and it is located at bottom right Diagonal line of the straight line as second position region between the matching characteristic point at angle, determines second position region, second position area The shape in domain includes quadrangle, using cornerwise central point as the central point in second position region;Second position region is mesh Current location area of the logo image on tracking equipment screen.
Optionally, in the present embodiment, when the processor is configured to for executing the program stored in the memory When, following steps can also be performed:
The pixel number of target image is compared with preset pixel number max-thresholds and pixel number minimum threshold, true Set the goal image pixel number be greater than pixel number max-thresholds when, give mobile device send move;So that mobile device root Tracking equipment is moved backward according to move, so that the pixel number of target image is less than pixel number max-thresholds;And/or
The pixel number of target image is compared with preset pixel number max-thresholds and pixel number minimum threshold, true Set the goal image pixel number be less than pixel number minimum threshold when, give mobile device send move;So that mobile device root Tracking equipment is moved forward according to move, so that the pixel number of target image is greater than pixel number minimum threshold;And/or
When the central point in second position region is not in the region of first position, move is sent to mobile device;With Make mobile device according to move mobile tracking equipment, so that the central point in second position region is located at first position region It is interior.
Device described in the embodiment of the present invention can execute method shown in FIG. 1, and implementing principle and technical effect are no longer It repeats.
The embodiment of the invention also provides a kind of computer storage mediums, for storing the realization pair of tracking equipment shown in Fig. 7 Mobile target carries out computer software instructions used in track up.
Finally, it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although Present invention has been described in detail with reference to the aforementioned embodiments, those skilled in the art should understand that: it still may be used To modify the technical solutions described in the foregoing embodiments or equivalent replacement of some of the technical features; And these are modified or replaceed, technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution spirit and Range.

Claims (10)

1. a kind of track up method of mobile target characterized by comprising
Extract the characteristic point of the target image of current shooting;
By the characteristic point progress of the characteristic point of target image template image corresponding with each shooting visual angle extracted in advance Match, the most template image of matched characteristic point is determined as and the matched template image of target image;
The corresponding shooting visual angle of matched template image is determined as to the shooting visual angle of the target image of current shooting;
If the shooting visual angle of the target image of current shooting and the preset shooting visual angle for target are inconsistent, mobile tracking Equipment carries out track up to target, so that the shooting visual angle for target is consistent with the preset shooting visual angle for target.
2. the method according to claim 1, wherein by the characteristic point of target image in advance extract each Before the characteristic point of the corresponding template image of shooting visual angle is matched, comprising:
Target is shot from multiple shooting visual angles in advance, obtains the corresponding template image of each shooting visual angle, is extracted every The characteristic point of the corresponding template image of one shooting visual angle.
3. method according to claim 1 or 2, which is characterized in that by the characteristic point of target image with extract in advance it is every The characteristic point of the corresponding template image of one shooting visual angle is matched, comprising:
Feature point description of the corresponding template image of each shooting visual angle is calculated by rapid robust feature SURF algorithm;
Feature point description of target image is calculated by rapid robust feature SURF algorithm;
It is corresponding with each shooting visual angle to feature point description of target image by the consistent RANSAC algorithm of random sampling Feature point description of template image is matched, and is rejected feature point description of erroneous matching, is determined correct matched feature Point description;
The correct most template image of matched feature point description subnumber amount is determined as the most template image of matching characteristic point.
4. the method according to claim 1, wherein further include:
First position region of the target image on tracking equipment screen is preset according to every acquisition parameters of tracking equipment, When so that target image being located in the region of first position, the shooting effect of target image reaches best
The characteristic point of target image is compared with the characteristic point of matched template image, determines that the matching for being located at the upper left corner is special Sign point and the matching characteristic point positioned at the lower right corner;
Using the straight line positioned at the matching characteristic point in the upper left corner and between the matching characteristic point in the lower right corner as second position area The diagonal line in domain determines second position region, and the shape in second position region includes quadrangle, using cornerwise central point as The central point in second position region, second position region are Current location area of the target image on tracking equipment screen.
5. according to right want 4 described in method, which is characterized in that further include:
The pixel number of target image is compared with preset pixel number max-thresholds and pixel number minimum threshold, is determining mesh When the pixel number of logo image is greater than pixel number max-thresholds, tracking equipment is moved backward, it is small in the pixel number for determining target image When pixel number minimum threshold, tracking equipment is moved forward, so that the pixel number of target image is big less than pixel number max-thresholds In pixel number minimum threshold;And/or
When the central point in second position region is not in the region of first position, mobile tracking equipment, so that second position region Central point be located in the region of first position.
6. a kind of tracking equipment, for carrying out track up, the camera including photographic subjects image, feature to mobile target It is, further includes: processor and memory;
The memory is used to store the computer program for supporting tracking equipment to execute the track up method to mobile target, institute It states computer program and calls execution for the processor;
When the processor is configured to following step can be executed for when executing the computer program stored in the memory It is rapid:
The characteristic point for extracting the target image of camera current shooting, by the characteristic point of target image in advance extract each The characteristic point of the corresponding template image of shooting visual angle is matched, and the most template image of matched characteristic point is determined as and mesh The matched template image of logo image, and the corresponding shooting visual angle of matched template image is determined as to the target image of current shooting Shooting visual angle, given if shooting visual angle and the preset shooting visual angle for target of the target image of current shooting are inconsistent Mobile device sends move;So that mobile device carries out track up to target according to move mobile tracking equipment, So that the shooting visual angle for target is consistent with the preset shooting visual angle for target.
7. equipment according to claim 6, it is characterised in that:
Camera shoots target from multiple shooting visual angles to obtain the template image of corresponding shooting visual angle in advance;
Processor carries out the extraction of characteristic point to the corresponding template image of each shooting visual angle;
The corresponding template image of each shooting visual angle and corresponding characteristic point are preserved in memory.
8. equipment according to claim 6 or 7, which is characterized in that when the processor is configured to described for executing When the program stored in memory, following steps can also be performed:
Feature point description of the corresponding template image of each shooting visual angle is calculated by rapid robust feature SURF algorithm; Feature point description of target image is calculated by rapid robust feature SURF algorithm;It is calculated by the consistent RANSAC of random sampling Feature point description progress of the method to feature point description of target image template image corresponding with each shooting visual angle Match, reject feature point description of erroneous matching, determines correct matched feature point description;Correct matched characteristic point is retouched It states the most template image of subnumber amount and is determined as the most template image of matching characteristic point.
9. equipment according to claim 6, it is characterised in that:
It also preserves in the memory and is set according to the pre-set target image of every acquisition parameters of tracking equipment in tracking First position region on standby screen, wherein first position region is that the shooting effect of target image reaches the position of optimum state Set region;
The characteristic point of target image is compared by the processor with the characteristic point of matched template image, is determined and is located at upper left The matching characteristic point at angle and matching characteristic point positioned at the lower right corner;The matching characteristic point in the upper left corner will be located at and positioned at the lower right corner Diagonal line of the straight line as second position region between matching characteristic point, determines second position region, second position region Shape includes quadrangle, using cornerwise central point as the central point in second position region;Second position region is target figure As the Current location area on tracking equipment screen.
10. equipment according to claim 9, which is characterized in that when the processor is configured to for executing described deposit When the program stored in reservoir, following steps can also be performed:
The pixel number of target image is compared with preset pixel number max-thresholds and pixel number minimum threshold, is determining mesh When the pixel number of logo image is greater than pixel number max-thresholds, move is sent to mobile device;So that mobile device is according to shifting Dynamic instruction moves backward tracking equipment, so that the pixel number of target image is less than pixel number max-thresholds;And/or
The pixel number of target image is compared with preset pixel number max-thresholds and pixel number minimum threshold, is determining mesh When the pixel number of logo image is less than pixel number minimum threshold, move is sent to mobile device;So that mobile device is according to shifting Dynamic instruction moves forward tracking equipment, so that the pixel number of target image is greater than pixel number minimum threshold;And/or
When the central point in second position region is not in the region of first position, move is sent to mobile device;So as to move Device is moved according to move mobile tracking equipment, so that the central point in second position region is located in the region of first position.
CN201710488228.3A 2017-06-23 2017-06-23 The track up method and tracking equipment of mobile target Active CN107330917B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201710488228.3A CN107330917B (en) 2017-06-23 2017-06-23 The track up method and tracking equipment of mobile target
PCT/CN2017/095247 WO2018232837A1 (en) 2017-06-23 2017-07-31 Tracking photography method and tracking apparatus for moving target
US15/756,545 US10645299B2 (en) 2017-06-23 2017-07-31 Method for tracking and shooting moving target and tracking device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710488228.3A CN107330917B (en) 2017-06-23 2017-06-23 The track up method and tracking equipment of mobile target

Publications (2)

Publication Number Publication Date
CN107330917A CN107330917A (en) 2017-11-07
CN107330917B true CN107330917B (en) 2019-06-25

Family

ID=60195593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710488228.3A Active CN107330917B (en) 2017-06-23 2017-06-23 The track up method and tracking equipment of mobile target

Country Status (3)

Country Link
US (1) US10645299B2 (en)
CN (1) CN107330917B (en)
WO (1) WO2018232837A1 (en)

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109074657B (en) * 2018-07-18 2022-04-08 达闼机器人有限公司 Target tracking method and device, electronic equipment and readable storage medium
CN110750094A (en) * 2018-07-23 2020-02-04 杭州海康威视数字技术股份有限公司 Method, device and system for determining pose change information of movable equipment
CN110881117A (en) * 2018-09-06 2020-03-13 杭州海康威视数字技术股份有限公司 Inter-picture area mapping method and device and multi-camera observation system
CN109788191A (en) * 2018-12-21 2019-05-21 中国科学院自动化研究所南京人工智能芯片创新研究院 Photographic method, device, computer equipment and storage medium
CN110097586B (en) * 2019-04-30 2023-05-30 青岛海信网络科技股份有限公司 Face detection tracking method and device
KR102629225B1 (en) * 2019-05-16 2024-01-29 캐논 가부시끼가이샤 Batch determination device, system, batch determination method and recording medium
CN111951211B (en) * 2019-05-17 2024-05-14 株式会社理光 Target detection method, device and computer readable storage medium
CN110213566B (en) * 2019-05-20 2021-06-01 歌尔光学科技有限公司 Image matching method, device, equipment and computer readable storage medium
CN111738282A (en) * 2019-10-22 2020-10-02 腾讯科技(深圳)有限公司 Image recognition method based on artificial intelligence and related equipment
CN112749722B (en) * 2019-10-31 2023-11-17 深圳云天励飞技术有限公司 Model distribution management method and related products thereof
CN110633612B (en) * 2019-11-20 2020-09-11 中通服创立信息科技有限责任公司 Monitoring method and system for inspection robot
CN111626978B (en) * 2019-12-24 2023-05-09 西安元智系统技术有限责任公司 Cultural relic fracture monitoring method based on feature points
CN111275739B (en) * 2020-01-19 2023-05-05 广东工业大学 Automatic tracking device and method
CN111917989B (en) * 2020-09-15 2022-01-21 苏州臻迪智能科技有限公司 Video shooting method and device
CN112487946A (en) * 2020-11-26 2021-03-12 努比亚技术有限公司 Human body moving target removing method and device, mobile terminal and storage medium
CN112929567B (en) * 2021-01-27 2023-04-28 咪咕音乐有限公司 Shooting position determining method, electronic device and storage medium
CN112926593A (en) * 2021-02-20 2021-06-08 温州大学 Image feature processing method and device for dynamic image enhancement presentation
CN113079320B (en) * 2021-04-13 2022-03-11 浙江科技学院 Sectional type multifunctional camera shooting method based on whole body mirror
CN113190455B (en) * 2021-05-13 2024-06-07 统信软件技术有限公司 Element positioning method and computing equipment
CN113408465B (en) * 2021-06-30 2022-08-26 平安国际智慧城市科技股份有限公司 Identity recognition method and device and related equipment
CN113610134B (en) * 2021-07-29 2024-02-23 Oppo广东移动通信有限公司 Image feature point matching method, device, chip, terminal and storage medium
CN113837246B (en) * 2021-09-06 2022-12-27 广州极飞科技股份有限公司 Image matching method and device and unmanned equipment
CN113807224B (en) * 2021-09-07 2023-11-21 金华市浙工大创新联合研究院 Method for detecting and tracking illegal behaviors of factory
CN113985830B (en) * 2021-11-08 2024-10-29 武汉逸飞激光股份有限公司 Feeding control method and device of sealing nail, electronic equipment and storage medium
CN114173060A (en) * 2021-12-10 2022-03-11 北京瞰瞰智能科技有限公司 Intelligent mobile shooting method and device and controller
CN114972629B (en) * 2022-04-14 2024-10-15 广州极飞科技股份有限公司 Feature point matching method, device, equipment and storage medium
CN114926903A (en) * 2022-05-27 2022-08-19 深圳市瑞立视多媒体科技有限公司 Method, device and related equipment for matching rigid body mark points based on template graph
CN114689030A (en) * 2022-06-01 2022-07-01 中国兵器装备集团自动化研究所有限公司 Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
CN115100144B (en) * 2022-06-23 2023-04-07 常州市新创智能科技有限公司 Method and device for detecting scraps in glass fiber cloth production process
CN116723402B (en) * 2023-05-26 2024-10-22 深圳市腾浩科技有限公司 Video monitoring method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6687386B1 (en) * 1999-06-15 2004-02-03 Hitachi Denshi Kabushiki Kaisha Object tracking method and object tracking apparatus
CN1477599A (en) * 2002-08-23 2004-02-25 英业达股份有限公司 Focusing method of moving body
CN102663777A (en) * 2012-04-26 2012-09-12 安科智慧城市技术(中国)有限公司 Target tracking method and system based on multi-view video
CN102929288A (en) * 2012-08-23 2013-02-13 山东电力集团公司电力科学研究院 Unmanned aerial vehicle inspection head control method based on visual servo

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6829384B2 (en) * 2001-02-28 2004-12-07 Carnegie Mellon University Object finder for photographic images
US8238612B2 (en) * 2008-05-06 2012-08-07 Honeywell International Inc. Method and apparatus for vision based motion determination
US9378431B2 (en) * 2011-11-18 2016-06-28 Metaio Gmbh Method of matching image features with reference features and integrated circuit therefor
WO2013086475A1 (en) * 2011-12-08 2013-06-13 Cornell University System and methods for world-scale camera pose estimation
CN104933755B (en) * 2014-03-18 2017-11-28 华为技术有限公司 A kind of stationary body method for reconstructing and system
US9317921B2 (en) * 2014-07-10 2016-04-19 Qualcomm Incorporated Speed-up template matching using peripheral information
US10237477B2 (en) * 2017-05-22 2019-03-19 Fyusion, Inc. Loop closure

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6687386B1 (en) * 1999-06-15 2004-02-03 Hitachi Denshi Kabushiki Kaisha Object tracking method and object tracking apparatus
CN1477599A (en) * 2002-08-23 2004-02-25 英业达股份有限公司 Focusing method of moving body
CN102663777A (en) * 2012-04-26 2012-09-12 安科智慧城市技术(中国)有限公司 Target tracking method and system based on multi-view video
CN102929288A (en) * 2012-08-23 2013-02-13 山东电力集团公司电力科学研究院 Unmanned aerial vehicle inspection head control method based on visual servo

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SURF算法及其对运动目标的检测跟踪效果;仝如强等;《西南科技大学学报》;20110930;第26卷(第3期);63-67

Also Published As

Publication number Publication date
WO2018232837A1 (en) 2018-12-27
US20190281224A1 (en) 2019-09-12
CN107330917A (en) 2017-11-07
US10645299B2 (en) 2020-05-05

Similar Documents

Publication Publication Date Title
CN107330917B (en) The track up method and tracking equipment of mobile target
CN108010067B (en) A kind of visual target tracking method based on combination determination strategy
US11361459B2 (en) Method, device and non-transitory computer storage medium for processing image
CN103824070B (en) A kind of rapid pedestrian detection method based on computer vision
CN105989608B (en) A kind of vision capture method and device towards intelligent robot
US11093737B2 (en) Gesture recognition method and apparatus, electronic device, and computer-readable storage medium
CN109165589A (en) Vehicle based on deep learning recognition methods and device again
US20180321776A1 (en) Method for acting on augmented reality virtual objects
CN103886325B (en) Cyclic matrix video tracking method with partition
CN103514432A (en) Method, device and computer program product for extracting facial features
CN110298860B (en) High pole hydrangea detection count system based on machine vision
CN109886951A (en) Method for processing video frequency, device and electronic equipment
CN108875730A (en) A kind of deep learning sample collection method, apparatus, equipment and storage medium
CN106097385B (en) A kind of method and apparatus of target following
CN110287907A (en) A kind of method for checking object and device
CN110097586A (en) A kind of Face datection method for tracing and device
CN106412441B (en) A kind of video stabilization control method and terminal
WO2021004186A1 (en) Face collection method, apparatus, system, device, and medium
CN110298281A (en) Video structural method, apparatus, electronic equipment and storage medium
CN108021852A (en) A kind of demographic method, passenger number statistical system and electronic equipment
CN109636828A (en) Object tracking methods and device based on video image
CN108256567A (en) A kind of target identification method and system based on deep learning
Sokolova et al. Human identification by gait from event-based camera
CN109948630A (en) Recognition methods, device, system and the storage medium of target sheet image
CN110458857B (en) Central symmetry primitive detection method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant