[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN101394546B - Video target profile tracing method and device - Google Patents

Video target profile tracing method and device Download PDF

Info

Publication number
CN101394546B
CN101394546B CN2007101541207A CN200710154120A CN101394546B CN 101394546 B CN101394546 B CN 101394546B CN 2007101541207 A CN2007101541207 A CN 2007101541207A CN 200710154120 A CN200710154120 A CN 200710154120A CN 101394546 B CN101394546 B CN 101394546B
Authority
CN
China
Prior art keywords
profile
barycenter
target
particle
random particles
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN2007101541207A
Other languages
Chinese (zh)
Other versions
CN101394546A (en
Inventor
于纪征
曾贵华
赵光耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN2007101541207A priority Critical patent/CN101394546B/en
Publication of CN101394546A publication Critical patent/CN101394546A/en
Application granted granted Critical
Publication of CN101394546B publication Critical patent/CN101394546B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses a video target profile tracking method. The parameters for carrying out the state transfer of various random particles during the profile tracking process are adjusted through the position of a target center-of-mass obtained by tracking the center-of-mass of a target, thereby allowing the parameters for carrying out the state transfer of the random particles to be correspondingly changed along with the change of the position of the target center-of-mass and further allowing the tracking of the target profile to be more accurate. The embodiment of the invention further discloses another video target profile tracking method, the various random particles are assessed through the position of the target center-of-mass obtained by tracking the center-of-mass of the target, the weight values of the particles are adjusted according to the degree of closeness of the center-of-mass of the profiles of the various particles to the target center-of-mass obtained by tracking, thereby allowing the target profile obtained by weighted accumulation of the various particles to be more closer to the true target profile and further allowing the tracking of the target profile to be more accurate. The embodiment of the invention further discloses two video target profile tracking devices.

Description

Video target profile tracing method and device
Technical field
The embodiment of the invention relates to computer vision technique and technical field of image processing, particularly a kind of video target profile tracing method and device.
Background technology
At present, human vision system is the main path that obtains external information, and motion target detection and tracking then are important topics in vision field, and real-time target following is the key technology in the computer vision especially.The video safety monitoring system has obtained increasing application in all departments such as bank, traffic at present, the tracking of carrying out the video object object in real time can be played the effect of early warning especially, has obtained people and more and more pays close attention to so target object is carried out real-time tracking.
Video target tracking method has multiple, according to whether carrying out pattern matching in interframe, can be divided into based on the method that detects with based on the method for discerning.Based on the method that detects is directly to extract target in each two field picture according to clarification of objective, need not transmit the kinematic parameter of target and mates in interframe, such as the method for Differential Detection; Usually at first extract certain feature of target based on the method for identification, in each two field picture, hunt out the zone that feature therewith mates the most then and be the target of being followed the tracks of.Can be divided into the method for following the tracks of profile and the method for tracking target partial points according to the result who follows the tracks of gained.The method of common tracking target profile mainly is the particle filter tracking method; The method of tracking target partial points mainly contains average drifting tracking etc.
Follow the tracks of in the method for profile, the particle filter tracking method is the most commonly used.Particle filter is called sequence Monte Carlo (SMC, Sequential Monte Carlo) method again, is a kind of method that realizes Bayes's Recursive Filtering with Monte Carlo method.According to Bayes's filtering theory, given current time observation sequence z 1: k, state x kPosterior probability can utilize (k-1) posterior probability p (x constantly K-1| z K-1) estimate to obtain with recursive mode, promptly
p ( x k | z k ) ∝ p ( z k | x k ) ∫ x k - 1 p ( x k | x k - 1 ) p ( x k - 1 | z k - 1 ) - - - ( 1 )
P (z wherein k| x k) be likelihood probability.
Particle filter does not need to obtain the concrete form of probability function, but utilizes N sIndividual random sample (particle) x that has weight K-1 i, w K-1 i(i=1 ..., N s) expression posterior probability function p (x K-1| z K-1), like this, the integration in the formula (1) just can estimate with the weighted sum of sample set, promptly
p ( x k | z k ) ≈ p ( z k | x k ) Σ i w k - 1 i p ( x k | x k - 1 i ) - - - ( 2 )
When sample size was abundant, this probability estimation was equal to the posterior probability density function.
Below, propagating tracking with conditional probability density is that the example introduction utilizes particle filter to carry out the method for video frequency object tracking.
Conditional probability density propagation tracking is based on conditional probability density propagation (Condensation, Conditional Density Propagation) algorithm.The Condensation algorithm is a kind of in the particle filter method.Utilize the Condensation algorithm to carry out to adopt a kind of profile characterizing method when profile is followed the tracks of based on movable contour model and shape space, for example, can characterize contour curve with the control point of B batten (B-Snake), characterize the possible variation of contour curve with shape space.The motion state of objective contour be T=(TX, TY, θ, SX, SY), TX and TY are respectively the positions of x direction and y direction objective contour barycenter, θ is the angle of objective contour rotation, SX and SY are respectively the yardstick of target in x direction and y direction.The shape space parameter S of target is expressed as:
S=(TX,TY,SX cosθ-1,SYcosθ-1,-SYsinθ,SX sinθ) (3)
Like this, just can express the contour curve of target and changed.
Fig. 1 is that available technology adopting Condensation algorithm is realized the flow chart that video target profile is followed the tracks of.As shown in Figure 1, adopt the Condensation algorithm mainly to comprise following step according to the process of objective contour.
Step 101 judges whether the view data and the tracking target information of input is new object, promptly whether needs to set up new tracking target, if then execution in step 102, otherwise execution in step 104.
The view data of input can be the view data through background segment, can select certain moving object as tracking target in the initial frame image.
Step 102 utilizes existing profile extractive technique to obtain the profile vector of target, and the centroid position of calculating profile is tried to achieve B batten control point QX according to B batten technology 0And QY 0, obtain the motion state initial value of target according to the profile vector of target:
T 0=(TX 0,TY 0,θ 0,SX 0,SY 0) (4)
Wherein, TX 0And TY 0Be respectively the position of x direction and y direction objective contour barycenter, θ 0Be the initial value 0 of the objective contour anglec of rotation, SX 0And SY 0Be respectively the yardstick of objective contour in x direction and y direction.
Step 103, initialization N sIndividual particle, the initial weight w of each particle 0 iBe 1/N s, motion state and shape space parameter are respectively T 0 i, S 0 i(i=1,2 ... Ns):
TX 0 i = TX 0 + B 1 × ξ - - - ( 5 )
TY 0 i = TY 0 + B 2 × ξ - - - ( 6 )
θ 0 i = θ 0 + B 3 × ξ - - - ( 7 )
SX 0 i = SX 0 + B 4 × ξ - - - ( 8 )
SY 0 i = SY 0 + B 5 × ξ - - - ( 9 )
Wherein, B 1, B 2, B 3, B 4, B 5Be constant, ξ is the random number of [1 ,+1];
S 0 i = ( TX 0 i , TY 0 i , SX 0 i cos θ 0 i - 1 , SY 0 i cos θ 0 i - 1 , - SY 0 i sin θ 0 i , SX 0 i sin θ 0 i ) - - - ( 10 )
Step 104 when importing the k frame image data, is carried out state transitions to each particle state, and the system mode equation of transfer is:
TX k i = TX k - 1 i + B 1 × ξ 1 - k i - - - ( 11 )
TY k i = TY k - 1 i + B 2 × ξ 2 - k i - - - ( 12 )
θ k i = θ k - 1 i + B 3 × ξ 3 - k i - - - ( 13 )
SX k i = SX k - 1 i + B 4 × ξ 4 - k i - - - ( 14 )
SY k i = SY k - 1 i + B 5 × ξ 5 - k i - - - ( 15 )
Wherein, B 1, B 2, B 3, B 4, B 5Be constant, ξ is the random number of [1 ,+1].
Step 105, utilize the motion state of each particle that step 104 obtains to calculate the shape space parameter of each particle:
S k i = ( TX k i , TY k i , SX k i cos θ k i - 1 , SY k i cos θ k i - 1 , - SY k i sin θ k i , SX k i sin θ k i ) - - - ( 16 )
Step 106 is calculated the B batten control point vector of each particle.
For particle N i, can be by its kinematic parameter T iWith the shape space parameter S iTry to achieve the control point vector of its B batten:
( QX k i QY k i ) T = W i S k i + ( QX 0 i QY 0 i ) T - - - ( 17 )
Wherein, W i = 1 0 QX 0 i 0 0 QY 0 i 0 1 0 QY 0 i QX 0 i 0 , Each element all is the matrix of Nc * 1, and Nc is the number at control point, initial control point QX 0 iAnd QY 0 iShape space parameter by each particle obtains.
Step 107, obtain the control point vector of each particle after, just can simulate the contour curve of each particle correspondence with the method for B batten, fitting formula is as follows:
y k i ( x ) = Σ k = 0 Nc - 1 P k B k , m ( x ) - - - ( 18 )
P wherein k(k=0,1 ..., Nc-1) be the coordinate at k control point, B K, m(k=0,1 ..., Nc-1) be m standard B spline base function.
Step 108, on contour curve, randomly draw N sampling point, the picture element of the Grad maximum of gray scale on each sampling point normal direction in the view data of calculating present frame, this point is exactly a measured value of true profile point, be according to the current frame image data computation obtain near the picture element of the true profile of target.
To particle N i, can be by its kinematic parameter T iWith the shape space parameter S iTry to achieve contour curve, N point of sampling extracts the Grad that a picture element calculates its gray scale in the current frame image data on each sampling point normal both sides at a certain distance by normal direction on contour curve.The number of the picture element of selecting can be chosen in the certain limit around sampling point as required, and is too far away because true profile point can not depart from sampling point.It is many more that picture element is got, and the measured value of the true profile point that obtains is approaching more true profile point just, but just higher to the requirement of the computing capability of equipment.
Try to achieve the distance D IS between the measured value of true profile point at each sampling point and this some place again i(n) (n=1,2 ..., N).Because the Grad of the gray scale at the true profile point of target place is bigger, therefore the distance between the measured value of the true profile point at the particle profile point of trying to achieve and this some place can be used as the standard of weighing each particle weight, the profile of this particle of the big expression of distance is with truly profile differences is apart from bigger, and distance is little represents that then the profile of this particle is more approaching with true profile.
Step 109 is passed through the distance D IS between the measured value of the true profile point of each sampling point and this some place in the view data of the present frame of being tried to achieve i(n) (n=1,2 ..., N) can obtain the observation probability density function p of each particle k i:
p k i = exp { - 1 2 ( 1 σ 2 Φ ) } - - - ( 19 )
Wherein, Φ = 1 N Σ n = 1 N DIS i ( n ) .
Step 110 is carried out right value update to the weighted value of each particle in the former frame, obtains the weighted value of each particle in the present frame:
w k i = w k - 1 i p k i - - - ( 20 )
Wherein, p k iBe the observation probability density function of i particle of k frame, w k iIt is the weighted value of i particle of k frame.
Step 111 is weighted the motion state parameters that summation obtains expecting by the motion state parameters of each particle and weights:
TX k = Σ i = 1 Ns w k i TX k i - - - ( 21 )
TY k = Σ i = 1 Ns w k i TY k i - - - ( 22 )
θ k = Σ i = 1 Ns w k i θ k i - - - ( 23 )
SX k = Σ i = 1 Ns w k i SX k i - - - ( 24 )
SY k = Σ i = 1 Ns w k i SY k i - - - ( 25 )
Wherein, w k iBe the weighted value of i particle of k frame, T k=(TX k, TY k, θ k, SX k, SY k) be the motion state parameters of k frame objective contour, T k i = ( TX k i , TY k i , θ k i , SX k i , SY k i ) Be the motion state parameters of i particle of k frame, Ns is a total number of particles.
Step 112 just can be obtained the shape space parameter of k frame objective contour by motion state parameters:
S k=(TX k,TY k,SX k cosθ k-1,SY k cosθ k-1,-SY k sinθ k,SX k sinθ k) (26)
Step 113 is by S kCalculate the control point vector QX of profile kAnd QY k:
(QX kQY k) T=WS k+(QX 0QY 0) T (27)
Wherein, W = 1 0 QX 0 0 0 QY 0 0 1 0 QY 0 QX 0 0 , Each element all is the matrix of Nc * 1, and Nc is the number at control point, S kBe the shape space parameter of k frame objective contour, (QX kQY k) TIt is the control point vector of the B batten of k frame objective contour.
Step 114, the contour curve y of match target k(x):
y k ( x ) = Σ k = 0 Nc - 1 P k B k , m ( x ) - - - ( 28 )
P wherein k(k=0,1 ..., Nc-1) be the coordinate at k control point, B K, m(k=0,1 ..., Nc-1) be m standard B spline base function.
So just finished once tracing process to the target object profile.
The particle filter method of common tracking target profile also has the important resampling (SIR of sequence, Sequential Importance Resampling), important resample filter (the ASIR of auxiliary sampling, Auxiliary Sampling Importance Resampling), regularization particle filter (RPF, Regularized Particle Filter).These algorithms have same form when particle state shifts (particle propagation) state transition equation, so the process of tracking target profile is similar.
The method of above-mentioned tracking target profile can realize the tracking to the target object profile, but analyze and calculate owing to only from video data, extract the profile information of target object, therefore the jitter phenomenon of " in advance " or " hysteresis " can appear in the profile that tracking obtains when the movement velocity frequent variations of video object, follows the tracks of inaccurate; In addition, when similar profile information occurring around the video object, can not follow the tracks of accurately.
Summary of the invention
In view of this, the embodiment of the invention provides two kinds of video target profile tracing methods, and it is more accurate that this method is followed the tracks of the profile of video object.
The embodiment of the invention also provides two kinds of video target profile tracking means, and the result of the profile of this device tracking video object is more accurate.
On the one hand, embodiments of the invention provide a kind of video target profile tracing method, comprise the following step:
From initial frame image data extraction objective contour, utilize the objective contour that is extracted to produce a plurality of random particles;
Each random particles in the current frame image data is carried out state transitions, obtain the profile of each random particles;
On the profile of each random particles, randomly draw a plurality of sampling points, each sampling point is calculated the measured value of its true profile point;
Calculate the distance between the measured value of each sampling point and its true profile point;
Determine the weighted value of each random particles according to the distance between the measured value of each sampling point and its true profile point on each random particles profile that obtains; With
All random particles weighted accumulations are obtained the profile of target in current frame image followed the tracks of;
Each random particles carries out further comprising before the state transitions in to the current frame image data:
The barycenter of tracking target obtains the position of target barycenter;
Utilize the position of the target barycenter that is obtained that the parameter of random particles state transitions is adjusted.
Embodiments of the invention provide another kind of video target profile tracing method, comprise the following step:
From initial frame image data extraction objective contour, utilize the objective contour that is extracted to produce a plurality of random particles;
Each random particles in the current frame image data is carried out state transitions, obtain the profile of each random particles;
On the profile of each random particles, randomly draw a plurality of sampling points, each sampling point is calculated the measured value of its true profile point;
Calculate the distance between the measured value of each sampling point and its true profile point;
Determine the weighted value of each random particles according to the distance between the measured value of each sampling point and its true profile point on each random particles profile that obtains; With
All random particles weighted accumulations are obtained the profile of target in current frame image followed the tracks of;
Distance on each random particles profile that described basis obtains between the measured value of each sampling point and its true profile point determines that the weighted value of each random particles further comprises:
The barycenter of tracking target obtains the position of target barycenter;
Obtain the barycenter of each particle profile according to the profile of each random particles that obtains;
The distance of the barycenter of the described target barycenter of the position calculation of the target barycenter that utilization obtains and each particle profile;
Adjust the weighted value of described each particle according to the distance between the profile barycenter of the target barycenter that obtains and each particle.
On the other hand, the embodiment of the invention provides a kind of video target profile tracking means, comprising:
The profile extraction module is used for from initial frame image data extraction objective contour;
The random particles generation module, the objective contour that is used to utilize described profile extraction module to be extracted produces a plurality of random particles;
The particle state shift module is used for each random particles of current frame image data is carried out state transitions, obtains the profile of each random particles;
The granular Weights Computing module, be used on the profile of each random particles, randomly drawing a plurality of sampling points, each sampling point is calculated the measured value of its true profile point, calculate the distance between the measured value of each sampling point and its true profile point, and determine the weighted value of each random particles according to the distance between the measured value of each sampling point and its true profile point on each random particles profile that obtains;
The profile fitting module is used for all random particles weighted accumulations are obtained the target of the being followed the tracks of profile at current frame image;
This device further comprises:
The centroid calculation module, the barycenter that is used for tracking target obtains the position of target barycenter;
Described particle state shift module is further used in to the current frame image data each random particles to carry out before the state transitions, utilizes the position of the target barycenter that described centroid calculation module obtained that the parameter of random particles state transitions is adjusted.
The embodiment of the invention also provides another kind of video target profile tracking means, comprising:
The profile extraction module is used for from initial frame image data extraction objective contour;
The random particles generation module, the objective contour that is used to utilize described profile extraction module to be extracted produces a plurality of random particles;
The particle state shift module is used for each random particles of current frame image data is carried out state transitions, obtains the profile of each random particles;
The granular Weights Computing module, be used on the profile of each random particles, randomly drawing a plurality of sampling points, each sampling point is calculated the measured value of its true profile point, calculate the distance between the measured value of each sampling point and its true profile point, and determine the weighted value of each random particles according to the distance between the measured value of each sampling point and its true profile point on each random particles profile that obtains;
The profile fitting module is used for all random particles weighted accumulations are obtained the target of the being followed the tracks of profile at current frame image;
This device further comprises the centroid calculation module, and the barycenter that is used for tracking target obtains the position of target barycenter;
The profile that the granular Weights Computing module is further used for each random particles of obtaining according to described particle state shift module obtains the barycenter of each particle profile, utilize the distance of the barycenter of the described target barycenter of position calculation of the target barycenter that the centroid calculation module obtains and each particle profile, and according to the weighted value of described each particle of distance adjustment between the profile barycenter of the target barycenter that obtains and each particle.
As seen from the above technical solutions, a kind of video target profile tracing method that the embodiment of the invention provides, the state transitions parameter of each random particles in the profile tracing process is adjusted in the position of the target barycenter that the barycenter by tracking target obtains, the state transitions parameter of random particles can be changed accordingly along with the variation of target centroid position, the change in location of random particles became greatly thereupon when the centroid position of target changed greatly, thus make the tracking of objective contour more accurate.
The another kind of video target profile tracing method that embodiments of the invention provide, barycenter by tracking target obtains the position of tracking target barycenter and comes each random particles is assessed, adjust the weighted value of particle according to the far and near degree of the barycenter of the profile of each particle and tracking target barycenter, make near the truly weighted value increase of the particle of profile, the weighted value that departs from the particle of true profile reduces, near real objective contour, it is more accurate to the tracking of objective contour to make more for the objective contour that each particle weighted accumulation obtains.
A kind of video target profile tracking means that the embodiment of the invention provides, the position that is used for the target barycenter that the barycenter of tracking target obtains, and utilize this centroid position to adjust the state transitions parameter that profile is followed the tracks of each random particles of gained, the state transitions parameter that makes random particles is along with the variation of target centroid position changes accordingly, the change in location of random particles became greatly thereupon when the centroid position of target changed greatly, and the tracking results of the objective contour that obtains is more accurate.
The another kind of video target profile tracing method device that embodiments of the invention provide, the barycenter that is used for tracking target obtains the position of tracking target barycenter, and utilize this centroid position to come each random particles is assessed, adjust the weighted value of particle according to the far and near degree of the barycenter of the profile of each particle and tracking target barycenter, make near the truly weighted value increase of the particle of profile, the weighted value that departs from the particle of true profile reduces, near real objective contour, the tracking results of the objective contour that obtains is more accurate more for the objective contour that each particle weighted accumulation obtains.
Description of drawings
Fig. 1 is that available technology adopting Condensation algorithm is realized the flow chart that video target profile is followed the tracks of.
Fig. 2 is the flow chart of video target profile tracing method in the embodiment of the invention one.
Fig. 3 is the structure chart of video target profile tracking means in the embodiment of the invention one.
Fig. 4 is the flow chart of video target profile tracing method in the embodiment of the invention two.
Fig. 5 is the structure chart of video target profile tracking means in the embodiment of the invention two.
Fig. 6 is the flow chart of video target profile tracing method in the embodiment of the invention three.
Fig. 7 carries out the design sketch that video target profile is followed the tracks of for the Condensation algorithm that adopts prior art.
Follow the tracks of design sketch behind the video target profile tracing method of Fig. 8 for the employing embodiment of the invention three.
Embodiment
For the purpose, technical scheme and the advantage that make embodiments of the invention is clearer, below with reference to the accompanying drawing embodiment that develops simultaneously, the embodiment of the invention is further described.
According to embodiments of the invention, when objective contour is followed the tracks of, barycenter to target is followed the tracks of, and utilize the position and the change information thereof of the target barycenter that centroid tracking obtains that the position transfer parameter of objective contour barycenter in the profile tracing process is adjusted, make the objective contour minimizing that obtains produce " shake " because of the variation of target object translational speed, thus feasible more accurate to the tracking of objective contour; Perhaps, utilize the position and the change information thereof of the target barycenter that centroid tracking obtains that the weight of each random particles (random sample) in the profile tracing process is adjusted, thereby make that the objective contour that the random sample weighted accumulation is obtained is more accurate.
The SIR algorithm of common tracking target profile, Condensation algorithm, ASIR algorithm, RPF algorithm.
To adopt the Condensation algorithm to come the profile of tracking target, utilizing average drifting (Mean Shift) track algorithm to come the barycenter of tracking target is the specific implementation process that example illustrates the embodiment of the invention among the following embodiment.
The algorithm of top tracking target profile has same form when particle state shifts (particle propagation) state transition equation is so those skilled in the art can easily implement the embodiment of the invention with the alternative Condensation algorithm of other several algorithms.
The average drifting track algorithm can be followed the tracks of target object in the video more exactly, obtains at first simply introducing the principle of this algorithm than the accurate target center of mass values.
Suppose { x i *} I=1 ..., nThe normalization location of pixels of expression tracking target model, its center-of-mass coordinate is O; The color gray value further is quantified as m grade, and b (x) is for being the function that the color of pixel gray scale of x quantizes with the position, and then the probability that occurs of color u is:
q u ‾ = C Σ i = 1 n k ( | | x i * | | 2 ) δ [ b ( x i * ) - u ] - - - ( 29 )
Wherein, k (x) is any one kernel function, makes that the weight of the pixel far away apart from barycenter is less;
C is a constant, and its expression formula is:
C = 1 Σ i = 1 n k ( | | x i * | | 2 ) - - - ( 30 )
Then the tracking target model representation is:
q ‾ = { q u ‾ } u = 1 , . . . , m , Σ u = 1 m q u ‾ = 1 - - - ( 31 )
Suppose { x i} I=1 ..., nhBe the location of pixels of candidate target in the present frame, its centroid position is y, the same kernel function k (x) of utilization in scope h, and then the probability that color u occurs in the candidate target can be expressed as:
p u ‾ ( y ) = C h Σ i = 1 n h k ( | | y - x i h | | 2 ) δ [ b ( x i ) - u ] - - - ( 32 )
Wherein, C hBe constant, its expression formula is:
C h = 1 Σ i = 1 n h k ( | | y - x i h | | 2 ) - - - ( 33 )
Then the candidate target model representation is:
p ‾ ( y ) = { p u ‾ ( y ) } u = 1 , . . . , m , Σ u = 1 m p u ‾ =1--- ( 34 )
By the tracking target model and the candidate target model of above definition, the distance between them is:
d ( y ) = 1 - ρ [ p ‾ ( y ) , q ‾ ] - - - ( 35 )
Wherein, ρ ‾ ( y ) ≡ ρ [ p ‾ ( y ) , q ‾ ] = Σ u = 1 m p u ‾ ( y ) q u ‾ ,
Figure S2007101541207D00132
Be the probability of color u appearance in the candidate target,
Figure S2007101541207D00133
Probability for color u appearance in the tracking target.
The optimal candidate target is the candidate target nearest with tracking target modal distance, just makes the minimum candidate region of d (y), therefore will try to achieve to make d (y) obtain the target barycenter y of minimum value.Can use following iterative formula to try to achieve the minimum value of d (y):
y ‾ 1 = Σ i = 1 n h x i w i g ( | | y ‾ 0 - x i h | | 2 ) Σ i = 1 n h w i g ( | | y ‾ 0 - x i h | | 2 ) - - - ( 36 )
Wherein,
Figure S2007101541207D00135
Be current location,
Figure S2007101541207D00136
Be next reposition { x constantly i} I=1 ..., nhBe the location of pixels of candidate target in the present frame, h is the target in-scope, and function g (x) is the derivative of kernel function k (x), w iExpression formula be:
w i = Σ u = 1 m q u ‾ p u ‾ ( y 0 ) δ [ b ( x i ) - u ] - - - ( 37 )
Next just can in each frame picture, use this iterative formula, try to achieve and make d (y) obtain the position of the candidate target and the barycenter thereof of minimum value, this candidate target is the optimal candidate target of tracking target, has so also just realized the tracking to target object barycenter in the video.
Embodiment one
Present embodiment is adjusted the parameter that particle state shifts by the position of the target barycenter that utilizes the average drifting algorithm and obtain, the parameter that makes particle state shift can be adjusted accordingly along with the position change of target barycenter, thereby makes the tracking of objective contour more accurate.
Fig. 2 mainly comprises following step for present embodiment carries out the flow chart that video target profile is followed the tracks of.
Step 201 judges whether it is new tracing object by the view data of importing, and promptly whether needs to set up new tracking target, if then execution in step 202; If not, then execution in step 215.
The view data of input can be the index of the view data that obtains through the background segment technology and selected tracking target, or data image and therein by the tracking target region of the manual input of user.
Step 202, according to the input tracking target index or the zone utilize existing profile extractive technique to obtain the profile vector sum B batten control point QX of target 0And QY 0, according to the barycenter TX of the profile vector calculation profile that obtains 0And TY 0, the initial state vector T of calculating target 0, its computing formula is formula (4), this step is identical with step 102 in the background technology.
Step 203 is by initial motion state vector T 0Carry out the random particles initialization, initialization N sIndividual particle, the initial weight w of each particle 0 iBe 1/N s, motion state is respectively T 0 i ( i = 1,2 , · · · Ns ) , Its computing formula is that formula (5) is to (9)
B 3, B 4, B 5Be constant, B 4, B 5Get [0.15,0.5], B 1, B 2Initial value is got [3,15], and ξ is the random number of [1 ,+1].
This step is identical with step 103 in the background technology.
Step 215 is utilized the position coordinates { (CX of average drifting algorithm computation target barycenter in preceding M frame picture t, CY t) T=1,2 ..., M
The target barycenter was in the movement velocity of x direction and y direction when step 216, the coordinate of the target barycenter that obtains according to step 215 can obtain by the t frame to the t+1 frame:
V t x = fabs ( CX t + 1 - CX t ) - - - ( 38 )
V t y = fabs ( CY t + 1 - CY t ) - - - ( 39 )
Step 217 is adjusted B parameter 1, B 2
State transition equation TX k i = TX k - 1 i + B 1 × ξ 1 - k i With TY k i = TY k - 1 i + B 2 × ξ 2 - k i , Be formula (11) and (12) for by the target state (centroid position of objective contour) of present frame to the transfer of the target state (centroid position of objective contour) of next frame or be called prediction, so the profile centroid position transfer parameters in the state transition equation should change along with the change of the movement velocity of objective contour barycenter.If the barycenter of objective contour speeds in the movement velocity of x and y direction, then B parameter 1, B 2Will correspondingly increase, otherwise, will reduce, so just can make tracing process stable, the jitter phenomenon of " in advance " and " hysteresis " can not appear following the tracks of.
Can follow the tracks of the barycenter that obtains target more exactly by the average drifting track algorithm, therefore introduce the tracking results of this algorithm here and come above-mentioned B parameter 1, B 2Adjust, so that more stable the tracking of objective contour.
According to target object movement velocity set { (V t x, V t y) T=0,1 ..., M-1(being expressed as initial frame during t=0) predicts the B parameter that can obtain the k frame 1 kAnd B 2 k, predictor formula is:
( B 1 k , B 2 k ) = f ( { ( V t x , V t y ) } t = 0,1 , . . . , M - 1 , B 1 ini , B 2 ini ) - - - ( 40 )
Wherein, f () is anticipation function, i.e. B 1 kAnd B 2 kDepend on initial B 1 Ini, B 2 IniAnd the movement velocity of objective contour barycenter is gathered V in the previous frame t xAnd V t yFor target by the t frame during to the t+1 frame in the movement velocity of x direction and y direction, B 1 IniAnd B 2 IniThe equation of transfer parameter of setting when initial.
Step 204, the B parameter that obtains according to step 217 through adjusting 1 kAnd B 2 k, each particle is carried out the motion state vector parameter that state transitions obtains each particle:
TX k i = TX k - 1 i + B 1 k × ξ 1 - k i - - - ( 41 )
TY k i = TY k - 1 i + B 2 k × ξ 2 - k i - - - ( 42 )
θ k i = θ k - 1 i + B 3 × ξ 3 - k i - - - ( 13 )
SX k i = SX k - 1 i + B 4 × ξ 4 - k i - - - ( 14 )
SY k i = SY k - 1 i + B 5 × ξ 5 - k i - - - ( 15 )
B wherein 1 kAnd B 2 kObtain by step 217; ξ 1-k i, ξ 2-k i, ξ 3-k i, ξ 4-k i, ξ 5-k iRandom number for [1 ,+1].
Step 205, the motion state vector parameter of each particle that obtains according to step 204 is calculated each particle corresponding shape spatial parameter S according to formula (16) k i
The shape space parameter of each particle that step 206 basis obtains is according to formula (17) calculating B batten control point vector separately.
Step 207 simulates the contour curve of each particle correspondence, wherein B with formula (18) K, mBe m standard B spline base function, m can be taken as 3.
Step 208 is calculated the distance between the measured value of true profile point at each sampling point and this some place, and this step is identical with the step 108 of background technology.
Step 209 is passed through the distance D IS between the measured value of the true profile point of each sampling point and this some place in the view data of the present frame of being tried to achieve i(n) (n=1,2 ..., N) can obtain the observation probability density function p of each particle according to formula (19) k i
Step 210 is carried out right value update to the weighted value of each particle in the former frame, obtains the weighted value of each particle in the present frame according to formula (20).
Step 211 is weighted the motion state parameters that summation obtains expecting according to formula (21) to (25) by the motion state parameters of each particle and weights.
Step 212 just can obtain the shape space parameter S by the motion state value according to formula (26) k
Step 213 is by S kAccording to formula (27) calculate the control point vector QX of profile kAnd QY k
Step 214 is according to the contour curve y of formula (28) match target k(x).
So just finished once tracking to objective contour.
Fig. 3 is the structure chart of present embodiment video target profile tracking means.As shown in Figure 3, this device comprises: memory module 301, profile extraction module 302, random particles generation module 303, particle state shift module 304, centroid calculation module 305, control module 306, granular Weights Computing module 307 and profile fitting module 308.The method that this device utilized has provided detailed description, and therefore, the function that regards to this device is down only done simple introduction.
Memory module 301 is used to store the view data of input;
Control module 306 each module of control are finished corresponding operation.
Receive order from the new tracking target of foundation of control module 306, profile extraction module 302 is used for utilizing the profile of existing profile extractive technique according to target in the initial frame image frame data of extracting memory module 301, calculates the profile vector sum B batten control point QX of target 0And QY 0, according to the barycenter TX of the profile vector calculation profile that obtains 0And TY 0, the initial state vector T of calculating target 0
Random particles generation module 303 is used for by the initial motion state vector T of utilizing profile extraction module 302 to calculate 0Initialization N sIndividual particle, the initial weight w of each particle 0 iBe 1/N s, motion state is respectively T 0 i(i=1,2 ... Ns)
Here, each the two field picture frame data after the initial image frame are carried out following processing as the present image frame data successively.
Centroid calculation module 305 is used for utilizing the position of target barycenter in the present image frame data that the centroid calculation memory module 301 of average drifting algorithm keeps track target preserves, and/or calculates the movement velocity of target barycenter in x direction and y direction.
Particle state shift module 304 is used for the position of the target barycenter that calculates according to centroid calculation module 305 and/or target barycenter in the movement velocity of x direction and the y direction B parameter to the particle state equation of transfer 1, B 2Adjust, and utilize adjusted B parameter 1 kAnd B 2 kEach particle is carried out state transitions, obtain the profile of each particle;
Control module 306 is used to control each module and finishes corresponding operation;
Granular Weights Computing module 307 is used for randomly drawing a plurality of sampling points on the profile of each random particles, in the present image frame data, each sampling point is calculated the measured value of its true profile point, calculate the distance between the measured value of each sampling point and its true profile point, determine the weighted value of each random particles according to the distance between the measured value of each sampling point and its true profile point on each random particles profile that obtains.;
Profile fitting module 308 is used for all particles are obtained according to its weighted value match the contour curve and the output of present image frame data tracking target.
Maybe also have untreated image frame data in the memory module of this device as long as continue to import new image frame data to this device, this device just it is calculated that the contour curve of tracking target at these number of image frames.
Present embodiment is adjusted the position transfer parameter of profile barycenter in the state transition equation by the movement velocity of the target barycenter that utilizes the average drifting algorithm and obtain, make the profile centroid position to change along with the movement velocity of target and correspondingly to change, reduced the generation of the jitter phenomenon of the relative real goal profile of objective contour " in advance " followed the tracks of or " hysteresis " to a certain extent, increased the stability that profile is followed the tracks of, made that the profile tracking results is more accurate.
Embodiment two
Present embodiment comes candidate's particle is assessed by the position and the movement velocity of the tracking target barycenter that utilizes the average drifting algorithm and obtain, influence the weighted value of particle according to the far and near degree of the barycenter of the profile of each particle and tracking target barycenter, make objective contour that each particle weighted accumulation obtains more near real objective contour, thereby reach the tracking purpose more accurately that makes objective contour.
Fig. 4 mainly comprises following step for present embodiment carries out the flow chart that video target profile is followed the tracks of.
Step 401 judges whether the view data and the tracking target information of input is new object, promptly whether needs to set up new tracking target, if then execution in step 402, otherwise execution in step 415.
The view data of input can be the view data through background segment, can select certain moving object as tracking target in the initial frame image.
Step 402, according to the input tracking target index or the zone utilize existing profile extractive technique to obtain the profile vector sum B batten control point QX of target 0And QY 0, according to the barycenter TX of the profile vector calculation profile that obtains 0And TY 0, the initial state vector T of calculating target 0, its computing formula is formula (4), this step is identical with step 102 in the background technology.
Step 403 is by initial motion state vector T 0Carry out the random particles initialization, initialization N sIndividual particle, the initial weight w of each particle 0 iBe 1/N s, motion state is respectively T i 0(i=1,2 ... Ns) its computing formula is that formula (5) is to (9)
B 1, B 2, B 3, B 4, B 5Be constant, B 1, B 2Get [3,15], B 4, B 5Get [0.15,0.5], ξ is the random number of [1 ,+1].
This step is identical with step 103 in the background technology.
Step 415 is utilized the position coordinates { (CX of average drifting algorithm computation target barycenter in preceding M frame picture t, CY t) T=1,2 ..., M
Step 404 when importing the k frame image data, is carried out state transitions to each particle state, and the system mode equation of transfer is that formula (11) arrives (15), and this step is identical with step 104 in the background technology.
Step 405 utilizes the motion state of each particle that step 404 obtains to calculate the shape space parameter S of each particle according to formula (16) k i
Step 406 is calculated the B batten control point vector of each particle according to formula (17).
Step 407, obtain the control point vector of each particle after, just can simulate the contour curve of each particle correspondence with the method for B batten, fitting formula is formula (18):
Step 408 is assessed candidate's particle.
The method of present embodiment is assessed each particle by two kinds of measurement factors of present frame: the distance between the measured value of a. particle profile point and true profile point; B. the distance between particle centroid position and the tracking target centroid position.Wherein, weighing factor a is existing technology.
Try to achieve the distance D IS between the measured value of true profile point at each sampling point of profile and this some place according to the step 108 in the background technology i(n), this is apart from as first measurement factor.
Only utilized above-mentioned measurement factor a to carry out the assessment of particle in the prior art, more accurate for profile is followed the tracks of, present embodiment has been introduced the centroid position more accurately that utilizes the average drifting track algorithm to obtain here and has been come each candidate's particle is assessed.
Can trace into objective contour centroid position (CX in each frame more accurately by the average drifting track algorithm t, CY t), available this barycenter (tracking target barycenter), the i.e. barycenter that obtains of step 415 and by the distance D IS between the barycenter of each particle profile i(C k) as another measurement factor, its value this particle centroid position of big more explanation and the true deviation of mass center of target are big more; Otherwise, illustrate that the deviation between them is just more little.
Step 409, by two that obtain in the step 408 observation probability density functions that can get each particle apart from the measurement factor:
p k i = exp { - 1 2 ( 1 σ 1 2 Φ 1 + 1 σ 2 2 Φ 2 ) } - - - ( 43 )
Wherein, Φ 1 = 1 N Σ n = 1 N DIS i ( n ) , Φ 2=DIS i(C k), DIS i(n) distance between the measured value of the true profile point at each sampling point of profile and this some place, DIS i(C k) for the barycenter of tracking target and by being distance between the barycenter of i particle profile, described σ 1Be the dispersion between the measured value of particle profile point and true profile point, described σ 2Be the dispersion between particle centroid position and the tracking target centroid position.
Step 410 is according to the observation probability density function P of each particle that obtains in the step 409 k iCarry out the right value update of each particle according to formula (20).
Step 411 is weighted on average according to the motion state parameters and their weighted value of formula (21) to (25) by each particle, obtains the motion state parameters T of k frame tracking target profile k=(TX k, TY k, θ k, SX k, SY k).
Step 412 is by motion state parameters T kObtain the shape space parameter S k
Step 413, according to formula (27) by S kCalculate the control point vector QX of profile kAnd QY k
Step 414 simulates the contour curve y of object according to formula (28) k(x).So just finished a profile of tracking target has been followed the tracks of.
Fig. 5 is the structure chart of present embodiment video target profile tracking means.As shown in Figure 5, this device comprises: memory module 501, profile extraction module 502, random particles generation module 503, particle state shift module 504, centroid calculation module 505, control module 506, granular Weights Computing module 507 and profile fitting module 508.The method that this device utilized has provided detailed description, and therefore, the function that regards to this device is down only done simple introduction.
The view data of memory module 301 storage inputs.
Control module 306 each module of control are finished corresponding operation.
Receive the order from the new tracking target of foundation of control module 306, profile extraction module 502 is used for utilizing the initial frame image frame data of existing profile extractive technique according to memory module 501, calculates the profile vector sum B batten control point QX of target 0And QY 0, according to the barycenter TX of the profile vector calculation profile that obtains 0And TY 0, the initial state vector T of calculating target 0
Random particles generation module 503 is used for by the initial motion state vector T of utilizing profile extraction module 502 to calculate 0Initialization N sIndividual particle, the initial weight w of each particle 0 iBe 1/N s, motion state is respectively T 0 i(i=1,2 ... Ns).
Here, each the two field picture frame data after the initial image frame are carried out following processing as the present image frame data successively.
Centroid calculation module 505 is used for utilizing the position coordinates { (CX of average drifting algorithm computation target barycenter at preceding M frame picture t, CY t) T=1,2 ..., M
Particle state shift module 504 is used for according to prior art each particle being carried out state transitions.
Centroid calculation module 505 is utilized the position of average drifting algorithm computation target barycenter in the present image frame data.
Granular Weights Computing module 507 is used for by the distance between present image frame data particle profile point and the observation profile point, and two factors of distance between the tracking target centroid position that calculates of particle centroid position and centroid calculation module 505 calculate the observation probability density function of each particle, determine the weighted value of each particle according to the observation probability density function of each particle that obtains.
Profile fitting module 508 is used for the contour curve that utilizes each particle and weighted value match thereof to obtain present image frame data target and exports.
As long as continue to also have untreated image frame data in this device is imported the memory module of new image frame data or this device, this device just it is calculated that the contour curve of tracking target at these number of image frames.
Present embodiment participates in assessing each candidate's particle by the tracking target barycenter that utilizes the average drifting algorithm and obtain, make the weight of each particle relevant with distance between its profile centroid position and tracking target barycenter, the feasible objective contour that is obtained by each particle weighted accumulation has increased the accuracy that profile is followed the tracks of more near the real goal profile.
Embodiment three
Present embodiment by the target barycenter that utilizes the average drifting algorithm and obtain the position and change and adjust the parameter that particle state shifts, make the state transitions of particle to adjust accordingly along with the change in location of target barycenter; Simultaneously, also use the position and the movement velocity of the tracking target barycenter that the average drifting algorithm obtains to come candidate's particle is assessed, influence the weighted value of particle according to the far and near degree of the barycenter of the profile of each particle and tracking target barycenter, make objective contour that each particle weighted accumulation obtains more near real objective contour, more than two aspects make that the tracking of objective contour is more accurate.
Fig. 6 mainly comprises following step for present embodiment carries out the flow chart that video target profile is followed the tracks of.
Step 601 judges whether it is new tracing object by the view data of importing, and promptly whether needs to set up new tracking target, if then execution in step 602; If not, then execution in step 615.
The view data of input can be the index of the view data that obtains through the background segment technology and selected tracking target, or data image and therein by the tracking target region of the manual input of user.
Step 602, according to the input tracking target index or the zone utilize existing profile extractive technique to obtain the profile vector sum B batten control point QX of target 0And QY 0, according to the barycenter TX of the profile vector calculation profile that obtains 0And TY 0, the initial state vector T of calculating target 0, its computing formula is formula (4), this step is identical with step 102 in the background technology.
Step 603 is by initial motion state vector T 0Carry out the random particles initialization, initialization N sIndividual particle, the initial weight w of each particle 0 iBe 1/N s, motion state is respectively T 0 i(i=1,2 ... Ns), its computing formula is that formula (5) is to (9)
B 3, B 4, B 5Be constant, B 4, B 5Get [0.15,0.5], B 1, B 2Initial value is got [3,15], and ξ is the random number of [1 ,+1].This step is identical with step 103 in the background technology.
Step 615 is utilized the position coordinates { (CX of average drifting algorithm computation target barycenter in preceding M frame picture t, CY t) T=1,2 ..., M
Step 616, the coordinate of the target barycenter that obtains according to step 615 can according to formula (38) and (39) obtain target by the t frame during to the t+1 frame at the movement velocity V of x direction and y direction t xAnd V t y
Step 617 is adjusted B parameter according to formula (60) 1, B 2, this step is identical with the step 217 of embodiment one.
Step 604, the B parameter that the process self adaptation that obtains according to step 617 is adjusted 1 kAnd B 2 k, each particle is carried out the motion state vector parameter that state transitions obtains each particle, this step is identical with the step 204 of embodiment one.
Step 605, the motion state vector parameter of each particle that obtains according to step 604 is calculated each particle corresponding shape spatial parameter S according to formula (16) k i
The shape space parameter of each particle that step 606 basis obtains is according to formula (17) calculating B batten control point vector separately.
Step 607 simulates the contour curve of each particle correspondence, wherein B with formula (18) K, mBe m standard B spline base function, m can be taken as 3.
Step 608 is assessed candidate's particle, the distance D IS between the measured value of the true profile point at calculating each sampling point of profile and this some place i(n) and the distance D IS between particle centroid position and the tracking target centroid position i(C k).This step is identical with step 408 among the embodiment two.
Step 609 is by two that obtain in the step 608 observation probability density function p that can obtain each particle apart from the measurement factor according to formula (43) k iThis step is identical with the step 409 of embodiment two.
Step 610 is according to the observation probability density function p of each particle that obtains in the step 609 k iCarry out the right value update of each particle according to formula (20).
Step 611 is weighted on average according to the motion state parameters and their weighted value of formula (21) to (25) by each particle, obtains the motion state parameters of tracking target
T k=(TX k,TY k,θ k,SX k,SY k)。
Step 612 is by motion state parameters T kObtain the shape space parameter S k
Step 613, according to formula (27) by S kCalculate the control point vector QX of profile kAnd QY k
Step 614 simulates the contour curve y of object according to formula (28) k(x).So just finished a profile of tracking target has been followed the tracks of.
Present embodiment is actually by using two kinds of means among embodiment one and the embodiment two to make tracking to objective contour have high advantage of the tracking stability of embodiment one and the tracking advantage of high accuracy of embodiment two simultaneously simultaneously.
Those skilled in the art should easily combine the new equipment that composition has the function of described two kinds of devices simultaneously with two kinds of devices among embodiment one and the embodiment two, this device is used for realizing the tracking of the video target profile of present embodiment, and video object is followed the tracks of exactly.
Fig. 7 carries out the design sketch that video target profile is followed the tracks of for the Condensation algorithm that adopts prior art; Fig. 8 is the tracking effect figure of the video target profile tracing method of the employing embodiment of the invention three.We have chosen beginning, centre and a few frame representative pictures when finishing.Being computer with the automobile profile shown in the white curve post in the picture utilizes contour tracing method " to see " profile of the automobile that arrives.Can see, when not introducing the average drifting algorithm, occur following the tracks of unsettled phenomenon in the tracing process, this is that the state transitions parameter immobilizes because when not introducing the average drifting algorithm; Introduced the average drifting algorithm and realized that the state transitions parameter is to the self adaptation adjustment of motor racing velocity variations with added the method that the automobile barycenter followed the tracks of with the average drifting method is assessed each particle, tracking effect is greatly improved, follows the tracks of more stable and accurate.
Following table is the state transitions B parameter when using the method for embodiment three 1, B 2Variation relation (that adopt is the linear changing relation herein) with automobile center of mass motion speed.
Table 1B 1Follow V xVariation
V x 1 2 5 8 10
B 1 3.2 4.5 5.8 7.1 8.4
Table 2B 2Follow V yVariation
V y 1 2 3 5 6
B 2 3.2 4.5 5.8 7.1 8.4
Be to be example in the present embodiment with the Condensation algorithm in the particle filter tracking algorithm and the average drifting algorithm in the centroid tracking algorithm.
Other common particle filter method has the SIR algorithm, ASIR algorithm, RPF algorithm etc.
Above these particle filter algorithms when particle state shifts (particle propagation) and the state transition equation of Condensation algorithm, so those skilled in the art obtain more excellent profile tracking results in the middle of the method for the embodiment of the invention should being applied in other particle filter algorithm with same form.
In addition, any track algorithm that can access the target centroid position can replace the average drifting algorithm application among the enforcement of the embodiment of the invention.
By the above embodiments as seen, the position and the movement velocity of the target barycenter that the barycenter of this video target profile tracing method of the embodiment of the invention by tracking target obtains are adjusted the parameter that particle state shifts in the profile tracing process, the state transitions of particle can be adjusted accordingly along with the change in location of target barycenter, thus make the tracking of objective contour more accurate.
The barycenter of the another kind of video target profile tracing method of the embodiment of the invention by tracking target obtains the position and the movement velocity of tracking target barycenter and comes candidate's particle is assessed, influence the weighted value of particle according to the barycenter of the profile of each particle and the far and near degree of the target barycenter that obtains, the approaching more real objective contour of the objective contour that makes each particle weighted accumulation obtain, it is more accurate to the tracking of objective contour to make.
A kind of video target profile tracking means of the embodiment of the invention, the position that is used for the target barycenter that the barycenter of tracking target obtains, and utilize this centroid position to adjust the state transitions parameter that profile is followed the tracks of each random particles of gained, the state transitions parameter that makes random particles is along with the variation of target centroid position changes accordingly, the change in location of random particles became greatly thereupon when the centroid position of target changed greatly, and the tracking results of the objective contour that obtains is more accurate.
The another kind of video target profile tracing method of the embodiment of the invention, the barycenter that is used for tracking target obtains the position of tracking target barycenter, and utilize this centroid position to come each random particles is assessed, adjust the weighted value of particle according to the far and near degree of the barycenter of the profile of each particle and tracking target barycenter, make near the truly weighted value increase of the particle of profile, the weighted value that departs from the particle of true profile reduces, near real objective contour, the tracking results of the objective contour that obtains is more accurate more for the objective contour that each particle weighted accumulation obtains.
In sum, more than be part embodiment of the present invention only, be not to be used to limit protection scope of the present invention.Within the spirit and principles in the present invention all, any modification of being done, be equal to replacement, improvement etc., all should be included within protection scope of the present invention.

Claims (17)

1. video target profile tracing method comprises:
From initial frame image data extraction objective contour, utilize the objective contour that is extracted to produce a plurality of random particles;
Each random particles in the current frame image data is carried out state transitions, obtain the profile of each random particles;
On the profile of each random particles, randomly draw a plurality of sampling points, each sampling point is calculated the measured value of its true profile point;
Calculate the distance between the measured value of each sampling point and its true profile point;
Determine the weighted value of each random particles according to the distance between the measured value of each sampling point and its true profile point on each random particles profile that obtains; With
All random particles weighted accumulations are obtained the profile of target in current frame image followed the tracks of;
It is characterized in that each random particles carries out further comprising before the state transitions in to the current frame image data:
The barycenter of tracking target obtains the position of target barycenter;
Utilize the position of the target barycenter that is obtained that the parameter of random particles state transitions is adjusted.
2. method according to claim 1 is characterized in that, the parameter of described random particles state transitions is for carrying out the parameter of position transfer to the barycenter of random particles profile;
The method that the position of the target barycenter that described utilization obtained is adjusted the parameter of random particles state transitions comprises:
Make barycenter carry out the parameter of position transfer along with the alternate position spike of described target barycenter in adjacent two interframe increases and increase, along with the alternate position spike of described target barycenter in adjacent two interframe reduces and reduce to the random particles profile.
3. method according to claim 1 is characterized in that, the position that the barycenter of described tracking target obtains the target barycenter comprises:
Utilize the movement velocity set of position calculation target barycenter on all directions between consecutive frame of target barycenter in each frame before of following the tracks of the barycenter acquisition;
The method that the position of the target barycenter that described utilization obtained is adjusted the parameter of random particles state transitions comprises:
Utilize described sets of speeds the parameter of described random particles state transitions is predicted and to be adjusted, obtain the parameter of the state transitions of adjusted random particles.
4. method according to claim 3 is characterized in that, describedly carries out that each random particles is carried out state transitions and comprises:
Utilize following state transition equation that each random particles is carried out state transitions:
Figure FA20192431200710154120701C00013
Figure FA20192431200710154120701C00014
Figure FA20192431200710154120701C00015
Wherein, TX and TY are respectively the positions of x direction and y direction objective contour barycenter, and θ is the angle of objective contour rotation, and SX and SY are respectively the yardstick of target in x direction and y direction, and k is the frame number of view data, i=1, and 2 ... N s, described N sBe initialization particle number, ξ 1-k i, ξ 2-k i, ξ 3-k i, ξ 4-k i, ξ 5-k iBe the random number of [1 ,+1], B 3, B 4, B 5Be constant, the parameter of the described state transitions of adjusting is B 1, B 2
Described utilization follow the tracks of that barycenter obtains before in each frame the position calculation of target barycenter obtain the movement velocity set of target barycenter on all directions between consecutive frame and be { (V t x, V t y) T=0,1 ..., M-1, wherein, V t xThe target barycenter is at the speed of x direction, V for from the t frame to the t+1 frame time t yThe target barycenter is in the speed of y direction for from the t frame to the t+1 frame time, and initial frame is the 0th frame;
The described sets of speeds of utilizing predicts to the parameter of described random particles state transitions and adjusts that the parameter that obtains the state transitions of adjusted random particles comprises:
According to movement velocity the set { (V of described target barycenter in each interframe t x, V t y) T=0,1 ..., M-1Predict the B parameter of the random particles state transitions when obtaining the k frame 1 kAnd B 2 k,
Figure FA20192431200710154120701C00021
Wherein, f () is an anticipation function, B 1 IniAnd B 2 IniThe B that sets when initial 1, B 2Value.
5. method according to claim 1 is characterized in that, the barycenter of described tracking target comprises: the barycenter that adopts average drifting track algorithm tracking target.
6. method according to claim 1 is characterized in that, the profile of described objective contour and random particles utilizes the B batten to characterize.
7. according to the described method of the arbitrary claim of claim 1 to 6, it is characterized in that, further comprise:
Obtain the barycenter of each particle profile according to the profile of each random particles;
Distance between this target barycenter of position calculation of the target barycenter that utilization obtains and the barycenter of each particle profile;
Utilize distance between the barycenter of described target barycenter and each particle profile to adjust the weighted value of described each random particles.
8. method according to claim 7 is characterized in that, the weighted value of described each particle of adjustment comprises:
Calculate the observation probability density function of each particle according to following formula:
Figure FA20192431200710154120701C00022
Wherein,
Figure FA20192431200710154120701C00023
Φ 2=DIS i(C k), DIS i(n) be distance between the measured value of each sampling point of described each particle profile and its true profile point, DIS i(C k) being the distance between the barycenter of the described target barycenter that calculates and described each particle profile, described N is the number of sampling point on contour curve, described σ 1Be the dispersion between the measured value of particle profile point and true profile point, described σ 2Be the dispersion between particle centroid position and the tracking target centroid position;
Weighted value to each particle in the former frame carries out right value update, obtains the weighted value of each particle in the present frame:
Figure FA20192431200710154120701C00031
Wherein, p k iBe the observation probability density function of i particle of k frame, w k iIt is the weighted value of i particle of k frame.
9. method according to claim 1 is characterized in that, the quantity of described random particles is more than 50.
10. video target profile tracing method comprises:
From initial frame image data extraction objective contour, utilize the objective contour that is extracted to produce a plurality of random particles;
Each random particles in the current frame image data is carried out state transitions, obtain the profile of each random particles;
On the profile of each random particles, randomly draw a plurality of sampling points, each sampling point is calculated the measured value of its true profile point;
Calculate the distance between the measured value of each sampling point and its true profile point;
Determine the weighted value of each random particles according to the distance between the measured value of each sampling point and its true profile point on each random particles profile that obtains; With
All random particles weighted accumulations are obtained the profile of target in current frame image followed the tracks of;
It is characterized in that the distance on each random particles profile that described basis obtains between the measured value of each sampling point and its true profile point determines that the weighted value of each random particles further comprises:
The barycenter of tracking target obtains the position of target barycenter;
Obtain the barycenter of each particle profile according to the profile of each random particles that obtains;
The distance of the barycenter of the described target barycenter of the position calculation of the target barycenter that utilization obtains and each particle profile;
Adjust the weighted value of described each particle according to the distance between the profile barycenter of the target barycenter that obtains and each particle.
11. method according to claim 10 is characterized in that, the weighted value of described each particle of adjustment comprises:
Calculate the observation probability density function of each particle according to following formula:
Figure FA20192431200710154120701C00041
Wherein,
Figure FA20192431200710154120701C00042
Φ 2=DIS i(C k), DIS i(n) be distance between the measured value of the sampling point of described each particle profile and its true profile point, DIS i(C k) be the distance between the barycenter of the target barycenter of described acquisition and described each particle profile, described N is the number of sampling point on contour curve, described σ 1Be the dispersion between the measured value of particle profile point and true profile point, described σ 2Be the dispersion between particle centroid position and the tracking target centroid position;
Weighted value to each particle in the former frame carries out right value update, obtains the weighted value of each particle in the present frame:
Figure FA20192431200710154120701C00043
Wherein, p k iBe the observation probability density function of i particle of k frame, w k iIt is the weighted value of i particle of k frame.
12. method according to claim 10 is characterized in that, the barycenter of described tracking target comprises: the barycenter that adopts average drifting track algorithm tracking target.
13. method according to claim 10 is characterized in that, the profile of described objective contour and random particles utilizes the B batten to characterize.
14. method according to claim 10 is characterized in that, the quantity of described random particles is more than 50.
15. a video target profile tracking means comprises:
The profile extraction module is used for from initial frame image data extraction objective contour;
The random particles generation module, the objective contour that is used to utilize described profile extraction module to be extracted produces a plurality of random particles;
The particle state shift module is used for each random particles of current frame image data is carried out state transitions, obtains the profile of each random particles;
The granular Weights Computing module, be used on the profile of each random particles, randomly drawing a plurality of sampling points, each sampling point is calculated the measured value of its true profile point, calculate the distance between the measured value of each sampling point and its true profile point, and determine the weighted value of each random particles according to the distance between the measured value of each sampling point and its true profile point on each random particles profile that obtains;
The profile fitting module is used for all random particles weighted accumulations are obtained the target of the being followed the tracks of profile at current frame image;
It is characterized in that, further comprise:
The centroid calculation module, the barycenter that is used for tracking target obtains the position of target barycenter;
Described particle state shift module is further used in to the current frame image data each random particles to carry out before the state transitions, utilizes the position of the target barycenter that described centroid calculation module obtained that the parameter of random particles state transitions is adjusted.
16. device according to claim 15 is characterized in that,
The profile that the granular Weights Computing module is further used for each random particles of obtaining according to described particle state shift module obtains the barycenter of each particle profile, utilize the distance of the barycenter of the described target barycenter of position calculation of the target barycenter that the centroid calculation module obtains and each particle profile, and according to the weighted value of described each particle of distance adjustment between the profile barycenter of the target barycenter that obtains and each particle.
17. a video target profile tracking means comprises:
The profile extraction module is used for from initial frame image data extraction objective contour;
The random particles generation module, the objective contour that is used to utilize described profile extraction module to be extracted produces a plurality of random particles;
The particle state shift module is used for each random particles of current frame image data is carried out state transitions, obtains the profile of each random particles;
The granular Weights Computing module, be used on the profile of each random particles, randomly drawing a plurality of sampling points, each sampling point is calculated the measured value of its true profile point, calculate the distance between the measured value of each sampling point and its true profile point, and determine the weighted value of each random particles according to the distance between the measured value of each sampling point and its true profile point on each random particles profile that obtains;
The profile fitting module is used for all random particles weighted accumulations are obtained the target of the being followed the tracks of profile at current frame image;
It is characterized in that, further comprise the centroid calculation module, the barycenter that is used for tracking target obtains the position of target barycenter;
The profile that the granular Weights Computing module is further used for each random particles of obtaining according to described particle state shift module obtains the barycenter of each particle profile, utilize the distance of the barycenter of the described target barycenter of position calculation of the target barycenter that the centroid calculation module obtains and each particle profile, and according to the weighted value of described each particle of distance adjustment between the profile barycenter of the target barycenter that obtains and each particle.
CN2007101541207A 2007-09-17 2007-09-17 Video target profile tracing method and device Active CN101394546B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2007101541207A CN101394546B (en) 2007-09-17 2007-09-17 Video target profile tracing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2007101541207A CN101394546B (en) 2007-09-17 2007-09-17 Video target profile tracing method and device

Publications (2)

Publication Number Publication Date
CN101394546A CN101394546A (en) 2009-03-25
CN101394546B true CN101394546B (en) 2010-08-25

Family

ID=40494580

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2007101541207A Active CN101394546B (en) 2007-09-17 2007-09-17 Video target profile tracing method and device

Country Status (1)

Country Link
CN (1) CN101394546B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102831423B (en) * 2012-07-26 2014-12-03 武汉大学 SAR (synthetic aperture radar) image road extracting method
KR20140031613A (en) * 2012-09-05 2014-03-13 삼성전자주식회사 Apparatus and method for processing image
CN103559723B (en) * 2013-10-17 2016-04-20 同济大学 A kind of human body tracing method based on self-adaptive kernel function and mean shift
CN104376576B (en) * 2014-09-04 2018-06-05 华为技术有限公司 A kind of method for tracking target and device
CN104392469B (en) * 2014-12-15 2017-05-31 辽宁工程技术大学 A kind of method for tracking target based on soft characteristic theory
CN106297292A (en) * 2016-08-29 2017-01-04 苏州金螳螂怡和科技有限公司 Based on highway bayonet socket and the Trajectory System of comprehensively monitoring
CN106791294A (en) * 2016-11-25 2017-05-31 益海芯电子技术江苏有限公司 Motion target tracking method
CN106875426B (en) * 2017-02-21 2020-01-21 中国科学院自动化研究所 Visual tracking method and device based on related particle filtering
CN107818651A (en) * 2017-10-27 2018-03-20 华润电力技术研究院有限公司 A kind of illegal cross-border warning method and device based on video monitoring
CN108010032A (en) * 2017-12-25 2018-05-08 北京奇虎科技有限公司 Video landscape processing method and processing device based on the segmentation of adaptive tracing frame
CN110830846B (en) * 2018-08-07 2022-02-22 阿里巴巴(中国)有限公司 Video clipping method and server
CN109212480B (en) * 2018-09-05 2020-07-28 浙江理工大学 Sound source tracking method based on distributed auxiliary particle filtering
CN112214535A (en) * 2020-10-22 2021-01-12 上海明略人工智能(集团)有限公司 Similarity calculation method and system, electronic device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1606033A (en) * 2004-11-18 2005-04-13 上海交通大学 Weak target detecting and tracking method in infrared image sequence
CN101026759A (en) * 2007-04-09 2007-08-29 华为技术有限公司 Visual tracking method and system based on particle filtering
JP2007233798A (en) * 2006-03-02 2007-09-13 Nippon Hoso Kyokai <Nhk> Video object tracking device and video object tracking program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1606033A (en) * 2004-11-18 2005-04-13 上海交通大学 Weak target detecting and tracking method in infrared image sequence
JP2007233798A (en) * 2006-03-02 2007-09-13 Nippon Hoso Kyokai <Nhk> Video object tracking device and video object tracking program
CN101026759A (en) * 2007-04-09 2007-08-29 华为技术有限公司 Visual tracking method and system based on particle filtering

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
孙俊喜等.基于粒子滤波的目标实时跟踪系统.电视技术31 3.2000,31(3),85-87. *
孟勃等.粒子滤波算法在非线性目标跟踪系统中的应用.光学精密工程15 9.2007,15(9),1421-1426. *

Also Published As

Publication number Publication date
CN101394546A (en) 2009-03-25

Similar Documents

Publication Publication Date Title
CN101394546B (en) Video target profile tracing method and device
CN108615027B (en) Method for counting video crowd based on long-term and short-term memory-weighted neural network
CN102542289B (en) Pedestrian volume statistical method based on plurality of Gaussian counting models
Feng et al. Cross-frame keypoint-based and spatial motion information-guided networks for moving vehicle detection and tracking in satellite videos
CN110533695A (en) A kind of trajectory predictions device and method based on DS evidence theory
CN106128121B (en) Vehicle queue length fast algorithm of detecting based on Local Features Analysis
CN110276785B (en) Anti-shielding infrared target tracking method
CN103886325B (en) Cyclic matrix video tracking method with partition
CN101882217B (en) Target classification method of video image and device
CN108550161A (en) A kind of dimension self-adaption core correlation filtering fast-moving target tracking method
CN111311647B (en) Global-local and Kalman filtering-based target tracking method and device
CN104200485A (en) Video-monitoring-oriented human body tracking method
CN104899590A (en) Visual target tracking method and system for unmanned aerial vehicle
Xin et al. A self-adaptive optical flow method for the moving object detection in the video sequences
CN101877134B (en) Robust tracking method of target in airport monitoring video
CN113516853B (en) Multi-lane traffic flow detection method for complex monitoring scene
CN106780727B (en) Vehicle head detection model reconstruction method and device
CN102063625B (en) Improved particle filtering method for multi-target tracking under multiple viewing angles
CN109977818A (en) A kind of action identification method and system based on space characteristics and multi-target detection
CN108292367A (en) Image processing apparatus, semiconductor device, pattern recognition device, mobile body device and image processing method
Liu et al. Experimental study on relaxation time in direction changing movement
CN110827320A (en) Target tracking method and device based on time sequence prediction
CN103077533B (en) A kind of based on frogeye visual characteristic setting movement order calibration method
CN108182410A (en) A kind of joint objective zone location and the tumble recognizer of depth characteristic study
CN112991394B (en) KCF target tracking method based on cubic spline interpolation and Markov chain

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant