[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN102156991A - Quaternion based object optical flow tracking method - Google Patents

Quaternion based object optical flow tracking method Download PDF

Info

Publication number
CN102156991A
CN102156991A CN 201110089324 CN201110089324A CN102156991A CN 102156991 A CN102156991 A CN 102156991A CN 201110089324 CN201110089324 CN 201110089324 CN 201110089324 A CN201110089324 A CN 201110089324A CN 102156991 A CN102156991 A CN 102156991A
Authority
CN
China
Prior art keywords
light stream
color
hypercomplex number
angle point
degree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201110089324
Other languages
Chinese (zh)
Other versions
CN102156991B (en
Inventor
徐奕
杨小康
陈尔康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN 201110089324 priority Critical patent/CN102156991B/en
Publication of CN102156991A publication Critical patent/CN102156991A/en
Application granted granted Critical
Publication of CN102156991B publication Critical patent/CN102156991B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a quaternion based object optical flow tracking method in the technical field of video image processing. In the method, color is processed and an optical flow is estimated in an entire signal mode by utilizing quaternion expression of color. Through the method, more accurate optical flow estimation can be achieved at a pixel position with space color change, so that characteristic points on an object can be more robustly tracked to reduce tracking errors. In the method, quaternion color corners are also used as reliable characteristic points during tracking. A good characteristic point set is formed by using the quaternion color corners and gray value corners together; and the object optical flow tracking is performed by combining a quaternion optical flow estimation algorithm.

Description

Object light stream tracking based on hypercomplex number
Technical field
What the present invention relates to is a kind of method of technical field of video image processing, specifically is a kind of object light stream tracking based on hypercomplex number.
Background technology
The object optical flow tracking at first detects the unique point that can reliably be followed the tracks of on the tracked object in first frame of image sequence, adopt the light stream algorithm for estimating in adjacent two frames of image sequence, to calculate the light stream of these unique points successively then, obtain the position of unique point in the next frame, so repeatedly, obtain in all frames of image sequence the position of all unique points on the tracked object, to obtain the position of tracked object.The light stream algorithm for estimating used a lot of years in Flame Image Process and computer vision field.Lucas-Kanade light stream algorithm for estimating efficient height, and can estimate the light stream of sparse features point, therefore be suitable for real-time object tracking.Yet former algorithm is only based on gray-value image, and color has comprised more information as an integral unit.In addition, when being used for object tracking, some color characteristics are very stable, are suitable for following the tracks of, and have lost conspicuousness on the gray-scale value but they are transformed into, and Lucas-Kanade light stream algorithm for estimating just can not well be worked.Under some other situation, such as having illumination difference between two two field pictures that are used to estimate light stream, colouring information also can be more stable than half-tone information, helps to estimate light stream more accurately.Therefore be very important carrying out using colouring information when light stream is estimated.
Find that through the literature search to prior art C.Lei and Y.Yang have proposed a kind of method of improving the light stream estimated result with the color segmentation result in " Optical Flow Estimation on Coarse-to-Fine Region-Trees using Discrete Optimization " (using the light stream of discrete optimization to estimate to meticulous position tree roughly) article that " Proceedings of the 12thInternational Conference onComputer Vision " (the 12nd computer vision international conference collection) the 1562nd page delivered to 1569 pages. But this method minimizes based on integral energy, the color single-gene signal of the people such as therefore is not suitable for following the tracks of the sparse features point.G.Demarcq color-based image in the 481st page of " the The Color Monogenic Signal:A new framework for color image processing.application to color optical flow " that deliver to 484 pages (color single-gene signal: the new frame of Color Image Processing and the application in colored light stream is estimated thereof) of " Proceedings of 2009International Conference on Image Processing " (image was processed the international conference collection in 2009) article; Propose the concept of local color phase place, and estimated light stream with the local color phase place. In " Motion from Color " (obtaining movable information) article that P.Golland and A.M.Bruckstein delivered on 346 pages to, 362 pages of " Computer Vision and Image Understanding " (computer vision and image understanding) magazine, 1997 the 68th the 3rd phases of volume by color, and " Optical flowusing color information:preliminary results " that people such as Kelson Aires delivers on to 1611 pages the 1607th page of " Proceedings of 2008 ACM symposium on Applied computing " (ACM in 2008 use and calculates Conference Papers collection) (uses the light stream of colouring information to estimate:PRELIMINARY RESULTS) to use direct three-channel method in the article, on the RGB triple channel, use the Lucas-Kanade optical flow constraint respectively, and merge these constraints, to estimate light stream.Also there are other some research work that color is transformed into other color spaces from rgb space, and light stream is estimated in the space after conversion, in " A method of optical flow computation based on LUV color space " (based on a kind of optical flow computation method of LUV color space) article that the 378th page of " Proceedings of 2009 International Conference on Test and Measurement " (test in 2009 and measure international conference collection) delivered on to 381 pages, color is transformed into the LUV space from rgb space such as people such as X.Xiang, and aforementioned P.Golland and A.M.Bruckstein shows in the article color is transformed into the HSV space from rgb space.Using rgb space or other color spaces to estimate aspect the light stream, the channel signal that former work still is considered as color to separate, but not color is considered as an overall signal.This impels seeks a kind of new colored light stream algorithm for estimating, utilizes colouring information with integral body, improves the accuracy that light stream is estimated.In addition, successful object optical flow tracking need detect and follow the tracks of reliable unique point.On directly perceived, color characteristic is reliably on following the tracks of, have under the situation of illumination variation also stable than gray feature.This impels seeks a kind of color characteristic that is suitable for the object optical flow tracking, cooperates colored light stream algorithm for estimating, improves the accuracy rate of object tracking.
Summary of the invention
The present invention is directed to the prior art above shortcomings, a kind of object light stream tracking based on hypercomplex number is provided, utilized the hypercomplex number of color to represent, handle color and estimate light stream with the form of overall signal.Can obtain the estimation of light stream more accurately at location of pixels in this way, thereby can follow the tracks of the unique point on the object more healthy and stronglyer, reduce tracking error with spatial color variation.The present invention adopts hypercomplex number color angle point simultaneously, reliable unique point during as tracking.Adopt hypercomplex number color angle point and and the gray-scale value angle point constitute feature point set good when following the tracks of jointly, carry out the object optical flow tracking in conjunction with hypercomplex number light stream algorithm for estimating.
The present invention is achieved by the following technical solutions, the present invention includes following steps:
The first step, in the first frame I of image sequence, set the scope of target to be tracked and in this scope, detect the Harris angle point, and keep the n of angle point degree tolerance greater than threshold gamma hIndividual Harris angle point, concrete steps comprise:
1.1) the first frame I of sequence of computed images is at the gradient I of two-dimensional space x and y direction xAnd I yAmplitude as the pixel local neighborhood changes;
1.2) calculate angle point degree tolerance by amplitude correlation matrix M, to obtain the Harris angle point of angle point degree tolerance greater than threshold gamma.
Described amplitude correlation matrix
Figure BDA0000054634680000021
Wherein g (σ) is Gauss's smoothing filter, and σ is the filter size parameter, and * is a convolution.
Described angle point degree tolerance Cornerness=det (M)-k trace 2(M), wherein det and trace represent determinant and the mark of amplitude correlation matrix M, and k is for adjusting parameter.
Second step, in the scope of described target to be tracked, detect hypercomplex number color angle point, and retaining color angle point degree tolerance is greater than threshold gamma qN qIndividual hypercomplex number color angle point, concrete steps comprise:
2.1) the first frame I of image sequence is expressed as pure quaternion matrix I q, to I qCarry out following hypercomplex number filtering, and the saturation degree of getting the filtering result is respectively represented the change color degree C of x direction xAnd the change color degree C of y direction y:
H x = R 0 R * R 0 R * R 0 R * [ I q ] R * 0 R R * 0 R R * 0 R , C x=Saturation(H x),
H y = R R R 0 0 0 R * R * R * [ I q ] R * R * R * 0 0 0 R R R , C y=Saturation(H y),
Wherein: R=Se μ π/4=S{cos π/4+sin π/4}, μ are the unit pure quaternions
Figure BDA0000054634680000033
Figure BDA0000054634680000034
2.2) by color correlation matrix M qCalculate color angle point degree tolerance, to obtain color angle point degree tolerance greater than threshold gamma qHypercomplex number color angle point.
Described color correlation matrix
Figure BDA0000054634680000035
Wherein g (σ) is Gauss's smoothing filter, and σ is the filter size parameter, and * is a convolution.
Described color angle point degree tolerance is meant: Cornerness q=det (M q)/(trace (M q)+δ), wherein: det and trace represent determinant and the mark of color correlation matrix M, δ=2.2204e-016.
The 3rd the step, in conjunction with n hIndividual Harris angle point and n qIndividual hypercomplex number color angle point is as the feature set of optical flow tracking, to each unique point in the feature set, employing is carried out the light stream estimation based on the light stream algorithm for estimating of hypercomplex number in adjacent two interframe of image sequence, upgrade unique point with resulting light stream value and follow the tracks of in the position and the realization of second frame, concrete steps comprise:
3.1) t is constantly the time, at the adjacent two frame I of image sequence tAnd I T+1Between use following step to estimate the light stream of each unique point in the feature point set, and t is added its light stream in the position of this unique point constantly, obtain the t+1 position of this unique point constantly, and carry out t+1 light stream estimation constantly.T=1 when initial.
3.2) with I tAnd I T+1Carry out down-sampling respectively and generate p tomographic image pyramid I T, lAnd I T+1, l, l ∈ [1, p] is the image pyramid level number.At first carry out step 3.3 at the pyramid top layer) and 3.4) described light stream estimation, the light stream that is obtained is amplified to next aspect of pyramid, as the initial value of this aspect light stream, carry out step 3.3 once more) and 3.4) described light stream estimation.So repeatedly, until the pyramid bottom, the light stream of finally being estimated.In order further to improve accuracy of estimation, step 3.3) and 3.4) described light stream estimates that all carrying out q time in each aspect of image pyramid circulates.
3.3) at image pyramid l layer, with I T, lAnd I T+1, lBe expressed as the pure quaternion image respectively
Figure BDA0000054634680000036
With
Figure BDA0000054634680000037
Arbitrary in Harris angle point in the given feature set or the hypercomplex number color angle point is that (x, y t), get the local neighborhood of w on its direction in space * w size at three-dimensional coordinate; Make this centre of neighbourhood that (x, y) the hypercomplex number color value of locating pixel and another pixel i is respectively Q (c)And Q (i), calculate Q (c)And Q (i)The hypercomplex number inner product with the color similarity degree of remarked pixel i and center pixel, then get rid of pixel i when the color similarity degree less than threshold alpha and do not carry out following steps.
Described color similarity degree is meant L i=<Q (i), Q (c)〉/(| Q (i)|| Q (c)|), wherein:<, represent the hypercomplex number inner product, || represent the quaternary digital-to-analogue.
3.4) calculating hypercomplex number image
Figure BDA0000054634680000041
Hypercomplex number gradient Q on space x and y direction x, Q yArbitrary in Harris angle point in the given feature set or the hypercomplex number color angle point is that (x, y t), get the local neighborhood of w on its direction in space * w size, get at three-dimensional coordinate
Figure BDA0000054634680000042
In the corresponding neighborhood that points to of initial light stream, calculate the hypercomplex number gradient Q of this neighborhood time orientation tWhen through 3.3) get rid of step after, get in the neighborhood remaining pixel and obtain hypercomplex number light stream equation: Wherein: Be light stream to be estimated, digital subscript is represented remaining pixel in the neighborhood, and the Quaternion Matrix form of following formula is A qv q=b q, A qBe n * 2 pure quaternion matrixes, b qBe n * 1 pure quaternion vector, v qBe 2 * 1 quaternionic vectors,
Figure BDA0000054634680000045
The mould of n * 1 quaternionic vector q is defined as || q||=sqrt (∑ n| q n| 2), | q n| be the mould of each hypercomplex number element.Calculate v q=(A q HA q) -1A q Hb q, the light stream of being asked is
Figure BDA0000054634680000046
Wherein the scalar part of hypercomplex number is got in S () expression.
Principle of the present invention is, adopt the hypercomplex number of color to represent mode, handle color in the overall signal mode, color angle point based on the hypercomplex number detection, constitute the feature point set of object tracking together with the gray scale angle point, reduce required feature and count, improve tracking velocity, employing is carried out optical flow tracking based on the light stream algorithm for estimating of hypercomplex number to object.Coloured image is carried out hypercomplex number filtering obtain the change color degree, calculation of filtered result's saturation degree is to detect the color angle point.Light stream algorithm for estimating based on hypercomplex number, can estimate light stream more accurately in the zone that change color is arranged, and in the big pixel of local neighborhood eliminating color distortion degree, meet light stream consistance hypothesis more, when carrying out the object optical flow tracking, more unique point remains on correct position, has realized healthy and strong tracking.
Compared with prior art, the present invention is based on hypercomplex number and detect the color angle point, in conjunction with the feature point set of gray scale angle point as the object optical flow tracking, light stream algorithm for estimating based on hypercomplex number, in the overall signal mode but not the triple channel mode of separating is handled color, estimate that in disclosed light stream the light stream evaluated error has reduced by 11.9% than Lucas-Kanada algorithm in the evaluation test, in object light stream tracking test, reduced the characteristic point position mistake effectively.
The present invention has carried out the qualitative assessment of hypercomplex number light stream algorithm for estimating on disclosed light stream estimation test set, estimate that in the Middlebury light stream light stream evaluated error has reduced by 11.9% than Lucas-Kanada algorithm on the standard testing collection.Carry out object tracking in conjunction with hypercomplex number color angle point and also improved the accuracy rate of following the tracks of, proved that the present invention can realize real-time, healthy and strong tracking.
Description of drawings
Fig. 1 is the process flow diagram that the inventive method is carried out the object optical flow tracking.
Fig. 2 is embodiment hypercomplex number color angle point and gray scale angle point component-bar chart.
Fig. 3 is the object light stream tracking results figure of embodiment.
Embodiment
Below embodiments of the invention are elaborated, present embodiment is being to implement under the prerequisite with the technical solution of the present invention, provided detailed embodiment and concrete operating process, but protection scope of the present invention is not limited to following embodiment.
Embodiment:
The first step, in the first frame I of image sequence, set the scope of target to be tracked and in this scope, detect the Harris angle point, and keep the n of angle point degree tolerance greater than threshold gamma hIndividual Harris angle point, among this embodiment, γ=2000, concrete steps comprise:
1.1) the first frame I of sequence of computed images is at the gradient I of two-dimensional space x and y direction xAnd I yAmplitude as the pixel local neighborhood changes;
1.2) calculate angle point degree tolerance by amplitude correlation matrix M, to obtain the Harris angle point of angle point degree tolerance greater than threshold gamma.
Described amplitude correlation matrix
Figure BDA0000054634680000051
Wherein g (σ) is Gauss's smoothing filter, and σ is the filter size parameter, and * is a convolution.Among this embodiment, σ=3.
Described angle point degree tolerance Cornerness=det (M)-k trace 2(M), wherein det and trace represent determinant and the mark of amplitude correlation matrix M, and k is for adjusting parameter.Among this embodiment, k=0.04.
Second step, in the scope of described target to be tracked, detect hypercomplex number color angle point, and retaining color angle point degree tolerance is greater than threshold gamma qN qIndividual hypercomplex number color angle point, among this embodiment, γ q=4.5e-4.Concrete steps comprise:
2.1) the first frame I of image sequence is expressed as pure quaternion matrix I q, to I qCarry out following hypercomplex number filtering, and the saturation degree of getting the filtering result is respectively represented the change color degree C of x direction xAnd the change color degree C of y direction y:
H x = R 0 R * R 0 R * R 0 R * [ I q ] R * 0 R R * 0 R R * 0 R , C x=Saturation(H x),
H y = R R R 0 0 0 R * R * R * [ I q ] R * R * R * 0 0 0 R R R , C y=Saturation(H y),
Wherein: R=Se μ π/4=S{cos π/4+sin π/4}, μ are the unit pure quaternions
Figure BDA0000054634680000054
2.2) by color correlation matrix M qCalculate color angle point degree tolerance, to obtain color angle point degree tolerance greater than threshold gamma qHypercomplex number color angle point.
Described color correlation matrix
Figure BDA0000054634680000056
Wherein g (σ) is Gauss's smoothing filter, and σ is the filter size parameter, and * is a convolution.Among this embodiment, σ=3
Described color angle point degree tolerance is meant: Cornerness q=det (M q)/(trace (M q)+δ), wherein: det and trace represent determinant and the mark of color correlation matrix M, δ=2.2204e-016.
The 3rd the step, in conjunction with n hIndividual Harris angle point and n qIndividual hypercomplex number color angle point is as the feature set of optical flow tracking, to each unique point in the feature set, employing is carried out the light stream estimation based on the light stream algorithm for estimating of hypercomplex number in adjacent two interframe of image sequence, upgrade unique point with resulting light stream value and follow the tracks of in the position and the realization of second frame, concrete steps comprise:
3.1) t is constantly the time, at the adjacent two frame I of image sequence tAnd I T+1Between use following step to estimate the light stream of each unique point in the feature point set, and t is added its light stream in the position of this unique point constantly, obtain the t+1 position of this unique point constantly, and carry out t+1 light stream estimation constantly.T=1 when initial.
3.2) with I tAnd I T+1Carry out down-sampling respectively and generate p tomographic image pyramid I T, lAnd I T+1, l, l ∈ [1, p] is the image pyramid level number.At first carry out step 3.3 at the pyramid top layer) and 3.4) described light stream estimation, the light stream that is obtained is amplified to next aspect of pyramid, as the initial value of this aspect light stream, carry out step 3.3 once more) and 3.4) described light stream estimation.So repeatedly, until the pyramid bottom, the light stream of finally being estimated.In order further to improve accuracy of estimation, step 3.3) and 3.4) described light stream estimates that all carrying out q time in each aspect of image pyramid circulates.Among this embodiment, p=3, q=3.
3.3) at image pyramid l layer, with I T, lAnd I T+1, lBe expressed as the pure quaternion image respectively
Figure BDA0000054634680000061
With
Figure BDA0000054634680000062
Arbitrary in Harris angle point in the given feature set or the hypercomplex number color angle point is that (x, y t), get the local neighborhood of w on its direction in space * w size at three-dimensional coordinate; Make this centre of neighbourhood that (x, y) the hypercomplex number color value of locating pixel and another pixel i is respectively Q (c)And Q (i), calculate Q (c)And Q (i)The hypercomplex number inner product with the color similarity degree of remarked pixel i and center pixel, then get rid of pixel i when the color similarity degree less than threshold alpha and do not carry out following steps.Among this embodiment, w=9, α=0.97.
Described color similarity degree is meant L i=<Q (i), Q (c)〉/(| Q (i)|| Q (c)|), wherein:<, represent the hypercomplex number inner product, || represent the quaternary digital-to-analogue.
3.4) calculating hypercomplex number image
Figure BDA0000054634680000063
Hypercomplex number gradient Q on space x and y direction x, Q yArbitrary in Harris angle point in the given feature set or the hypercomplex number color angle point is that (x, y t), get the local neighborhood of w on its direction in space * w size, get at three-dimensional coordinate
Figure BDA0000054634680000064
In the corresponding neighborhood that points to of initial light stream, calculate the hypercomplex number gradient Q of this neighborhood time orientation tWhen through 3.3) get rid of step after, get in the neighborhood remaining pixel and obtain hypercomplex number light stream equation:
Figure BDA0000054634680000065
Wherein:
Figure BDA0000054634680000066
Be light stream to be estimated, digital subscript is represented remaining pixel in the neighborhood, and the Quaternion Matrix form of following formula is A qv q=b q, A qBe n * 2 pure quaternion matrixes, b qBe n * 1 pure quaternion vector, v qBe 2 * 1 quaternionic vectors,
Figure BDA0000054634680000067
The mould of n * 1 quaternionic vector q is defined as || q||=sqrt (∑ n| q n| 2), | q n| be the mould of each hypercomplex number element.Calculate v q=(A q HA q) -1A q Hb q, the light stream of being asked is
Figure BDA0000054634680000068
Wherein the scalar part of hypercomplex number is got in S () expression.
Implementation result
According to above-mentioned steps, the football game motion video sequence is carried out object (sportsman) optical flow tracking.Lan Huangfang and Bai Fang both sides sportsman are arranged in this video sequence, difference corresponding color characteristic remarkable and the significant object of gray feature, as can be seen from Figure 2 the Harris angle point is more is scattered in white side sportsman, hypercomplex number color angle point is more to be scattered in blue yellow side sportsman, and the two constitutes the tracking characteristics collection jointly.Fig. 3 follows the tracks of (Fig. 3 middle column) for Lucas-Kanade and based on the optical flow tracking (Fig. 3 right-hand column) of hypercomplex number figure as a result, as can be seen based on the optical flow tracking of hypercomplex number, and characteristic point position correct (red block shows the unique point of positional fault).Wherein, estimate on the standard testing collection, adopt average angle error (AAE) and average light stream end point error (AEPE) assessment light stream to estimate that the light stream evaluated error has reduced by 11.9% in the Middlebury light stream according to the light stream algorithm for estimating in above-mentioned the 3rd step.All tests realize on the PC computing machine that all the major parameter of this PC computing machine is: central processing unit
Figure BDA0000054634680000071
Core TM2DuoCPU E6600@2.40GHz, internal memory 2GB.
Experiment shows, compare to existing object optical flow tracking method, the feature point set that present embodiment adopts is taken into account the remarkable and significant invariant feature of change color of grey scale change, light stream after being used for is estimated, the light stream algorithm for estimating that is adopted has effectively reduced the light stream evaluated error, the tracking accuracy rate has been improved in more accurate calculated characteristics point position when following the tracks of.

Claims (7)

1. the object light stream tracking based on hypercomplex number is characterized in that, may further comprise the steps:
The first step, in first frame of image sequence, set the scope of target to be tracked and in this scope, detect the Harris angle point, and keep the Harris angle point of angle point degree tolerance greater than threshold value;
Second step, in the scope of described target to be tracked, detect hypercomplex number color angle point, and retaining color angle point degree tolerance is greater than the hypercomplex number color angle point of threshold value;
The 3rd step, in conjunction with Harris angle point and hypercomplex number color angle point feature set as optical flow tracking, to each unique point in the feature set, employing is carried out the light stream estimation based on the light stream algorithm for estimating of hypercomplex number in adjacent two interframe of image sequence, upgrades unique point with resulting light stream value and follows the tracks of in the position and the realization of second frame.
2. the object light stream tracking based on hypercomplex number according to claim 1 is characterized in that the described first step may further comprise the steps:
1.1) the first frame I of sequence of computed images is at the gradient I of two-dimensional space x and y direction xAnd I yAmplitude as the pixel local neighborhood changes;
1.2) calculate angle point degree tolerance by amplitude correlation matrix M, to obtain the Harris angle point of angle point degree tolerance greater than threshold gamma.
3. the object light stream tracking based on hypercomplex number according to claim 2 is characterized in that described amplitude correlation matrix
Figure FDA0000054634670000011
Wherein g (σ) is Gauss's smoothing filter, and σ is the filter size parameter, and * is a convolution; Described angle point degree tolerance Cornerness=det (M)-k trace 2(M), wherein det and trace represent determinant and the mark of amplitude correlation matrix M, and k is for adjusting parameter.
4. the object light stream tracking based on hypercomplex number according to claim 1 is characterized in that, described second step may further comprise the steps:
2.1) the first frame I of image sequence is expressed as pure quaternion matrix I q, to I qCarry out following hypercomplex number filtering, and the saturation degree of getting the filtering result is respectively represented the change color degree C of x direction xAnd the change color degree C of y direction y:
H x = R 0 R * R 0 R * R 0 R * [ I q ] R * 0 R R * 0 R R * 0 R , C x=Saturation(H x),
H y = R R R 0 0 0 R * R * R * [ I q ] R * R * R * 0 0 0 R R R , C y=Saturation(H y),
Wherein: R=Se μ π/4=S{cos π/4+sin π/4}, μ are the unit pure quaternions
Figure FDA0000054634670000021
Figure FDA0000054634670000022
2.2) by color correlation matrix M qCalculate color angle point degree tolerance, to obtain color angle point degree tolerance greater than threshold gamma qHypercomplex number color angle point.
5. the object light stream tracking based on hypercomplex number according to claim 4 is characterized in that described color correlation matrix
Figure FDA0000054634670000023
Wherein g (σ) is Gauss's smoothing filter, and σ is the filter size parameter, and * is a convolution; Described color angle point degree tolerance is meant: Cornerness q=det (M q)/(trace (M q)+δ), wherein: det and trace represent determinant and the mark of color correlation matrix M, δ=2.2204e-016.
6. the object light stream tracking based on hypercomplex number according to claim 1 is characterized in that, described the 3rd step may further comprise the steps:
3.1) t is constantly the time, at the adjacent two frame I of image sequence tAnd I T+1Between use following step to estimate the light stream of each unique point in the feature point set, and t is added its light stream in the position of this unique point constantly, obtain the t+1 position of this unique point constantly, and carry out t+1 light stream constantly and estimate t=1 in the time of initially;
3.2) with I tAnd I T+1Carry out down-sampling respectively and generate p tomographic image pyramid I T, lAnd I T+1, lL ∈ [1, p] is the image pyramid level number, at first carry out step 3.3 at the pyramid top layer) and 3.4) described light stream estimation, the light stream that is obtained is amplified to next aspect of pyramid, as the initial value of this aspect light stream, carry out step 3.3 once more) and 3.4) described light stream estimation, so repeatedly, until the pyramid bottom, the light stream of finally being estimated is in order further to improve accuracy of estimation, step 3.3) and 3.4) described light stream estimates that all carrying out q time in each aspect of image pyramid circulates;
3.3) at image pyramid l layer, with I T, lAnd I T+1, lBe expressed as the pure quaternion image respectively
Figure FDA0000054634670000024
With
Figure FDA0000054634670000025
Arbitrary in Harris angle point in the given feature set or the hypercomplex number color angle point is that (x, y t), get the local neighborhood of w on its direction in space * w size at three-dimensional coordinate; Make this centre of neighbourhood that (x, y) the hypercomplex number color value of locating pixel and another pixel i is respectively Q (c)And Q (i), calculate Q (c)And Q (i)The hypercomplex number inner product with the color similarity degree of remarked pixel i and center pixel, then get rid of pixel i when the color similarity degree less than threshold alpha and do not carry out following steps;
3.4) calculating hypercomplex number image
Figure FDA0000054634670000026
Hypercomplex number gradient Q on space x and y direction x, Q y, arbitrary in Harris angle point in the given feature set or the hypercomplex number color angle point is that (x, y t), get the local neighborhood of w on its direction in space * w size, get at three-dimensional coordinate
Figure FDA0000054634670000027
In the corresponding neighborhood that points to of initial light stream, calculate the hypercomplex number gradient Q of this neighborhood time orientation t, when through 3.3) get rid of step after, get in the neighborhood remaining pixel and obtain hypercomplex number light stream equation:
Figure FDA0000054634670000028
Wherein:
Figure FDA0000054634670000031
Be light stream to be estimated, digital subscript is represented remaining pixel in the neighborhood, and the Quaternion Matrix form of following formula is A qv q=b q, A qBe n * 2 pure quaternion matrixes, b qBe n * 1 pure quaternion vector, v qBe 2 * 1 quaternionic vectors,
Figure FDA0000054634670000032
The mould of n * 1 quaternionic vector q is defined as || q||=sqrt (∑ n| q n| 2), | q n| be the mould of each hypercomplex number element, calculate v q=(A q HA q) -1A q Hb q, the light stream of being asked is
Figure FDA0000054634670000033
Wherein the scalar part of hypercomplex number is got in S () expression.
7. the object light stream tracking based on hypercomplex number according to claim 6 is characterized in that described color similarity degree is meant L i=<Q (i), Q (c)〉/(| Q (i)|| Q (c)|), wherein:<, represent the hypercomplex number inner product, || represent the quaternary digital-to-analogue.
CN 201110089324 2011-04-11 2011-04-11 Quaternion based object optical flow tracking method Expired - Fee Related CN102156991B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110089324 CN102156991B (en) 2011-04-11 2011-04-11 Quaternion based object optical flow tracking method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110089324 CN102156991B (en) 2011-04-11 2011-04-11 Quaternion based object optical flow tracking method

Publications (2)

Publication Number Publication Date
CN102156991A true CN102156991A (en) 2011-08-17
CN102156991B CN102156991B (en) 2013-05-01

Family

ID=44438472

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110089324 Expired - Fee Related CN102156991B (en) 2011-04-11 2011-04-11 Quaternion based object optical flow tracking method

Country Status (1)

Country Link
CN (1) CN102156991B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103400395A (en) * 2013-07-24 2013-11-20 佳都新太科技股份有限公司 Light stream tracking method based on HAAR feature detection
CN103693532A (en) * 2013-12-26 2014-04-02 江南大学 Method of detecting violence in elevator car
CN105374049A (en) * 2014-09-01 2016-03-02 浙江宇视科技有限公司 Multi-angle-point tracking method based on sparse optical flow method and apparatus thereof
CN105551048A (en) * 2015-12-21 2016-05-04 华南理工大学 Space surface patch-based three-dimensional corner detection method
CN106204645A (en) * 2016-06-30 2016-12-07 南京航空航天大学 Multi-object tracking method
CN108460786A (en) * 2018-01-30 2018-08-28 中国航天电子技术研究院 A kind of high speed tracking of unmanned plane spot
CN111382784A (en) * 2020-03-04 2020-07-07 厦门脉视数字技术有限公司 Moving target tracking method
CN112529936A (en) * 2020-11-17 2021-03-19 中山大学 Monocular sparse optical flow algorithm for outdoor unmanned aerial vehicle

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6480615B1 (en) * 1999-06-15 2002-11-12 University Of Washington Motion estimation within a sequence of data frames using optical flow with adaptive gradients
US6559848B2 (en) * 2000-12-13 2003-05-06 Intel Corporation Coding and decoding three-dimensional data
CN101183460A (en) * 2007-11-27 2008-05-21 西安电子科技大学 Color picture background clutter quantizing method
CN101216941A (en) * 2008-01-17 2008-07-09 上海交通大学 Motion estimation method under violent illumination variation based on corner matching and optic flow method
US7535463B2 (en) * 2005-06-15 2009-05-19 Microsoft Corporation Optical flow-based manipulation of graphical objects
US20100194741A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Depth map movement tracking via optical flow and velocity prediction
CN101923719A (en) * 2009-06-12 2010-12-22 新奥特(北京)视频技术有限公司 Particle filter and light stream vector-based video target tracking method
CN101923718A (en) * 2009-06-12 2010-12-22 新奥特(北京)视频技术有限公司 Optimization method of visual target tracking method based on particle filtering and optical flow vector

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6480615B1 (en) * 1999-06-15 2002-11-12 University Of Washington Motion estimation within a sequence of data frames using optical flow with adaptive gradients
US6559848B2 (en) * 2000-12-13 2003-05-06 Intel Corporation Coding and decoding three-dimensional data
US7535463B2 (en) * 2005-06-15 2009-05-19 Microsoft Corporation Optical flow-based manipulation of graphical objects
CN101183460A (en) * 2007-11-27 2008-05-21 西安电子科技大学 Color picture background clutter quantizing method
CN101216941A (en) * 2008-01-17 2008-07-09 上海交通大学 Motion estimation method under violent illumination variation based on corner matching and optic flow method
US20100194741A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Depth map movement tracking via optical flow and velocity prediction
CN101923719A (en) * 2009-06-12 2010-12-22 新奥特(北京)视频技术有限公司 Particle filter and light stream vector-based video target tracking method
CN101923718A (en) * 2009-06-12 2010-12-22 新奥特(北京)视频技术有限公司 Optimization method of visual target tracking method based on particle filtering and optical flow vector

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
《Multimedia and Expo, 2007 IEEE International Conference on》 20070702 YI XU等 2D Quaternion Fourier Transform: The Spectrum Properties and its Application in Color Image Registration , *
《电子器件》 20070831 基于角点检测的光流目标跟踪算法 沈美丽等 第30卷, 第4期 *
《计算机工程》 20030831 陈震等 基于角点跟踪的光流场计算 第29卷, 第3期 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103400395A (en) * 2013-07-24 2013-11-20 佳都新太科技股份有限公司 Light stream tracking method based on HAAR feature detection
CN103693532A (en) * 2013-12-26 2014-04-02 江南大学 Method of detecting violence in elevator car
CN103693532B (en) * 2013-12-26 2016-04-27 江南大学 Act of violence detection method in a kind of lift car
CN105374049A (en) * 2014-09-01 2016-03-02 浙江宇视科技有限公司 Multi-angle-point tracking method based on sparse optical flow method and apparatus thereof
CN105374049B (en) * 2014-09-01 2020-01-14 浙江宇视科技有限公司 Multi-corner point tracking method and device based on sparse optical flow method
CN105551048A (en) * 2015-12-21 2016-05-04 华南理工大学 Space surface patch-based three-dimensional corner detection method
CN106204645A (en) * 2016-06-30 2016-12-07 南京航空航天大学 Multi-object tracking method
CN108460786A (en) * 2018-01-30 2018-08-28 中国航天电子技术研究院 A kind of high speed tracking of unmanned plane spot
CN111382784A (en) * 2020-03-04 2020-07-07 厦门脉视数字技术有限公司 Moving target tracking method
CN112529936A (en) * 2020-11-17 2021-03-19 中山大学 Monocular sparse optical flow algorithm for outdoor unmanned aerial vehicle
CN112529936B (en) * 2020-11-17 2023-09-05 中山大学 Monocular sparse optical flow algorithm for outdoor unmanned aerial vehicle

Also Published As

Publication number Publication date
CN102156991B (en) 2013-05-01

Similar Documents

Publication Publication Date Title
CN102156991B (en) Quaternion based object optical flow tracking method
CN105631861B (en) Restore the method for 3 D human body posture from unmarked monocular image in conjunction with height map
US8086027B2 (en) Image processing apparatus and method
CN103514441B (en) Facial feature point locating tracking method based on mobile platform
US9280832B2 (en) Methods, systems, and computer readable media for visual odometry using rigid structures identified by antipodal transform
CN102075686B (en) Robust real-time on-line camera tracking method
CN104794737B (en) A kind of depth information Auxiliary Particle Filter tracking
CN112801074B (en) Depth map estimation method based on traffic camera
CN106157372A (en) A kind of 3D face grid reconstruction method based on video image
Huang et al. Robust human body shape and pose tracking
CN104050685B (en) Moving target detecting method based on particle filter visual attention model
CN111797688A (en) Visual SLAM method based on optical flow and semantic segmentation
US20160267678A1 (en) Methods, systems, and computer readable media for visual odometry using rigid structures identified by antipodal transform
CN113011401B (en) Face image posture estimation and correction method, system, medium and electronic equipment
CN113435336A (en) Running intelligent timing system and method based on artificial intelligence
CN101539989A (en) Human face detection-based method for testing incorrect reading posture
Zhang et al. 3D head tracking under partial occlusion
CN114612933B (en) Monocular social distance detection tracking method
Dornaika et al. A new framework for stereo sensor pose through road segmentation and registration
CN113139504B (en) Identity recognition method, device, equipment and storage medium
CN108694348B (en) Tracking registration method and device based on natural features
CN101710421A (en) Two-dimensional human posture processing method based on sketch
Lefevre et al. Structure and appearance features for robust 3d facial actions tracking
Lin et al. System Implementation of Multiple License Plate Detection And Correction On Wide-Angle Images Using an Instance Segmentation Network Model
CN115272450A (en) Target positioning method based on panoramic segmentation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130501

Termination date: 20170411

CF01 Termination of patent right due to non-payment of annual fee