CN103473565B - Image matching method and device - Google Patents
Image matching method and device Download PDFInfo
- Publication number
- CN103473565B CN103473565B CN201310374229.7A CN201310374229A CN103473565B CN 103473565 B CN103473565 B CN 103473565B CN 201310374229 A CN201310374229 A CN 201310374229A CN 103473565 B CN103473565 B CN 103473565B
- Authority
- CN
- China
- Prior art keywords
- image
- feature point
- point
- subpoint
- fisrt feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Landscapes
- Image Analysis (AREA)
Abstract
The invention discloses an image matching method and device. The image matching method comprises the steps of acquiring initial feature point sets matched from a first image and a second image; sequentially based on a matching relation between each pair of feature points in the initial feature point sets, determining projection points of preset reference points in the first image in the second image to obtain multiple projection points; determining at least one first projection point with the distance to a first position with most dense projection points in the second image being greater than a preset threshold value from the multiple projection points; removing the feature point pair corresponding to the first projection point from the initial feature point sets, and determining the matching relation between the first image and the second image according to the residual feature point pairs in the initial feature point sets. The image matching method can improve the accuracy of image matching and the reliability of image matching.
Description
Technical field
The present invention relates to technical field of image processing, more particularly to a kind of image matching method and device.
Background technology
Image matching technology is applied to multiple fields, such as image registration, target recognition, face recognition, medical science figure
As process etc..However, no matter which kind of image matching technology adopted, the phenomenon of error hiding cannot be all avoided the occurrence of, makes what is matched
There are more error matching points in matching point set.
During for matching relationship between two width images, need to calculate the transformation model parameter between two width images, but
When there are a large amount of error matching points in point set is matched, the estimated accuracy of transformation model parameter is influenced whether, and then is had influence on
Determine matching relationship between image.Although adopting stochastical sampling consistency algorithm at present(RANSAC, Random Sample
Consensus), least square intermediate value method scheduling algorithm carrying out the estimation of image transform model parameter, Mismatching point can be reduced
Impact to determining images match relation.But when error matching points quantity is more, transformation model ginseng is carried out using these algorithms
Number estimates can also there is a problem of that Parameter Estimation Precision is low, so as to have impact on the matching relationship reliability between determination image.
The content of the invention
In view of this, the present invention provides a kind of image matching method and device, and the method can be improved
Precision, and then improve images match reliability.
To realize object above, a first aspect of the present invention provides a kind of image matching method, including:
The initial characteristicses point set that acquisition is matched from the first image and the second image, wherein, the initial characteristicses point
Set includes multipair feature point pairs;
Matching relationship based on feature point pairs described in each pair successively, in determining described first image, default reference point is in institute
The subpoint in the second image is stated, multiple subpoints are obtained;
From the plurality of subpoint, it is determined that big with the distance of the most intensive first position of subpoint in second image
In at least one first subpoints of predetermined threshold value;
The corresponding feature point pairs of first subpoint are removed from the initial characteristicses point set, and according to described initial
Remaining feature point pairs in set of characteristic points, determine the matching relationship of described first image and second image.
It is with reference in a first aspect, in the first possible implementation, described successively based on feature point pairs described in each pair
Matching relationship, determines subpoint of the default reference point in second image in described first image, including:
The characteristic point information of the fisrt feature point and second feature point in each pair feature point pairs is obtained, wherein, it is described
Fisrt feature point is the characteristic point in described first image, and the second feature point is the characteristic point in the second image;The spy
Levying an information includes:Position of the characteristic point in affiliated image, yardstick and principal direction;
Based on the matching relationship of the fisrt feature point and the second feature point, and according to the fisrt feature point and
The position of the characteristic point information of two characteristic points and the reference point in described first image, determines the reference point described
Subpoint in second image.
It is with reference to the first possible implementation of first aspect, in second possible implementation, described to be based on
The matching relationship of the fisrt feature point and the second feature point, and according to the fisrt feature point and the spy of second feature point
The position of an information and the reference point in described first image is levied, determines the reference point in second image
Subpoint, including:
According to the characteristic point information of the fisrt feature point, calculate the first of the fisrt feature point and the reference point away from
From, and fisrt feature point is to the folder between the principal direction of the direction vector and fisrt feature point of the reference point
Angle;
According to the yardstick of the yardstick and second feature point of the fisrt feature point, second feature point is calculated with respect to fisrt feature
The scaling factor of point;
Based on the matching relationship of the fisrt feature point and second feature point, and the position according to the second feature point,
The principal direction of the second feature point, the direction vector of fisrt feature point to the reference point and the fisrt feature point
Angle, first distance and the scaling factor between principal direction, calculates the reference point in second figure
The position of the subpoint as in.
It is with reference to second possible implementation of first aspect, in the third possible implementation, described to be based on
The matching relationship of the fisrt feature point and second feature point, and the position according to the second feature point, the second feature
Between the principal direction of the principal direction, the direction vector of fisrt feature point to the reference point and fisrt feature point of point
Angle, first distance and the scaling factor, calculate subpoint of the reference point in second image
Position, including:
In the following way, calculate the position P of subpoint of the reference point in second image(Xr,kj,Yr,kj):
Xr,kj=xkj+Sri,kj*dr,ri*cos(wkj+Δφr,ri);
Yr,kj=ykj+Sri,kj*dr,ri*sin(wkj+Δφr,ri);
Wherein, xkj,ykjAbscissa and vertical coordinate of the respectively described second feature point in second image;Sri,kj
For the scaling factor;dr,riFor first distance;wkjFor the principal direction of the second feature point;Δφr,riFor institute
Fisrt feature point is stated to the angle between the principal direction of the direction vector and fisrt feature point of the reference point.
Second possible reality of the first possible implementation, first aspect with reference to first aspect, first aspect
The third possible implementation of existing mode and first aspect, it is in the 4th kind of possible implementation, described from described many
In individual subpoint, it is determined that being more than predetermined threshold value at least with the distance of the most intensive first position of subpoint in second image
One the first subpoint, including:
The subpoint number at each coordinate position in second image is counted, and utilizes Density Estimator method, point
The density specified at each coordinate position in second image in core window width is not calculated, determine that density maxima is located first
Position.
Second possible reality of the first possible implementation, first aspect with reference to first aspect, first aspect
4th kind of possible implementation of existing mode, the third possible implementation of first aspect and first aspect, the 5th
Plant in possible implementation, it is described according to remaining feature point pairs in the initial characteristicses point set, determine first figure
The matching relationship of picture and second image, including:
Using stochastical sampling consistency algorithm, transformation model is carried out to remaining characteristic point in the initial characteristicses point set
Parameter estimation, and according to the transformation model parameter determination described first image for estimating and the matching relationship of second image.
A second aspect of the present invention provides a kind of image matching apparatus, including:
Feature point pairs acquiring unit, for obtaining the initial characteristicses point set matched from the first image and the second image
Close, wherein, the initial characteristicses point set includes multipair feature point pairs;
Subpoint determining unit, for matching relationship successively based on feature point pairs described in each pair, determines first figure
Subpoint of the default reference point in second image as in, obtains multiple subpoints;
Subpoint screening unit, for from the plurality of subpoint, it is determined that most close with subpoint in second image
At least one first subpoints of the distance of the first position of collection more than predetermined threshold value;
Feature point pairs removal unit, for the corresponding spy of first subpoint is removed from the initial characteristicses point set
Levy a little right;
Matching unit, after eliminating the corresponding feature point pairs of first subpoint for basis, the initial characteristicses
Remaining feature point pairs in point set, determine the matching relationship of described first image and second image.
With reference to second aspect, in the first possible implementation, the subpoint determining unit, including:
Information acquisition unit, for obtaining the feature of the fisrt feature point in each pair feature point pairs and second feature point
Point information, wherein, the fisrt feature point is the characteristic point in described first image, during the second feature point is the second image
Characteristic point;The characteristic point information includes:Position of the characteristic point in affiliated image, yardstick and principal direction;
Subpoint determination subelement, for the matching relationship based on the fisrt feature point and the second feature point, and
According to the position of the characteristic point information and the reference point of the fisrt feature point and second feature point in described first image
Put, determine subpoint of the reference point in second image.
With reference to the first possible implementation of second aspect, in second possible implementation, the subpoint
Determination subelement, including:
First computing unit, for the characteristic point information according to the fisrt feature point, calculate the fisrt feature point with
First distance of the reference point, and fisrt feature point and the direction vector of the reference point and the fisrt feature point
Principal direction between angle;
Second computing unit, for the yardstick of yardstick and second feature point according to the fisrt feature point, calculates second
Scaling factor of the characteristic point with respect to fisrt feature point;
Subpoint computing unit, for the matching relationship based on the fisrt feature point and second feature point, and according to institute
State the position of second feature point, the principal direction of second feature point, the vector side of fisrt feature point to the reference point
To the angle between the principal direction of the fisrt feature point, first distance and the scaling factor, institute is calculated
State the position of subpoint of the reference point in second image.
With reference to second possible reality of the first possible implementation and second aspect of second aspect, second aspect
Existing mode, in the third possible implementation, the subpoint screening unit, including:
Subpoint screens subelement, for counting the subpoint number in second image at each coordinate position, and
Using Density Estimator method, the density specified at each coordinate position in second image in core window width is calculated respectively, really
Determine the first position at density maxima place.
Second possible reality of the first possible implementation, second aspect with reference to second aspect, second aspect
In the third possible implementation of existing mode and second aspect, in the 4th kind of possible implementation, the matching is single
Unit includes:
Coupling subelement, for using stochastical sampling consistency algorithm, to eliminating the corresponding spy of first subpoint
Levy a little to afterwards, in the initial characteristicses point set, remaining characteristic point carries out transformation model parameter estimation, and according to estimating
Transformation model parameter determination described first image and second image matching relationship.
It can be seen from above-mentioned technical scheme that, the present invention is in the first image to be matched and the second image is got
After the initial characteristicses point set allotted, based on the matching relationship of each pair feature point pairs in initial characteristicses point set, the first figure is determined
Subpoint of the default reference point in second image as in, and determine and subpoint in second image most intensive first
Positional distance more than predetermined threshold value the first subpoint, due to feature point pairs be proper characteristics point pair when, distinguished point based is to true
The position of the subpoint made should within the specified range, therefore, the corresponding feature point pairs of the first subpoint be error hiding feature
Point is right, so.Using the feature point pairs eliminated in initial characteristicses point set after feature point pairs corresponding with the first subpoint,
Determine the matching relationship of the first image and the second image, improve the precision and reliability of images match.
Description of the drawings
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing
Accompanying drawing to be used needed for having technology description is briefly described, it should be apparent that, drawings in the following description are only this
Some embodiments of invention, for those of ordinary skill in the art, without having to pay creative labor, may be used also
To obtain other accompanying drawings according to these accompanying drawings.
Fig. 1 is a kind of schematic flow sheet of the image matching method shown in one embodiment of the invention;
Fig. 2 is a kind of schematic flow sheet of the image matching method shown in another embodiment of the present invention;
Fig. 3 is a kind of structural representation of the image matching apparatus shown in one embodiment of the invention;
Fig. 4 is a kind of structural representation of the image matching apparatus shown in another embodiment of the present invention;
Fig. 5 is a kind of structural representation of the calculate node shown in one embodiment of the invention.
Specific embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete
Site preparation is described, it is clear that described embodiment is only a part of embodiment of the invention, rather than the embodiment of whole.Based on this
Embodiment in invention, the every other reality obtained under the premise of creative work is not made by those of ordinary skill in the art
Example is applied, the scope of protection of the invention is belonged to.
A kind of image matching method is embodiments provided, to improve the precision of images match.
Referring to Fig. 1, a kind of schematic flow sheet of one embodiment of image matching method of the invention, the method are it illustrates
Including:
Step 101:The initial characteristicses point set that acquisition is matched from the first image and the second image.
The initial characteristicses point set includes multipair feature point pairs, and each pair characteristic point is included respectively from the first image and
Two characteristic points of two images, and the two characteristic points have matching relationship.Feature included in the initial characteristicses point set
Point is to the feature point pairs to be matched by existing signature search matching algorithm, the feature point pairs in the initial characteristicses point set
The middle feature point pairs that can there is error hiding.
Wherein, first image and two width images that the second image is pending matching.Images match actually will not
Same resolution, such as through translation or rotation etc., two width images of different ratio scale are mapped diverse location, and determine this
Corresponding matching relationship between two width images.Matching relationship between two width images, can pass through the conversion between two width images
Relation determines.For example, there is transformation relations such as what kind of rotation, translation, scaling etc. between this two width image.In order to determine this two
Matching relationship between width image, in the matching process can be as reference picture, separately using the width in this two width image
The image then matched with the reference picture as needs by piece image.In the embodiment of the present application using the first image as ginseng
Image is examined, the second image is the image for needing to be matched with the reference picture.
Step 102:Matching relationship based on each pair feature point pairs successively, determines in the first image default reference point
Subpoint in two images, obtains multiple subpoints.
Wherein, reference point is a bit for selecting in reference picture i.e. the first image in advance, the reference point can be
Any point selected in first image.Optionally, the reference point can be the central point of first image.
For any pair feature point pairs in initial characteristicses point set, in two features based on this feature point pair
On the premise of certain two characteristic points to be mutually matched in the first image and the second image of point, the reference of first image is determined
Subpoint position of the point on second image.So, for the matching relationship of this multipair feature point pairs, the reference can be obtained
Multiple subpoints of the point in second image, this multiple subpoint position may overlap, it is also possible to be in different positions
Point.
Step 103:From this multiple subpoint, it is determined that the distance with the most intensive first position of subpoint in the second image
More than at least one first subpoints of predetermined threshold value.
As reference point is a fixing point in first image, and when two width images determine, this two width image has
Matching relationship be also that certain determines relation, therefore, if each feature point pairs is the feature point pairs for correctly matching, should
Subpoint projected position of the reference point in multiple images should be in same position point.It follows that being based on the initial characteristicses point
The matching relationship of the feature point pairs of correct matching in set, the subpoint position determined should be same in the second image
Point, and the matching relationship of the feature point pairs based on error hiding, the position of the subpoint determined should be dispersed in second image
In each various location.
In in view of practical application, determine that the process of subpoint can have certain error, cause based on correct matching
The matching relationship of feature point pairs, the projected position point determined are also impossible to entirely fall within same position point, but this part
The position of subpoint should be distributed in certain scope.Hence, it can be determined that going out subpoint in the second image most intensive
One position, and using the subpoint with the distance of the first position in predetermined threshold value as based on the characteristic point for correctly matching
With the subpoint that relation is determined.
Accordingly, if the position of subpoint is more than the predetermined threshold value with the distance of the first position, then determine the throwing
Feature point pairs of the corresponding feature point pairs of shadow point for erroneous matching.Wherein, if based on the matching relationship of feature point pairs, determine
Subpoint of the reference point in second image, then this feature point is to the feature point pairs that are corresponding to the subpoint.
For convenience, the subpoint with the distance of the first position more than the predetermined threshold value referred to as should in the application
First subpoint, when the matching relationship based on certain feature point pairs, when the subpoint determined is the first subpoint, then this feature
Point is to feature point pairs then for error hiding.
Step 104:The corresponding feature point pairs of the first subpoint are removed from the initial characteristicses point set, and according to initial spy
Remaining feature point pairs in point set are levied, the matching relationship of the first image and the second image is determined.
As the corresponding feature point pairs of the first subpoint are the feature point pairs of erroneous matching, the application is by the erroneous matching
Feature point pairs remove from the initial characteristicses point set, and using the feature point pairs eliminated after the feature point pairs of error hiding
To carry out the analysis of images match relation, to determine the matching relationship of the first image and the second image.
In the embodiment of the present application, the initial characteristicses point matched in the first image to be matched and the second image is got
During set, it is not directly using feature point pairs in the initial characteristicses point set entering to the matching relationship between this two width image
Row analysis.If but according in two width images matching relationship determine in the case of, a fixing point wherein in piece image
Should be based respectively in initial characteristicses point set every in the principle within the specific limits that falls to the subpoint position in another piece image
Matching relationship to feature point pairs, determines subpoint of the default reference point in second image in the first image, goes forward side by side one
During step determines the subpoint that obtains, the first position distance most intensive with subpoint in second image more than predetermined threshold value the
One subpoint, and then the corresponding feature point pairs of the first subpoint are defined as the feature point pairs of error hiding.So, using initial
In set of characteristic points, eliminate feature point pairs after the feature point pairs of error hiding analyze the first image and the second image
With relation, the precision and reliability of images match are improve.
Wherein, according to remaining feature point pairs in initial characteristicses point set, determine the first image and the second image
During with relation, transformation model parameter estimation can be carried out using existing any-mode, and then is determined between two width images
Matching relationship.Such as, least square intermediate value method, the M estimations technique can be adopted(M-estimators)Scheduling algorithm calculates transformation model ginseng
Number is estimated, and then determines the matching relationship between two width images.In the mistake that transformation model parameter estimation is carried out using this several algorithm
Further removal has been carried out to Mismatching point that may be present in the remaining feature point pairs in journey, determination has been further increased
The accuracy of the images match relation for going out.
Optionally, it is contemplated that stochastical sampling concordance(RANSAC, Random Sample Consensus)Algorithm is being carried out
The estimated accuracy of parameter estimation is higher, it is possible to use RANSAC algorithms, to remaining feature point pairs in the initial characteristicses point set
Transformation model parameter estimation is carried out, and according to the matching of the transformation model parameter determination for estimating first image and the second image
Relation.
As RANSAC algorithms are a kind of nondeterministic algorithms, estimate in the transformation model parameter that image is carried out using the RANSAC
The process of meter, needs model parameter is estimated by way of iteration, and it has certain probability to draw a rational result, in order to
Improve probability and must improve iterationses.Therefore, in the case that in initial characteristicses point set, erroneous matching point data is more, profit
Transformation model parameter estimation is carried out with the RANSAC algorithms, then may may require that the interative computation of more number of times, amount of calculation is larger,
The requirement of real-time can not be met.And pass through the method for the present invention and first first subpoint corresponding error hiding feature point pairs are gone
Except afterwards, recycling RANSAC algorithms to carry out transformation model parameter estimation, then iterationses can be greatly reduced, reduce conversion
The amount of calculation of model parameter estimation, meets requirement of real-time.
It is understood that in any one embodiment of the invention, obtaining the initial characteristicses point set for matching
Process is identical with the process of existing matching characteristic point set, e.g., is somebody's turn to do by the operation such as feature point extraction, Feature Points Matching
Initial characteristicses point set, will not be described here.
Optionally, multiple dimensioned invariant features detection can be carried out to the first image and the second image respectively, from the first image
Set of characteristic points to be matched is extracted, and corresponding set of characteristic points to be matched is extracted from the second image.Such as, can pass through
Scale invariant feature conversion (SIFT, Scale-invariant feature transform) algorithm, acceleration robust features
(SURF, Speeded Up Robust Features)Algorithm or random fern((Random ferns) algorithm etc., to image
Carry out multiple dimensioned invariant features detection.After obtaining the set of characteristic points to be matched of the first image and the second image, Ke Yifen
The matching relationship of each characteristic point in first image and the set of characteristic points to be matched in the second image is not calculated, matching is obtained
Go out the initial characteristicses point set.Such as, feature point pairs are matched using signature search matching algorithm, so as to obtain comprising multiple features
Point to initial characteristicses point set.
Optionally, in any embodiment of the present invention, in order to according to the fisrt feature point in each feature point pairs and the
The matching relationship of two characteristic points, determines subpoint position of the reference point in the second image, can obtain each pair feature point pairs
In each characteristic point characteristic point information, this feature point information includes position of the characteristic point in affiliated image, yardstick and master
Direction.For convenience, the characteristic point for belonging to the first image in feature point pairs is referred to as into fisrt feature point, by this feature point pair
In belong to characteristic point in the second image and be referred to as second feature point.Such as, the characteristic point information of fisrt feature point includes that this is first special
Levy the coordinate position o'clock in the first image, yardstick and principal direction.Wherein it is determined that position of the characteristic point in affiliated image, chi
The method of degree and principal direction can be identical with existing mode.For example, the principal direction of characteristic point can be calculated by SIFT algorithms
When, can be the region of the specified size of adnexa selection in characteristic point, and calculate the histogram of gradients in the region, will be histogrammic
The direction of peak value represents the principal direction of this feature point.
Wherein, this feature point information can be in order to it is determined that during feature point pairs between two width images, calculating
Go out this feature point information, and when it is determined that reference point is in the subpoint of the second image, the characteristic point that direct access has been calculated
Information.
Certainly, the characteristic point information according to feature point pairs, and distinguished point based centering fisrt feature point and second feature point
Matching relationship, determine that the projection pattern of reference point may have various ways.
Referring to Fig. 2, a kind of schematic flow sheet of another embodiment of image matching method of the invention, the reality are it illustrates
Apply example and provide a kind of preferred mode for determining subpoint, the method for the present embodiment includes:
Step 201:The initial characteristicses point set that acquisition is matched from the first image and the second image.
Wherein, multiple feature point pairs are contained in initial characteristicses point set.
The step and the associated description similar process of preceding embodiment, will not be described here.
Step 202:Obtain the characteristic point information of the fisrt feature point and second feature point in each pair feature point pairs.
Wherein, for any pair feature point pairs, the fisrt feature point belongs to the first image for this feature point centering
In characteristic point, the second feature point belongs to the characteristic point in second image for this feature point centering.This feature point packet
Include:Position of the characteristic point in affiliated image, yardstick and principal direction.
Step 203:According to the characteristic point information of fisrt feature point, default ginseng in fisrt feature point and the first image is calculated
First distance of examination point, and fisrt feature point is to the folder between the principal direction of the direction vector and fisrt feature point of the reference point
Angle.
When the fisrt feature point position and the reference point position determine after, can calculate the fisrt feature point with
First distance of the reference point.
For convenience, feature point pairs are set in the present embodiment includes fisrt feature point PriAnd second feature point Pkj。
If the position of the reference point is in the first image(xr,yr)), position of the fisrt feature o'clock in the first image is(xri,yri),
Then this is first apart from dr,riFor:
(Formula one)
Likewise, can determine fisrt feature point to the arrow of the reference point according to the position of the fisrt feature point and reference point
Amount, so that it is determined that going out direction vector, and then obtains the angle Δ between the direction vector and the fisrt feature point principal direction
φr,ri。
Step 204:According to the yardstick of the yardstick and second feature point of the fisrt feature point, second feature point is calculated with respect to the
The scaling factor of one characteristic point.
Wherein, the scaling factor is equal to the yardstick of the second feature point and the ratio of the yardstick of the fisrt feature point.I.e.
Scaling factorWherein, σriFor the yardstick of fisrt feature point, σkjFor the yardstick of second feature point.
Step 205:Based on the matching relationship of fisrt feature point and second feature point, and the position according to second feature point,
Between the principal direction of second feature point, the principal direction of the direction vector and fisrt feature point of the fisrt feature point to the reference point
Angle, the first distance and scaling factor, the position of subpoint of the calculating reference point in the second image.
According to the parameter calculated in above step, it may be determined that go out the matching relationship based on this feature point pair, obtain
The position of subpoint of the reference point in second image.
Specifically, for the matching relationship of this feature point pair, in the following way, calculate the reference point in the second image
Subpoint position P(Xr,kj,Yr,kj):
Xr,kj=xkj+Sri,kj*dr,ri*cos(wkj+Δφr,ri) (Formula two)
Yr,kj=ykj+Sri,kj*dr,ri*sin(wkj+Δφr,ri) (Formula three)
Wherein, xkj,ykjAbscissa and vertical coordinate of the respectively described second feature point in second image;Sri,kj
For scaling factor;dr,riFor the first distance;wkjFor the principal direction of second feature point;Δφr,riBe fisrt feature point to should
Angle between the principal direction of the direction vector and fisrt feature point of reference point.
Step 206:From the plurality of subpoint, it is determined that the distance with the most intensive first position of subpoint in the second image
More than at least one first subpoints of predetermined threshold value.
Step 207:The corresponding feature point pairs of the first subpoint are removed from initial characteristicses point set, and according to initial characteristicses
Remaining feature point pairs in point set, determine the matching relationship of the first image and second image.
Wherein, the operation of step 208 and step 209 is referred to the associated description of above example, will not be described here.
In the embodiment of the present application, as two in feature point pairs can be calculated during matching characteristic point pair
The characteristic point information of the position, yardstick and principal direction of characteristic point, without the need for individually calculating characteristic point information, it is to avoid increase characteristic point
The amount of calculation of information.And this feature point information is utilized, when determining the subpoint of reference point, amount of calculation is less such that it is able to compare
Each subpoint position for determining reference point faster, and then according to the position of subpoint, remove error hiding characteristic point, carry
The high speed for removing Mismatching point.
Optionally, in any of the above one embodiment, multiple subpoints of the reference point in the second image are being obtained
After position, in determining the second image, the most intensive first position of subpoint can be:Count each coordinate in second image
Subpoint number at position, and Density Estimator method is utilized, calculated in the second image respectively and specified at each coordinate position
Density in core window width, determines the first position that density maxima is located.Such as, can be by accumulator to counting in the second image
The subpoint number of each coordinate position.The matching relationship of two characteristic points in based on a feature point pairs, determines
During the position of subpoint of the reference point in second image, the accumulator adds one to the value of the position of Current projection point, until
Multiple subpoints are obtained, each corresponding value in position place is determined, so that it is determined that the subpoint number gone out at each position.
Wherein, it is to be in total number of samples using the total sample number in core window width that the thought of Density Estimator method is, from
And obtain the core window position at density maxima place.
The correspondence method of the present invention, present invention also offers a kind of image matching apparatus.Referring to Fig. 3, the present invention is shown
A kind of structural representation of one embodiment of image matching apparatus, the device of the present embodiment can include:Feature point pairs are obtained
Unit 301, subpoint determining unit 302, subpoint screening unit 303, feature point pairs removal unit 304 and matching unit 305.
Wherein, feature point pairs acquiring unit 301, for obtaining the initial spy matched from the first image and the second image
Point set is levied, wherein, the initial characteristicses point set includes multipair feature point pairs.
Subpoint determining unit 302, for matching relationship successively based on feature point pairs described in each pair, determines described first
In image, subpoint of the default reference point in second image, obtains multiple subpoints.
Subpoint screening unit 303, for from the plurality of subpoint, it is determined that with subpoint in second image most
At least one first subpoints of the distance of intensive first position more than predetermined threshold value;
Feature point pairs removal unit 304, for the first subpoint correspondence is removed from the initial characteristicses point set
Feature point pairs.
Matching unit 305, after eliminating the corresponding feature point pairs of first subpoint for basis, the initial spy
Remaining feature point pairs in point set are levied, the matching relationship of described first image and second image is determined.
Optionally, this feature point is to acquiring unit, more specifically for carrying out to described first image and second image
Scale invariant feature detection, respectively obtains the set of characteristic points to be matched in described first image and the second image;Calculate respectively
The matching relationship of each characteristic point in set of characteristic points to be matched in described first image and the second image, what acquisition was matched
The initial characteristicses point set.
Optionally, the matching unit 305 can include:Coupling subelement is for using stochastical sampling consistency algorithm, right
After eliminating the corresponding feature point pairs of first subpoint, in the initial characteristicses point set, remaining characteristic point is become
Model parameter estimation is changed, and according to the matching of the transformation model parameter determination described first image for estimating and second image
Relation.
Referring to Fig. 4, a kind of structural representation of another embodiment of image matching apparatus in the present invention is shown, with upper one
The difference of device embodiment is:
The subpoint determining unit 302 can include in the present embodiment:
Information acquisition unit 3021, for obtaining the fisrt feature point in each pair feature point pairs and second feature point
Characteristic point information, wherein, the fisrt feature point is the characteristic point in described first image, and the second feature point is the second figure
Characteristic point as in;The characteristic point information includes:Position of the characteristic point in affiliated image, yardstick and principal direction;
Subpoint determination subelement 3022, for matching pass based on the fisrt feature point and the second feature point
System, and according to the fisrt feature point and second feature point characteristic point information and the reference point in described first image
Position, determine subpoint of the reference point in second image.
Further, the subpoint determination subelement, can include:
First computing unit, for the characteristic point information according to the fisrt feature point, calculate the fisrt feature point with
First distance of the reference point, and fisrt feature point and the direction vector of the reference point and the fisrt feature point
Principal direction between the first angle;
Second computing unit, for the yardstick of yardstick and second feature point according to the fisrt feature point, calculates second
Scaling factor of the characteristic point with respect to fisrt feature point;
Subpoint computing unit, for the matching relationship based on the fisrt feature point and second feature point, and according to institute
State the position of second feature point, the principal direction of second feature point, the vector side of fisrt feature point to the reference point
To the angle between the principal direction of the fisrt feature point, first distance and the scaling factor, institute is calculated
State the position of subpoint of the reference point in second image.
Specifically, in the following way, the calculating reference point is in second image for the subpoint computing unit
The position P of subpoint(Xr,kj,Yr,kj):
Xr,kj=xkj+Sri,kj*dr,ri*cos(wkj+Δφr,ri);
Yr,kj=ykj+Sri,kj*dr,ri*sin(wkj+Δφr,ri);
Wherein, xkj,ykjAbscissa and vertical coordinate of the respectively described second feature point in second image;Sri,kj
For the scaling factor;dr,riFor first distance;wkjFor the principal direction of the second feature point;Δφr,riFor institute
State differential seat angle.
Optionally, in any of the above one embodiment, the subpoint screening unit can include:
Subpoint screens subelement, for counting the subpoint number in second image at each coordinate position, and
Using Density Estimator method, the density specified at each coordinate position in second image in core window width is calculated respectively, really
Determine the first position at density maxima place.
On the other hand, present invention also offers a kind of calculate node, referring to Fig. 5, shows a kind of calculate node of the invention
One embodiment structural representation, the calculate node possibly host server comprising computing capability, or personal
Computer PC, or portable portable computer or terminal etc..The calculate node 500 of the present embodiment can include:
Memorizer 501, processor 502, communication interface 503 and communication bus 504.
Wherein, the memorizer 501, processor 502 and communication interface 503 are completed by the communication bus 504 each other
Communication.
The communication interface 504, for net element communication, node, other network terminals in such as shared memory systems etc.
Deng.
The memorizer 501, for the initial characteristicses point set matched from the first image and the second image, and stores bag
The information of the program run containing processor 502, wherein, multiple feature point pairs are contained in the initial characteristicses point set.Its
In, the memorizer may include high-speed RAM memorizer, it is also possible to also including nonvolatile memory.
Processor 502, for matching relationship successively based on feature point pairs described in each pair, determines pre- in described first image
If subpoint of the reference point in second image, obtain multiple subpoints;From the plurality of subpoint, it is determined that and institute
State at least one first subpoints of the distance more than predetermined threshold value of the most intensive first position of subpoint in the second image;From institute
The corresponding feature point pairs of first subpoint are removed in stating initial characteristicses point set, and according in the initial characteristicses point set
Remaining feature point pairs, determine the matching relationship of described first image and second image.
Wherein, a processor possibly central processor CPU, or specific integrated circuit ASIC
(Application Specific Integrated Circuit), or be arranged to implement the one of the embodiment of the present invention
Individual or multiple integrated circuits.
In this specification, each embodiment is described by the way of progressive, and what each embodiment was stressed is and other
The difference of embodiment, between each embodiment identical similar portion mutually referring to.For device disclosed in embodiment
For, as which corresponds to the method disclosed in Example, so description is fairly simple, related part is said referring to method part
It is bright.
Professional further appreciates that, with reference to the unit of each example of the embodiments described herein description
And algorithm steps, can with electronic hardware, computer software or the two be implemented in combination in, in order to clearly demonstrate hardware and
The interchangeability of software, generally describes the composition and step of each example in the above description according to function.These
Function actually with hardware or software mode performing, depending on the application-specific and design constraint of technical scheme.Specialty
Technical staff can use different methods to realize described function to each specific application, but this realization should not
Think beyond the scope of this invention.
The step of method described with reference to the embodiments described herein or algorithm, directly can be held with hardware, processor
Capable software module, or the combination of the two is implementing.Software module can be placed in random access memory(RAM), internal memory, read-only deposit
Reservoir(ROM), electrically programmable ROM, electrically erasable ROM, depositor, hard disk, moveable magnetic disc, CD-ROM or technology
In any other form of storage medium well known in field.
The foregoing description of the disclosed embodiments, enables professional and technical personnel in the field to realize or using the present invention.
Various modifications to these embodiments will be apparent for those skilled in the art, as defined herein
General Principle can be realized without departing from the spirit or scope of the present invention in other embodiments.Therefore, the present invention
The embodiments shown herein is not intended to be limited to, and is to fit to and principles disclosed herein and features of novelty phase one
The most wide scope for causing.
Claims (12)
1. a kind of image matching method, it is characterised in that include:
The initial characteristicses point set that acquisition is matched from the first image and the second image, wherein, the initial characteristicses point set
Include multipair feature point pairs, feature point pairs described in each pair include two respectively from described first image and the second image it is special
Levy a little, and described two characteristic points have matching relationship;
Matching relationship based on feature point pairs described in each pair successively, determines in described first image default reference point described
Subpoint in two images, obtains multiple subpoints;
From the plurality of subpoint, it is determined that being more than with the distance of the most intensive first position of subpoint in second image pre-
If at least one first subpoints of threshold value;
The corresponding feature point pairs of first subpoint are removed from the initial characteristicses point set;
Carry out transformation model parameter estimation according to remaining feature point pairs in the initial characteristicses point set, and according to estimating
Transformation model parameter, determines the matching relationship of described first image and second image.
2. method according to claim 1, it is characterised in that the matching successively based on feature point pairs described in each pair is closed
System, determines subpoint of the default reference point in second image in described first image, including:
The characteristic point information of the fisrt feature point and second feature point in each pair feature point pairs is obtained, wherein, described first
Characteristic point is the characteristic point in described first image, and the second feature point is the characteristic point in the second image;The characteristic point
Information includes:Position of the characteristic point in affiliated image, yardstick and principal direction;
It is based on the matching relationship of the fisrt feature point and the second feature point and special according to the fisrt feature point and second
Position of the characteristic point information and the reference point a little in described first image is levied, determines the reference point described second
Subpoint in image.
3. method according to claim 2, it is characterised in that described based on the fisrt feature point and the second feature
Point matching relationship, and according to the fisrt feature point and second feature point characteristic point information and the reference point described
Position in first image, determines subpoint of the reference point in second image, including:
According to the characteristic point information of the fisrt feature point, the first distance of the fisrt feature point and the reference point is calculated,
And the fisrt feature point is to the angle between the principal direction of the direction vector and fisrt feature point of the reference point;
According to the yardstick of the yardstick and second feature point of the fisrt feature point, second feature point is calculated with respect to fisrt feature point
Scaling factor;
It is based on the matching relationship of the fisrt feature point and second feature point and the position according to the second feature point, described
The main formula of the principal direction of second feature point, the direction vector of fisrt feature point to the reference point and the fisrt feature point
Angle, first distance and the scaling factor between, the calculating reference point is in second image
Subpoint position.
4. method according to claim 3, it is characterised in that described based on the fisrt feature point and second feature point
Matching relationship, and the position according to the second feature point, the principal direction of second feature point, fisrt feature point are to institute
Angle, first distance and the yardstick between the principal direction of the direction vector and fisrt feature point of stating reference point
Zoom factor, calculates the position of subpoint of the reference point in second image, including:
In the following way, the position P (X of subpoint of the reference point in second image are calculatedr,kj,Yr,kj):
Xr,kj=xkj+Sri,kj*dr,ri*cos(wkj+Δφr,ri);
Yr,kj=ykj+Sri,kj*dr,ri*sin(wkj+Δφr,ri);
Wherein, xkj,ykjAbscissa and vertical coordinate of the respectively described second feature point in second image;Sri,kjFor institute
State scaling factor;dr,riFor first distance;wkjFor the principal direction of the second feature point;Δφr,riFor described
One characteristic point is to the angle between the principal direction of the direction vector and fisrt feature point of the reference point.
5. the method according to any one of Claims 1-4, it is characterised in that described from the plurality of subpoint, it is determined that
It is more than at least one first subpoints of predetermined threshold value with the distance of the most intensive first position of subpoint in second image,
Including:
The subpoint number at each coordinate position in second image is counted, and utilizes Density Estimator method, counted respectively
The density specified at each coordinate position in second image in core window width is calculated, determine that density maxima is located first
Put.
6. the method according to any one of Claims 1-4, it is characterised in that described according to the initial characteristicses point set
In remaining feature point pairs, determine the matching relationship of described first image and second image, including:
Using stochastical sampling consistency algorithm, transformation model parameter is carried out to remaining characteristic point in the initial characteristicses point set
Estimate, and according to the transformation model parameter determination described first image for estimating and the matching relationship of second image.
7. the method according to any one of Claims 1-4, it is characterised in that the acquisition is from the first image and the second figure
The initial characteristicses point set matched as in, including:
Multiple dimensioned invariant features detection is carried out to described first image and second image, respectively obtain described first image and
Set of characteristic points to be matched in second image;
The matching relationship of each characteristic point in described first image and the set of characteristic points to be matched in the second image is calculated respectively,
The initial characteristicses point set that acquisition is matched.
8. a kind of image matching apparatus, it is characterised in that include:
Feature point pairs acquiring unit, for obtaining the initial characteristicses point set matched from the first image and the second image, its
In, the initial characteristicses point set includes multipair feature point pairs, and feature point pairs described in each pair are included respectively from described first
Two characteristic points of image and the second image, and described two characteristic points have matching relationship;
Subpoint determining unit, for matching relationship successively based on feature point pairs described in each pair, in determining described first image
Subpoint of the default reference point in second image, obtains multiple subpoints;
Subpoint screening unit, for from the plurality of subpoint, it is determined that most intensive with subpoint in second image
At least one first subpoints of the distance of first position more than predetermined threshold value;
Feature point pairs removal unit, for the corresponding characteristic point of first subpoint is removed from the initial characteristicses point set
It is right;
Matching unit, after eliminating the corresponding feature point pairs of first subpoint for basis, the initial characteristicses point set
Remaining feature point pairs in conjunction, carry out transformation model parameter estimation, and according to the transformation model parameter for estimating, determine described
The matching relationship of one image and second image.
9. device according to claim 8, it is characterised in that the subpoint determining unit, including:
Information acquisition unit, the characteristic point for obtaining the fisrt feature point in each pair feature point pairs and second feature point are believed
Breath, wherein, the fisrt feature point is the characteristic point in described first image, and the second feature point is the spy in the second image
Levy a little;The characteristic point information includes:Position of the characteristic point in affiliated image, yardstick and principal direction;
Subpoint determination subelement, for the matching relationship based on the fisrt feature point and the second feature point, and foundation
Position of the characteristic point information and the reference point of the fisrt feature point and second feature point in described first image, really
Fixed subpoint of the reference point in second image.
10. device according to claim 8, it is characterised in that the subpoint determination subelement, including:
First computing unit, for the characteristic point information according to the fisrt feature point, calculate the fisrt feature point with it is described
First distance of reference point, and the master of fisrt feature point and the direction vector of the reference point and the fisrt feature point
Angle between direction;
Second computing unit, for the yardstick of yardstick and second feature point according to the fisrt feature point, calculates second feature
The scaling factor of the relative fisrt feature point of point;
Subpoint computing unit, for the matching relationship based on the fisrt feature point and second feature point, and according to described the
The position of two characteristic points, the principal direction of second feature point, the direction vector of fisrt feature point to the reference point with
Angle, first distance and the scaling factor between the principal direction of the fisrt feature point, calculates the ginseng
The position of subpoint of the examination point in second image.
11. devices according to any one of claim 8 to 10, it is characterised in that the subpoint screening unit, including:
Subpoint screens subelement, for counting the subpoint number in second image at each coordinate position, and utilizes
Density Estimator method, calculates the density specified at each coordinate position in second image in core window width respectively, determines close
The first position that degree maximum is located.
12. devices according to any one of claim 8 to 10, it is characterised in that the matching unit includes:
Coupling subelement, for using stochastical sampling consistency algorithm, to eliminating the corresponding characteristic point of first subpoint
To afterwards, in the initial characteristicses point set, remaining characteristic point carries out transformation model parameter estimation, and according to the change for estimating
Mold changing shape parameter determines the matching relationship of described first image and second image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310374229.7A CN103473565B (en) | 2013-08-23 | 2013-08-23 | Image matching method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310374229.7A CN103473565B (en) | 2013-08-23 | 2013-08-23 | Image matching method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103473565A CN103473565A (en) | 2013-12-25 |
CN103473565B true CN103473565B (en) | 2017-04-26 |
Family
ID=49798409
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310374229.7A Expired - Fee Related CN103473565B (en) | 2013-08-23 | 2013-08-23 | Image matching method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103473565B (en) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9558426B2 (en) * | 2014-04-24 | 2017-01-31 | Nant Holdings Ip, Llc | Robust feature identification for image-based object recognition |
KR101932547B1 (en) * | 2014-10-23 | 2018-12-27 | 한화테크윈 주식회사 | Camera system and Method of image registration thereof |
CN105095344B (en) * | 2015-05-29 | 2018-08-17 | 南昌正业科技有限公司 | A kind of modeling method and device of space-time loop data structure |
CN105205494B (en) * | 2015-08-31 | 2018-12-11 | 小米科技有限责任公司 | Similar pictures recognition methods and device |
KR20170070415A (en) * | 2015-12-14 | 2017-06-22 | 현대자동차주식회사 | Image projectiing apparatus and control method thereof |
AU2016409676B2 (en) * | 2016-06-08 | 2020-01-30 | Huawei Technologies Co., Ltd. | Processing method and terminal |
CN106779055B (en) * | 2017-01-10 | 2019-06-21 | 北京邮电大学 | Image characteristic extracting method and device |
CN108182457B (en) * | 2018-01-30 | 2022-01-28 | 百度在线网络技术(北京)有限公司 | Method and apparatus for generating information |
CN109186625B (en) * | 2018-10-24 | 2020-05-05 | 北京奥特贝睿科技有限公司 | Method and system for accurately positioning intelligent vehicle by using hybrid sampling filtering |
CN110335315B (en) * | 2019-06-27 | 2021-11-02 | Oppo广东移动通信有限公司 | Image processing method and device and computer readable storage medium |
CN111010496B (en) * | 2019-12-24 | 2022-07-08 | 维沃移动通信(杭州)有限公司 | Image processing method and electronic equipment |
CN111083456B (en) * | 2019-12-24 | 2023-06-16 | 成都极米科技股份有限公司 | Projection correction method, apparatus, projector, and readable storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007334795A (en) * | 2006-06-19 | 2007-12-27 | Sony Corp | Information processor and information processing method, and program |
CN101984463A (en) * | 2010-11-02 | 2011-03-09 | 中兴通讯股份有限公司 | Method and device for synthesizing panoramic image |
CN102005047A (en) * | 2010-11-15 | 2011-04-06 | 无锡中星微电子有限公司 | Image registration system and method thereof |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101556692A (en) * | 2008-04-09 | 2009-10-14 | 西安盛泽电子有限公司 | Image mosaic method based on neighborhood Zernike pseudo-matrix of characteristic points |
KR20110064197A (en) * | 2009-12-07 | 2011-06-15 | 삼성전자주식회사 | Object recognition system and method the same |
JP2011259044A (en) * | 2010-06-04 | 2011-12-22 | Panasonic Corp | Image processing device and image processing method |
CN102256111B (en) * | 2011-07-17 | 2013-06-12 | 西安电子科技大学 | Multi-channel panoramic video real-time monitoring system and method |
-
2013
- 2013-08-23 CN CN201310374229.7A patent/CN103473565B/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007334795A (en) * | 2006-06-19 | 2007-12-27 | Sony Corp | Information processor and information processing method, and program |
CN101984463A (en) * | 2010-11-02 | 2011-03-09 | 中兴通讯股份有限公司 | Method and device for synthesizing panoramic image |
CN102005047A (en) * | 2010-11-15 | 2011-04-06 | 无锡中星微电子有限公司 | Image registration system and method thereof |
Non-Patent Citations (1)
Title |
---|
一种改进的Harris特征点匹配算法;张波等;《计算机系统应用》;20130731;第22卷(第7期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN103473565A (en) | 2013-12-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103473565B (en) | Image matching method and device | |
US9942535B2 (en) | Method for 3D scene structure modeling and camera registration from single image | |
CN109141364A (en) | Obstacle detection method, system and robot | |
CN103034982B (en) | Image super-resolution rebuilding method based on variable focal length video sequence | |
CN109029381A (en) | A kind of detection method of tunnel slot, system and terminal device | |
CN104504723B (en) | Image registration method based on remarkable visual features | |
KR101645292B1 (en) | System and method for automatic planning of two-dimensional views in 3d medical images | |
CN107194959A (en) | The method and apparatus that image registration is carried out based on section | |
TW201710989A (en) | System and method for determining whether a product image includes a logo pattern | |
CN105953773B (en) | Ramp slope angle acquisition methods and device | |
CN110210067B (en) | Method and device for determining threshold straight line based on measurement track | |
CN109239654A (en) | Positioning using TDOA result method for correcting error neural network based | |
CN108122280A (en) | The method for reconstructing and device of a kind of three-dimensional point cloud | |
CN108737694A (en) | Camera arrangement and image providing method | |
CN109948439A (en) | A kind of biopsy method, system and terminal device | |
CN113112486B (en) | Tumor motion estimation method and device, terminal equipment and storage medium | |
CN108765447A (en) | A kind of image partition method, image segmentation device and electronic equipment | |
CN107918688A (en) | Model of place method for dynamic estimation, data analysing method and device, electronic equipment | |
CN104573737B (en) | The method and device of positioning feature point | |
CN101894369A (en) | Real-time method for computing focal length of camera from image sequence | |
CN109684950A (en) | A kind of processing method and electronic equipment | |
Clyne et al. | Physically-based feature tracking for CFD data | |
CN111814514A (en) | Number recognition device and method and electronic equipment | |
CN112991445B (en) | Model training method, gesture prediction method, device, equipment and storage medium | |
CN116309643A (en) | Face shielding score determining method, electronic equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20170426 Termination date: 20180823 |
|
CF01 | Termination of patent right due to non-payment of annual fee |