CN114119952B - Image matching method and device based on edge information - Google Patents
Image matching method and device based on edge information Download PDFInfo
- Publication number
- CN114119952B CN114119952B CN202111190561.9A CN202111190561A CN114119952B CN 114119952 B CN114119952 B CN 114119952B CN 202111190561 A CN202111190561 A CN 202111190561A CN 114119952 B CN114119952 B CN 114119952B
- Authority
- CN
- China
- Prior art keywords
- layer
- image
- pyramid
- matching
- current
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 61
- 238000004364 calculation method Methods 0.000 claims description 11
- 238000005259 measurement Methods 0.000 claims description 5
- 230000001133 acceleration Effects 0.000 claims description 4
- 238000011524 similarity measure Methods 0.000 claims description 4
- 238000005286 illumination Methods 0.000 abstract description 5
- 238000001914 filtration Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
Landscapes
- Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Analysis (AREA)
Abstract
The invention provides an image matching method based on edge information, which comprises the steps of respectively layering a template image and a target image to be detected in a pyramid manner; setting the rotation step length of each layer of the template image; rotating the template image under each pyramid layer; acquiring edge gradient information of each image in the rotating image set; acquiring edge gradient information of each pyramid layer image in the target image to be detected; setting a search range of the current pyramid layer based on a result with maximum similarity in matching information of a layer above the current pyramid layer; matching is carried out based on the search range of the current pyramid layer, and matching information of the current pyramid layer is obtained; the current pyramid layer is the bottommost pyramid layer, and the matching information of the current pyramid layer is recorded as the final matching position of the target. The method can realize rapid, stable and high-precision matching and positioning under the conditions of scaling, rotation, shielding, uneven illumination, disordered background and the like of the target image to be detected.
Description
Technical Field
The present invention relates to the field of image recognition technologies, and in particular, to an image matching method and apparatus based on edge information.
Background
Image matching is a process of searching an image area similar to a template image in a target image to be detected. Existing image matching algorithms fall broadly into the following categories: gray-scale based matching algorithm, feature-based matching algorithm, transform domain-based matching algorithm, model-based matching algorithm.
The matching algorithm based on the gray scale realizes matching by calculating the gray value similarity of the template image and the target image to be detected. The matching algorithm can obtain a satisfactory result under good illumination conditions, but is sensitive to noise, and can have a great influence on matching precision when external conditions change or target images are defected, shielded and the like. The feature-based matching algorithm firstly extracts the features of the template image, and then matches the template image with the target image to be detected through the similarity degree of the features. The features extracted by the matching algorithm comprise point features, edge features, regional features and the like. The matching algorithm based on the transform domain refers to matching the image in the frequency domain, wherein the frequency domain transform mainly comprises Fourier transform and wavelet transform, and the matching algorithm has slower development. The matching algorithm based on the model firstly establishes a motion model, then estimates target transformation parameters, and finally obtains a matching position through Kalman filtering. In the matching algorithm, accurate establishment of a motion model is a key of correct matching results, but calculation analysis and model updating are complex, and real-time performance is poor.
In an actual scene, a feature-based matching algorithm is more applied, but the problems of different sizes and angles of objects in a target image to be detected and objects in a template image exist, so that the target image and the template image cannot be completely matched, the target image to be detected is easily affected by illumination, shielding, noise and the like, and the image matching result is often low in precision, low in speed and poor in stability.
Disclosure of Invention
In order to solve the technical problems, the invention provides an image matching method and device based on edge information, which are used for solving the technical problems of low accuracy, low speed and poor stability of an image matching result in the prior art.
According to a first aspect of the present invention, there is provided an image matching method based on edge information, the method comprising the steps of:
step S101: setting the number of pyramid layers, and respectively layering the template image and the target image to be tested;
Step S102: setting the rotation step length of each layer of the template image; rotating the template image under each pyramid layer within a circumferential range according to the rotation step length, and obtaining a rotation image once per rotation; obtaining a rotation image set M in of each pyramid layer; wherein i represents a pyramid layer, and n represents the number of rotated images in an ith layer;
Step S103: acquiring edge gradient information of each image in the rotating image set M in;
Step S104: acquiring edge gradient information of each pyramid layer image in the target image to be detected;
Step S105: taking the topmost pyramid layer of the target image to be detected as a current pyramid layer, searching a template image in the current pyramid layer, and calculating the similarity between the template image and the target image to be detected to obtain first matching information;
Step S106: if the current pyramid layer is the bottommost pyramid layer, marking the matching information of the current pyramid layer as the final matching position (u final,vfinal,θfinal) of the target, and ending the method; otherwise, the next layer of the current pyramid layer is set as the current pyramid layer; wherein u final,vfinal,θfinal is the x coordinate of the final matching position, the y coordinate of the final matching position and the rotation angle value of the template image corresponding to the final matching position respectively;
step S107: setting a search range (u i+1±Δu,vi+1±Δv,θi+1±AngleStepi+1) of the current pyramid layer based on a result (u i+1,vi +1,θi+1) with the maximum similarity in matching information of a layer above the current pyramid layer; wherein, (u i+1,vi+1) is coordinate information of a matching point with the maximum similarity obtained in the upper layer of the current pyramid layer, θ i+1 is angle information obtained in the upper layer of the current pyramid layer, deltau and Deltav are coordinate difference values set according to the image size, and ANGLESTEP i+1 is a rotation step length of the upper layer of the current pyramid layer; matching is carried out based on the search range of the current pyramid layer, and matching information of the current pyramid layer is obtained; step S106 is entered.
According to a second aspect of the present invention, there is provided an image matching apparatus based on edge information, the apparatus comprising:
an initialization module: the method comprises the steps of configuring to set pyramid layers, and respectively layering a template image and a target image to be tested;
And (3) a rotation module: the method comprises the steps of setting rotation step sizes of layers of the template image; rotating the template image under each pyramid layer within a circumferential range according to the rotation step length, and obtaining a rotation image once per rotation; obtaining a rotation image set M in of each pyramid layer; wherein i represents a pyramid layer, and n represents the number of rotated images in an ith layer;
edge gradient acquisition module: the method comprises the steps of acquiring edge gradient information of each image in the rotating image set M in;
the target edge gradient acquisition module to be measured: the method comprises the steps of acquiring edge gradient information of each pyramid layer image in the target image to be detected;
A first matching module: the method comprises the steps of using the topmost pyramid layer of the target image to be detected as a current pyramid layer, searching a template image in the current pyramid layer, and calculating the similarity between the template image and the target image to be detected to obtain first matching information;
And a judging module: if the current pyramid layer is the bottommost pyramid layer, marking the matching information of the current pyramid layer as the final matching position (u final,vfinal,θfinal) of the target, and ending the method; otherwise, the next layer of the current pyramid layer is set as the current pyramid layer; wherein u final,vfinal,θfinal is the x coordinate of the final matching position, the y coordinate of the final matching position and the rotation angle value of the template image corresponding to the final matching position respectively;
Similarity calculation module: the method comprises the steps of setting a search range (u i+1±Δu,vi+1±Δv,θi+1±AngleStepi+1) of a current pyramid layer based on a result (u i+1,vi+1,θi+1) with maximum similarity in matching information of a layer above the current pyramid layer; wherein, (u i+1,vi+1) is coordinate information of a matching point with the maximum similarity obtained in the upper layer of the current pyramid layer, θ i+1 is angle information obtained in the upper layer of the current pyramid layer, deltau and Deltav are coordinate difference values set according to the image size, and ANGLESTEP i+1 is a rotation step length of the upper layer of the current pyramid layer; matching is carried out based on the search range of the current pyramid layer, and matching information of the current pyramid layer is obtained; and triggering a judging module.
According to a third aspect of the present invention, there is provided an image matching system based on edge information, comprising:
a processor for executing a plurality of instructions;
a memory for storing a plurality of instructions;
Wherein the plurality of instructions are configured to be stored by the memory and loaded by the processor and perform the image matching method based on edge information as described above.
According to a fourth aspect of the present invention, there is provided a computer-readable storage medium having stored therein a plurality of instructions; the instructions are configured to be loaded and executed by the processor to perform the image matching method based on the edge information as described above.
According to the scheme, the template image is described by adopting the edge gradient characteristics, so that the calculated amount during image matching can be reduced, the matching time is shortened, and the method has strong robustness in the aspects of noise resistance, shielding and the like. According to the method, the number of layers of the image pyramid is automatically calculated according to the size of the image and the number of the feature points, and different angle rotation step sizes are set on different feature layers of the template image pyramid, so that the problem of complex calculation caused by angle enumeration in the existing matching method can be solved, and various complex scenes can be more rapidly and stably dealt with. The method sets the coefficient b in the termination condition calculation of the similarity measurement, and can meet the requirement of matching scores of images of all layers in the image pyramid, thereby improving the accuracy and stability of matching. When the matching angle is determined, the method marks the sub-angle value as the final matching angle through polynomial fitting, and the matching accuracy is further improved. The rapid, stable and high-precision matching and positioning can be realized under the conditions of scaling, rotation, shielding, uneven illumination, disordered background and the like of the target image to be detected.
The foregoing description is only an overview of the present invention, and is intended to provide a better understanding of the present invention, as it is embodied in the following description, with reference to the preferred embodiments of the present invention and the accompanying drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention, illustrate the invention and together with the description serve to explain the invention. In the drawings:
FIG. 1 is a schematic flow chart of an image matching method based on edge information according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of image matching based on edge information according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an image matching device based on edge information according to an embodiment of the present invention.
Detailed Description
First, an image matching method based on edge information according to an embodiment of the present invention is described with reference to fig. 1, and as shown in fig. 1-2, the image matching method based on edge information includes:
step S101: setting the number of pyramid layers, and respectively layering the template image and the target image to be tested;
Step S102: setting the rotation step length of each layer of the template image; rotating the template image under each pyramid layer within a circumferential range according to the rotation step length, and obtaining a rotation image once per rotation; obtaining a rotation image set M in of each pyramid layer; wherein i represents a pyramid layer, and n represents the number of rotated images in an ith layer;
Step S103: acquiring edge gradient information of each image in the rotating image set M in;
Step S104: acquiring edge gradient information of each pyramid layer image in the target image to be detected;
Step S105: taking the topmost pyramid layer of the target image to be detected as a current pyramid layer, searching a template image in the current pyramid layer, and calculating the similarity between the template image and the target image to be detected to obtain first matching information;
Step S106: if the current pyramid layer is the bottommost pyramid layer, marking the matching information of the current pyramid layer as the final matching position (u final,vfinal,θfinal) of the target, and ending the method; otherwise, the next layer of the current pyramid layer is set as the current pyramid layer; wherein u final,vfinal,θfinal is the x coordinate of the final matching position, the y coordinate of the final matching position and the rotation angle value of the template image corresponding to the final matching position respectively;
step S107: setting a search range (u i+1±Δu,vi+1±Δv,θi+1±AngleStepi+1) of the current pyramid layer based on a result (u i+1,vi +1,θi+1) with the maximum similarity in matching information of a layer above the current pyramid layer; wherein, (u i+1,vi+1) is coordinate information of a matching point with the maximum similarity obtained in the upper layer of the current pyramid layer, θ i+1 is angle information obtained in the upper layer of the current pyramid layer, deltau and Deltav are coordinate difference values set according to the image size, and ANGLESTEP i+1 is a rotation step length of the upper layer of the current pyramid layer; matching is carried out based on the search range of the current pyramid layer, and matching information of the current pyramid layer is obtained; step S106 is entered.
The step S101: pyramid layering is respectively carried out on the template image and the target image to be detected, the required pyramid layer number is obtained, and the method comprises the following steps:
Step S1011: acquiring the length and width of the reduced image:
Li=L/ki
(equation 1)
Wherein i represents the ith pyramid hierarchy; l represents the initial length and width of the image, namely the length and width of the bottommost layer of the pyramid; l i represents the length and width of the ith layer pyramid; k represents the multiple of image shrinkage, and k is more than or equal to 2;
Step S1012: acquiring the required pyramid layer number i c according to the reduced length and width of the image:
Extracting characteristic points of each pyramid layer by adopting a sobel operator, and if the number of characteristic points of an ith c +1 layer image in a pyramid of a template image is smaller than or equal to a preset characteristic point number threshold value, the number of characteristic points of an ith c layer image is larger than the preset characteristic point number threshold value, and the number of characteristic points of an ith c layer image is the least of the number of characteristic points of 0 th to ith c layers of images, determining the number of pyramid layers required by the template image as an i c layer; the pyramid layer number of the target image to be measured is set as an i c layer.
For example, if the number of feature points of the 4 th layer image in the template image pyramid is smaller than the preset feature point number threshold 20 and the number of feature points of the 3 rd layer image is larger than the preset feature point number threshold 20, the number of layers of the template image pyramid is determined to be 3 layers. Likewise, the number of layers of the pyramid of the target image to be measured is 3.
Further, before the step S101, the template image and the target image to be detected are preprocessed, where the preprocessing process is: according to the image imaging quality, a proper filtering method, such as Gaussian filtering, is selected, the image edge is smoothed through filtering, noise in the image background is removed, and the influence of interference factors is reduced, so that the matching efficiency and the matching precision are improved.
The step S102: setting the rotation step length of each layer of the template image; rotating the template image under each pyramid layer within a circumferential range according to the rotation step length, and obtaining a rotation image once per rotation; obtaining a rotation image set M in of each pyramid layer, wherein:
the rotation step ANGLESTEP i of the ith layer of the template image is as follows:
ANGLESTEP i=si (formula 2)
Wherein s is more than or equal to 2, and the value of s corresponds to the multiple k of image shrinkage so as to better adapt to the change of image scaling and rotation. For example, taking s=2, the rotation step of each pyramid layer is: ANGLESTEP 0=1,AngleStep1=2,AngleStep2 = 4.
Rotating each pyramid layer of the template image according to the rotation step length, including:
And rotating each pyramid layer of the template image in a clockwise or anticlockwise direction according to the rotation step length, and generating template images with multiple angles by each pyramid layer. For example: the rotation step length of the 0 th layer template image is 1, the original 0 degree template is respectively rotated clockwise by 1 degree, 2 degrees … … degrees 358 degrees and 359 degrees, and at the moment, the 0 th layer pyramid shares 360 kinds of template images. The rotation step length of the template image of the 1 st layer is 2, the original 0-degree template is respectively rotated by 2 degrees, 4 degrees … … degrees and 358 degrees clockwise, and at the moment, the template image of 180 angles is shared by the pyramid of the 1 st layer. The rotation step length of the 2 nd layer template image is 4, the original 0 degree template is respectively rotated clockwise by 4 degrees, 8 degrees … … degrees 352 degrees and 356 degrees, and at the moment, the 2 nd layer pyramid shares 90 angles of template images.
In each layer of pyramid, the rotated template images with different angles are marked as a rotated image set M in of the layer, n represents the number of the template images with different angles in the rotated image set of the ith layer, num represents the element subscript in the rotated image set of the ith layer, num=0, 1,2 … … ceil ((360/ANGLESTEP i) -1), ceil represents an upward rounding function. For example: when i=0, num=0, 1,2 … … 359, representing that the layer 0 rotated image set is { M 00,M01……M0359 }; when i=1, num=0, 1,2 … … 179, representing that the layer 1 rotated image set is { M 10,M11……M1179 }; when i=2, num=0, 1,2 … … 89, which means that the layer 2 rotated image set is { M 20,M21……M289 }.
The step S103: acquiring edge gradient information of each image in the rotation image set M in, wherein: the edge gradient information includes x-direction gradient informationAnd y-direction gradient informationWherein, (x j,yj) represents the coordinates of the jth edge point in the template image, in this embodiment, the sobel operator is used to extract the edge gradient information.
The step S104: acquiring edge gradient information of each pyramid layer image in the target image to be detected; wherein: the edge gradient information includes x-direction gradient informationAnd y-direction gradient informationWherein, (x, y) represents coordinates of edge points in the target image to be detected, and in this embodiment, a sobel operator is adopted to extract edge gradient information.
The step S105, wherein the similarity calculation formula is:
wherein m represents the total number of feature points of the template image under a certain angle of the ith pyramid layer; And Gradient values along the x and y directions at (x j,yj) of the template image subscripted num of the i-th layer pyramid rotated image set are respectively represented; And Gradient values of the ith layer of target image to be measured along the x and y directions at (u+x j,v+yj) are respectively represented; A modulus representing the gradient of the template subscripted num of the i-th layer pyramid rotated image set at (x j,yj); A modulus representing the gradient of the i-th layer of the image of the object to be measured at (u+x j,v+yj), i.e
Wherein, The template image subscripted num for the i-th layer pyramid rotation image set is squared along the x-direction gradient values at (x j,yj),The template image subscripted num for the i-th layer pyramid rotation image set is squared along the y-direction gradient values at (x j,yj),For the i-th layer of the image of the object to be measured at (u + x j,v+yj) along the square of the gradient value in the x-direction,The target image to be measured for the i-th layer is squared along the y-direction gradient value at (u+x j,v+yj).
And carrying out similarity calculation on the template images under all angles of the pyramid layer and the target image to be detected, and obtaining a value with the maximum similarity.
In the formula 3, score inum represents the similarity between the template image and the search area of the target image to be measured, and the value is 0 to 1. When Score inum =1, it means that the template image is completely matched with the search area of the target image to be measured. The larger the Score inum value is, the higher the matching degree is, which indicates that the template image is more similar to the search area of the target image to be detected. In this embodiment, the normalization method is used in the similarity measurement formula, and thus, there is illumination invariance.
Further, setting an acceleration termination condition of similarity measurement, if m feature points are total for a template image under a certain specific angle of a certain pyramid layer, calculating a sum Score inuma of similarity of a (a is less than or equal to m) feature points, wherein the acceleration termination condition is shown in formula 6:
In the formula 6, score inuma represents the sum of the similarity of a (a.ltoreq.m) feature points; a represents the number of the calculated template image feature points; s min denotes a set minimum match score; g represents the set greedy degree, and the value range is 0-1; b represents the coefficient of the set minimum matching score s min, and the value range of b is 0-1; the formula of the sum Score inuma of the similarity of a feature points is shown in formula 7:
If Score inuma satisfies the accelerated termination condition of the similarity measure, the similarity of the remaining (m-a) feature points is not calculated any more, and the similarity measure calculation is terminated at the current position.
In this embodiment, the minimum matching score s min is set only once in the termination condition in general, but in an actual scene, it is not reasonable to use only one fixed score s min when the image pyramid performs the similarity measurement on each layer. When the pyramid is at a high level, the similarity between the template image and the target image to be detected is not high, and the obtained score is possibly smaller, so that the minimum matching score is set as b.s min at each layer of the pyramid of the image more reasonably. The coefficient b can meet the requirements of matching scores of images of all layers in different image pyramids to be detected, and the accuracy and stability of matching can be further improved.
Step S105: taking the topmost pyramid layer of the target image to be detected as a current pyramid layer, searching a template image at the current pyramid layer, and calculating the similarity between the template image and the target image to be detected to obtain first matching information, wherein:
When the topmost layer is matched, each template image which is rotated out obtains a matching result, one result (u i+1,vi+1,θi+1) with the largest similarity is screened out, the searching range of the target image to be detected on the sub-top layer pyramid layer is set to be (u i+1±Δu,vi+1±Δv,θi+1±AngleStepi+1), wherein (u i+1,vi+1) is matching point coordinate information obtained in the topmost layer pyramid, theta i+1 is angle information obtained in the topmost layer pyramid, delta u and delta v are pixel difference values which can be set manually, and the setting can be carried out according to the size of the image, for example, the setting is 5 pixels. ANGLESTEP i+1 is the rotation step in the topmost pyramid.
Step S106: if the current pyramid layer is the bottommost pyramid layer, marking the matching information of the current pyramid layer as the final matching position (u final,vfinal,θfinal) of the target, and ending the method; otherwise, the next layer of the current pyramid layer is set as the current pyramid layer, wherein:
In order to obtain a more accurate matching angle value, matching similarity scores at two angles of theta final+AngleStep1 and theta final-AngleStep1 can be calculated respectively according to matching information (u final,vfinal,θfinal) of the bottommost layer, then three matching angles of theta final、θfinal+AngleStep1、θfinal-AngleStep1 and corresponding matching scores thereof are fitted by utilizing a quadratic curve y=a 1x2+a2x+a3, wherein theta corresponds to x in the quadratic curve, the matching score corresponds to y in the quadratic curve, so that a sub-angle value corresponding to the maximum value of the similarity score on the quadratic curve is obtained through solving, and the sub-angle value is recorded as a final matching angle theta final.
In this embodiment, sorting and screening are involved in the matching process, and the sorting can select a proper algorithm according to the size of the data volume, and when the data volume is large, a quick sorting algorithm can be used for sorting, so that the processing speed is improved.
The embodiment of the invention further provides an image matching device based on the edge information, as shown in fig. 3, the device comprises:
an initialization module: the method comprises the steps of configuring to set pyramid layers, and respectively layering a template image and a target image to be tested;
And (3) a rotation module: the method comprises the steps of setting rotation step sizes of layers of the template image; rotating the template image under each pyramid layer within a circumferential range according to the rotation step length, and obtaining a rotation image once per rotation; obtaining a rotation image set M in of each pyramid layer; wherein i represents a pyramid layer, and n represents the number of rotated images in an ith layer;
edge gradient acquisition module: the method comprises the steps of acquiring edge gradient information of each image in the rotating image set M in;
the target edge gradient acquisition module to be measured: the method comprises the steps of acquiring edge gradient information of each pyramid layer image in the target image to be detected;
A first matching module: the method comprises the steps of using the topmost pyramid layer of the target image to be detected as a current pyramid layer, searching a template image in the current pyramid layer, and calculating the similarity between the template image and the target image to be detected to obtain first matching information;
And a judging module: if the current pyramid layer is the bottommost pyramid layer, marking the matching information of the current pyramid layer as the final matching position (u final,vfinal,θfinal) of the target, and ending the method; otherwise, the next layer of the current pyramid layer is set as the current pyramid layer; wherein u final,vfinal,θfinal is the x coordinate of the final matching position, the y coordinate of the final matching position and the rotation angle value of the template image corresponding to the final matching position respectively;
Similarity calculation module: the method comprises the steps of setting a search range (u i+1±Δu,vi+1±Δv,θi+1±AngleStepi+1) of a current pyramid layer based on a result (u i+1,vi+1,θi+1) with maximum similarity in matching information of a layer above the current pyramid layer; wherein, (u i+1,vi+1) is coordinate information of a matching point with the maximum similarity obtained in the upper layer of the current pyramid layer, θ i+1 is angle information obtained in the upper layer of the current pyramid layer, deltau and Deltav are coordinate difference values set according to the image size, and ANGLESTEP i+1 is a rotation step length of the upper layer of the current pyramid layer; matching is carried out based on the search range of the current pyramid layer, and matching information of the current pyramid layer is obtained; and triggering a judging module.
The embodiment of the invention further provides an image matching system based on the edge information, which comprises the following steps:
a processor for executing a plurality of instructions;
a memory for storing a plurality of instructions;
Wherein the plurality of instructions are configured to be stored by the memory and loaded by the processor and perform the image matching method based on edge information as described above.
The embodiment of the invention further provides a computer readable storage medium, wherein a plurality of instructions are stored in the storage medium; the instructions are configured to be loaded and executed by the processor to perform the image matching method based on the edge information as described above.
It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other.
In the several embodiments provided in the present invention, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the elements is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for making a computer device (which may be a personal computer, a physical machine Server, or a network cloud Server, etc., and need to install a Windows or Windows Server operating system) execute part of the steps of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above description is only of the preferred embodiments of the present invention, and is not intended to limit the present invention in any way, but any simple modification, equivalent variation and modification made to the above embodiments according to the technical substance of the present invention still fall within the scope of the technical solution of the present invention.
Claims (7)
1. An image matching method based on edge information, the method comprising:
step S101: setting the number of pyramid layers, and respectively layering the template image and the target image to be tested;
Step S102: setting the rotation step length of each layer of the template image; rotating the template image under each pyramid layer within a circumferential range according to the rotation step length, and obtaining a rotation image once per rotation; obtaining a rotation image set M in of each pyramid layer; wherein i represents a pyramid layer, and n represents the number of rotated images in an ith layer;
Step S103: acquiring edge gradient information of each image in the rotating image set M in;
Step S104: acquiring edge gradient information of each pyramid layer image in the target image to be detected;
Step S105: taking the topmost pyramid layer of the target image to be detected as a current pyramid layer, searching a template image in the current pyramid layer, and calculating the similarity between the template image and the target image to be detected to obtain first matching information;
Step S106: if the current pyramid layer is the bottommost pyramid layer, marking the matching information of the current pyramid layer as the final matching position (u final,vfinal,θfinal) of the target, and ending the method; otherwise, the next layer of the current pyramid layer is set as the current pyramid layer; wherein u final,vfinal,θfinal is the x coordinate of the final matching position, the y coordinate of the final matching position and the rotation angle value of the template image corresponding to the final matching position respectively;
Step S107: setting a search range (u i+1±Δu,vi+1±Δv,θi+1±AngleStepi+1) of the current pyramid layer based on a result (u i+1,vi+1,θi+1) with the maximum similarity in matching information of a layer above the current pyramid layer; wherein, (u i +1,vi+1) is coordinate information of a matching point with the maximum similarity obtained in the upper layer of the current pyramid layer, θ i+1 is angle information obtained in the upper layer of the current pyramid layer, deltau and Deltav are coordinate difference values set according to the image size, and ANGLESTEP i+1 is a rotation step length of the upper layer of the current pyramid layer; matching is carried out based on the search range of the current pyramid layer, and matching information of the current pyramid layer is obtained; step S106 is entered;
The step S101: pyramid layering is respectively carried out on the template image and the target image to be detected, the required pyramid layer number is obtained, and the method comprises the following steps:
Step S1011: acquiring the length and width of the reduced image:
l i=L/ki (equation 1)
Wherein i represents the ith pyramid hierarchy; l represents the initial length and width of the image, namely the length and width of the bottommost layer of the pyramid; l i represents the length and width of the ith layer pyramid; k represents the multiple of image shrinkage, and k is more than or equal to 2;
Step S1012: acquiring the required pyramid layer number i c according to the reduced length and width of the image:
Extracting characteristic points of each pyramid layer by adopting a sobel operator, and if the number of characteristic points of an ith c +1 layer image in a pyramid of a template image is smaller than or equal to a preset characteristic point number threshold value, the number of characteristic points of an ith c layer image is larger than the preset characteristic point number threshold value, and the number of characteristic points of an ith c layer image is the least of the number of characteristic points of 0 th to ith c layers of images, determining the number of pyramid layers required by the template image as an i c layer; the pyramid layer number of the target image to be measured is set as an i c layer.
2. The image matching method based on edge information as claimed in claim 1, wherein said step S102: setting the rotation step length of each layer of the template image; rotating the template image under each pyramid layer within a circumferential range according to the rotation step length, and obtaining a rotation image once per rotation; obtaining a rotation image set M in of each pyramid layer, wherein:
the rotation step ANGLESTEP i of the ith layer of the template image is as follows:
ANGLESTEP i=si (formula 2)
Wherein s is more than or equal to 2.
3. The image matching method based on edge information as claimed in claim 2, wherein in the step S105, a similarity calculation formula is:
wherein m represents the total number of feature points of the template image under a certain angle of the ith pyramid layer; And Gradient values along the x and y directions at (x j,yj) of the template image subscripted num of the i-th layer pyramid rotated image set are respectively represented; And Gradient values of the ith layer of target image to be measured along the x and y directions at (u+x j,v+yj) are respectively represented; A modulus representing the gradient of the template subscripted num of the i-th layer pyramid rotated image set at (x j,yj); A modulus representing the gradient of the i-th layer of the image of the object to be measured at (u+x j,v+yj), i.e
Wherein, The template image subscripted num for the i-th layer pyramid rotation image set is squared along the x-direction gradient values at (x j,yj),The template image subscripted num for the i-th layer pyramid rotation image set is squared along the y-direction gradient values at (x j,yj),For the i-th layer of the image of the object to be measured at (u + x j,v+yj) along the square of the gradient value in the x-direction,The target image to be measured for the i-th layer is squared along the y-direction gradient value at (u+x j,v+yj).
4. The image matching method based on edge information as claimed in claim 3, wherein an acceleration termination condition of similarity measurement is set, if m feature points are shared by the template image under a certain specific angle of a certain pyramid layer, a sum Score inuma of similarity of a feature points is calculated, wherein a is less than or equal to m, and the acceleration termination condition is:
Wherein Score inuma represents the sum of the similarity of the a feature points; a represents the number of the calculated template image feature points; s min denotes a set minimum match score; g represents the set greedy degree, and the value range is 0-1; b represents the coefficient of the set minimum matching score sm in, and the value range of b is 0-1; the formula of the sum Score inuma of the similarity of a feature points is:
If Score inuma satisfies the accelerated termination condition of the similarity measure, the similarity of the remaining (m-a) feature points is not calculated any more, and the similarity measure calculation is terminated at the current position.
5. An image matching apparatus based on edge information, the apparatus comprising:
an initialization module: the method comprises the steps of configuring to set pyramid layers, and respectively layering a template image and a target image to be tested;
And (3) a rotation module: the method comprises the steps of setting rotation step sizes of layers of the template image; rotating the template image under each pyramid layer within a circumferential range according to the rotation step length, and obtaining a rotation image once per rotation; obtaining a rotation image set M in of each pyramid layer; wherein i represents a pyramid layer, and n represents the number of rotated images in an ith layer;
edge gradient acquisition module: the method comprises the steps of acquiring edge gradient information of each image in the rotating image set M in;
the target edge gradient acquisition module to be measured: the method comprises the steps of acquiring edge gradient information of each pyramid layer image in the target image to be detected;
A first matching module: the method comprises the steps of using the topmost pyramid layer of the target image to be detected as a current pyramid layer, searching a template image in the current pyramid layer, and calculating the similarity between the template image and the target image to be detected to obtain first matching information;
and a judging module: if the current pyramid layer is the bottommost pyramid layer, marking the matching information of the current pyramid layer as the final matching position (u final,vfinal,θfinal) of the target, and ending the method; otherwise, the next layer of the current pyramid layer is set as the current pyramid layer; wherein u final,vfinal,θfinal is the x coordinate of the final matching position, the y coordinate of the final matching position and the rotation angle value of the template image corresponding to the final matching position respectively;
Similarity calculation module: the method comprises the steps of setting a search range (u i+1±Δu,vi+1±Δv,θi+1±AngleStepi+1) of a current pyramid layer based on a result (u i+1,vi+1,θi+1) with maximum similarity in matching information of a layer above the current pyramid layer; wherein, (u i+1,vi+1) is coordinate information of a matching point with the maximum similarity obtained in the upper layer of the current pyramid layer, θ i+1 is angle information obtained in the upper layer of the current pyramid layer, deltau and Deltav are coordinate difference values set according to the image size, and ANGLESTEP i+1 is a rotation step length of the upper layer of the current pyramid layer; matching is carried out based on the search range of the current pyramid layer, and matching information of the current pyramid layer is obtained; triggering a judging module;
pyramid layering is respectively carried out on the template image and the target image to be detected, the required pyramid layer number is obtained, and the method comprises the following steps:
acquiring the length and width of the reduced image:
l i=L/ki (equation 1)
Wherein i represents the ith pyramid hierarchy; l represents the initial length and width of the image, namely the length and width of the bottommost layer of the pyramid; l i represents the length and width of the ith layer pyramid; k represents the multiple of image shrinkage, and k is more than or equal to 2;
Acquiring the required pyramid layer number i c according to the reduced length and width of the image:
Extracting characteristic points of each pyramid layer by adopting a sobel operator, and if the number of characteristic points of an ith c +1 layer image in a pyramid of a template image is smaller than or equal to a preset characteristic point number threshold value, the number of characteristic points of an ith c layer image is larger than the preset characteristic point number threshold value, and the number of characteristic points of an ith c layer image is the least of the number of characteristic points of 0 th to ith c layers of images, determining the number of pyramid layers required by the template image as an i c layer; the pyramid layer number of the target image to be measured is set as an i c layer.
6. An image matching system based on edge information, comprising:
a processor for executing a plurality of instructions;
a memory for storing a plurality of instructions;
Wherein the plurality of instructions are for storing by the memory and loading and executing by the processor the edge information based image matching method of any one of claims 1-4.
7. A computer-readable storage medium having stored therein a plurality of instructions; the plurality of instructions for loading and executing by a processor the edge information based image matching method of any one of claims 1-4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111190561.9A CN114119952B (en) | 2021-10-13 | 2021-10-13 | Image matching method and device based on edge information |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111190561.9A CN114119952B (en) | 2021-10-13 | 2021-10-13 | Image matching method and device based on edge information |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114119952A CN114119952A (en) | 2022-03-01 |
CN114119952B true CN114119952B (en) | 2024-11-05 |
Family
ID=80375770
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111190561.9A Active CN114119952B (en) | 2021-10-13 | 2021-10-13 | Image matching method and device based on edge information |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114119952B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115512415A (en) * | 2022-09-27 | 2022-12-23 | 深圳先进技术研究院 | Face recognition method and device based on visual template and pyramid strategy |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110197232A (en) * | 2019-06-05 | 2019-09-03 | 中科新松有限公司 | Image matching method based on edge direction and Gradient Features |
BR102019016252A2 (en) * | 2018-08-14 | 2020-02-18 | Canon Kabushiki Kaisha | IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2792147A4 (en) * | 2011-12-15 | 2015-08-05 | Intel Corp | Techniques for improving stereo block matching with the pyramid method |
CN110472674B (en) * | 2019-07-31 | 2023-07-18 | 苏州中科全象智能科技有限公司 | Template matching algorithm based on edge and gradient characteristics |
-
2021
- 2021-10-13 CN CN202111190561.9A patent/CN114119952B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
BR102019016252A2 (en) * | 2018-08-14 | 2020-02-18 | Canon Kabushiki Kaisha | IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD |
CN110197232A (en) * | 2019-06-05 | 2019-09-03 | 中科新松有限公司 | Image matching method based on edge direction and Gradient Features |
Also Published As
Publication number | Publication date |
---|---|
CN114119952A (en) | 2022-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Spreeuwers | Fast and accurate 3D face recognition: using registration to an intrinsic coordinate system and fusion of multiple region classifiers | |
US9135518B2 (en) | Robust and efficient image identification | |
CA2717612C (en) | Fingerprint representation using gradient histograms | |
JP5705147B2 (en) | Representing 3D objects or objects using descriptors | |
US9141871B2 (en) | Systems, methods, and software implementing affine-invariant feature detection implementing iterative searching of an affine space | |
Dibeklioglu et al. | 3D facial landmarking under expression, pose, and occlusion variations | |
EP2534612B1 (en) | Efficient scale-space extraction and description of interest points | |
US20120082385A1 (en) | Edge based template matching | |
WO2013180530A1 (en) | Device and method for tracking object by using characteristic point descriptor, device and method for removing erroneous characteristic point, and device implemented in mobile terminal | |
CN106447592A (en) | Online personalized service of each feature descriptor | |
CN108564092A (en) | Sunflower disease recognition method based on SIFT feature extraction algorithm | |
CN111079571A (en) | Identification card information identification and edge detection model training method and device | |
CN109447117B (en) | Double-layer license plate recognition method and device, computer equipment and storage medium | |
CN108664970A (en) | A kind of fast target detection method, electronic equipment, storage medium and system | |
CN109784171A (en) | Car damage identification method for screening images, device, readable storage medium storing program for executing and server | |
CN114119952B (en) | Image matching method and device based on edge information | |
CN111127558B (en) | Method and device for determining assembly detection angle, electronic equipment and storage medium | |
CN111611994A (en) | Image extraction method, image extraction device, electronic equipment and storage medium | |
Wu et al. | An accurate feature point matching algorithm for automatic remote sensing image registration | |
CN115984759A (en) | Substation switch state identification method and device, computer equipment and storage medium | |
CN111401252B (en) | Book spine matching method and equipment of book checking system based on vision | |
Yuankui et al. | Automatic target recognition of ISAR images based on Hausdorff distance | |
CN114329403A (en) | Method for improving unlocking speed of fingerprint identification equipment | |
CN114298994A (en) | Positioning method and device based on Gaussian filtering and utilizing chip key points | |
CN107092912A (en) | A kind of recognition methods of car plate and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |