CN113159103A - Image matching method, image matching device, electronic equipment and storage medium - Google Patents
Image matching method, image matching device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN113159103A CN113159103A CN202110209671.9A CN202110209671A CN113159103A CN 113159103 A CN113159103 A CN 113159103A CN 202110209671 A CN202110209671 A CN 202110209671A CN 113159103 A CN113159103 A CN 113159103A
- Authority
- CN
- China
- Prior art keywords
- template
- contour
- image
- edge
- searched
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 86
- 238000013507 mapping Methods 0.000 claims description 39
- 238000012545 processing Methods 0.000 claims description 32
- 230000005484 gravity Effects 0.000 claims description 30
- 230000001629 suppression Effects 0.000 claims description 19
- 238000004590 computer program Methods 0.000 claims description 4
- 230000008569 process Effects 0.000 abstract description 29
- 230000006872 improvement Effects 0.000 abstract description 6
- 238000010586 diagram Methods 0.000 description 16
- 230000009466 transformation Effects 0.000 description 14
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 12
- 230000006870 function Effects 0.000 description 12
- 239000011159 matrix material Substances 0.000 description 11
- 238000004364 calculation method Methods 0.000 description 8
- 230000003044 adaptive effect Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 238000009499 grossing Methods 0.000 description 5
- 238000013519 translation Methods 0.000 description 5
- 238000000605 extraction Methods 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000010276 construction Methods 0.000 description 3
- 238000003708 edge detection Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 239000002699 waste material Substances 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000008707 rearrangement Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/752—Contour matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/02—Affine transformations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20016—Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
The embodiment of the invention discloses an image matching method and device, electronic equipment and a storage medium. The method comprises the steps of carrying out pyramid layering on template images, creating the template images with variable scales and multiple angles to extract template edges, creating a multi-layer template edge contour point set with variable scales and multiple angles, enabling the template edge contours of the template images to contain scale information, solving the problems of scale information loss and scale invariance in the image shape matching process, and improving the image matching accuracy in the image matching process; meanwhile, considering that the data structure of the template edge contour information containing the scale features is complex to construct and long time is needed in the matching process, the images to be searched are layered by the pyramid layering number of the template images, a performance improvement strategy of realizing coarse-to-fine similarity matching in each layer is comprehensively applied by the pyramid, the matching data quantity of the lower layer is reduced, and the shape edge matching speed under the multi-scale features is improved on the premise of ensuring the matching precision.
Description
Technical Field
The embodiment of the invention relates to the technical field of image processing, in particular to an image matching method, an image matching device, electronic equipment and a storage medium.
Background
With the development of computer vision, application fields of image shape matching are increasing, such as fields of target recognition, character recognition (OCR), image retrieval, medical image analysis, robot navigation and the like.
Currently, shape matching methods are mainly divided into two main categories: one is to calculate the difference of the image invariants under various transformations; the other is to minimize the matching error by finding the local correspondence between the image to be searched and the template image. However, in the above method, the first type is suitable for global feature description, but some important shape information, such as scale features, is lost, resulting in low image matching accuracy; another method is to combine the whole shape and the local shape closely, and has good robustness to image translation, rotation, scale change and slight geometric deformation, but the method has high computational complexity and affects the image matching efficiency.
Disclosure of Invention
The embodiment of the invention provides an image matching method, an image matching device, electronic equipment and a storage medium, and aims to solve the problems of low matching precision and low matching efficiency in the shape matching process.
In a first aspect, an embodiment of the present invention provides an image matching method, where the method includes:
determining a template edge outline information set of a template image; the template edge contour information set represents the multi-scale multi-rotation-angle template edge contours extracted from the template image under different pyramid hierarchies;
determining an edge contour information set to be searched of an image to be searched; the edge contour information set to be searched represents edge contours to be searched under different pyramid hierarchies extracted from the image to be searched according to the pyramid hierarchy number of the template image;
aiming at a pyramid hierarchical structure from top to bottom, based on the currently-layered contour matching search information, performing similarity matching on the currently-layered template edge contour on the currently-layered edge contour to be searched to obtain a currently-layered contour matching result;
determining profile matching search information used when the template edge profile of the next hierarchy is subjected to similarity matching corresponding to the edge profile to be searched in the next hierarchy according to the current hierarchical profile matching result, wherein the profile matching search information is used when the template edge profile of the next hierarchy is skipped to the next hierarchy for similarity matching until the template edge profile of the next hierarchy reaches the bottom layer of a pyramid;
identifying a target shape indicated by a template image in the image to be searched according to a contour matching result when the pyramid bottom layer similarity matching is finished;
the contour matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the similarity matching is carried out on the edge contour of the template; and the contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched.
In a second aspect, an embodiment of the present invention further provides an image matching apparatus, where the apparatus includes:
the template information determining module is used for determining a template edge outline information set of the template image; the template edge contour information set represents the multi-scale multi-rotation-angle template edge contours extracted from the template image under different pyramid hierarchies;
the information to be searched determining module is used for determining an edge contour information set to be searched of the image to be searched; the edge contour information set to be searched represents edge contours to be searched under different pyramid hierarchies extracted from the image to be searched according to the pyramid hierarchy number of the template image;
the contour similar matching module is used for matching the similarity of the edge contour of the template in the current hierarchy on the edge contour to be searched in the current hierarchy based on the contour matching search information in the current hierarchy aiming at the pyramid hierarchy from top to bottom to obtain the contour matching result in the current hierarchy;
the contour lower layer matching mapping module is used for determining contour matching search information used when the next hierarchical template edge contour is subjected to similarity matching corresponding to the edge contour to be searched according to the current hierarchical contour matching result, and is used when the next hierarchical template edge contour is skipped to for similarity matching until the pyramid bottom layer;
the image to be searched identification module is used for identifying the target shape indicated by the template image in the image to be searched according to the contour matching result when the pyramid bottom layer similarity matching is finished;
the contour matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the similarity matching is carried out on the edge contour of the template; and the contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched.
In a third aspect, an embodiment of the present invention further provides an electronic device, including:
one or more processing devices;
storage means for storing one or more programs;
when the one or more programs are executed by the one or more processing devices, the one or more processing devices are caused to implement the image matching method according to any one of the embodiments of the present invention.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processing device, implements the image matching method described in any one of the embodiments of the present invention.
According to the image matching method provided by the embodiment of the application, the template image is subjected to pyramid layering, the multi-angle template image creation is carried out in a variable scale mode to extract the template edge, a multi-layer variable-scale multi-angle template edge outline point set is created, and the template edge outline of the template image comprises scale information, so that the problems of scale information loss and scale invariance in the image shape matching process are solved, and the image matching accuracy in the image matching process is improved; meanwhile, considering that template edge contour matching containing scale features is adopted, the data structure of template edge contour information is complex to construct, long time is needed in the matching process, the images to be searched are layered through pyramid layering numbers of the template images, a performance improvement strategy of realizing coarse-to-fine similarity matching in each layer is comprehensively applied by the image pyramid, the matching data quantity at the lower layer is reduced, the algorithm complexity is reduced, and the shape edge matching speed under the multi-scale features is improved on the premise of ensuring the matching precision.
The above summary of the present invention is merely an overview of the technical solutions of the present invention, and the present invention can be implemented in accordance with the content of the description in order to make the technical means of the present invention more clearly understood, and the above and other objects, features, and advantages of the present invention will be more clearly understood.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a flow chart of an image matching method provided in an embodiment of the present invention;
FIG. 2 is a schematic diagram of a template edge contour information set including image scale features provided in an embodiment of the present invention;
FIG. 3 is a general flow diagram of image matching provided in an embodiment of the present invention;
fig. 4 is a flowchart of creating a template edge profile information set according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of edge extraction performed on a template image according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a template image direction difference processing provided in an embodiment of the present invention;
FIG. 7 is a schematic illustration of a gradient magnitude image of a template image provided in an embodiment of the invention;
FIG. 8 is a schematic diagram of the image gradient direction of a template image provided in an embodiment of the present invention;
FIG. 9 is a schematic illustration of non-maxima suppression of a template image according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of an adaptive hysteresis threshold processing provided in an embodiment of the present invention;
FIG. 11 is a schematic diagram illustrating an effect of adaptive hysteresis threshold processing on a template image according to an embodiment of the present invention;
fig. 12 is a schematic flowchart of a pyramid adaptive layering of template edge profile images according to an embodiment of the present invention;
fig. 13 is a flow chart of constructing an edge contour information set to be searched according to an embodiment of the present invention;
FIG. 14 is a flow chart of another image matching method provided in embodiments of the present invention;
fig. 15 is a schematic diagram of performing sliding traversal on an edge profile of a template in accordance with an edge profile to be searched according to an embodiment of the present invention;
fig. 16 is a schematic diagram of performing similarity matching between a template edge profile and an edge profile to be searched in each pyramid hierarchy according to an embodiment of the present invention;
fig. 17 is a schematic diagram illustrating a mapping of search information for contour matching from an upper layer to a lower layer of a pyramid according to an embodiment of the present invention;
fig. 18 is a schematic diagram illustrating comparison between forward and backward acceleration for similarity matching by using a performance enhancement policy according to an embodiment of the present invention;
fig. 19 is a structural diagram of an image matching apparatus provided in an embodiment of the present invention;
fig. 20 is a schematic structural diagram of an electronic device provided in an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations (or steps) as a sequential process, many of the operations (or steps) can be performed in parallel, concurrently or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
Fig. 1 is a flowchart of an image matching method provided in an embodiment of the present invention. The technical solution of this embodiment is applicable to the case of matching shapes between images, and the method may be executed by an image matching apparatus, which may be implemented in a software and/or hardware manner and integrated on any electronic device with a network communication function. As shown in fig. 1, the image matching method in the embodiment of the present application may include the following steps:
s110, determining a template edge outline information set of the template image; the template edge contour information set represents the multi-scale multi-rotation-angle template edge contour under different pyramid hierarchies extracted from the template image.
The image shape matching is usually to measure the similarity between the shapes according to a certain criterion, and the result of the shape matching between the two images can be represented by a numerical value, which is called the shape similarity. The larger the value of the shape similarity, the more similar the shapes of the two images are; otherwise, the more dissimilar. The similarity of the shape matching is the result corresponding to the maximum value in the matching process.
Referring to fig. 2, pyramid layering may be performed on a template image including a target shape to obtain a plurality of layered template images, and after pyramid layering, template edge contour features of the template image of each layer are described from multiple scales and multiple rotation angles, so as to create a template edge contour information set of the template image. Therefore, the template edge contour can contain scale information by constructing the multi-layer variable-scale multi-angle template edge contour characteristics, and the problem of scale invariance in the image matching process is solved.
S120, determining an edge contour information set to be searched of the image to be searched; the edge contour information set to be searched represents edge contours to be searched under different pyramid hierarchies extracted from the image to be searched according to the pyramid hierarchy number of the template image.
Because the image matching is required to be carried out on the image to be searched and the template edge outline information of the template image, after the image to be searched is obtained, similar to the processing of the template image, the pyramid layering can be carried out on the image to be searched according to the pyramid layering number of the template image, and a plurality of layered images to be searched are obtained. Furthermore, corresponding edge contour points to be searched can be extracted from each layered image to be searched, the edge contour to be searched of each layer is constructed, and the same-layer matching of the template image and the image to be searched by using the edge contour is realized.
S130, aiming at the top-down pyramid layered structure, based on the current layered contour matching search information, similarity matching is carried out on the current layered template edge contour on the current layered edge contour to be searched, and a current layered contour matching result is obtained.
The contour matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the similarity matching is carried out on the edge contour of the template. The area to be matched can indicate the search limiting position of the template edge outline when the similarity matching is carried out on the same layer of edge outline to be searched, so that the problem that the similarity matching resources are wasted due to the fact that the template edge outline is matched in other areas except the area to be matched of the edge outline to be searched is avoided. The scale to be matched can indicate the scale of the edge profile of the template used when the edge profile of the template is subjected to similarity matching with the edge profile to be searched on the same layer, so that the waste of matching resources caused by the fact that the edge profile of the template with an improper scale is used for matching with the edge profile to be searched is avoided. Similarly, the angle to be matched can indicate the rotation angle of the edge profile of the template when the edge profile of the template is subjected to similarity matching with the edge profile to be searched on the same layer, so that the waste of similarity matching resources caused by the fact that the edge profile of the template with an improper angle is used for similarity matching is avoided.
When the template edge contour is subjected to similarity matching with the edge contour to be searched, the similarity between the template edge contour and the overlapped contour region of the template edge contour in the edge contour to be searched can be counted, the greater the similarity is, the more similar the similarity is, and the similarity is 1 when the template edge contour and the overlapped contour region are completely matched. The contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched. For the template edge contour and the edge contour to be searched in the same layer, similarity matching is respectively carried out on the same edge contour to be searched through template edge contours with different scales and different rotation angles, a series of similarity measurement values among contours can be obtained, and a position area, an angle and a scale corresponding to the maximum value of the similarity measurement values are taken as the contour matching result of the current layer.
And S140, determining profile matching search information used when the template edge profile of the next hierarchy is subjected to similarity matching corresponding to the edge profile to be searched according to the current hierarchical profile matching result, and using the template edge profile of the next hierarchy when jumping to the next hierarchy for similarity matching until reaching the pyramid bottom layer.
Referring to fig. 3, when image matching is performed on the template image and the image to be searched, the image pyramid is introduced to accelerate the speed of contour matching between the template edge contour and the edge contour to be searched. Aiming at the pyramid layering from top to bottom, the template edge profile close to the pyramid bottom layer and the edge profile feature details in the edge profile to be searched are more, and the time spent in similarity matching is longer. Therefore, according to the top-down sequence of the pyramid, the rough matching of the similarity between the template edge profile and the edge profile to be searched can be firstly carried out on the upper layer of the pyramid to obtain a rough profile matching result.
Referring to fig. 3, after the contour matching result is obtained at the upper layer of the pyramid, new contour matching search information can be obtained by mapping to the lower layer of the pyramid based on the contour matching result obtained at the upper layer of the pyramid, so that when the similarity matching between the template edge contour and the edge contour to be searched is performed at the lower layer of the pyramid, the region to be matched, the range of the scale to be matched and the range of the angle to be matched at the lower layer of the pyramid are effectively determined, the similarity matching of the position of the useless region, the useless angle and the useless scale performed at the lower layer of the pyramid is greatly reduced, and the similarity matching speed at the lower layer of the pyramid is accelerated.
S150, identifying the target shape indicated by the template image in the image to be searched according to the contour matching result when the pyramid bottom layer similarity matching is finished.
Referring to fig. 3, the template edge profiles determined at the second layer of the pyramid are layered from top to bottom, and the template edge profiles determined at the second layer of the pyramid are used for performing similarity matching on the edge profiles to be searched to obtain a profile matching result, and the profile matching search information used when the similarity matching is performed on the edge profiles to be searched on the template edge profiles at the bottom layer of the pyramid is mapped and determined downwards. And then, similarity matching can be carried out on the template edge profile of the pyramid bottom layer on the edge profile to be searched corresponding to the pyramid bottom layer, and a profile matching result is obtained when the similarity matching of the pyramid bottom layer is finished.
The contour matching result can comprise the contour gravity center position, the scale and the rotation angle of the template edge contour at the pyramid bottom layer when the edge contour to be searched at the pyramid bottom layer completes the similarity matching. And drawing a target shape indicated by the template image in the image to be searched based on the contour center position indicated by the contour matching result when the pyramid bottom layer similarity matching is finished, and the scale and the rotation angle of the edge contour of the template used, so as to realize the shape matching between the images.
According to the image matching method provided by the embodiment of the application, the template image is subjected to pyramid layering, the multi-angle template image creation is carried out in a variable scale mode to extract the template edge, a multi-layer variable-scale multi-angle template edge outline point set is created, and the template edge outline of the template image comprises scale information, so that the problems of scale information loss and scale invariance in the image shape matching process are solved, and the image matching accuracy in the image matching process is improved; meanwhile, considering that template edge contour matching containing scale features is adopted, the data structure of template edge contour information is complex to construct, long time is needed in the matching process, the images to be searched are layered through pyramid layering numbers of the template images, the image pyramid is comprehensively applied to realize similarity matching from coarse to fine in each layer, the matching data quantity at the lower layer can be greatly reduced through the performance improvement strategy, the algorithm complexity is reduced, and the shape edge matching speed under the multi-scale features is improved on the premise of ensuring the matching precision.
Fig. 4 is a flow chart of creating a template edge contour information set provided in an embodiment of the present invention, and the technical solution of this embodiment is further optimized based on the above embodiment, and the technical solution of this embodiment may be combined with various alternatives in one or more of the above embodiments. As shown in fig. 4, the process of creating the template edge contour information set provided in the embodiment of the present application may include the following steps S410 to S430:
s410, carrying out pyramid self-adaptive layering on the template images to obtain a plurality of layered template images.
In an alternative of this embodiment, referring to fig. 5, performing pyramid adaptive hierarchical on the template image to obtain a plurality of hierarchical template images may include the following steps a1-a 4:
and A1, according to the gradient amplitude and the gradient direction of the template image, carrying out non-maximum suppression processing on pixel points in the template image to obtain a template image with non-maximum suppression.
Referring to fig. 3, after acquiring a template image, the acquired template image is preprocessed using a separate gaussian filter. Among them, the gaussian filter is a linear filter, and can effectively suppress noise and smooth an image. Gaussian filter based onThe Gaussian function generates a template, and then convolution operation is carried out on the template and the image to be processed. The two-dimensional gaussian kernel function is as follows:in the formula, x and y are coordinates of pixel points in the template image; e is a natural constant equal to about 2.71828; sigma is a standard deviation, the smaller sigma is, the larger the generated template center coefficient is, the smaller the peripheral coefficient is, and the more obvious the smoothing effect on the template image is; on the contrary, if the sigma is larger, the difference of each coefficient of the generated template is not very large, similar to the mean template, and the smoothing effect on the template image is more obvious. The template image is preprocessed as follows: in formula (2), I (x, y) is a template image, G (x, y) is a generated gaussian kernel function, and L (x, y) is a template image after gaussian kernel smoothing.
Referring to fig. 3 and 6, after the acquired template image is preprocessed by using the discrete gaussian filtering, the image direction difference processing may be performed on the template image after the image preprocessing, so as to obtain difference images of the template image in the x direction and the y direction. For example, Sobel edge detection is essentially a first-order filter, and based on the sensitivity of the first-order derivative to lines and noise, smoothing of an image is required to be performed first when Sobel edge detection is performed, so as to reduce the influence of noise on edge detection. The Sobel operator contains two sets of 3 x 3 matrices, the x-direction and y-direction convolution kernels, respectively. And (4) performing convolution processing on the template image after Gaussian smoothing by using a Sobel operator to obtain a differential image of the template image in the x direction and the y direction.
Referring to fig. 3 and 7, the difference value between the x direction and the y direction corresponding to the pixel coordinate (x, y) can be extracted from the difference image between the x direction and the y direction of the template image, and the gradient amplitude F (x, y) can be obtained by squaring and root-opening, and the gradient amplitude is calculated as follows:in the formula Gx(x,y)、GyAnd (x, y) are difference values at corresponding pixel coordinates (x, y) of the difference image in the x direction and the y direction respectively, F (x, y) is the gradient amplitude of the corresponding coordinates, and the generated image is a gradient amplitude map.Meanwhile, the gradient direction θ can be obtained by an arctan function atan2(y, x), and the calculation formula of the gradient direction is as follows: theta-atan 2 (G)y(x,y),Gx(x, y)), wherein Gx(x,y)、Gy(x, y) are difference values at the corresponding pixel coordinates (x, y) of the difference image in the x direction and the y direction, respectively, and θ is the gradient direction of the corresponding coordinates.
The edge detected by the Sobel operator is sometimes too thick and cannot be directly utilized, so that pixel points with insufficient gradient need to be restrained, and only the maximum gradient is reserved, so that the purpose of thin edge is achieved. These pixels with insufficient gradient are likely to be the transition points of an edge. Definition of maximum points according to a binary function, i.e. points (x)0,y0) All (x, y) in a certain neighborhood have f (x, y) less than or equal to f (x)0,y0) Then, let f be at point (x)0,y0) Has a maximum value of f (x)0,y0) And non-maximum suppression processing can be performed on pixel points in the template image based on the gradient amplitude and the gradient direction of the template image.
Referring to fig. 8, the non-maximum suppression processing on the template image can be implemented by the following method operations:
(1) according to the image gradient direction obtained by calculation, the image center pixel point can be divided into four directions according to the angle, which are respectively: a horizontal gradient direction, a vertical gradient direction, an upper right diagonal gradient direction, and a lower right gradient direction. Wherein, the horizontal gradient direction is the part between the straight lines 3 and 4 in fig. 8, specifically, (theta > -22.5& < 22.5) | (theta < -157.5& & theta > 157.5); the vertical gradient direction is a portion between the straight lines 1 and 2 in fig. 8, specifically, (θ > -67.5& & θ < (112.5) | (θ > -112.5& & θ < (67.5)); the upper right diagonal gradient direction is a score between lines 2 and 4 in fig. 8, specifically, (θ > -22.5& & θ <67.5) | (θ > -157.5& & θ < -112.5); the lower right gradient direction is a portion between lines 1 and 3 in fig. 8, specifically (θ > -67.5& & θ < -22.5) | (θ > 112.5& & θ < 157.5).
(2) And dividing the image center pixel point into four directions according to the gradient direction of the image center pixel point, and then respectively comparing the gradient values of the pixel points on the gradient straight line in the 8 neighborhoods of the image center pixel point. For example, in fig. 8, 4 gradient straight lines are defined according to regions, which are a horizontal gradient straight line, a vertical gradient straight line, a left-lower-right upper-45 ° gradient straight line, and a left-upper-right lower-45 ° gradient straight line. When the gradient value of the central point is greater than the gradient values of two end points on the gradient straight line, the gradient value of the central pixel point is the maximum value in an 8-neighborhood, and the pixel value corresponding to the central point is taken as the selected pixel; when the gradient value of the center point is less than or equal to the gradient values of the two end points on the gradient straight line, the pixel value corresponding to the center point is taken as 0, and fig. 9 is the image after the non-maximum suppression processing.
Step A2, determining the lag threshold of the template image adaptively according to the gradient amplitude of the template image.
In order to reduce human factor errors caused by manually setting the threshold, the high and low thresholds are obtained in a self-adaptive mode according to the image gradient amplitude. Referring to fig. 10, when the high and low thresholds are obtained adaptively according to the gradient amplitude of the image, the intra-class variance minimization can be introduced in the edge extraction process to select the high and low thresholds adaptively, so as to reduce the interference of factors such as artificially setting the thresholds. The specific method comprises the following steps: acquiring a gradient amplitude map of the template image, grading the gradient amplitude by L, wherein L is 256, and dividing the 8-bit gray scale map into [0,1, …,255 according to the gradient amplitude](ii) a Further classification of all grades into three classes, C0、C1And C2In which C is0For non-edge point pixel gradients, including a prior grading of gradient magnitude [0,1, …, k],C1For weak edge point pixel gradients, including gradient magnitude grading [ k, k +1, …, m],C2For strong edge point pixel gradients, including gradient magnitude grading [ m +1, m +2, …, L-1]. The high and low thresholds (m, k) are found by gradient magnitude histogram and intra-class variance minimization. For example, for the "Lena" image shown in fig. 7, the high and low hysteresis thresholds obtained by the above method are 150, 92.
And A3, performing edge point dividing processing on the template image subjected to the non-maximum value suppression processing according to the hysteresis threshold of the template image to obtain a template edge contour image of the template image.
Referring to fig. 10, hysteresis thresholding (dual-thresholding) is performed by assuming that there are two types of edges in the image: among the edge points after the non-maximum suppression, those whose gradient values exceed the high threshold are called strong edges, those whose gradient values are smaller than the high threshold and larger than the low threshold are called weak edges, and those whose gradient values are smaller than the low threshold are not edges. A strong edge is necessarily an edge point and therefore the high threshold must be set high enough to require the gradient values of the pixel points to be large enough (change strongly enough). The weak edge may be an edge or noise present in the image. When a strong edge point exists in 8 adjacent areas around the weak edge, the weak edge point is changed into the strong edge point, so that the strong edge is supplemented. After determining the hysteresis threshold, the edge point dividing process is performed on the template image subjected to the non-maximum suppression process by using the hysteresis threshold, as shown in the effect diagram of fig. 11.
And A4, carrying out pyramid self-adaptive layering on the template image based on template edge outline points in the template edge outline image of the template image to obtain a plurality of layered template images.
In an alternative of this embodiment, referring to FIG. 12, pyramid adaptive layering of template images may include the following steps B1-B2:
and step B1, carrying out pyramid layering on the input template image, and counting the number of the layered edge contour points.
And step B2, if the number of the edge contour points after layering is smaller than a preset pyramid top layer edge point threshold value, stopping pyramid layering, taking the previous layer image as a pyramid top layer, and determining the pyramid layering number.
Referring to fig. 12, a pyramid top-level edge point threshold is set, for example, the threshold is set to 20, the template image is pyramid-layered, and the number of edge contour points after layering is counted each time. And judging whether the number of the layered edge contour points is less than a preset edge point threshold value of the pyramid top layer of the image. And when the number of the edge contour points after layering is less than a preset threshold value, stopping layering, taking the previous layer of image as a pyramid top layer, determining the number of pyramid layers, and realizing pyramid self-adaptive layering of the image.
And S420, carrying out scale configuration on each layered template image, and carrying out multi-rotation-angle configuration on the scaled template images.
Referring to fig. 3, given a scale interval and step length of shape matching, performing scale division processing on the pyramid-layered template image sets to obtain a multi-scale template image expression of each pyramid template image layer, that is, each pyramid template image layer includes a set from a minimum-scale image to a maximum-scale image of a current template image layer.
Referring to fig. 3, the orthogonal template is to implement matching of a non-orthogonal region of an image to be searched, so that given an angle interval and a step length of shape matching, a rotation angle configuration is performed for a sub-scale template image according to the configured angle of the angle interval and the step length, and the template can be rotated to determine a matching angle. Considering that a digital image generally exists in a matrix form, a rotation angle configuration can be performed on a sub-scale template image through image affine transformation, and the specific flow of the image affine transformation is as follows:
(1) constructing an affine transformation matrix
The conversion of the image from one two-dimensional coordinate to another two-dimensional coordinate is realized through image affine transformation, wherein rotation and translation of the image are included. From the perspective of the spatial three-dimensional coordinate system of the image, the rotation of the image is equivalent to the rotation around the Z axis of the image, meanwhile, the rotation center point of the image is translated, and finally, an affine transformation matrix containing the rotation translation of the image is constructed. The corresponding affine transformation matrix expression is:
in the above formula, (u, v) is a matrix coordinate after affine transformation of the image, and (x, y) is an image coordinate of the original template or the image to be searched. (c)1,c2) Is the translation coordinate of the image rotation center relative to the original template or the image rotation center to be searched, (a)1,a2,b1,b2) For forming rotation matrix in affine transformation matrixThe parameter (a) includes information such as rotation and scale change of the image, and the image coordinate axes x and y are orthogonal axes1,a2,b1,b2) Satisfy the requirement ofAnd a1b1+a2b2=0。
(2) Calculating an affine transformation matrix
Because the rotation transformation of the image is carried out around the Z axis of the image space coordinate system, a rotation matrix of affine transformation is obtained by calculation according to the image rotation angle information obtained in the first step and is recorded as:
in the above equation, θ is a rotation angle around the Z-axis of the image space coordinate system. The rotation center of the image is defined as the coordinate center of the image, and half of the values of the image rows and columns are taken as (cols/2, rows/2). Finally, obtaining an affine transformation matrix containing image rotation and translation information, and recording as:
according to the angle starting point, the range and the step length information of the first-step image, a series of image affine transformation matrix groups formed by a plurality of rotation angles can be obtained and are recorded as:
in the above formula, i is a pixel point of the image to be rotated, i is 1,2, …, n, n is a pixel point number of the image to be rotated, (x)i,yi) To correspond to the coordinate position of the i pixel point in the image to be rotated, thetaiIs a rotation angle,(ui,vi) The coordinate position of the rotated corresponding i pixel point is obtained.
S430, extracting corresponding template edge contour information from each layered template image set with multi-scale and multi-rotation angles respectively to construct a template edge contour information set of the template image.
The edge extraction is to obtain template edge contour point information of each template image in a template image set including image scale features, and for specific extraction, reference is made to the process of steps a1-A3, which is not described herein again. The edge contour information may include a position of a center of gravity of the contour, a pixel position of an edge contour point relative to the center of gravity of the contour, an edge contour point gradient magnitude, and edge contour point lateral and longitudinal gradients. The center of gravity of the set of edge contour points is calculated as: and statistically summing the row-column coordinates of all the edge contour points, and dividing the row-column coordinates by the number of the statistical edge contour points. Edge profile center of gravityThe following formula is used for the calculation of (c):
in the formula, the sum of the row coordinates and the column coordinates of each edge pixel point is counted, n is the number of the edge pixel points, and the mean value of the row coordinates and the column coordinates is obtained. The coordinates of the edge contour points relative to the contour center of gravity pixels are solved as:
the gradients of the edge contour points in the x direction and the y direction are obtained by an x-direction gradient image and a y-direction gradient image which are generated by image direction difference. The gradient amplitudes of the edge contour points are obtained from the gradient amplitudes calculated previously.
After extracting corresponding template edge contour information from each template image in each layered template image set with sub-scale and multi-rotation angles, storing the template edge contour information in a structural body form, organizing all template edge contour information in a linear table mode for facilitating access, finally constructing a template edge contour information set of the template image, and pre-storing the constructed template edge contour information set without determining the template edge contour information set for the template image every time the image to be searched is matched.
Fig. 13 is a flow chart of constructing an edge contour information set to be searched, which is provided in an embodiment of the present invention, and the technical solution of this embodiment is further optimized based on the above embodiment, and may be combined with various alternatives in one or more of the above embodiments. As shown in fig. 13, the process for constructing the edge contour information set to be searched provided in the embodiment of the present application may include the following steps:
s1310, carrying out pyramid layering on the image to be searched according to the number of pyramid layering of the template image to obtain a plurality of layered images to be searched.
S1320, extracting corresponding edge contour information to be searched from each layered image to be searched so as to construct an edge contour information set to be searched of the image to be searched.
The edge contour information comprises the gravity center position of the contour, the pixel position of the edge contour point relative to the gravity center of the contour, the gradient amplitude of the edge contour point, and the transverse gradient and the longitudinal gradient of the edge contour point.
In an alternative of this embodiment, extracting corresponding edge contour information to be searched from each layered image to be searched may include: according to the gradient amplitude and the gradient direction of each layered image to be searched, carrying out non-maximum suppression on pixel points in the image to be searched; and carrying out edge point division processing on the image to be searched with the non-maximum suppression to obtain an edge contour image to be searched of the image to be searched, so as to obtain corresponding edge contour information to be searched.
The difference between the to-be-searched edge contour information set provided in the embodiment of the present application and the construction process of the template edge contour information set is that the construction of the to-be-searched edge contour information set is not performed with a multi-rotation angle configuration in a scale division manner, and specific details of the technique not described in detail in the above embodiment may refer to the construction process of the template edge contour information provided in any embodiment of the present application.
Fig. 14 is a flowchart of another image matching method provided in an embodiment of the present invention, and the technical solution of the present embodiment is further optimized based on the above embodiment, and may be combined with various alternatives in one or more of the above embodiments. As shown in fig. 14, the image matching method provided in the embodiment of the present application may include the following steps:
s1410, determining a template edge outline information set of the template image; the template edge contour information set represents the multi-scale multi-rotation-angle template edge contour under different pyramid hierarchies extracted from the template image.
S1420, determining an edge outline information set to be searched of the image to be searched; and the edge contour information set to be searched represents edge contours to be searched under different pyramid hierarchies extracted from the image to be searched according to the pyramid hierarchy number of the template image.
And S1430, for the pyramid layered structure from top to bottom, respectively traversing the edge profiles of the template with different scales and different rotation angles of the current layering on the edge profile to be searched corresponding to the current layering in a sliding manner based on the profile matching search information of the current layering.
The contour matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the similarity matching is carried out on the edge contour of the template.
Referring to fig. 15 and 16, in the case that the current hierarchy is the pyramid top layer, the contour matching search information of the current hierarchy includes the entire contour region of the edge contour to be searched as the region to be matched, and the initially set scale range and angle range are respectively used as the scale range to be matched and the angle range to be matched. At this time, the edge contour to be searched of the pyramid top layer and the template edge contour containing the scale angle are respectively taken as input of pyramid top layer traversal matching, and since the position coordinates of the template edge contour point are relative to the gravity center of the template edge contour, the gravity center of the template edge contour moves in the edge contour to be searched in the pyramid top layer traversal matching process to calculate the similarity.
Referring to fig. 15 and 16, in the case that the current hierarchy is a non-pyramid top layer, the contour matching search information of the current hierarchy includes a result of contour matching in the edge contour to be searched corresponding to the previous hierarchy, and the edge contour to be searched corresponding to the current hierarchy is mapped and determined in a preset matching search mapping manner. At this time, the area to be matched in the edge contour to be searched indicated by the contour matching search information of the current hierarchy, the template edge contour of the scale range to be matched and the angle range to be matched indicated by the contour matching search information of the current hierarchy are respectively taken as the input of the current hierarchy traversal matching, and the position coordinates of the template edge contour point are relative to the gravity center of the template edge contour, so that the gravity center of the template edge contour of the scale range to be matched and the angle range to be matched indicated by the contour matching search information is moved in the area to be matched indicated by the contour matching search information of the edge contour to be searched in the current hierarchy traversal matching process to calculate the similarity.
And S1440, calculating the similarity between the template edge profile with different scales and different rotation angles and the edge profile to be searched during the sliding process of the template edge profiles with different scales and different rotation angles.
Referring to fig. 9, for each hierarchical similarity matching, the center of the black intersection point is the center of gravity of the template edge contour, the template edge contour is traversed from the upper left corner to the lower right corner of the edge contour image to be searched, and the similarity between the template edge contour and the contour corresponding to the edge image to be searched during each movement is counted, the greater the similarity is, the more similar the two are, and the similarity is 1 during complete matching. The template edge contour similarity metric function is as follows:
in the above formula, n is the number of edge points involved in the calculation, di' is an edge profile image to be searchedGradient of a certain edge point in, eq+p'Is the gradient, t ', of the corresponding edge contour points of the template edge contour image'iAndrespectively representing the gradient, u 'in the x direction of the edge contour point corresponding to the edge contour image to be searched and the edge contour image of the template'iAndand respectively representing the gradient of the edge contour point corresponding to the edge contour image to be searched and the gradient of the edge contour point corresponding to the template edge contour image in the y direction.
And S1450, determining a contour matching result of the edge contour of the current layered lower template in the edge contour to be searched based on the calculated similarity between the edge contour of the template with different scales and different rotation angles and the edge contour to be searched.
Referring to fig. 16, for different pyramid hierarchies, traversing template edge contours with different scales and different angles on an edge contour image to be searched according to contour matching search information of each hierarchy to obtain a series of similarities, and taking a position, an angle and a scale corresponding to a maximum value of the similarities as a contour matching result of the template edge contour in the current hierarchy on the edge contour to be searched. And the contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched.
And S1460, determining profile matching search information used when the next hierarchical template edge profile is subjected to similarity matching corresponding to the edge profile to be searched according to the current hierarchical profile matching result, and using the next hierarchical template edge profile to jump to the next hierarchical template edge profile for similarity matching until the pyramid bottom layer.
Referring to fig. 17, when determining the contour matching result of the current hierarchy, the template edge contour of the next hierarchy may be determined by matching mapping according to a preset matching search mapping manner, and the contour matching search information used when performing similarity matching on the template edge contour of the next hierarchy corresponding to the edge contour to be searched is determined. And the matching search mapping mode is used for mapping the contour matching search information used for matching the similarity of the edge contour to be searched corresponding to the lower layer according to the contour matching result of the edge contour to be searched corresponding to the upper layer.
And based on the profile matching search information of the next layer, performing similarity matching on the edge profile of the template of the next layer on the edge profile to be searched of the next layer from top to bottom until similarity matching is performed on all pyramid layers. By introducing pyramid layering, the matching area of the lower layer can be effectively reduced by mapping the matching area to the lower layer after the result is matched by the upper layer, so that the matching speed of the lower layer is accelerated.
Referring to fig. 16 and 17, the matching search mapping manner includes mapping the region to be matched in the contour matching search information from the upper layer to the lower layer of the pyramid. Traversing and matching the contour gravity center of the template edge contour along the upper layer edge contour image to be searched to obtain an upper layer matching position (x, y) with the highest score, wherein the position of the upper layer matching position at the lower layer is (2x,2y), and correspondingly the area to be matched at the lower layer is as follows:
in the above formula, (x ', y') is the coordinates of the upper left corner of the mapping region of the lower layer edge contour image to be searched, and (x ", y") is the coordinates of the lower right corner of the mapping region of the lower layer edge contour image to be searched, so that the position of the region to be matched corresponding to the contour matching search information at the lower layer of the pyramid can be determined.
Referring to fig. 16 and 17, the matching search mapping manner further includes mapping the angle range to be matched in the contour matching search information from the upper layer of the pyramid to the lower layer of the pyramid. The calculation formula of the angle range which is mapped to the pyramid lower layer by the matching search mapping mode in the contour matching result determined by the pyramid upper layer is as follows:
the above formula is the angle mapping formula to be matched, anglenext_preIs mapped for the lower layerAngle of point, anglenext_aftAnd mapping an end point angle for the lower layer, wherein numLevels are the determined pyramid layering number of the image to be searched.
Referring to fig. 16 and 17, the matching search mapping manner further includes mapping the scale to be matched in the profile matching search information from the upper layer to the lower layer of the pyramid. The calculation formula of the scale mapped from the upper layer of the pyramid to the lower layer of the pyramid according to the matching search mapping mode in the contour matching result determined by the upper layer of the pyramid is as follows:
the above formula is a scale mapping formula to be matchednext_preScale for the lower layer mapping of the starting point scalenext_aftAnd mapping an end point scale for the lower layer, wherein scalestep is a given scale step size when the template is created.
S1470, identifying a target shape indicated by the template image in the image to be searched according to the contour matching result when the pyramid bottom layer similarity matching is finished.
On the basis of the foregoing embodiment, optionally, before identifying the target shape indicated by the template image in the image to be searched according to the contour matching result at the end of the pyramid bottom layer similarity matching, the method further includes:
and when the next layer is determined to be positioned at the second bottom layer of the pyramid, directly taking the scale to be matched in the contour matching search information of the current layer as the scale to be matched in the contour matching search information of the next layer, so that the scale mapping adjustment at the lower layer is cut off in advance.
Referring to fig. 3, the last two layers of the pyramid are used as scale division boundaries, and whether the next hierarchy of the current hierarchy is located in the last two layers of the pyramid is determined. If the next hierarchical layer of the current hierarchical layer is not located at the last two layers of the pyramid, the matching search mapping mode is continuously adopted for mapping and determining the dimension to be matched in the contour matching search information used when the contour of the template edge is subjected to similarity matching corresponding to the contour of the edge to be searched at the next hierarchical layer, namely the dimension to be matched in the contour matching search information used at the next hierarchical layer needs to be determined according to the preset matching search mapping based on the dimension in the contour matching result corresponding to the contour of the edge to be searched at the last hierarchical layer. If the next hierarchical level of the current hierarchical level is in the last two hierarchical levels of the pyramid, the scale to be matched in the contour matching search information used by the next hierarchical level is determined to be the scale in the contour matching result of the current hierarchical level, namely the scale information is already determined by the last two hierarchical levels of the pyramid, and the scale is cut off in advance. By means of the scale cut-off in advance, the data volume of scale matching during similarity matching of the next layer can be reduced, and the shape edge matching speed is improved on the premise of ensuring the matching precision.
On the basis of the foregoing embodiment, optionally, before identifying the target shape indicated by the template image in the image to be searched according to the contour matching result at the end of the pyramid bottom layer similarity matching, the method further includes:
when the current hierarchy is determined to be located at the non-pyramid top layer, the angle range to be matched in the contour matching search information of the current hierarchy is corrected as follows: and matching the similarity of the edge contour of the template layered at present on the edge contour to be searched correspondingly according to the plurality of angles to be matched selected from the angle change step length in the angle range to be matched along with the pyramid layer number change angle step length and the angle to be matched indicated by the corrected contour matching search information.
Referring to fig. 3, after the angle range to be matched in the current hierarchical contour matching search information is determined, in order to improve the matching speed in the current hierarchy and reduce the matching number of angles, traversal matching is performed in the angle range to be matched in the current hierarchical contour matching search information, and the angle range to be matched is corrected to a plurality of angles to be matched selected along with the pyramid hierarchy number and angle step length for matching. The angle variable step length matching strategy is as follows:
in the above formula, numLevels is the determined number of pyramid hierarchies, anglenext_preAnd anglenext_aftTo be matched by current hierarchyDetermining the angle range, wherein the angle step length changes with the pyramid layer number, and the current layered angle needs to be matched for three times, namely anglenext_pre、anglenext_aftAnd angle, thereby reducing the number of angle matches.
On the basis of the foregoing embodiment, optionally, performing sliding traversal on the template edge profiles of different scales and different rotation angles of the current hierarchy on the edge profile to be searched corresponding to the current hierarchy may include:
and when the current hierarchy is determined to be positioned at the non-pyramid top layer, controlling the template edge profiles of the current hierarchy with different scales and different rotation angles, and performing interval sliding traversal on the edge profile to be searched corresponding to the current hierarchy according to the interval number of the edge profile points corresponding to the current hierarchy.
Referring to fig. 3, in order to reduce the matching time of the current layer, the edge contour points of the current layer are accessed at certain pixel intervals during the matching time under the condition that the number of the edge contour points matched in the previous layer is not changed, so that the effects of reducing the number of the matching points and improving the matching speed are achieved. Ensuring the number of lower layer edge contour points to be 100 points, and acquiring the total number of current layered template edge contour points (P)total) The number of interval points when the edge contour points of the current hierarchy are traversed and matched is as follows: pdelete_num=Ptotal/100, in the formula, PtotalFor the total number of points, P, of the edge profile of the current hierarchical templatedelete_numThe number of edge contour point intervals used for the matching calculation for the current tier. Information redundancy and computational complexity can be effectively reduced by the edge contour point deletion strategy of the non-pyramid top layer.
Referring to fig. 18, in a serial case, by using the performance improvement strategy of this embodiment, the shape matching time including the image scale features is reduced from 2.999s to 200ms before, the shape speed is improved by 93.3%, and finally, the difference between the two matching scores is 0.0002, which indicates that the performance improvement strategy can effectively improve the shape matching speed including the scale features on the premise of ensuring the matching accuracy.
According to the image matching method provided by the embodiment of the application, the template image is subjected to pyramid layering, the multi-angle template image creation is carried out in a variable scale mode to extract the template edge, a multi-layer variable-scale multi-angle template edge outline point set is created, and the template edge outline of the template image comprises scale information, so that the problems of scale information loss and scale invariance in the image shape matching process are solved, and the image matching accuracy in the image matching process is improved; meanwhile, considering that template edge contour matching containing scale features is adopted, the data structure of template edge contour information is complex to construct, long time is needed in the matching process, the images to be searched are layered through pyramid layering numbers of the template images, a performance improvement strategy of realizing coarse-to-fine similarity matching in each layer is comprehensively applied by the image pyramid, the matching data quantity at the lower layer is reduced, the algorithm complexity is reduced, and the shape edge matching speed under the multi-scale features is improved on the premise of ensuring the matching precision.
Fig. 19 is a block diagram of an image matching apparatus provided in the embodiment of the present invention. The technical scheme of the embodiment can be suitable for the condition of matching the shapes of the images, and the device can be realized in a software and/or hardware mode and integrated on any electronic equipment with a network communication function. As shown in fig. 19, the image matching apparatus in the embodiment of the present application may include: a template information determination module 1910, a to-be-searched information determination module 1920, a contour similarity matching module 1930, a contour lower layer matching mapping module 1940, and a to-be-searched image identification module 1950. Wherein:
a template information determining module 1910, configured to determine a template edge contour information set of a template image; the template edge contour information set represents the multi-scale multi-rotation-angle template edge contours extracted from the template image under different pyramid hierarchies;
the information to be searched determining module 1920 is configured to determine an edge contour information set to be searched of an image to be searched; the edge contour information set to be searched represents edge contours to be searched under different pyramid hierarchies extracted from the image to be searched according to the pyramid hierarchy number of the template image;
the contour similarity matching module 1930 is configured to, for a top-down pyramid hierarchical structure, perform similarity matching on a currently-layered template edge contour on a currently-layered edge contour to be searched based on currently-layered contour matching search information, and obtain a currently-layered contour matching result;
the contour lower layer matching mapping module 1940 is used for determining contour matching search information used when the next hierarchical template edge contour is subjected to similarity matching corresponding to the edge contour to be searched according to the current hierarchical contour matching result, and the contour lower layer matching mapping module is used when the next hierarchical template edge contour is skipped to the next hierarchical template edge contour for similarity matching until the pyramid bottom layer;
the image to be searched identification module 1950 is used for identifying the target shape indicated by the template image in the image to be searched according to the contour matching result when the pyramid bottom layer similarity matching is finished;
the contour matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the similarity matching is carried out on the edge contour of the template; and the contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched.
On the basis of the foregoing embodiment, optionally, the template information determining module 1910 includes:
carrying out pyramid self-adaptive layering on the template images to obtain a plurality of layered template images; performing scale configuration on each layered template image, and performing multi-rotation-angle configuration on the scaled template images;
respectively extracting corresponding template edge contour information from each layered template image set with multi-scale and multi-rotation angles to construct a template edge contour information set of the template image;
the edge contour information comprises the gravity center position of the contour, the pixel position of the edge contour point relative to the gravity center of the contour, the gradient amplitude of the edge contour point, and the transverse gradient and the longitudinal gradient of the edge contour point.
On the basis of the foregoing embodiment, optionally, performing pyramid adaptive layering on the template image to obtain a plurality of layered template images, including:
according to the edge gradient amplitude and the gradient direction of the template image, carrying out non-maximum suppression processing on pixel points in the template image; adaptively determining a hysteresis threshold of the template image according to the gradient amplitude of the template image;
according to the hysteresis threshold value of the template image, performing edge point division processing on the template image subjected to non-maximum suppression processing to obtain a template edge contour image of the template image;
and carrying out pyramid self-adaption layering on the template image based on template edge outline points in the template edge outline image of the template image to obtain a plurality of layered template images.
On the basis of the foregoing embodiment, optionally, the to-be-searched information determining module 1920 includes:
carrying out pyramid layering on the image to be searched according to the pyramid layering number of the template image to obtain a plurality of layered images to be searched;
extracting corresponding edge contour information to be searched from each layered image to be searched so as to construct an edge contour information set to be searched of the image to be searched;
the edge contour information comprises the gravity center position of the contour, the pixel position of the edge contour point relative to the gravity center of the contour, the gradient amplitude of the edge contour point, and the transverse gradient and the longitudinal gradient of the edge contour point.
On the basis of the foregoing embodiment, optionally, extracting corresponding edge contour information to be searched from each layered image to be searched includes:
according to the gradient amplitude and the gradient direction of each layered image to be searched, carrying out non-maximum suppression on pixel points in the image to be searched;
and carrying out edge point division processing on the image to be searched with the non-maximum suppression to obtain an edge contour image to be searched of the image to be searched so as to obtain corresponding edge contour information to be searched.
On the basis of the foregoing embodiment, optionally, the contour similarity matching module 1930 includes:
based on the contour matching search information of the current hierarchy, respectively traversing the edge contours of the template with different scales and different rotation angles of the current hierarchy on the edge contour to be searched corresponding to the current hierarchy in a sliding manner;
calculating the similarity between the template edge profiles with different scales and different rotation angles and the edge profile to be searched during sliding of the template edge profiles with different scales and different rotation angles;
and determining the contour matching result of the edge contour of the template under the current layering in the edge contour to be searched based on the calculated similarity between the edge contour of the template with different scales and different rotation angles and the edge contour to be searched.
On the basis of the above embodiment, optionally, under the condition that the current hierarchy is a pyramid top layer, the contour matching search information of the current hierarchy includes that the whole contour region of the edge contour to be searched is used as a region to be matched, and a scale range and an angle range which are initially set are respectively used as a scale range to be matched and an angle range to be matched;
and under the condition that the current hierarchy is a non-pyramid top layer, the contour matching search information of the current hierarchy comprises a contour matching result in the edge contour to be searched corresponding to the last hierarchy, and the edge contour to be searched corresponding to the current hierarchy is mapped and determined according to a preset matching search mapping mode.
On the basis of the above embodiment, optionally, the matching search mapping manner is used to map the contour matching search information used for performing similarity matching on the edge contour to be searched corresponding to the lower layer according to the contour matching result of the edge contour to be searched corresponding to the upper layer.
On the basis of the foregoing embodiment, optionally, the apparatus further includes:
and when the next hierarchical layer is determined to be positioned at the pyramid sub-bottom layer, directly taking the scale to be matched in the contour matching search information of the current hierarchical layer as the scale to be matched in the contour matching search information of the next hierarchical layer, so as to cut off the scale mapping adjustment of the lower layer in advance.
On the basis of the foregoing embodiment, optionally, the apparatus further includes:
when the current hierarchy is determined to be located at the non-pyramid top layer, correcting the range of the angle to be matched in the contour matching search information of the current hierarchy into: and a plurality of angles to be matched are selected according to the angle step length changed from the angle range to be matched along with the pyramid layer number.
On the basis of the foregoing embodiment, optionally, when it is determined that the current hierarchy is located at the non-pyramid top level, performing sliding traversal on the template edge profiles of different scales and different rotation angles of the current hierarchy on the edge profile to be searched corresponding to the current hierarchy, includes:
and controlling the edge profiles of the template with different scales and different rotation angles of the current hierarchy, and performing interval sliding traversal on the edge profile to be searched corresponding to the current hierarchy according to the interval number of the edge profile points corresponding to the current hierarchy.
The image matching device provided in the embodiment of the present application may perform the image matching method provided in any embodiment of the present application, and has a corresponding function and a beneficial effect of performing the image matching method.
Fig. 20 is a schematic structural diagram of an electronic device provided in an embodiment of the present invention. As shown in fig. 20, the electronic device provided in the embodiment of the present invention includes: one or more processors 2010 and storage 2020; the processor 2010 in the electronic device may be one or more, and one processor 2010 is taken as an example in fig. 20; the storage 2020 is used to store one or more programs; the one or more programs are executed by the one or more processors 2010, such that the one or more processors 2010 implement the image matching method of any of the embodiments of the present invention.
The electronic device may further include: an input device 2030 and an output device 2040.
The processor 2010, the storage device 2020, the input device 2030, and the output device 2040 in the electronic apparatus may be connected by a bus or other means, and fig. 20 illustrates an example of connection by a bus.
The storage 2020 in the electronic device serves as a computer-readable storage medium for storing one or more programs, which may be software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the image matching method provided in the embodiment of the present invention. The processor 2010 executes various functional applications and data processing of the electronic device by executing software programs, instructions and modules stored in the storage 2020, that is, implements the image matching method in the above-described method embodiment.
The storage 2020 may include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the electronic device, and the like. In addition, the storage 2020 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the storage 2020 may further include memory located remotely from the processor 2010, which may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 2030 may be used for receiving input numeric or character information and generating key signal inputs related to user settings and function control of the electronic apparatus. The output device 2040 may include a display device such as a display screen.
And, when the one or more programs included in the electronic device are executed by the one or more processors 2010, the programs perform the following operations:
determining a template edge outline information set of a template image; the template edge contour information set represents the multi-scale multi-rotation-angle template edge contours extracted from the template image under different pyramid hierarchies;
determining an edge contour information set to be searched of an image to be searched; the edge contour information set to be searched represents edge contours to be searched under different pyramid hierarchies extracted from the image to be searched according to the pyramid hierarchy number of the template image;
aiming at a pyramid hierarchical structure from top to bottom, based on the currently-layered contour matching search information, performing similarity matching on the currently-layered template edge contour on the currently-layered edge contour to be searched to obtain a currently-layered contour matching result;
determining profile matching search information used when the template edge profile of the next hierarchy is subjected to similarity matching corresponding to the edge profile to be searched in the next hierarchy according to the current hierarchical profile matching result, wherein the profile matching search information is used when the template edge profile of the next hierarchy is skipped to the next hierarchy for similarity matching until the template edge profile of the next hierarchy reaches the bottom layer of a pyramid;
identifying a target shape indicated by a template image in the image to be searched according to a contour matching result when the pyramid bottom layer similarity matching is finished;
the contour matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the similarity matching is carried out on the edge contour of the template; and the contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched.
Of course, it will be understood by those skilled in the art that when one or more programs included in the electronic device are executed by the one or more processors 510, the programs may also perform related operations in the image matching method provided in any embodiment of the present invention.
An embodiment of the present invention provides a computer-readable storage medium having stored thereon a computer program for executing an image matching method when executed by a processor, the method including:
determining a template edge outline information set of a template image; the template edge contour information set represents the multi-scale multi-rotation-angle template edge contours extracted from the template image under different pyramid hierarchies;
determining an edge contour information set to be searched of an image to be searched; the edge contour information set to be searched represents edge contours to be searched under different pyramid hierarchies extracted from the image to be searched according to the pyramid hierarchy number of the template image;
aiming at a pyramid hierarchical structure from top to bottom, based on the currently-layered contour matching search information, performing similarity matching on the currently-layered template edge contour on the currently-layered edge contour to be searched to obtain a currently-layered contour matching result;
determining profile matching search information used when the template edge profile of the next hierarchy is subjected to similarity matching corresponding to the edge profile to be searched in the next hierarchy according to the current hierarchical profile matching result, wherein the profile matching search information is used when the template edge profile of the next hierarchy is skipped to the next hierarchy for similarity matching until the template edge profile of the next hierarchy reaches the bottom layer of a pyramid;
identifying a target shape indicated by a template image in the image to be searched according to a contour matching result when the pyramid bottom layer similarity matching is finished;
the contour matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the similarity matching is carried out on the edge contour of the template; and the contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched.
Optionally, the program, when executed by a processor, may be further adapted to perform an image matching method as provided in any of the embodiments of the invention.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM), a flash Memory, an optical fiber, a portable CD-ROM, an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. A computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take a variety of forms, including, but not limited to: an electromagnetic signal, an optical signal, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, Radio Frequency (RF), etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.
Claims (14)
1. An image matching method, characterized in that the method comprises:
determining a template edge outline information set of a template image; the template edge contour information set represents the multi-scale multi-rotation-angle template edge contours extracted from the template image under different pyramid hierarchies;
determining an edge contour information set to be searched of an image to be searched; the edge contour information set to be searched represents edge contours to be searched under different pyramid hierarchies extracted from the image to be searched according to the pyramid hierarchy number of the template image;
aiming at a pyramid hierarchical structure from top to bottom, based on the currently-layered contour matching search information, performing similarity matching on the currently-layered template edge contour on the currently-layered edge contour to be searched to obtain a currently-layered contour matching result;
determining profile matching search information used when the template edge profile of the next hierarchy is subjected to similarity matching corresponding to the edge profile to be searched in the next hierarchy according to the current hierarchical profile matching result, wherein the profile matching search information is used when the template edge profile of the next hierarchy is skipped to the next hierarchy for similarity matching until the template edge profile of the next hierarchy reaches the bottom layer of a pyramid;
identifying a target shape indicated by a template image in the image to be searched according to a contour matching result when the pyramid bottom layer similarity matching is finished;
the contour matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the similarity matching is carried out on the edge contour of the template; and the contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched.
2. The method of claim 1, wherein determining a template edge contour information set for the template image comprises:
carrying out pyramid self-adaptive layering on the template images to obtain a plurality of layered template images; performing scale configuration on each layered template image, and performing multi-rotation-angle configuration on the scaled template images;
respectively extracting corresponding template edge contour information from each layered template image set with multi-scale and multi-rotation angles to construct a template edge contour information set of the template image;
the edge contour information comprises the gravity center position of the contour, the pixel position of the edge contour point relative to the gravity center of the contour, the gradient amplitude of the edge contour point, and the transverse gradient and the longitudinal gradient of the edge contour point.
3. The method of claim 2, wherein pyramid-adaptively layering the template image to obtain a plurality of layered template images, comprises:
according to the edge gradient amplitude and the gradient direction of the template image, carrying out non-maximum suppression processing on pixel points in the template image; adaptively determining a hysteresis threshold of the template image according to the gradient amplitude of the template image;
according to the hysteresis threshold value of the template image, performing edge point division processing on the template image subjected to non-maximum suppression processing to obtain a template edge contour image of the template image;
and carrying out pyramid self-adaption layering on the template image based on template edge outline points in the template edge outline image of the template image to obtain a plurality of layered template images.
4. The method of claim 1, wherein determining the set of edge profile information to be searched for the image to be searched for comprises:
carrying out pyramid layering on the image to be searched according to the pyramid layering number of the template image to obtain a plurality of layered images to be searched;
extracting corresponding edge contour information to be searched from each layered image to be searched so as to construct an edge contour information set to be searched of the image to be searched;
the edge contour information comprises the gravity center position of the contour, the pixel position of the edge contour point relative to the gravity center of the contour, the gradient amplitude of the edge contour point, and the transverse gradient and the longitudinal gradient of the edge contour point.
5. The method according to claim 4, wherein extracting corresponding edge contour information to be searched from each layered image to be searched comprises:
according to the gradient amplitude and the gradient direction of each layered image to be searched, carrying out non-maximum suppression on pixel points in the image to be searched;
and carrying out edge point division processing on the image to be searched with the non-maximum suppression to obtain an edge contour image to be searched of the image to be searched so as to obtain corresponding edge contour information to be searched.
6. The method of claim 1, wherein performing similarity matching on the template edge contour of the current hierarchy on the edge contour to be searched of the current hierarchy based on the contour matching search information of the current hierarchy comprises:
based on the contour matching search information of the current hierarchy, respectively traversing the edge contours of the template with different scales and different rotation angles of the current hierarchy on the edge contour to be searched corresponding to the current hierarchy in a sliding manner;
calculating the similarity between the template edge profiles with different scales and different rotation angles and the edge profile to be searched during sliding of the template edge profiles with different scales and different rotation angles;
and determining the contour matching result of the edge contour of the template under the current layering in the edge contour to be searched based on the calculated similarity between the edge contour of the template with different scales and different rotation angles and the edge contour to be searched.
7. The method of claim 6,
under the condition that the current hierarchy is a pyramid top layer, the contour matching search information of the current hierarchy comprises the whole contour region of the edge contour to be searched as a region to be matched, and a scale range and an angle range which are initially set are respectively used as a scale range to be matched and an angle range to be matched;
and under the condition that the current hierarchy is a non-pyramid top layer, the contour matching search information of the current hierarchy comprises a contour matching result in the edge contour to be searched corresponding to the last hierarchy, and the edge contour to be searched corresponding to the current hierarchy is mapped and determined according to a preset matching search mapping mode.
8. The method according to claim 7, wherein the matching search mapping manner is used for mapping profile matching search information used for performing similarity matching on the lower layer corresponding to the edge profile to be searched according to the profile matching result of the upper layer corresponding to the edge profile to be searched.
9. The method of claim 1, further comprising:
and when the next hierarchical layer is determined to be positioned at the pyramid sub-bottom layer, directly taking the scale to be matched in the contour matching search information of the current hierarchical layer as the scale to be matched in the contour matching search information of the next hierarchical layer, so as to cut off the scale mapping adjustment of the lower layer in advance.
10. The method of claim 1,
when the current hierarchy is determined to be located at the non-pyramid top layer, correcting the range of the angle to be matched in the contour matching search information of the current hierarchy into: and a plurality of angles to be matched are selected according to the angle step length changed from the angle range to be matched along with the pyramid layer number.
11. The method of claim 1, wherein when it is determined that the current hierarchy is located at a non-pyramid top level, performing sliding traversal on template edge contours of different scales and different rotation angles of the current hierarchy on edge contours to be searched corresponding to the current hierarchy, comprises:
and controlling the edge profiles of the template with different scales and different rotation angles of the current hierarchy, and performing interval sliding traversal on the edge profile to be searched corresponding to the current hierarchy according to the interval number of the edge profile points corresponding to the current hierarchy.
12. An image matching apparatus, characterized in that the apparatus comprises:
the template information determining module is used for determining a template edge outline information set of the template image; the template edge contour information set represents the multi-scale multi-rotation-angle template edge contours extracted from the template image under different pyramid hierarchies;
the information to be searched determining module is used for determining an edge contour information set to be searched of the image to be searched; the edge contour information set to be searched represents edge contours to be searched under different pyramid hierarchies extracted from the image to be searched according to the pyramid hierarchy number of the template image;
the contour similar matching module is used for matching the similarity of the edge contour of the template in the current hierarchy on the edge contour to be searched in the current hierarchy based on the contour matching search information in the current hierarchy aiming at the pyramid hierarchy from top to bottom to obtain the contour matching result in the current hierarchy;
the contour lower layer matching mapping module is used for determining contour matching search information used when the next hierarchical template edge contour is subjected to similarity matching corresponding to the edge contour to be searched according to the current hierarchical contour matching result, and is used when the next hierarchical template edge contour is skipped to for similarity matching until the pyramid bottom layer;
the image to be searched identification module is used for identifying the target shape indicated by the template image in the image to be searched according to the contour matching result when the pyramid bottom layer similarity matching is finished;
the contour matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the similarity matching is carried out on the edge contour of the template; and the contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched.
13. An electronic device, comprising:
one or more processing devices;
storage means for storing one or more programs;
when executed by the one or more processing devices, cause the one or more processing devices to implement the image matching method of any of claims 1-11.
14. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processing device, implements the image matching method of any one of claims 1 to 11.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110209671.9A CN113159103B (en) | 2021-02-24 | 2021-02-24 | Image matching method, device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110209671.9A CN113159103B (en) | 2021-02-24 | 2021-02-24 | Image matching method, device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113159103A true CN113159103A (en) | 2021-07-23 |
CN113159103B CN113159103B (en) | 2023-12-05 |
Family
ID=76883883
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110209671.9A Active CN113159103B (en) | 2021-02-24 | 2021-02-24 | Image matching method, device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113159103B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113869441A (en) * | 2021-10-10 | 2021-12-31 | 青岛星科瑞升信息科技有限公司 | Multi-scale target positioning method based on template matching |
CN114792373A (en) * | 2022-04-24 | 2022-07-26 | 广东天太机器人有限公司 | Visual identification spraying method and system of industrial robot |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101872475A (en) * | 2009-04-22 | 2010-10-27 | 中国科学院自动化研究所 | Method for automatically registering scanned document images |
CN102073874A (en) * | 2010-12-29 | 2011-05-25 | 中国资源卫星应用中心 | Geometric constraint-attached spaceflight three-line-array charged coupled device (CCD) camera multi-image stereo matching method |
CN102654902A (en) * | 2012-01-16 | 2012-09-05 | 江南大学 | Contour vector feature-based embedded real-time image matching method |
CN105930858A (en) * | 2016-04-06 | 2016-09-07 | 吴晓军 | Fast high-precision geometric template matching method enabling rotation and scaling functions |
CN110378376A (en) * | 2019-06-12 | 2019-10-25 | 西安交通大学 | A kind of oil filler object recognition and detection method based on machine vision |
WO2021017361A1 (en) * | 2019-07-31 | 2021-02-04 | 苏州中科全象智能科技有限公司 | Template matching algorithm based on edge and gradient feature |
CN112396640A (en) * | 2020-11-11 | 2021-02-23 | 广东拓斯达科技股份有限公司 | Image registration method and device, electronic equipment and storage medium |
-
2021
- 2021-02-24 CN CN202110209671.9A patent/CN113159103B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101872475A (en) * | 2009-04-22 | 2010-10-27 | 中国科学院自动化研究所 | Method for automatically registering scanned document images |
CN102073874A (en) * | 2010-12-29 | 2011-05-25 | 中国资源卫星应用中心 | Geometric constraint-attached spaceflight three-line-array charged coupled device (CCD) camera multi-image stereo matching method |
CN102654902A (en) * | 2012-01-16 | 2012-09-05 | 江南大学 | Contour vector feature-based embedded real-time image matching method |
CN105930858A (en) * | 2016-04-06 | 2016-09-07 | 吴晓军 | Fast high-precision geometric template matching method enabling rotation and scaling functions |
CN110378376A (en) * | 2019-06-12 | 2019-10-25 | 西安交通大学 | A kind of oil filler object recognition and detection method based on machine vision |
WO2021017361A1 (en) * | 2019-07-31 | 2021-02-04 | 苏州中科全象智能科技有限公司 | Template matching algorithm based on edge and gradient feature |
CN112396640A (en) * | 2020-11-11 | 2021-02-23 | 广东拓斯达科技股份有限公司 | Image registration method and device, electronic equipment and storage medium |
Non-Patent Citations (2)
Title |
---|
吴晓军;邹广华;: "基于边缘几何特征的高性能模板匹配算法", 仪器仪表学报, no. 07 * |
张石;唐敏;董建威;: "基于小波金字塔和轮廓特征的医学图像配准", 计算机仿真, no. 05 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113869441A (en) * | 2021-10-10 | 2021-12-31 | 青岛星科瑞升信息科技有限公司 | Multi-scale target positioning method based on template matching |
CN113869441B (en) * | 2021-10-10 | 2022-09-27 | 青岛星科瑞升信息科技有限公司 | Multi-scale target positioning method based on template matching |
CN114792373A (en) * | 2022-04-24 | 2022-07-26 | 广东天太机器人有限公司 | Visual identification spraying method and system of industrial robot |
CN114792373B (en) * | 2022-04-24 | 2022-11-25 | 广东天太机器人有限公司 | Visual identification spraying method and system of industrial robot |
Also Published As
Publication number | Publication date |
---|---|
CN113159103B (en) | 2023-12-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7995055B1 (en) | Classifying objects in a scene | |
CN104915949B (en) | A kind of image matching method of combination point feature and line feature | |
WO2022179002A1 (en) | Image matching method and apparatus, electronic device, and storage medium | |
CN103218787B (en) | Multi-source heterogeneous remote sensing image reference mark automatic acquiring method | |
CN111598946B (en) | Object pose measuring method and device and storage medium | |
Marie et al. | The delta medial axis: a fast and robust algorithm for filtered skeleton extraction | |
CN110942515A (en) | Point cloud-based target object three-dimensional computer modeling method and target identification method | |
CN101650784B (en) | Method for matching images by utilizing structural context characteristics | |
CN113111212A (en) | Image matching method, device, equipment and storage medium | |
CN108388902B (en) | Composite 3D descriptor construction method combining global framework point and local SHOT characteristics | |
CN112396640A (en) | Image registration method and device, electronic equipment and storage medium | |
CN113838005B (en) | Intelligent identification and three-dimensional reconstruction method and system for rock mass fracture based on dimension conversion | |
CN114529837A (en) | Building outline extraction method, system, computer equipment and storage medium | |
CN113159103B (en) | Image matching method, device, electronic equipment and storage medium | |
CN111783722B (en) | Lane line extraction method of laser point cloud and electronic equipment | |
CN117292337A (en) | Remote sensing image target detection method | |
CN113111741A (en) | Assembly state identification method based on three-dimensional feature points | |
CN114283343B (en) | Map updating method, training method and device based on remote sensing satellite image | |
CN113420648B (en) | Target detection method and system with rotation adaptability | |
CN104268550A (en) | Feature extraction method and device | |
CN111552751B (en) | Three-dimensional landmark control point generation and application method, generation and application device | |
CN113435479A (en) | Feature point matching method and system based on regional feature expression constraint | |
Di Ruberto | Generalized hough transform for shape matching | |
CN117745780A (en) | Outdoor large scene 3D point cloud registration method based on isolated cluster removal | |
CN111695377A (en) | Text detection method and device and computer equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |