[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN109478326B - Image processing method, terminal equipment and computer storage medium - Google Patents

Image processing method, terminal equipment and computer storage medium Download PDF

Info

Publication number
CN109478326B
CN109478326B CN201780028671.0A CN201780028671A CN109478326B CN 109478326 B CN109478326 B CN 109478326B CN 201780028671 A CN201780028671 A CN 201780028671A CN 109478326 B CN109478326 B CN 109478326B
Authority
CN
China
Prior art keywords
pixel
image
pixels
contour
entropy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780028671.0A
Other languages
Chinese (zh)
Other versions
CN109478326A (en
Inventor
阳光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen A&E Intelligent Technology Institute Co Ltd
Original Assignee
Shenzhen A&E Intelligent Technology Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen A&E Intelligent Technology Institute Co Ltd filed Critical Shenzhen A&E Intelligent Technology Institute Co Ltd
Publication of CN109478326A publication Critical patent/CN109478326A/en
Application granted granted Critical
Publication of CN109478326B publication Critical patent/CN109478326B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

An image processing method, a terminal device and a computer storage medium, the method comprising: acquiring a first image and a second image (S11), wherein the first image and the second image have a corresponding relation; performing edge calculation on the first image and the second image respectively to obtain a first contour line in the first image and a second contour line in the second image (S12); determining pixels in the second contour line corresponding to the pixels in the first contour line to determine a correspondence relationship between the first contour line and the second contour line (S13); the correspondence is used to determine pixels in the second image that correspond to pixels in an area in the first image other than the first object line (S14). By implementing the method, the speed of image processing can be greatly improved.

Description

Image processing method, terminal equipment and computer storage medium
Technical Field
The present invention relates to the field of computer vision, and in particular, to an image processing method, a terminal device, and a computer storage medium.
Background
The identification of corresponding points on two images is an important form of machine vision, and is a method for acquiring three-dimensional geometric information of an object by acquiring two images of the object to be measured from different positions by using an imaging device based on a parallax principle and calculating the position deviation between the corresponding points of the images. The binocular stereo vision is widely applied to the fields of robot navigation, precision industrial measurement, object identification, virtual reality, scene reconstruction, surveying and the like.
Generally, when image corresponding identification is performed, mutual information calculation is performed on each pixel point in one image and all pixel points in the other image respectively, and the method is really feasible, but the image processing speed is greatly reduced due to the extremely large calculation amount.
Disclosure of Invention
The invention mainly solves the technical problem of providing an image processing method, a terminal device and a computer storage medium, which can greatly improve the speed of image processing.
In order to solve the technical problems, the invention adopts the technical scheme that: provided is an image processing method including: acquiring a first image and a second image, wherein the first image and the second image have a corresponding relation; respectively carrying out edge calculation on the first image and the second image to determine a first contour line in the first image and a second contour line in the second image; determining pixels in the second contour line corresponding to pixels in the first contour line so as to determine the corresponding relation between the first contour line and the second contour line; the correspondence is used to determine pixels in the second image that correspond to pixels in areas of the first image that are outside the first contour.
In order to solve the technical problems, the invention adopts the following technical scheme: there is provided a terminal device including: the processor, the memorizer, the computer operation instruction and data in the storage in the memorizer, the processor carries out the computer operation instruction, is used for: acquiring a first image and a second image from a memory, wherein the first image and the second image have a corresponding relation; respectively carrying out edge calculation on the first image and the second image to determine a first contour line in the first image and a second contour line in the second image; determining pixels in the second contour line corresponding to pixels in the first contour line so as to determine the corresponding relation between the first contour line and the second contour line; the correspondence is used to determine pixels in the second image that correspond to pixels in areas of the first image that are outside the first contour.
In order to solve the technical problems, the invention adopts the technical scheme that: there is provided a computer storage medium having stored thereon program data executable to implement a method corresponding to the above.
The invention has the beneficial effects that: different from the prior art, in the image processing process, the corresponding relation between the second contour line and the first contour line is determined, and then the corresponding relation is utilized to determine the pixels in the second image corresponding to the pixels in the area outside the first contour line in the first image. According to the technical scheme, the corresponding relation between the first contour line and the second contour line is utilized, so that the number of pixel points needing to be compared when the corresponding pixels of the pixels except the first contour line are calculated is reduced, the calculation amount is reduced, and the image processing speed is improved.
Drawings
FIG. 1 is a flow chart illustrating an embodiment of an image processing method according to the present invention;
FIG. 2 is a flowchart illustrating step S13 of one embodiment of the image processing method of the present invention;
FIG. 3 is a schematic diagram of an exemplary acquisition of epipolar lines in an embodiment of the image processing method of the present invention;
FIG. 4 is a flowchart illustrating step S14 of one embodiment of the image processing method of the present invention;
fig. 5 is a schematic diagram of an example of step S14 in an embodiment of the image processing method of the present invention;
FIG. 6 is a flowchart illustrating step S132 of one embodiment of the image processing method according to the present invention;
FIG. 7 is a flowchart illustrating step S132 of one embodiment of the image processing method according to the present invention;
FIG. 8 is a diagram illustrating an exemplary entropy calculation according to an embodiment of the image processing method of the present invention;
FIG. 9 is a flowchart illustrating step S132 of one embodiment of the image processing method according to the present invention;
FIG. 10 is a flowchart illustrating step S143 of one embodiment of the image processing method of the present invention;
FIG. 11 is a flowchart illustrating step S143 of an embodiment of the image processing method of the present invention;
FIG. 12 is a flowchart illustrating step S143 of one embodiment of the image processing method of the present invention;
FIG. 13 is a block diagram of a terminal device according to an embodiment of the present invention;
fig. 14 is a schematic diagram of a framework of another embodiment of the terminal device of the present invention.
Detailed Description
Referring to fig. 1, fig. 1 is a diagram illustrating an embodiment of an image processing method according to the present invention, including:
step S11: acquiring a first image and a second image;
it should be noted that there is a correspondence between the first image and the second image.
In an application scene, the first image and the second image are two images respectively acquired by two cameras of a corresponding binocular camera, the two images have difference and corresponding relationship, and the two images are subjected to matching processing and other operations based on the corresponding relationship. At this time, direct shooting acquisition may be performed using a binocular camera when the two images are acquired.
Of course, in other application scenarios, the two images may not correspond to the dual-purpose two images, but may be two identical or similar images having a corresponding relationship, for example, the two images may be obtained by a user shooting the same object from different angles by using a terminal device having a photographing function, and it should be noted that the scenes in the two images need to have an overlapping portion. In this case, the user may capture the same object, scene, or the like from different angles using a camera or the like to obtain the first image and the second image having the correspondence relationship, and perform image processing.
Step S12: respectively carrying out edge calculation on the first image and the second image to obtain a first contour line in the first image and a second contour line in the second image;
the method comprises the steps of carrying out edge calculation on an image, namely detecting the edge contour of an object in the image to obtain the contour line of the object in the image, and further dividing the image into different areas by utilizing the contour line. In this embodiment, the number of the first contour lines and the number of the second contour lines obtained may be one or more, and the area surrounded by the contour lines may be a closed area or an open area.
The edge detection methods are of various types, such as differential operator method, template matching method, wavelet detection method, neural network method, etc., and each type of detection method has different specific methods. Edge identification based on differential operators is a method which is commonly used at present, and the first order or the second order derivative is usually used for detecting the edge. The differential operator method comprises detection methods such as Roberts, Sobel, Prewitt, Canny, Laplacian, Log and MATLAB simulation, and different operators can be selected according to actual conditions in application. The specific algorithm is not limited in the present invention.
Step S13: determining pixels in the second contour line corresponding to pixels in the first contour line so as to determine the corresponding relation between the first contour line and the second contour line;
in one application scenario, the first contour line is a line having a width of a first preset value, and the second contour line is a line having a width of a second preset value. The width of the first preset value is a width capable of covering a certain number of pixels, such as 5 pixels, 10 pixels, and the like, and is not particularly limited; likewise, the width is the second preset value.
The second preset value can be the same as or different from the first preset value; in an application scene, the second preset value is larger than the first preset value, so that when the corresponding relation between the first contour line and the second contour line is determined, pixels in a relatively larger range in the second image can be used for determining, and the accuracy of the corresponding relation is further improved.
It should be noted that, when determining the pixels in the second contour line corresponding to the pixels in the first contour line, any image correspondence identification technology may be used, and is not limited herein.
Step S14: the correspondence is used to determine pixels in the second image that correspond to pixels in areas of the first image that are outside the first contour.
In the image processing process, the corresponding relationship between the second contour line and the first contour line is determined, and then the pixels in the second image corresponding to the pixels in the region outside the first contour line in the first image are determined by using the corresponding relationship. According to the technical scheme, the corresponding relation between the first contour line and the second contour line is utilized, so that the number of pixel points needing to be compared when the corresponding pixels of the pixels except the first contour line are calculated is reduced, the calculation amount is reduced, and the image processing speed is improved.
Referring to fig. 2, in one embodiment, step S13 includes: substep S131 and substep S132.
The substep S131, obtaining a first constraint line of a first pixel of the first contour line in the first image and a second constraint line corresponding to the first pixel in the second image, and obtaining an intersection point of the second constraint line and the second contour line;
in this embodiment, the constraint line may be an epipolar line, where the epipolar line refers to an intersection line of a polar plane and a plane where the two images are located in the epipolar geometry.
Referring to fig. 3, in the present embodiment, one of the first pixels J is randomly set in the first image, and the projection centers of the first image and the second image are known as C1And C2Transverse plane JC1C2Intersecting lines x and y of the planes gamma and sigma where the first image and the second image are respectively are epipolar lines of the first pixel J corresponding to the first image and the second image, namely a first constraint line and a second constraint line.
After the first constraint line and the second constraint line, the first contour line and the second contour line are obtained, the intersection point of the second constraint line and the second contour line in the second image can be simply obtained.
In the sub-step S132, mutual information is obtained for the first pixels and the pixels of the intersection one by one, and the second pixels in the second contour corresponding to the first pixels are determined in the second image, so as to determine the corresponding relationship between the first contour and the second contour.
It should be noted that, since the second contour line has a certain line width, the intersection of the second constraint line and the second contour line is a small line segment having a certain length in a certain degree. At this time, mutual information is obtained for the first pixel and the pixel located at the intersection point one by one, that is, mutual information is obtained for the first pixel and the pixel located on the small line segment one by one.
In this embodiment, the mutual information indicates a correlation degree between a certain pixel in the first image and a certain pixel in the second image, wherein the larger the mutual information is, the larger the correlation degree between the two pixels is.
Since the first pixel is located on the first contour line, it is easy to understand that the first pixel is necessarily located at the intersection of the first contour line and the first constraint line, and the second pixel corresponding to the first pixel in the second image also corresponds to a certain pixel of the small line segment, which is the intersection of the second contour line and the second constraint line, according to the correspondence relationship. In this case, it is only necessary to obtain mutual information for pixels at the intersections of the first pixels and the second constraint lines and the second contour lines, and the pixels in the second image corresponding to the maximum mutual information value obtained are regarded as the second pixels corresponding to the first pixels.
In this embodiment, when determining the pixels corresponding to the pixels of the first contour line in the first image in the second image, according to the epipolar constraint, only the intersection point of the first pixel and the second constraint line and the second contour line, that is, the pixels covered by the small line segment, needs to obtain mutual information, so that the computation amount is greatly reduced, and the image processing speed is further improved.
Referring to fig. 4, in one embodiment, step S14 includes: substep S141, substep S142 and substep S143.
The substep S141 is to obtain a third constraint line of a third pixel in a region outside the first contour line in the first image and a fourth constraint line corresponding to the third pixel in the second image, so as to obtain all first intersecting line segments obtained by intersecting the first contour line and the third constraint line and all second intersecting line segments obtained by intersecting the second contour line and the fourth constraint line;
the substep S142, obtaining a second intersecting line segment corresponding to the first intersecting line segment where the third pixel is located according to the corresponding relation, and obtaining an endpoint of the second intersecting line segment;
the corresponding relationship refers to a corresponding relationship between a first contour line in the first image and a second contour line in the second image. Since the corresponding relationship between the first contour line and the second contour line is determined, the second intersecting line segment in the second image corresponding to the first intersecting line segment where the first pixel is located in the first image can be determined according to the positional relationship between the first intersecting line segment and the first contour line and the positional relationship between the second intersecting line segment and the second contour line.
And a substep S143, obtaining mutual information one by one for the third pixel and the pixel of the corresponding second intersecting line segment from which the end point is removed, to obtain a fourth pixel corresponding to the third pixel in the second image, so as to obtain a pixel corresponding to a pixel in a region in the second image other than the first contour line in the first image.
Since the end points of the second intersecting line segments are located on the second contour line, the pixels on the second contour line correspond to the pixels on the first contour line in the first image according to the constraint principle, and therefore, when the pixels corresponding to the pixels in the region outside the first contour line are obtained, the pixels on the second contour line, that is, the pixels on the end points of the second intersecting line segments, do not need to be considered.
For example, referring to fig. 5, a first constraint line 212 of a first pixel M in a first image 21 and a second constraint line 222 corresponding to a second image 22 are obtained by a prior art epipolar line calculation method. Then, according to the above steps S131 and S132, corresponding pixels of all pixels on the first contour 211 in the first image 21 in the second image 22 are uniformly and correspondingly found out, so as to obtain a contour 223 corresponding to the first contour 211 in the first image 21 in the second image 22. Wherein the first intersecting line segment has a line segment K1G1、G1H1、H1I1、I1E1、E1F1、F1L1The second intersecting line segment has a line segment K2G2、G2H2、H2I2、I2E2、E2F2、F2L2. At this time, since the orientation of the first pixel M in the first image 21 has been determined as the first intersecting line segment E between the first contour lines 211a and 211b in the present embodiment1F1Accordingly, the second pixel in the second image corresponding to the first pixel M also correspondingly defines the first intersecting line segment E in the second image 22 and the first image 211F1Second intersecting line segment E between corresponding contour lines 223a and 223b2F2In this case, when acquiring the second pixel corresponding to the first pixel M in the second image 22, only the first pixel M and the second intersecting line segment E in the second image 22 are needed2F2The mutual information of the pixels without the end points can be obtained without the need of the first pixels M and the corresponding second intersection line sections E2F2All pixels in (2) find mutual information.
Through the method, when the pixel corresponding to the third pixel in the area, except the first contour line, of the second image is obtained, the pixel in the second image can be further limited on the second intersection line segment, corresponding to the first intersection line segment where the third pixel is located, and the end point of the second intersection line segment is removed, so that the operation amount is greatly reduced, and the image processing speed can be obviously improved.
Referring to fig. 6, in one embodiment, step S132 includes: substep S1321, substep S1322:
substep S1321, calculating a first edge entropy of the first pixel, a first edge entropy of a fifth pixel of the intersection point, and a first joint entropy of the first pixel and the fifth pixel by using all pixels in the first image and all pixels in the second image;
in the sub-step S1322, the mutual information between the first pixel and the fifth pixel is calculated according to the first edge entropy of the first pixel, the first edge entropy of the fifth pixel, and the first joint entropy of the first pixel and the fifth pixel, so as to obtain the second pixel in the second contour line in the second image corresponding to the first pixel, so as to obtain the corresponding relationship between the first contour line and the second contour line.
The operation of obtaining mutual information for the pixels x and y is I (x, y) ═ H (x) + H (y) — H (x, y).
Wherein, I (x, y) is the mutual information of the pixel x and the pixel y, H (x), H (y) respectively represent the edge entropy of the pixel x and the pixel y, and H (x, y) represents the joint entropy of the pixel x and the pixel y. The entropy referred to herein is the entropy of information, and the greater the uncertainty of a variable, the greater the entropy.
In the present embodiment, for example, when the edge entropy is calculated for the first pixel in the first image, all the pixels in the first image need to participate in the calculation; similarly, when joint entropy is applied to the first pixel and a certain pixel in the second image, all pixels in the first image and the second image need to participate in the operation.
According to the above method, after the mutual information is calculated for the pixels covered by the intersection of the first pixel in the first image and the second image, the pixel with the maximum mutual information value is the second pixel corresponding to the first pixel.
Referring to fig. 7, in one embodiment, step S132 includes: substep S1323, substep S1324;
substep S1323: calculating a second edge entropy of the first pixel, a second edge entropy of a sixth pixel of the intersection point and a second joint entropy of the first pixel and the sixth pixel by using a first contour line pixel in the first image and a second contour line pixel in the second image;
substep S1324: and calculating the mutual information of the first pixel and the sixth pixel according to the second edge entropy of the first pixel, the second edge entropy of the sixth pixel and the second joint entropy of the first pixel and the sixth pixel to obtain a second pixel in a second contour line in the second image corresponding to the first pixel so as to obtain the corresponding relation between the first contour line and the second contour line.
It is easy to understand that the edge contour of an object refers to a part of the image where the local intensity changes most significantly, i.e. a region where the pixel values suddenly change, so that the pixel values in other regions except the contour line change very little and uniformly, and to some extent, can even be ignored.
In the present embodiment, in the process of obtaining mutual information, when the edge entropy and the joint entropy are calculated, the region surrounded by the contour lines is regarded as blank, that is, the pixels in the blank region are ignored. Referring to fig. 8, for example, the regions i, j, k surrounded by the first contour line in the first image 21 and the regions i ', j ', k ' surrounded by the second contour line in the second image 22 may be regarded as blank, for example, when the edge entropy of the first pixel N in the first image is obtained, only the pixels of the first contour line in the first image are used for obtaining the edge entropy, and the pixels of the region surrounded by the contour lines are not considered, because the number of the pixels included in the other regions is very large, the computation amount for obtaining the entropy can be greatly reduced in such a case, so that the computation amount for obtaining the mutual information is indirectly reduced, and the speed of image processing is further greatly increased.
Referring to fig. 9, in one embodiment, step S132 includes: substep S1325, substep S1326, substep S1327;
substep S1325, obtaining respective mean values of pixels of the regions respectively enclosed by the first contour lines and the second contour lines in the first image and the second image, and setting all the pixels of each region respectively enclosed by the first contour lines and the second contour lines as single pixels respectively having the respective mean values;
substep S1326, calculating a third edge entropy of the first pixel, a third edge entropy of a seventh pixel of the intersection point, and a third joint entropy of the first pixel and the seventh pixel by using the first contour line pixels in the first image, the single pixels with respective mean values of the regions surrounded by all the first contour lines, the second contour line pixels in the second image, and the single pixels with respective mean values of the regions surrounded by all the second contour lines;
and a substep S1327 of calculating mutual information of the first pixel and the seventh pixel according to the third edge entropy of the first pixel, the third edge entropy of the seventh pixel, and the third joint entropy of the first pixel and the seventh pixel, and obtaining a second pixel in the second image corresponding to the first pixel to obtain a corresponding relationship between the first contour line and the second contour line.
The respective mean values of the pixels of the areas respectively enclosed by the first contour lines and the second contour lines are obtained by averaging the pixel values of all the pixels in the areas enclosed by the contour lines, and the obtained respective mean values are the respective mean values of the pixels of the areas.
The single pixel refers to that the area enclosed by the first contour line and the second contour line is regarded as a pixel, and the pixel value of the pixel is the average value of the enclosed area.
As in the above embodiment, the pixel values of the pixels in the other regions than the contour line vary very little and are uniform, and can be considered to be a plurality of pixels having the same pixel value to some extent.
Therefore, in the present embodiment, the pixels of the region surrounded by the contour lines are regarded as the individual pixels, and only the pixels of the contour lines and the individual pixels are used for entropy calculation in the mutual information obtaining process, thereby greatly reducing the calculation amount. Meanwhile, because the pixel value of a single pixel is the average value of the pixel values of all the pixels in the region enclosed by the contour line, no large error is brought to the calculation of mutual information, that is, the embodiment greatly reduces the calculation amount on the basis of ensuring the accuracy of the calculation of the mutual information and improves the speed of image processing.
Referring to fig. 10, in one embodiment, step S143 includes: substeps 1431, substep S1432:
substep S1431, calculating a fourth edge entropy of the third pixel, a fourth edge entropy of an eighth pixel of the corresponding second intersected line segment after removing the end point, and a fourth joint entropy of the third pixel and the eighth pixel by using all pixels in the first image and all pixels in the second image;
substep S1432, calculating the mutual information between the third pixel and the eighth pixel according to the fourth edge entropy of the third pixel, the fourth edge entropy of the eighth pixel, and the fourth joint entropy of the third pixel and the eighth pixel, to obtain a fourth pixel in the second image corresponding to the third pixel, so as to obtain a pixel in the second image corresponding to a pixel in a region outside the first contour line in the first image.
In this embodiment, the calculation method of mutual information, the relationship between the mutual information and the edge entropy and the joint entropy, and the beneficial effects of this embodiment are similar to those of the above embodiment, and details are please refer to the above embodiment, which is not described herein again.
Referring to fig. 11, in one embodiment, step S143 includes: substep S1433, substep S1434;
substep S1433: calculating a fifth edge entropy of a third pixel, a fifth edge entropy of a ninth pixel of a corresponding second intersected line segment after an end point is removed and a fifth joint entropy of the third pixel and the ninth pixel by using a first contour line pixel in the first image and a second contour line pixel in the second image;
sub-step S1434: and calculating the mutual information of the third pixel and the ninth pixel according to the fifth edge entropy of the third pixel, the fifth edge entropy of the ninth pixel and the fifth joint entropy of the third pixel and the ninth pixel to obtain a fourth pixel in the second image corresponding to the third pixel so as to obtain a pixel in the second image corresponding to a pixel in a region except the first contour line in the first image.
Referring to fig. 12, in one embodiment, step S143 includes: substep S1435, substep S1437;
substep S1435, obtaining respective average values of pixels of the regions respectively enclosed by the first contour lines and the second contour lines in the first image and the second image, and setting all the pixels of each region respectively enclosed by the first contour lines and the second contour lines as single pixels respectively having the respective average values;
substep S1436, calculating a sixth edge entropy of the third pixel, a sixth edge entropy of a tenth pixel of a corresponding second intersecting line segment after removing the end point, and a sixth joint entropy of the third pixel and the tenth pixel by using the first contour line pixels in the first image, the single pixels with respective mean values of the regions surrounded by all the first contour lines, the second contour line pixels in the second image, and the single pixels with respective mean values of the regions surrounded by all the second contour lines;
sub-step S1437, calculating the mutual information between the third pixel and the tenth pixel according to the sixth edge entropy of the third pixel, the sixth edge entropy of the tenth pixel, and the sixth joint entropy of the third pixel and the tenth pixel, to obtain a fourth pixel in the second image corresponding to the third pixel, so as to obtain a pixel in the second image corresponding to a pixel in a region outside the first contour line in the first image.
Referring to fig. 13, a terminal device according to an embodiment of the present invention includes: a processor 31 and a memory 32, wherein the memory 32 is coupled to the processor 31.
The memory 32 stores computer operating instructions and data, and the processor 31 executes the computer operating instructions for: acquiring a first image and a second image from the memory 32, wherein the first image and the second image have a corresponding relationship; respectively carrying out edge calculation on the first image and the second image to determine a first contour line in the first image and a second contour line in the second image; determining pixels in the second contour line corresponding to pixels in the first contour line so as to determine the corresponding relation between the first contour line and the second contour line; the correspondence is used to determine pixels in the second image that correspond to pixels in areas of the first image that are outside the first contour.
In one embodiment, the processor 31 executes computer operation instructions, and is further configured to: acquiring a first constraint line of a first pixel of the first contour line in the first image and a second constraint line corresponding to the first pixel in the second image, and obtaining an intersection point of the second constraint line and the second contour line; and mutually solving the information of the first pixels and the pixels of the intersection one by one, and determining second pixels in a second contour line corresponding to the first pixels in the second image so as to determine the corresponding relation between the first contour line and the second contour line.
In an embodiment, the determining, by the processor 31, pixels in the second contour line corresponding to the pixels in the first contour line to determine a corresponding relationship between the first contour line and the second contour line includes: acquiring a third constraint line of a third pixel of a region outside the first contour line in the first image and a fourth constraint line corresponding to the second image to acquire all first intersected line segments obtained by intersecting the first contour line and the third constraint line and all second intersected line segments obtained by intersecting the second contour line and the fourth constraint line; acquiring a second intersecting line segment corresponding to the first intersecting line segment where the third pixel is located according to the corresponding relation, and acquiring an end point of the second intersecting line segment; and mutually obtaining information of the third pixel and the pixels of the corresponding second intersecting line segment after the endpoint is removed one by one to obtain a fourth pixel corresponding to the third pixel in the second image so as to obtain the pixels corresponding to the pixels in the region except the first contour line in the first image in the second image.
In an embodiment, the processor 31 uses the correspondence to determine a pixel in the second image corresponding to a pixel in a region of the first image other than the first contour line, including: calculating a first edge entropy of the first pixel, a first edge entropy of a fifth pixel of the intersection point and a first joint entropy of the first pixel and the fifth pixel by using all pixels in the first image and all pixels in the second image; and calculating mutual information of the first pixel and the fifth pixel according to the first edge entropy of the first pixel, the first edge entropy of the fifth pixel and the first joint entropy of the first pixel and the fifth pixel.
In one embodiment, the processor 31 finds mutual information for the first pixel and the pixel of the intersection one by one, and includes: calculating a second edge entropy of the first pixel, a second edge entropy of a sixth pixel of the intersection point and a second joint entropy of the first pixel and the sixth pixel by using a first contour line pixel in the first image and a second contour line pixel in the second image; and calculating the mutual information of the first pixel and the sixth pixel according to the second edge entropy of the first pixel, the second edge entropy of the sixth pixel and the second joint entropy of the first pixel and the sixth pixel.
In one embodiment, the processor 31 finds mutual information for the first pixel and the pixel of the intersection one by one, and includes: respectively obtaining respective average values of pixels of areas respectively enclosed by a first contour line and a second contour line in the first image and the second image, and setting all the pixels of each area respectively enclosed by the first contour line and the second contour line as single pixels respectively having the respective average values; calculating a third edge entropy of the first pixel, a third edge entropy of a seventh pixel of the intersection point and a third joint entropy of the first pixel and the seventh pixel by using the first contour line pixels in the first image, the single pixels with respective mean values of the regions surrounded by all the first contour lines, the second contour line pixels in the second image and the single pixels with respective mean values of the regions surrounded by all the second contour lines; and calculating a mutual information system of the first pixel and the seventh pixel according to the third edge entropy of the first pixel, the third edge entropy of the seventh pixel and the third joint entropy of the first pixel and the seventh pixel.
In one embodiment, the processor 31 finds mutual information of the third pixel and the pixel of the corresponding second intersecting line segment excluding the end point one by one, and includes: calculating a fourth edge entropy of a third pixel, a fourth edge entropy of an eighth pixel of the corresponding second intersected line segment after the endpoint is removed and a fourth joint entropy of the third pixel and the eighth pixel by using all pixels in the first image and all pixels in the second image; and calculating the mutual information of the third pixel and the eighth pixel according to the fourth edge entropy of the third pixel, the fourth edge entropy and the fourth joint entropy of the third pixel and the eighth pixel.
In one embodiment, the processor 31 finds mutual information of the third pixel and the pixel of the corresponding second intersecting line segment excluding the end point one by one, and includes: calculating a fifth edge entropy of a third pixel, a fifth edge entropy of a ninth pixel of the corresponding second intersected line segment after the endpoint is removed and a fifth joint entropy of the third pixel and the ninth pixel by using a first contour line pixel in the first image and a second contour line pixel in the second image; and calculating mutual information of the third pixel and the ninth pixel according to the fifth edge entropy of the third pixel, the fifth edge entropy of the ninth pixel and the fifth joint entropy of the third pixel and the ninth pixel.
In one embodiment, the processor 31 finds mutual information of the third pixel and the pixel of the corresponding second intersecting line segment excluding the end point one by one, and includes: respectively obtaining respective average values of pixels of areas respectively enclosed by a first contour line and a second contour line in the first image and the second image, and setting all the pixels of each area respectively enclosed by the first contour line and the second contour line as single pixels respectively having the respective average values; calculating a sixth edge entropy of a third pixel, a sixth edge entropy of a tenth pixel of the corresponding second intersecting line segment after the endpoint is removed, and a sixth joint entropy of the third pixel and the tenth pixel by using the first contour line pixels in the first image, the single pixels with respective mean values of the regions surrounded by all the first contour lines, the second contour line pixels in the second image, and the single pixels with respective mean values of the regions surrounded by all the second contour lines; and calculating mutual information of the third pixel and the tenth pixel according to the sixth edge entropy of the third pixel, the sixth edge entropy of the tenth pixel and the sixth joint entropy of the third pixel and the tenth pixel.
Referring to fig. 14, in an embodiment, the terminal device further includes: a first camera 33 and a second camera 34; the first camera 33 and the second camera 34 are used for acquiring a first image and a second image, respectively, and storing the first image and the second image in the memory 32.
In an embodiment of the computer storage medium of the present invention, the apparatus stores program data executable to implement the method as in an embodiment of the image processing method of the present invention.
The computer storage medium may be at least one of a flexible disk drive, a hard disk drive, a CD-ROM reader, a magneto-optical disk reader, a CPU (for RAM), and the like.
For a detailed description of other related matters, please refer to the above method section, which is not described herein again.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (20)

1. An image processing method, comprising:
acquiring a first image and a second image, wherein the first image and the second image have a corresponding relation;
respectively carrying out edge calculation on the first image and the second image so as to determine a first contour line in the first image and a second contour line in the second image;
determining pixels in the second contour line corresponding to pixels in the first contour line to determine a corresponding relationship between the first contour line and the second contour line;
and utilizing the corresponding relation to determine pixels in the second image corresponding to pixels in the area outside the first contour line in the first image.
2. The method of claim 1, wherein determining pixels in the second contour corresponding to the pixels in the first contour to determine the correspondence between the first contour and the second contour comprises:
acquiring a first constraint line of a first pixel of the first contour line in the first image and a second constraint line corresponding to the first pixel in the second image, and obtaining an intersection point of the second constraint line and the second contour line;
and mutually obtaining information of the first pixels and the pixels of the intersection point one by one, and determining second pixels in the second contour line corresponding to the first pixels in the second image so as to determine the corresponding relation between the first contour line and the second contour line.
3. The method of claim 1, wherein the utilizing the correspondence to determine pixels in the second image that correspond to pixels in an area of the first image outside of the first contour comprises:
acquiring a third constraint line of a third pixel in a region outside the first contour line in the first image and a fourth constraint line corresponding to the second image to acquire all first intersected line segments obtained by intersecting the first contour line and the third constraint line and all second intersected line segments obtained by intersecting the second contour line and the fourth constraint line;
acquiring the second intersected line segment corresponding to the first intersected line segment where the third pixel is located according to the corresponding relation, and acquiring an end point of the second intersected line segment;
and mutually obtaining information one by one for the third pixel and the pixel of the corresponding second intersecting line segment after the endpoint is removed to obtain a fourth pixel corresponding to the third pixel in the second image so as to obtain the pixel corresponding to the pixel in the region except the first contour line in the first image in the second image.
4. The method of claim 2, wherein the mutually information of the first pixel and the pixel of the intersection point comprises:
calculating a first edge entropy of the first pixel, a first edge entropy of a fifth pixel of the intersection point and a first joint entropy of the first pixel and the fifth pixel by using all pixels in the first image and all pixels in the second image;
and calculating mutual information of the first pixel and the fifth pixel according to the first edge entropy of the first pixel, the first edge entropy of the fifth pixel and the first joint entropy of the first pixel and the fifth pixel.
5. The method of claim 2, wherein the mutually information of the first pixel and the pixel of the intersection point comprises:
calculating a second edge entropy of the first pixel, a second edge entropy of a sixth pixel of the intersection point and a second joint entropy of the first pixel and the sixth pixel by using the first contour line pixel in the first image and the second contour line pixel in the second image;
and calculating mutual information of the first pixel and the sixth pixel according to the second edge entropy of the first pixel, the second edge entropy of the sixth pixel and the second joint entropy of the first pixel and the sixth pixel.
6. The method of claim 2, wherein the mutually information the first pixel and the pixel of the intersection point comprises:
respectively obtaining respective average values of pixels of regions respectively enclosed by the first contour lines and the second contour lines in the first image and the second image, and setting all the pixels of each region respectively enclosed by the first contour lines and the second contour lines as single pixels respectively having the respective average values;
calculating a third edge entropy of the first pixel, a third edge entropy of a seventh pixel of the intersection point, and a third joint entropy of the first pixel and the seventh pixel by using the first contour line pixel in the first image, the single pixel with the respective mean value of the region surrounded by all the first contour lines, the second contour line pixel in the second image, and the single pixel with the respective mean value of the region surrounded by all the second contour lines;
and calculating mutual information of the first pixel and the seventh pixel according to the third edge entropy of the first pixel, the third edge entropy of the seventh pixel and the third joint entropy of the first pixel and the seventh pixel.
7. The method of claim 3, wherein the mutually correlating the third pixel with the pixel of the corresponding second intersecting line segment without the end point comprises:
calculating a fourth edge entropy of the third pixel, a fourth edge entropy of an eighth pixel of the corresponding second intersected line segment after the endpoint is removed and a fourth joint entropy of the third pixel and the eighth pixel by using all pixels in the first image and all pixels in the second image;
and calculating mutual information of the third pixel and the eighth pixel according to a fourth edge entropy of the third pixel, a fourth edge entropy of the eighth pixel and a fourth joint entropy of the third pixel and the eighth pixel.
8. The method of claim 3, wherein the mutually correlating the third pixel with the pixel of the corresponding second intersecting line segment without the end point comprises:
calculating a fifth edge entropy of the third pixel, a fifth edge entropy of a ninth pixel of the corresponding second intersecting line segment after the endpoint is removed, and a fifth joint entropy of the third pixel and the ninth pixel by using the first contour line pixel in the first image and the second contour line pixel in the second image;
and calculating mutual information of the third pixel and the ninth pixel according to a fifth edge entropy of the third pixel, a fifth edge entropy of the ninth pixel and a fifth joint entropy of the third pixel and the ninth pixel.
9. The method of claim 3, wherein the mutually correlating the third pixel with the pixel of the corresponding second intersecting line segment without the end point comprises:
respectively obtaining respective average values of pixels of regions respectively enclosed by the first contour lines and the second contour lines in the first image and the second image, and setting all the pixels of each region respectively enclosed by the first contour lines and the second contour lines as single pixels respectively having the respective average values;
calculating a sixth edge entropy of the third pixel, a sixth edge entropy of a tenth pixel of the corresponding second intersecting line segment after the endpoint is removed, and a sixth joint entropy of the third pixel and the tenth pixel, by using the first contour line pixels in the first image, the single pixels with the respective mean values of the region surrounded by all the first contour lines, the second contour line pixels in the second image, and the single pixels with the respective mean values of the region surrounded by all the second contour lines;
and calculating mutual information of the third pixel and the tenth pixel according to a sixth edge entropy of the third pixel, a sixth edge entropy of the tenth pixel and a sixth joint entropy of the third pixel and the tenth pixel.
10. The image processing method according to any one of claims 1 to 9, wherein the first contour line is a line having a width of a first preset value, and the second contour line is a line having a width of a second preset value.
11. A terminal device, characterized in that the terminal device comprises: the processor executes the computer operation instructions and is used for:
acquiring a first image and a second image from a memory, wherein the first image and the second image have a corresponding relation;
respectively carrying out edge calculation on the first image and the second image so as to determine a first contour line in the first image and a second contour line in the second image;
determining pixels in the second contour line corresponding to pixels in the first contour line to determine a corresponding relationship between the first contour line and the second contour line;
and utilizing the corresponding relation to determine pixels in the second image corresponding to pixels in the area outside the first contour line in the first image.
12. The terminal device of claim 11, wherein the processor determines pixels in the second contour corresponding to the pixels of the first contour to determine the correspondence between the first contour and the second contour, comprises:
acquiring a first constraint line of a first pixel of the first contour line in the first image and a second constraint line corresponding to the first pixel in the second image, and obtaining an intersection point of the second constraint line and the second contour line;
and mutually obtaining information of the first pixels and the pixels of the intersection point one by one, and determining second pixels in the second contour line corresponding to the first pixels in the second image so as to determine the corresponding relation between the first contour line and the second contour line.
13. The terminal device of claim 11, wherein the processor utilizes the correspondence to determine pixels in the second image that correspond to pixels in an area of the first image outside of the first contour, comprising:
acquiring a third constraint line of a third pixel in a region outside the first contour line in the first image and a fourth constraint line corresponding to the second image to acquire all first intersected line segments obtained by intersecting the first contour line and the third constraint line and all second intersected line segments obtained by intersecting the second contour line and the fourth constraint line;
acquiring the second intersected line segment corresponding to the first intersected line segment where the third pixel is located according to the corresponding relation, and acquiring an end point of the second intersected line segment;
and mutually obtaining information one by one for the third pixel and the pixel of the corresponding second intersecting line segment after the endpoint is removed to obtain a fourth pixel corresponding to the third pixel in the second image so as to obtain the pixel corresponding to the pixel in the region except the first contour line in the first image in the second image.
14. The terminal device of claim 12, wherein the processor finds mutual information for the first pixel and the pixel of the intersection, comprising:
calculating a first edge entropy of the first pixel, a first edge entropy of a fifth pixel of the intersection point and a first joint entropy of the first pixel and the fifth pixel by using all pixels in the first image and all pixels in the second image;
and calculating mutual information of the first pixel and the fifth pixel according to the first edge entropy of the first pixel, the first edge entropy of the fifth pixel and the first joint entropy of the first pixel and the fifth pixel.
15. The terminal device of claim 12, wherein the processor finds mutual information for the first pixel and the pixel of the intersection, comprising:
calculating a second edge entropy of the first pixel, a second edge entropy of a sixth pixel of the intersection point and a second joint entropy of the first pixel and the sixth pixel by using the first contour line pixel in the first image and the second contour line pixel in the second image;
and calculating mutual information of the first pixel and the sixth pixel according to the second edge entropy of the first pixel, the second edge entropy of the sixth pixel and the second joint entropy of the first pixel and the sixth pixel.
16. The terminal device of claim 12, wherein the processor finds mutual information for the first pixel and the pixel of the intersection one by one, comprising:
respectively obtaining respective average values of pixels of regions respectively enclosed by the first contour lines and the second contour lines in the first image and the second image, and setting all the pixels of each region respectively enclosed by the first contour lines and the second contour lines as single pixels respectively having the respective average values;
calculating a third edge entropy of the first pixel, a third edge entropy of a seventh pixel of the intersection point, and a third joint entropy of the first pixel and the seventh pixel by using the first contour line pixel in the first image, the single pixel with the respective mean value of the region surrounded by all the first contour lines, the second contour line pixel in the second image, and the single pixel with the respective mean value of the region surrounded by all the second contour lines;
and calculating mutual information of the first pixel and the seventh pixel according to the third edge entropy of the first pixel, the third edge entropy of the seventh pixel and the third joint entropy of the first pixel and the seventh pixel.
17. The terminal device of claim 13, wherein the processor performs mutual information on the third pixel and the pixels of the corresponding second intersecting line segment excluding the end point, one by one, and includes:
calculating a fourth edge entropy of the third pixel, a fourth edge entropy of an eighth pixel of the corresponding second intersected line segment after the endpoint is removed and a fourth joint entropy of the third pixel and the eighth pixel by using all pixels in the first image and all pixels in the second image;
and calculating mutual information of the third pixel and the eighth pixel according to the fourth edge entropy of the third pixel, the fourth edge entropy of the eighth pixel and the fourth joint entropy of the third pixel and the eighth pixel.
18. The terminal device of claim 13, wherein the processor performs mutual information on the third pixel and the pixels of the corresponding second intersecting line segment excluding the end point, one by one, and includes:
calculating a fifth edge entropy of the third pixel, a fifth edge entropy of a ninth pixel of the corresponding second intersecting line segment after the endpoint is removed, and a fifth joint entropy of the third pixel and the ninth pixel by using the first contour line pixel in the first image and the second contour line pixel in the second image;
and calculating mutual information of the third pixel and the ninth pixel according to a fifth edge entropy of the third pixel, a fifth edge entropy of the ninth pixel and a fifth joint entropy of the third pixel and the ninth pixel.
19. The terminal device of claim 13, wherein the processor performs mutual information on the third pixel and the pixels of the corresponding second intersecting line segment excluding the end point, one by one, and includes:
respectively obtaining respective average values of pixels of regions respectively enclosed by the first contour lines and the second contour lines in the first image and the second image, and setting all the pixels of each region respectively enclosed by the first contour lines and the second contour lines as single pixels respectively having the respective average values;
calculating a sixth edge entropy of the third pixel, a sixth edge entropy of a tenth pixel of the corresponding second intersecting line segment after the endpoint is removed, and a sixth joint entropy of the third pixel and the tenth pixel, by using the first contour line pixels in the first image, the single pixels with the respective mean values of the region surrounded by all the first contour lines, the second contour line pixels in the second image, and the single pixels with the respective mean values of the region surrounded by all the second contour lines;
and calculating mutual information of the third pixel and the tenth pixel according to a sixth edge entropy of the third pixel, a sixth edge entropy of the tenth pixel and a sixth joint entropy of the third pixel and the tenth pixel.
20. A computer storage medium, characterized in that the computer storage medium has stored therein program data executable to implement the method according to any one of claims 1-10.
CN201780028671.0A 2017-05-26 2017-05-26 Image processing method, terminal equipment and computer storage medium Active CN109478326B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/086100 WO2018214151A1 (en) 2017-05-26 2017-05-26 Image processing method, terminal device and computer storage medium

Publications (2)

Publication Number Publication Date
CN109478326A CN109478326A (en) 2019-03-15
CN109478326B true CN109478326B (en) 2021-11-05

Family

ID=64395197

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780028671.0A Active CN109478326B (en) 2017-05-26 2017-05-26 Image processing method, terminal equipment and computer storage medium

Country Status (2)

Country Link
CN (1) CN109478326B (en)
WO (1) WO2018214151A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101770643A (en) * 2008-12-26 2010-07-07 富士胶片株式会社 Image processing apparatus, image processing method, and image processing program
CN103761708A (en) * 2013-12-30 2014-04-30 浙江大学 Image restoration method based on contour matching
CN104021568A (en) * 2014-06-25 2014-09-03 山东大学 Automatic registering method of visible lights and infrared images based on polygon approximation of contour
CN105957009A (en) * 2016-05-06 2016-09-21 安徽伟合电子科技有限公司 Image stitching method based on interpolation transition

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040101184A1 (en) * 2002-11-26 2004-05-27 Radhika Sivaramakrishna Automatic contouring of tissues in CT images
JP4490987B2 (en) * 2007-04-26 2010-06-30 株式会社東芝 High resolution device and method
CN101312539B (en) * 2008-07-03 2010-11-10 浙江大学 Hierarchical image depth extracting method for three-dimensional television
JP6015267B2 (en) * 2012-09-13 2016-10-26 オムロン株式会社 Image processing apparatus, image processing program, computer-readable recording medium recording the same, and image processing method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101770643A (en) * 2008-12-26 2010-07-07 富士胶片株式会社 Image processing apparatus, image processing method, and image processing program
CN103761708A (en) * 2013-12-30 2014-04-30 浙江大学 Image restoration method based on contour matching
CN104021568A (en) * 2014-06-25 2014-09-03 山东大学 Automatic registering method of visible lights and infrared images based on polygon approximation of contour
CN105957009A (en) * 2016-05-06 2016-09-21 安徽伟合电子科技有限公司 Image stitching method based on interpolation transition

Also Published As

Publication number Publication date
CN109478326A (en) 2019-03-15
WO2018214151A1 (en) 2018-11-29

Similar Documents

Publication Publication Date Title
CN106558080B (en) Monocular camera external parameter online calibration method
CN109658454B (en) Pose information determination method, related device and storage medium
CN107507277B (en) Three-dimensional point cloud reconstruction method and device, server and readable storage medium
CN106408609A (en) Parallel mechanism end motion pose detection method based on binocular vision
CN108648194A (en) Based on the segmentation of CAD model Three-dimensional target recognition and pose measuring method and device
CN113393524B (en) Target pose estimation method combining deep learning and contour point cloud reconstruction
KR20170091496A (en) Method and apparatus for processing binocular image
CN107590444B (en) Method and device for detecting static obstacle and storage medium
Li et al. Co-planar parametrization for stereo-SLAM and visual-inertial odometry
CN107492107B (en) Object identification and reconstruction method based on plane and space information fusion
CN102750704A (en) Step-by-step video camera self-calibration method
CN111047634A (en) Scene depth determination method, device, equipment and storage medium
KR100792172B1 (en) Apparatus and method for estimating fundamental matrix using robust correspondence point
CN109478326B (en) Image processing method, terminal equipment and computer storage medium
US20230281862A1 (en) Sampling based self-supervised depth and pose estimation
KR102171203B1 (en) A method of matching a stereo image and an apparatus therefor
Hamzah et al. Development of stereo matching algorithm based on sum of absolute RGB color differences and gradient matching
CN105931231A (en) Stereo matching method based on full-connection random field combination energy minimization
CN112991372B (en) 2D-3D camera external parameter calibration method based on polygon matching
Lee et al. Visual odometry for absolute position estimation using template matching on known environment
Evans et al. Iterative roll angle estimation from dense disparity map
US9384417B1 (en) System and method for object dimension estimation
Ling et al. Probabilistic dense reconstruction from a moving camera
Kang et al. 3D urban reconstruction from wide area aerial surveillance video
Du et al. Optimization of stereo vision depth estimation using edge-based disparity map

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant