CN113902652B - Speckle image correction method, depth calculation method, device, medium, and apparatus - Google Patents
Speckle image correction method, depth calculation method, device, medium, and apparatus Download PDFInfo
- Publication number
- CN113902652B CN113902652B CN202111502716.8A CN202111502716A CN113902652B CN 113902652 B CN113902652 B CN 113902652B CN 202111502716 A CN202111502716 A CN 202111502716A CN 113902652 B CN113902652 B CN 113902652B
- Authority
- CN
- China
- Prior art keywords
- speckle
- detection window
- image
- ellipse
- window
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000004364 calculation method Methods 0.000 title claims abstract description 89
- 238000000034 method Methods 0.000 title claims abstract description 61
- 238000003702 image correction Methods 0.000 title claims abstract description 35
- 238000001514 detection method Methods 0.000 claims abstract description 213
- 239000011159 matrix material Substances 0.000 claims abstract description 99
- 238000012545 processing Methods 0.000 claims description 24
- 238000003384 imaging method Methods 0.000 claims description 18
- 238000003860 storage Methods 0.000 claims description 8
- 238000012935 Averaging Methods 0.000 claims description 2
- 230000003287 optical effect Effects 0.000 description 12
- 230000000694 effects Effects 0.000 description 11
- 238000010606 normalization Methods 0.000 description 11
- 238000001914 filtration Methods 0.000 description 8
- 238000004422 calculation algorithm Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000012937 correction Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 230000004323 axial length Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 201000009310 astigmatism Diseases 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 241000255969 Pieris brassicae Species 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 238000003754 machining Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/66—Analysis of geometric attributes of image moments or centre of gravity
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Computational Mathematics (AREA)
- Mathematical Optimization (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Pure & Applied Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Analysis (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Geometry (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Algebra (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Processing (AREA)
Abstract
A speckle image correction method, a depth calculation method, a device, a medium and equipment are provided, the speckle image correction method comprises: acquiring a speckle image, and carrying out ellipse detection on the speckle image to obtain ellipse parameters of each speckle in the speckle image; traversing each detection window in the speckle images, and calculating a re-projection matrix corresponding to the current detection window according to the ellipse parameters of each speckle in the current detection window, wherein the detection window is a window used for matching between the speckle images and the reference image, and the re-projection matrix is a deformation matrix experienced by the speckles converted from ellipses into standard circles; and carrying out re-projection calculation on the coordinates of each pixel in the current detection window according to the re-projection matrix, and correcting the current detection window into a new detection window according to new coordinates obtained by the re-projection calculation. The window is corrected according to the calculated re-projection matrix to obtain the corrected window, so that the image matching precision is improved.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a speckle pattern correction method, a speckle pattern depth calculation device, a speckle pattern depth calculation medium and speckle pattern depth calculation equipment.
Background
The principle of the monocular astigmatic structured light depth camera is similar to that of the binocular depth camera, and a depth map is obtained by calculating a parallax and then calculating a depth value of each pixel by using triangulation. The binocular camera relies on texture information in images taken by the left and right cameras for depth reconstruction, but cannot do so in scenes where texture is not apparent, such as solid objects. In the monocular speckle structure light camera, speckles which are fine and distributed randomly in position are projected by a speckle projector, so that the effect of adding unique texture information to the environment is achieved, the density and the randomness of the speckles ensure that a plurality of speckles exist in each window of an imaging picture, the speckles are distributed randomly and are unique in the whole imaging picture, and the accuracy and the efficiency of window matching are greatly improved. Therefore, the monocular speckle structure optical camera is increasingly emphasized in practical application.
The monocular astigmatic structured light depth camera performs window matching between the imaging and reference images. I.e. to find a matching window for a certain window in the imaged picture in the reference picture. The window is usually N x N pixels, and the size of N is usually a constant determined according to actual conditions. When a matching window of the window is found, the disparity of the window can be calculated from the relative position difference of the two windows, and thus the depth value can be calculated from triangulation.
In an actual environment, when the surface of the object to be photographed has a certain angle with the calibration plane (i.e. is not perpendicular to the projector), the speckle pattern projected onto the surface of the object generates deformation effects such as stretching or flattening. When the angle between the object plane and the calibration plane is small, the deformation effect is not obvious, and the matching precision of the corresponding window of the object in the imaging image and the matching window in the reference image is hardly influenced. When the angle between the object plane and the calibration plane is large, the deformation effect is obvious, the matching precision of the corresponding window of the object in the imaging image and the matching window in the reference image is obviously influenced, the matching is not good or no matching is caused, and thus, a cavity or an abnormal point is generated at the position of the depth map.
Disclosure of Invention
In view of the above, it is necessary to provide a speckle image correction method, a depth calculation method, an apparatus, a medium, and a device, which solve the problem in the prior art that the speckle pattern distortion of an image causes low matching accuracy with a reference image.
A speckle image correction method is applied to a monocular depth camera, and comprises the following steps:
acquiring a speckle image, and carrying out ellipse detection on the speckle image to obtain ellipse parameters of each speckle in the speckle image;
traversing each detection window in the speckle images, and calculating a re-projection matrix corresponding to the current detection window according to the ellipse parameters of each speckle in the current detection window, wherein the detection window is a window used for matching between the speckle images and a reference image, the re-projection matrix is a deformation matrix experienced by the speckles when the speckles are converted into a standard circle from an ellipse, and the reference image is an image obtained by shooting after speckle projection on a calibration plane which is a preset distance away from a shooting lens and is parallel to an imaging plane;
and carrying out re-projection calculation on the coordinates of each pixel in the current detection window according to the re-projection matrix, and correcting the current detection window into a new detection window according to new coordinates obtained by the re-projection calculation.
Further, the speckle image correction method, wherein the step of obtaining the speckle image and performing ellipse detection on the speckle image to obtain the ellipse parameters of each speckle in the speckle image comprises:
acquiring a speckle image, and performing gradient calculation on the speckle image to obtain a corresponding gradient image;
carrying out binarization processing on the gradient map to obtain a speckle contour image containing each speckle contour;
and carrying out ellipse detection on the speckle contour image to obtain ellipse parameters of all speckles in the speckle image.
Further, in the speckle image correction method, the step of performing ellipse detection on the speckle profile image to obtain an ellipse parameter of each speckle in the speckle image includes:
taking R from the current pixel as the center in the speckle contour imagemax*RmaxSize calculation Window, RmaxIs the maximum speckle radius;
judging whether the current pixel is the center of the current speckle or not;
if yes, calculating the ellipse parameters of the current speckles according to the relative coordinates of each effective pixel in the calculation window, wherein the relative coordinates are coordinates relative to the current pixels, and the effective pixels are pixels on the speckle edges in the speckle contour image.
Further, in the speckle image correction method, the step of determining whether the current pixel is a speckle center includes:
determining relative coordinates of valid pixels in the calculation window;
judging whether the sum of the relative coordinates of each effective pixel is zero or not;
and if so, determining that the current pixel is the center of the speckle.
Further, in the speckle image correction method, the step of calculating the ellipse parameter of the current speckle according to the relative coordinates of each effective pixel in the calculation window includes:
substituting the relative coordinates of each pixel in the contour of the current speckle into an ellipse formula, and performing least square fitting to obtainWherein,In order to calculate the parameters of the ellipse,;
computingObtaining the ellipse parameter of the current speckle asWhereinAs a relative coordinate of each pixel,,is the average radius of all speckles in the reference image.
Further, the speckle image correction method described above, wherein the step of calculating the reprojection matrix corresponding to the current detection window according to the elliptical parameter of each speckle in the current detection window further includes:
respectively calculating corresponding deformation coefficients according to the elliptical parameters of the speckles in the current detection window;
calculating the average value of the deformation coefficients, and judging whether the average value exceeds a threshold range;
and if so, executing a step of calculating a re-projection matrix corresponding to the current detection window according to the elliptical parameters of the speckles in the current detection window.
Further, in the speckle image correction method, the step of calculating the reprojection matrix corresponding to the current detection window according to the elliptical parameters of the speckles in the current detection window includes:
averaging the elliptical parameters of all the speckles in the current detection window to obtain an elliptical parameter average value;
calculating a reprojection matrix corresponding to the current detection window according to the ellipse parameter mean value, wherein,
the calculation formula of the reprojection matrix F is as follows:
wherein,the angle of rotation required to project from an ellipse to a standard circle,andrespectively, the x-axis and y-axis of the ellipse are respectively scaled by a multiple when projected from the ellipse to the standard circle,,,is the average of the elliptical parameters.
Further, the method for correcting a speckle image, wherein the step of performing ellipse detection on the speckle image further includes:
and carrying out distortion removal, filtering and normalization processing on the speckle images.
Further, the speckle image correction method described above, wherein the step of determining whether the average value exceeds a threshold range further includes:
carrying out ellipse detection on each speckle in a reference image to obtain an axial length parameter of each speckle in the reference image;
determining the maximum distortion coefficient and the minimum distortion coefficient of speckles in the reference image according to the obtained axial length parameter;
determining the threshold range according to the maximum distortion coefficient and the minimum distortion coefficient;
wherein the maximum distortion coefficient isThe minimum distortion coefficient is,Andas the axial length parameter, respectively the length of the ith ellipse x axis in the reference image andthe length of the shaft.
The invention also discloses a speckle image depth calculation method, which is applied to a monocular depth camera and comprises the following steps:
acquiring a speckle image, and carrying out ellipse detection on the speckle image to obtain ellipse parameters of each speckle in the speckle image;
traversing each detection window in the speckle images, and calculating a re-projection matrix corresponding to the current detection window according to the ellipse parameters of each speckle in the current detection window, wherein the detection window is a window used for matching between the speckle images and a reference image, the re-projection matrix is a deformation matrix experienced by the speckles when the speckles are converted into a standard circle from an ellipse, and the reference image is an image obtained by shooting after speckle projection on a calibration plane which is a preset distance away from a shooting lens and is parallel to an imaging plane;
carrying out re-projection calculation on the coordinates of each pixel in the current detection window according to the re-projection matrix, and correcting the current detection window into a new detection window according to new coordinates obtained by the re-projection calculation;
and matching the redetermined detection window in the speckle image with a calibration window in a reference image, and calculating a depth value according to the parallax of the detection window and the calibration window which are successfully matched.
The invention also discloses a speckle image correction device, which is applied to a monocular depth camera, and the speckle image correction device comprises:
the ellipse detection module is used for acquiring speckle images and carrying out ellipse detection on the speckle images so as to obtain ellipse parameters of all speckles in the speckle images;
the matrix calculation module is used for traversing each detection window in the speckle images, and calculating a reprojection matrix corresponding to the current detection window according to the ellipse parameters of each speckle in the current detection window, wherein the detection window is a window used for matching between the speckle images and a reference image, the reprojection matrix is a deformation matrix experienced by the speckles converted from an ellipse to a standard circle, and the reference image is an image obtained by projecting the speckles on a calibration plane which is a preset distance away from a shooting lens and is parallel to an imaging plane;
and the window calculation module is used for carrying out re-projection calculation on the coordinates of each pixel in the current detection window according to the re-projection matrix and correcting the current detection window into a new detection window according to new coordinates obtained by the re-projection calculation.
Further, the speckle image correction device further includes:
the deformation coefficient calculation module is used for calculating corresponding deformation coefficients according to the ellipse parameters of the speckles in the current detection window;
and the judging module is used for calculating the average value of the deformation coefficient and judging whether the average value exceeds a threshold range, if so, the matrix calculating module executes the step of calculating a reprojection matrix corresponding to the current detection window according to the elliptical parameters of all the speckles in the current detection window.
Further, the speckle image correction device further includes:
and the preprocessing module is used for carrying out distortion removal, filtering and normalization processing on the speckle images.
The invention also discloses a speckle image depth calculating device, which is applied to a monocular depth camera, and the speckle image depth calculating device comprises:
the ellipse detection module is used for acquiring speckle images and carrying out ellipse detection on the speckle images so as to obtain ellipse parameters of all speckles in the speckle images;
the matrix calculation module is used for traversing each detection window in the speckle images, and calculating a reprojection matrix corresponding to the current detection window according to the ellipse parameters of each speckle in the current detection window, wherein the detection window is a window used for matching between the speckle images and a reference image, the reprojection matrix is a deformation matrix experienced by the speckles converted from an ellipse to a standard circle, and the reference image is an image obtained by projecting the speckles on a calibration plane which is a preset distance away from a shooting lens and is parallel to an imaging plane;
the window calculation module is used for carrying out re-projection calculation on the coordinates of each pixel in the current detection window according to the re-projection matrix and correcting the current detection window into a new detection window according to new coordinates obtained by the re-projection calculation;
and the depth calculation module is used for matching the redetermined detection window in the speckle image with the calibration window in the reference image and calculating the depth value according to the parallax of the detection window and the calibration window which are successfully matched.
The invention also discloses an electronic device comprising a memory and a processor, wherein the memory stores a program, and the program realizes any one of the methods when being executed by the processor.
The invention also discloses a computer readable storage medium having a program stored thereon, which when executed by a processor implements any of the methods described above.
The invention estimates the ellipse parameter of speckles in each window in the speckle image, calculates the deformation matrix from the deformed window to the window with the surface vertical to the projector according to the ellipse parameter, corrects the window by the deformation matrix to obtain the corrected window, and matches the corrected window with the window of the reference image. Therefore, the matching precision is improved, the condition that no matching exists or the matching fails is reduced, and the depth reconstruction effect of the depth camera is improved.
Drawings
FIG. 1 is a flow chart of a speckle image correction method according to a first embodiment of the present invention;
FIG. 2 is a diagram illustrating a reprojection effect according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a speckle image correction method according to a second embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating the determination of the centroid of speckles according to an embodiment of the present invention;
FIG. 5 is a flow chart of a speckle image depth calculation method according to a third embodiment of the present invention;
FIG. 6 is a block diagram of a speckle image correction apparatus according to a fourth embodiment of the present invention;
FIG. 7 is a block diagram of a speckle image depth calculating apparatus according to a fifth embodiment of the present invention
Fig. 8 is a schematic structural diagram of an electronic device in an embodiment of the invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
These and other aspects of embodiments of the invention will be apparent with reference to the following description and attached drawings. In the description and drawings, particular embodiments of the invention have been disclosed in detail as being indicative of some of the ways in which the principles of the embodiments of the invention may be practiced, but it is understood that the scope of the embodiments of the invention is not limited correspondingly. On the contrary, the embodiments of the invention include all changes, modifications and equivalents coming within the spirit and terms of the claims appended hereto.
In the image processing of the present invention, usually, a pixel is processed, and a neighborhood of N × N centered on the pixel is calculated, and this neighborhood is referred to as a window, and in the following description, the window size is represented by N. The working principle of the monocular speckle structure light camera is similar to that of the binocular depth camera, and the parallax is calculated through matching of windows. The process of matching of the binocular depth camera is to find a matching window of a certain window of the right view in the left view or to find a matching window of a certain window of the left view in the right view. The monocular speckle structure optical camera is used for finding a matching window in a reference image and a certain window in a speckle image. The windows are typically N x N pixels, i.e. windows with equal length and width dimensions, although the length and width dimensions of the windows may also be different.
The windows contain texture information of the environment image, and when a certain monitoring window of the speckle image and a window in the reference image contain the same texture information, the two windows are considered to be matched. When a window matching the window is found in the reference image, the disparity of the window can be calculated from the relative position difference of the two windows, and thus the depth value is calculated from triangulation.
The monocular speckle structure light camera comprises a speckle projector, and a speckle pattern emitted by the speckle projector is determined by a Diffraction Optical Element (DOE). Most DOEs design the speckle shape into a circle for process implementation reasons. In the embodiment, the circular speckle is also taken as an example to describe the embodiment. In order to ensure the effect achieved by the scheme, the selected or designed DOE element should ensure that the speckles are circular as much as possible, and the energy distribution of the speckles is concentrated. This is to enable the algorithm to extract the speckle edge profile more conveniently. The DOE optical device is reasonably designed, so that the distortion in the effective field range is small, and the speckle profile has good consistency.
When the surface of the object is not perpendicular to the speckle projector, the speckles on the surface of the object in the speckle image are deformed, the speckles are stretched or compressed into an ellipse by the prototype, and the major axis and the minor axis of the speckle image may not be parallel to the x axis and the y axis of the image, so that a certain included angle exists. The invention estimates the deformation matrix by calculating the ellipse parameters of the speckles, thereby carrying out the reprojection correction on the window.
Referring to fig. 1, a speckle image correction method according to a first embodiment of the invention includes steps S11-S13.
Step S11, acquiring speckle images, and performing ellipse detection on the speckle images to obtain ellipse parameters of each speckle in the speckle images.
The speckle image is obtained by photographing when a speckle projector projects a speckle pattern on an object, and the most intuitive application is a monocular astigmatism structure optical depth camera, namely, the image of the object collected by the monocular astigmatism structure optical depth camera.
Since the depth map reconstruction needs to be performed in real time, and the operation efficiency must be considered, a simplified optimization method is adopted for the ellipse detection in the embodiment. The common least square method for fitting the ellipse has 5 parametersIn this embodiment, when speckle detection is performed, the current pixel is used as the center, that is, only the ellipse using the current pixel as the center needs to be detected, so that the center offset is 0, and thus the ellipse formula can be simplified into a formula with 3 parameters,. Therefore, the calculation amount is greatly simplified, and the method is simple and efficient.
And calculating the ellipse parameters of the speckle images by using the simplified ellipses to obtain the ellipse parameters of each speckle in the speckle images, wherein the ellipse parameters are A, B and C.
When the speckle image is subjected to ellipse detection, firstly, the speckle contour image is subjected to speckle contour detection to obtain a speckle contour image, and then, the speckle contour image is subjected to ellipse detection to obtain ellipse parameters of all speckles in the speckle image. There are many ellipse detection algorithms, and there are hough transformation, optimization method and arc-based method, etc. commonly used.
Step S12, traversing each detection window in the speckle image, and calculating a reprojection matrix corresponding to the current detection window according to the ellipse parameters of each speckle in the current detection window.
The detection window is used for matching the speckle image and the reference image. The window is N x N pixel size for speckle image E(x,y)It contains L windows, each containing k speckles. In particular, the reprojection calculation may be performed for each window to correct each window in the speckle image.
The reprojection matrix is a deformation matrix experienced by the speckles projected from the ellipse to the standard circle, i.e., as shown in fig. 2, the reprojection matrix functions to project the speckles from the ellipse to the standard circle. Since the re-projection is performed for each detection window, and one detection window contains k speckles, in order to avoid calculation errors or noise, the average value of the elliptical parameters of the k speckles in the detection window can be taken, and the average elliptical parameter is used as the basis,Andto calculate a reprojection matrix for the detection window. Wherein,
Ai,Biand CiThe ellipse parameter of the ith speckle in the current detection window.
Specifically, the calculation process of the reprojection matrix is as follows:
it should be noted that, for the sake of calculation convenience, the elliptic formula is multiplied by the coefficientThe product isCan be the average radius of the speckle detected in the reference image, i.e.,Andthe length of the x-axis and the length of the y-axis of the ith ellipse in the reference image, respectively.
The projection from ellipse to standard circle needs to be rotated firstAngle and x-axis zoomY-axis scalingThe transformation matrix thereof, i.e. reprojectionThe matrix is:;
substituting into the equation of a circle yields:
the unfolding is as follows:
solving to obtain:
thereby obtaining,,Substituting the value of (b) into the reprojection matrix F to obtain the reprojection matrix F.
And step S13, carrying out re-projection calculation on the coordinates of each pixel in the current detection window according to the re-projection matrix, and correcting the current detection window into a new detection window according to new coordinates obtained by the re-projection calculation.
For speckle image E(x,y)The coordinates of each pixel in the current detection window are calculated for re-projection, which is understood to be the relative coordinates with respect to the center of the speckle. The coordinates of each pixel in the current detection window areCalculating new coordinates from the reprojection matrix. Wherein the new coordinates areA new detection window is generated based on the new coordinates, the coordinates of the pixels in the new detection window being respectivelyIt is understood that, in the new detection window, the gray scale value of the pixel of each new coordinate may be the same as the gray scale value of the original pixel.
Further, in order to improve the quality of the speckle images, in another embodiment of the present invention, before performing ellipse detection on the speckle images, the speckle images need to be subjected to distortion removal, filtering and normalization.
The distortion removal processing is to correct the whole image according to the optical distortion coefficient of the camera, the optical distortion model generally uses two types of models of radial distortion (barrel distortion and pillow distortion) and tangential distortion, the distortion correction methods are also many, the common Zhangyinyou calibration method using chessboard grids is used, and the like, and detailed explanation is not provided here.
The normalization processing can effectively remove the influence of light reflected or absorbed by different surfaces in the environment and remove the background brightness, thereby more stably and reliably detecting the speckles. If the shot speckle image has obvious noise, in order to enable the speckle contour detection to be more stable, the image can be subjected to low-pass filtering processing and then normalized.
The method comprises the steps of estimating an ellipse parameter of speckles in each window in a speckle image, calculating a deformation matrix from the deformed window to the window with the surface perpendicular to a projector according to the ellipse parameter, correcting the window by using the deformation matrix to obtain a corrected window, and matching the corrected window with a window of a reference image. Therefore, the matching precision is improved, the condition that no matching exists or the matching fails is reduced, and the depth reconstruction effect of the depth camera is improved.
The monocular speckle structure optical camera is mainly based on the principle that matching windows of all windows of a shot speckle image are searched in a reference image, parallax is calculated through window matching, and then depth values corresponding to pixels where the windows are located are calculated. Therefore, in the speckle image correction method according to the second embodiment of the present invention, when correcting the captured imaging image (i.e., the speckle image), the reference image needs to be processed in advance.
The reference image is a large white board (i.e. a calibration plane) placed at a preset distance (generally within a working distance range, such as 1 m) from the camera and parallel to the imaging plane of the camera, and the obtained image is photographed after speckles are projected. The image is subjected to distortion removal processing and normalization processing, and the processed image is stored as a reference image.
For reference imagePerforming distortion removal processing to obtain an imageThe distortion removal uses generic radial and tangential distortion models. For the image obtained by distortion removalNormalization is performed, and a mean value removing normalization method is adopted. Image of a personEach pixel of (2)Calculating the mean value of the windows of the N-N neighborhood centered on it asStandard deviation ofTo obtain. The normalized value of the pixel is. Thereby obtaining a normalized imageI.e. the reference image used for matching with the speckle pattern.
Further, if there is significant noise on the reference image, the image may be processed first to make the speckle profile detection more stableAnd carrying out normalization processing after low-pass filtering processing.
Finally, for the reference imageSpeckle is detected, and in this embodiment, the method for detecting speckle adopts ellipse detection. There are many ellipse detection algorithms, and there are hough transformation, optimization method, arc-based method, etc. commonly used. Because the reference image only needs to be processed once in the early stage and does not need to be processed in the actual operation, the method based on the circular arc can obtain higher precision. Even multiple iterations are possible in order to achieve higher detection accuracy. Each ellipse detected3 parameters need to be saved. WhereinIs the length of the x-axis of the ellipse,is the length of the y-axis of the ellipse,can be used to evaluate the degree of distortion of the speckle circle. Since the calibration plane is perpendicular to the projector optical axis, the major and minor axes of the ellipse should coincide with the coordinate axes here. The1 in the case of no distortion, but it deviates from 1 due to factors such as machining errors of the DOE, imaging errors, and calculation errors of the detection algorithm.
coefficient of performanceAnderror ranges for speckle stretching and compression distortion are described.
Specifically, referring to fig. 3, the speckle image correction method in the second embodiment of the present invention includes steps S21 to S27.
And step S21, acquiring the speckle images, and performing gradient calculation on the speckle images to obtain corresponding gradient images.
Shooting an image of the environment, and performing distortion removal processing, low-pass filtering processing and normalization processing on the shot image to obtain a final speckle image E(x,y). For processed speckle image E(x,y)Performing ellipse detection, and calculating speckle image E(x,y)Gradient map G of(x,y). Gradient map G (x,y)Each pixel ofIt can be expressed as a number of expressions,
wherein,,representative speckle image E(x,y)A gradation gradient in the x-axis direction at the (x, y) pixel position,representing speckle image E(x,y)At (x, y) pixel positionThe gradient of the gray scale in the direction of the axis,representing speckle image E(x,y)The gray value at the (x, y) pixel.
And step S22, performing binarization processing on the gradient map to obtain a speckle profile image containing each speckle profile.
And step S23, carrying out ellipse detection on the speckle contour image to obtain ellipse parameters of each speckle in the speckle image.
For gradient map G (x,y)The speckle contour image P can be obtained by binarization processing with threshold value(x,y)Speckle profile image P(x,y)As speckle image E(x,y)An image of the profile of each speckle. Because the speckle images are subjected to filtering and normalization operations, the global threshold value is adopted for binarization processing. In the speckle contour map after the binarization processing, the pixel values of all the pixels are 0 or 1.
And carrying out ellipse detection according to the binarized speckle profile image to obtain an ellipse coefficient of the speckles so as to obtain each speckle and speckle parameter on the speckle image.
Specifically, the step of performing ellipse detection on the speckle profile image to obtain an ellipse parameter of each speckle includes:
taking R from the current pixel as the center in the speckle contour imagemax*RmaxSize calculation Window, RmaxIs the maximum speckle radius;
judging whether the current pixel is the center of the current speckle or not;
if yes, calculating the ellipse parameters of the current speckles according to the relative coordinates of each effective pixel in the calculation window, wherein the relative coordinates are coordinates relative to the current pixels.
In this embodiment, during the ellipse detection, a simplified ellipse formula is used, which describes an ellipse with the current pixel as the center. It is necessary to first detect whether the current pixel is the center of a speckle profile. Judging whether the pixel is the center of the speckle profile, and only needing to obtain the coordinates of points on the profile around the current pixel relative to the current pixel, accumulating the coordinates to obtain the centroid, wherein the centroid is the center of the speckle profile, and then judging whether the centroid is coincided with the current pixel.
Speckle profile image P after binary(x,y)Upper pixel ofFor the current pixel, take R with it as the centermax*RmaxThe window of size serves as the calculation window. RmaxIs the maximum possible radius of a speckle, it can be considered that only a complete speckle is possible to include in the calculation window, and in the implementation, RmaxThe maximum radius of the speckle detected in the reference image is taken. All m valid pixels in the calculation window (pixel value 1) have relative coordinates with respect to pixel (x, y) of。
According to the definition of the centroid, there areAnd S is the centroid, i.e. the center of the speckle profile,is S relative to the current pixelThe coordinates of (a). The simplified ellipse formula can only describe the outline centered on the current pixel, so it is first determined whether the current pixel is the center of the ellipse.
As shown in fig. 4, whenWhen the speckle contour is complete and the current pixel is taken as the centroid, namely the (x, y) pixel is taken as the speckle center, thenCarrying out ellipse detection; when in useThe contour is incomplete or does not take the current pixel as the center of mass, i.e. the (x, y) pixel is not the center of the speckle, and ellipse detection is not needed, thereby avoiding unnecessary calculation.
When in useWhen the (x, y) pixel is the speckle center, the ellipse coefficient of the speckle is calculated, and the specific method is as follows:
from this, the ellipse parameter for the current speckle is obtained asAccording to this method, a speckle image E can be obtained(x,y)The ellipse parameters a, B, C of the respective speckles.
It should be noted that the above ellipse equation is multiplied byCoefficients to facilitate later calculations.
Step S24, traversing each detection window in the speckle image, and calculating a corresponding deformation coefficient according to the ellipse parameter of each speckle in the current detection window. The detection window is used for matching the speckle image and the reference image.
Step S25, calculating an average value of the deformation coefficients, and determining whether the average value exceeds a threshold range.
And step S26, if yes, calculating a reprojection matrix corresponding to the current detection window according to the ellipse parameters of the speckles in the current detection window, wherein the reprojection matrix is a deformation matrix experienced by the speckles converted from ellipses to standard circles.
The calculation formula of the deformation coefficient of the current speckle in the current detection window is as follows:
and judging the deformation degree of the window according to the average deformation coefficient of the speckles of each detection window on the speckle image, and determining whether the window needs to be re-projected.
For speckle image E(x,y)Each detection window of (1) contains k speckles, and the average deformation coefficient of the current detection window is。
When the average deformation coefficient of the current detection window exceeds the threshold range, the current detection window has obvious distortion, and then reprojection correction is needed. It should be noted that the threshold range is based on two maximum distortion coefficients defined in the reference imageAndto be determined. In the specific implementation process, the first-stage reactor,
when in useIf the deformation coefficient is within the error range of the reference image, the window is considered to be not deformed, and a reprojection calculation method is not needed;
when in useOrThe explanation window has certain deformation, but the influence of the deformation coefficient is in the allowable range, and the reprojection calculation is not needed;
when in useOrIn time, the window is considered to have obvious distortion, and the reprojection calculation is needed.
c1Is a constant less than 1.0, c2Is a constant greater than 1.0, c1And c2The value of (c) can be tried according to actual conditions, for example, c can be1=0.7 and c2=1.3。
It should be noted that in other embodiments of the present invention, the threshold range may also be an empirical value.
And step S27, carrying out re-projection calculation on the coordinates of each pixel in the current detection window according to the re-projection matrix, and correcting the current detection window into a new detection window according to new coordinates obtained by the re-projection calculation.
To avoid calculation errors or noise, the average ellipse parameters of k speckles in the current detection window are takenTo calculate a reprojection matrix for the current detection window, wherein,,,。
the specific calculation method of the reprojection matrix may be the first embodiment, and is not described herein.
Since the reference image is taken with the entire plane perpendicular to the projector, the speckle on the reference image is very close to an ideal circle. In a working environment, a plurality of objects exist, the included angles between the surfaces of the objects and the projector are different, and the surfaces of the objects and the projector are seriously non-perpendicular, so that speckles on the surfaces of the speckle images can be deformed into ellipses. Thereby affecting the window of the speckle image to match the window of the reference image. The method comprises the steps of estimating an ellipse parameter of speckles in each window in a speckle image, calculating a deformation matrix from the deformed window to the window with the surface perpendicular to a projector according to the ellipse parameter, correcting the window by using the deformation matrix to obtain a corrected window, and matching the corrected window with a window of a reference image. Therefore, the matching precision is improved, the condition that no matching exists or the matching fails is reduced, and the depth reconstruction effect of the depth camera is improved.
Referring to fig. 5, the speckle image depth calculating method according to the third embodiment of the invention includes steps S31 to S34.
Step S31, acquiring speckle images, and carrying out ellipse detection on the speckle images to obtain ellipse parameters of each speckle in the speckle images;
step S32, traversing each detection window in the speckle image, and calculating a reprojection matrix corresponding to the current detection window according to the ellipse parameters of each speckle in the current detection window. The detection window is used for matching between the speckle image and the reference image, and the re-projection matrix is a deformation matrix which is experienced by the speckle image converted from an ellipse into a standard circle;
step S33, carrying out re-projection calculation on the coordinates of each pixel in the current detection window according to the re-projection matrix, and correcting the current detection window into a new detection window according to new coordinates obtained by the re-projection calculation;
step S34, matching the redetermined detection window in the speckle image with the calibration window in the reference image, and calculating a depth value according to the parallax of the detection window and the calibration window which are successfully matched.
In this embodiment, the re-projection calculation of the detection windows is performed on the collected speckle images of the object, so that each detection window is corrected. The method for correcting the detection window in the speckle pattern can refer to the method in the first embodiment or the second embodiment, which is not repeated herein.
And after the window with the size of N x N in the reference image is defined as a calibration window, matching the speckle image corrected by the reprojection matrix with the reference image, determining the calibration window matched with each detection window, calculating parallax, and then calculating the depth value corresponding to the pixel where the detection window of the speckle pattern is located. The window matching in this embodiment is the same as the window matching of ordinary binocular vision or the window matching principle of monocular speckle structured light, and the common matching algorithm includes a mean value removal normalized correlation coefficient, Census transformation, a cost calculation method, and the like. And will not be described in detail here. It should be noted that the new corrected window may no longer be square, and thus cannot correspond to each pixel of the window of the reference image one-to-one, and only the matching calculation needs to be performed on the overlapping portion of the two windows where corresponding pixels exist.
Referring to fig. 6, a speckle image correction apparatus applied to a monocular depth camera in a fourth embodiment of the present invention includes:
the ellipse detection module 41 is configured to acquire a speckle image and perform ellipse detection on the speckle image to obtain an ellipse parameter of each speckle in the speckle image;
a matrix calculation module 42, configured to traverse each detection window in the speckle image, and calculate a re-projection matrix corresponding to the current detection window according to an ellipse parameter of each speckle in the current detection window, where the detection window is a window used for matching between a speckle image and a reference image, the re-projection matrix is a deformation matrix experienced by the speckle image transformed from an ellipse to a standard circle, and the reference image is an image captured after performing speckle projection on a calibration plane that is parallel to an imaging plane and is a preset distance from a capture lens;
and the window calculation module 43 is configured to perform reprojection calculation on the coordinates of each pixel in the current detection window according to the reprojection matrix, and correct the current detection window to a new detection window according to new coordinates obtained through the reprojection calculation.
Further, the speckle image correction device further includes:
the deformation coefficient calculation module is used for calculating corresponding deformation coefficients according to the ellipse parameters of the speckles in the current detection window;
and the judging module is used for calculating the average value of the deformation coefficient and judging whether the average value exceeds a threshold range, if so, the matrix calculating module executes the step of calculating a reprojection matrix corresponding to the current detection window according to the elliptical parameters of all the speckles in the current detection window.
Further, the speckle image correction device further includes:
and the preprocessing module is used for carrying out distortion removal, filtering and normalization processing on the speckle images.
The implementation principle and the generated technical effect of the speckle image correction device provided by the embodiment of the invention are the same as those of the speckle image correction method embodiment, and for the sake of brief description, corresponding contents in the method embodiment can be referred to where the embodiment of the device is not mentioned.
Referring to fig. 7, a speckle image depth calculating apparatus according to a fifth embodiment of the present invention is applied to a monocular depth camera, and the apparatus includes:
the ellipse detection module 51 is configured to acquire a speckle image and perform ellipse detection on the speckle image to obtain an ellipse parameter of each speckle in the speckle image;
the matrix calculation module 52 is configured to traverse each detection window in the speckle images, and calculate a re-projection matrix corresponding to the current detection window according to an ellipse parameter of each speckle in the current detection window, where the detection window is a window used for matching between a speckle image and a reference image, the re-projection matrix is a deformation matrix experienced by the speckle image transformed from an ellipse to a standard circle, and the reference image is an image captured after performing speckle projection on a calibration plane that is a preset distance away from a capturing lens and is parallel to an imaging plane;
the window calculation module 53 is configured to perform reprojection calculation on the coordinates of each pixel in the current detection window according to the reprojection matrix, and correct the current detection window to a new detection window according to new coordinates obtained through the reprojection calculation;
and a depth calculating module 54, configured to match the redetermined detection window in the speckle image with a calibration window in a reference image, and calculate a depth value according to a parallax between the successfully matched detection window and calibration window.
The implementation principle and the generated technical effect of the speckle image depth calculating device provided by the embodiment of the invention are the same as those of the speckle image depth calculating method embodiment, and for the sake of brief description, corresponding contents in the method embodiment can be referred to where the embodiment of the device is not mentioned.
In another aspect of the present invention, an electronic device is provided, please refer to fig. 8, which includes a processor 10, a memory 20, and a computer program 30 stored in the memory and executable on the processor, wherein the processor 10 executes the computer program 30 to implement the method as described above.
The electronic device may be, but is not limited to, a computer, a server, and the like. Processor 10 may be, in some embodiments, a Central Processing Unit (CPU), controller, microcontroller, microprocessor or other data Processing chip that executes program code stored in memory 20 or processes data.
The memory 20 includes at least one type of readable storage medium, which includes a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a magnetic memory, a magnetic disk, an optical disk, and the like. The memory 20 may in some embodiments be an internal storage unit of the electronic device, for example a hard disk of the electronic device. The memory 20 may also be an external storage device of the electronic device in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the electronic device. Further, the memory 20 may also include both an internal storage unit and an external storage device of the electronic apparatus. The memory 20 may be used not only to store application software installed in the electronic device and various types of data, but also to temporarily store data that has been output or will be output.
Optionally, the electronic device may further comprise a user interface, a network interface, a communication bus, etc., the user interface may comprise a Display (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface may further comprise a standard wired interface, a wireless interface. Alternatively, in some embodiments, the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch device, or the like. The display, which may also be referred to as a display screen or display unit, is suitable, among other things, for displaying information processed in the electronic device and for displaying a visualized user interface. The network interface may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), typically used to establish a communication link between the device and other electronic devices. The communication bus is used to enable connection communication between these components.
It should be noted that the configuration shown in fig. 8 does not constitute a limitation of the electronic device, and in other embodiments the electronic device may include fewer or more components than shown, or some components may be combined, or a different arrangement of components.
The invention also proposes a computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method as in the above-mentioned embodiments.
Those of skill in the art will understand that the logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be viewed as implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus (e.g., a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, and execute the instructions). For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (11)
1. A speckle image correction method is applied to a monocular depth camera, and is characterized by comprising the following steps:
acquiring a speckle image, and carrying out ellipse detection on the speckle image to obtain ellipse parameters of each speckle in the speckle image;
traversing each detection window in the speckle images, and calculating a re-projection matrix corresponding to the current detection window according to the ellipse parameters of each speckle in the current detection window, wherein the detection window is a window used for matching between the speckle images and a reference image, the re-projection matrix is suitable for all speckles in the current detection window, the re-projection matrix is a deformation matrix used for converting each speckle in the current detection window from an ellipse into a standard circle, and the reference image is an image obtained by shooting after performing speckle projection on a calibration plane which is a preset distance away from a shooting lens and is parallel to an imaging plane;
and carrying out re-projection calculation on the coordinates of each pixel in the current detection window according to the re-projection matrix, and correcting the current detection window into a new detection window according to new coordinates obtained by the re-projection calculation.
2. The speckle image correction method of claim 1, wherein the steps of obtaining a speckle image, and performing ellipse detection on the speckle image to obtain ellipse parameters of each speckle in the speckle image comprise:
acquiring a speckle image, and performing gradient calculation on the speckle image to obtain a corresponding gradient image;
carrying out binarization processing on the gradient map to obtain a speckle contour image containing each speckle contour;
and carrying out ellipse detection on the speckle contour image to obtain ellipse parameters of all speckles in the speckle image.
3. The speckle image correction method of claim 2, wherein the step of performing ellipse detection on the speckle profile image to obtain ellipse parameters for each speckle in the speckle image comprises:
taking R from the current pixel as the center in the speckle contour imagemax*RmaxSize calculation Window, RmaxIs the maximum speckle radius;
judging whether the current pixel is the center of the current speckle or not;
if yes, calculating the ellipse parameters of the current speckles according to the relative coordinates of each effective pixel in the calculation window, wherein the relative coordinates are coordinates relative to the current pixels, and the effective pixels are pixels on the speckle edges in the speckle contour image.
4. The speckle image correction method of claim 3, wherein the step of calculating the ellipse parameters of the current speckle based on the relative coordinates of each valid pixel in the calculation window comprises:
substituting the relative coordinates of each pixel in the contour of the current speckle into an ellipse formula, and performing least square fitting to obtainWherein,In order to calculate the parameters of the ellipse,;
5. The method for correcting an speckle image of claim 1, wherein the step of calculating a reprojection matrix corresponding to a current detection window according to the elliptical parameters of each of the speckles in the current detection window further comprises:
respectively calculating corresponding deformation coefficients according to the elliptical parameters of the speckles in the current detection window;
calculating the average value of the deformation coefficients, and judging whether the average value exceeds a threshold range;
and if so, executing a step of calculating a re-projection matrix corresponding to the current detection window according to the elliptical parameters of the speckles in the current detection window.
6. The method for speckle image correction according to claim 1, wherein the step of calculating a reprojection matrix corresponding to a current detection window according to the elliptical parameters of each of the speckles in the current detection window comprises:
averaging the elliptical parameters of all the speckles in the current detection window to obtain an elliptical parameter average value;
calculating a reprojection matrix corresponding to the current detection window according to the ellipse parameter mean value, wherein,
the calculation formula of the reprojection matrix F is as follows:
7. A speckle image depth calculation method is applied to a monocular depth camera, and is characterized by comprising the following steps:
acquiring a speckle image, and carrying out ellipse detection on the speckle image to obtain ellipse parameters of each speckle in the speckle image;
traversing each detection window in the speckle images, and calculating a re-projection matrix corresponding to the current detection window according to the ellipse parameters of each speckle in the current detection window, wherein the detection window is a window used for matching between the speckle images and a reference image, the re-projection matrix is suitable for all speckles in the current detection window, the re-projection matrix is a deformation matrix used for converting each speckle in the current detection window from an ellipse into a standard circle, and the reference image is an image obtained by shooting after performing speckle projection on a calibration plane which is a preset distance away from a shooting lens and is parallel to an imaging plane;
carrying out re-projection calculation on the coordinates of each pixel in the current detection window according to the re-projection matrix, and correcting the current detection window into a new detection window according to new coordinates obtained by the re-projection calculation;
and matching the redetermined detection window in the speckle image with a calibration window in a reference image, and calculating a depth value according to the parallax of the detection window and the calibration window which are successfully matched.
8. A speckle image correction device applied to a monocular depth camera, the speckle image correction device comprising:
the ellipse detection module is used for acquiring speckle images and carrying out ellipse detection on the speckle images so as to obtain ellipse parameters of all speckles in the speckle images;
the matrix calculation module is used for traversing each detection window in the speckle images, calculating a re-projection matrix corresponding to the current detection window according to the ellipse parameters of each speckle in the current detection window, wherein the detection window is a window used for matching between the speckle image and a reference image, the re-projection matrix is suitable for all speckles in the current detection window, the re-projection matrix is a deformation matrix for converting each speckle in the current detection window from an ellipse to a standard circle, and the reference image is an image obtained by shooting after speckle projection on a calibration plane which is a preset distance away from a shooting lens and is parallel to an imaging plane;
and the window calculation module is used for carrying out re-projection calculation on the coordinates of each pixel in the current detection window according to the re-projection matrix and correcting the current detection window into a new detection window according to new coordinates obtained by the re-projection calculation.
9. A speckle image depth calculating device applied to a monocular depth camera is characterized by comprising:
the ellipse detection module is used for acquiring speckle images and carrying out ellipse detection on the speckle images so as to obtain ellipse parameters of all speckles in the speckle images;
the matrix calculation module is used for traversing each detection window in the speckle images, calculating a re-projection matrix corresponding to the current detection window according to the ellipse parameters of each speckle in the current detection window, wherein the detection window is a window used for matching between the speckle image and a reference image, the re-projection matrix is suitable for all speckles in the current detection window, the re-projection matrix is a deformation matrix for converting each speckle in the current detection window from an ellipse to a standard circle, and the reference image is an image obtained by shooting after speckle projection on a calibration plane which is a preset distance away from a shooting lens and is parallel to an imaging plane;
the window calculation module is used for carrying out re-projection calculation on the coordinates of each pixel in the current detection window according to the re-projection matrix and correcting the current detection window into a new detection window according to new coordinates obtained by the re-projection calculation;
and the depth calculation module is used for matching the redetermined detection window in the speckle image with the calibration window in the reference image and calculating the depth value according to the parallax of the detection window and the calibration window which are successfully matched.
10. An electronic device comprising a memory and a processor, the memory storing a program that when executed by the processor implements the method of any of claims 1 to 7.
11. A computer-readable storage medium, on which a program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111502716.8A CN113902652B (en) | 2021-12-10 | 2021-12-10 | Speckle image correction method, depth calculation method, device, medium, and apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111502716.8A CN113902652B (en) | 2021-12-10 | 2021-12-10 | Speckle image correction method, depth calculation method, device, medium, and apparatus |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113902652A CN113902652A (en) | 2022-01-07 |
CN113902652B true CN113902652B (en) | 2022-03-08 |
Family
ID=79025528
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111502716.8A Active CN113902652B (en) | 2021-12-10 | 2021-12-10 | Speckle image correction method, depth calculation method, device, medium, and apparatus |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113902652B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115294187B (en) * | 2022-10-08 | 2023-01-31 | 合肥的卢深视科技有限公司 | Image processing method of depth camera, electronic device and storage medium |
CN115294375B (en) * | 2022-10-10 | 2022-12-13 | 南昌虚拟现实研究院股份有限公司 | Speckle depth estimation method and system, electronic device and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013217826A (en) * | 2012-04-11 | 2013-10-24 | National Institute Of Advanced Industrial & Technology | Surface change detector by laser light |
WO2019205890A1 (en) * | 2018-04-28 | 2019-10-31 | Oppo广东移动通信有限公司 | Image processing method, apparatus, computer-readable storage medium, and electronic device |
CN111243002A (en) * | 2020-01-15 | 2020-06-05 | 中国人民解放军国防科技大学 | Monocular laser speckle projection system calibration and depth estimation method applied to high-precision three-dimensional measurement |
CN111540004A (en) * | 2020-04-16 | 2020-08-14 | 北京清微智能科技有限公司 | Single-camera polar line correction method and device |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8172403B2 (en) * | 2009-05-21 | 2012-05-08 | Eastman Kodak Company | Projection with curved speckle reduction element surface |
US9119559B2 (en) * | 2011-06-16 | 2015-09-01 | Salient Imaging, Inc. | Method and system of generating a 3D visualization from 2D images |
CN109461181B (en) * | 2018-10-17 | 2020-10-27 | 北京华捷艾米科技有限公司 | Depth image acquisition method and system based on speckle structured light |
-
2021
- 2021-12-10 CN CN202111502716.8A patent/CN113902652B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013217826A (en) * | 2012-04-11 | 2013-10-24 | National Institute Of Advanced Industrial & Technology | Surface change detector by laser light |
WO2019205890A1 (en) * | 2018-04-28 | 2019-10-31 | Oppo广东移动通信有限公司 | Image processing method, apparatus, computer-readable storage medium, and electronic device |
CN111243002A (en) * | 2020-01-15 | 2020-06-05 | 中国人民解放军国防科技大学 | Monocular laser speckle projection system calibration and depth estimation method applied to high-precision three-dimensional measurement |
CN111540004A (en) * | 2020-04-16 | 2020-08-14 | 北京清微智能科技有限公司 | Single-camera polar line correction method and device |
Non-Patent Citations (3)
Title |
---|
"Adapted Anisotropic Gaussian SIFT Matching Strategy for SAR Registration";F. Wang等;《IEEE Geoscience and Remote Sensing Letters》;20150131;第12卷(第1期);全文 * |
"Digital image correlation based on variable circle template in dual camera matching";Qihan Zhao等;《Optical Engineering》;20200130;第59卷(第1期);全文 * |
"基于散斑视觉测量的叶片模型重构";王涛等;《激光与光电子学进展》;20190131;第56卷(第01期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN113902652A (en) | 2022-01-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8786700B2 (en) | Position and orientation measurement apparatus, position and orientation measurement method, and storage medium | |
US8755630B2 (en) | Object pose recognition apparatus and object pose recognition method using the same | |
JP7269874B2 (en) | How to process multiple regions of interest independently | |
CN111127422A (en) | Image annotation method, device, system and host | |
JP6570296B2 (en) | Image processing apparatus, image processing method, and program | |
CN113902652B (en) | Speckle image correction method, depth calculation method, device, medium, and apparatus | |
CN111080542B (en) | Image processing method, device, electronic equipment and storage medium | |
CN107592449B (en) | Three-dimensional model establishing method and device and mobile terminal | |
CN107871329B (en) | Method and device for quickly calibrating optical center of camera | |
CN104111038A (en) | Method for using phase fusion algorithm to repair phase error caused by saturation | |
US11417080B2 (en) | Object detection apparatus, object detection method, and computer-readable recording medium | |
CN109191516B (en) | Rotation correction method and device of structured light module and readable storage medium | |
Mei et al. | Radial lens distortion correction using cascaded one-parameter division model | |
CN111160233A (en) | Human face in-vivo detection method, medium and system based on three-dimensional imaging assistance | |
US11450140B2 (en) | Independently processing plurality of regions of interest | |
CN109587463A (en) | Calibration method, projector and the calibration system of projector | |
CN109902695B (en) | Line feature correction and purification method for image pair linear feature matching | |
CN111353945B (en) | Fisheye image correction method, device and storage medium | |
JP7533937B2 (en) | Image processing device, image processing method, and program | |
JP2014032628A (en) | Corresponding point search device, program thereof, and camera parameter estimation device | |
EP2953096B1 (en) | Information processing device, information processing method, system and carrier means | |
CN115797995B (en) | Face living body detection method, electronic equipment and storage medium | |
WO2024164633A1 (en) | Projection image correction method and apparatus, projection device, collection device, and medium | |
KR102666984B1 (en) | Apparatus and method for calibrating camera for distance measurement of object position within image | |
CN118822928A (en) | Method for detecting quality of lamp ring in VR handle and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |