[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN103793909B - Single vision global depth information getting method based on diffraction blurring - Google Patents

Single vision global depth information getting method based on diffraction blurring Download PDF

Info

Publication number
CN103793909B
CN103793909B CN201410028369.3A CN201410028369A CN103793909B CN 103793909 B CN103793909 B CN 103793909B CN 201410028369 A CN201410028369 A CN 201410028369A CN 103793909 B CN103793909 B CN 103793909B
Authority
CN
China
Prior art keywords
image
depth
degree
radiation
formula
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410028369.3A
Other languages
Chinese (zh)
Other versions
CN103793909A (en
Inventor
魏阳杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeastern University China
Original Assignee
Northeastern University China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeastern University China filed Critical Northeastern University China
Priority to CN201410028369.3A priority Critical patent/CN103793909B/en
Publication of CN103793909A publication Critical patent/CN103793909A/en
Application granted granted Critical
Publication of CN103793909B publication Critical patent/CN103793909B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)

Abstract

A kind of single vision global depth information getting method based on diffraction blurring, belongs to technical field of image processing.Same video camera, when the proportionality coefficient between focal length, image distance, numerical aperture, imaging wavelength and circle of confusion radius and fuzziness is fixed, gathers piece image;In the case of above-mentioned condition is constant, then changing the distance between video camera and object, the size of distance change is Δ s, gathers the second width image;Set up diffraction blurring model, be used for describing the relation between the scenery degree of depth and image blurring degree;Utilize diffraction blurring model and two width blurred pictures, determine the global depth information of the second width image;According to the global depth information obtained, set up the 3D rendering of the second width image.The present invention has incorporated diffraction mechanism in traditional geometric optics convex lens fuzzy imaging, establishes the Mathematical Modeling between diffraction blurring and the scenery 3D degree of depth, improves the precision of out of focus scenery degree of depth acquisition methods based on conventional geometric optics.

Description

Single vision global depth information getting method based on diffraction blurring
Technical field
The invention belongs to technical field of image processing, particularly to a kind of single vision global depth acquisition of information based on diffraction blurring Method.
Background technology
Owing to optical observation has real-time, lossless, to the advantages such as environmental suitability is strong, use two-dimensional optical image reconstruction scenery three The technology of dimension depth information is one of focus and advanced problems of High Accuracy Observation area research, then at microelectronics, semiconductor, new The fields such as material, the energy, biological medicine, precision manufactureing are with a wide range of applications.Typical 3D vision observation procedure master Depth of focus acquisition methods to be had and depth from defocus acquisition methods, the former is to use layering focal imaging mechanism to obtain scene image Depth information, it is achieved three-dimensional scene rebuild.The latter is to utilize out of focus to estimate mechanism to obtain scene image depth information, and then Realize three-dimensional scene to rebuild.Comparatively speaking, based on out of focus estimate the three-dimensional micro-vision method of mechanism have sampled images few, The features such as amount of calculation is little, resolution ratio is high, strong adaptability, thus have more realizing high-precision three-dimensional based on image information observation Practical significance.
It is a kind of to utilize the fog-level of the limited area image of the depth of field to the method recovering depth information that depth from defocus obtains.Tradition Depth from defocus acquisition methods when gathering two width blurred pictures, need to lay two video cameras, or change camera parameters.But It is that, in micro-nano observation, observation space is very limited, a video camera can only be laid;It is additionally, since in micro-nano operation Using the camera with high-amplification-factor, the change of any camera parameters all can damage the imaging model of camera.So, Traditional depth from defocus acquisition methods is more in the application of macroscopic arts, is difficult to be applied in micro-nano observation.Up-to-date from Burnt depth recovery acquisition methods uses two width blurred pictures of the camera acquisition of a fixing camera parameters to carry out degree of depth acquisition, And attempt using in micro-nano observation.But, image blurringization is mainly caused by two factors: one is the change of sighting distance;Two It it is the optical diffraction of little yardstick.When observation condition is fixed, the obfuscation degree of image is to be changed common effect by sighting distance and optical diffraction Result.But, up to now, during fuzzy imaging, between a kind of optical diffraction and image blurring degree Accurate mathematical model.And, above-mentioned depth from defocus acquisition methods assumes that when obtaining scene depth information imaging process does not exist light Learning diffraction, the blooming of image is only caused by the change in depth of scenery, and this will necessarily bring the error that the degree of depth obtains.Especially It is in micro-nano observation, and the light microscope multiplication factor of use is higher, and optical diffraction phenomenon becomes apparent from, optics spread out Penetrate cause image blurring can not be ignored.Therefore, study image blurring measure and modeling method that optical diffraction causes, and based on mould Stick with paste image reconstruction micro-nano-scale scene depth information, for promoting that the development of micro-nano observation technology and association area science and technology has Very important meaning.
Summary of the invention
The deficiency existed for prior art, it is an object of the invention to provide a kind of single vision global depth based on diffraction blurring letter Breath acquisition methods, to reach to improve the purpose of tradition depth from defocus acquisition methods precision.
The technical scheme is that and be achieved in that: a kind of single vision global depth information getting method based on diffraction blurring, Comprise the following steps:
Step 1: same video camera, focal length, image distance, numerical aperture, imaging wavelength and circle of confusion radius and fuzziness it Between proportionality coefficient when fixing, gather piece image;In the case of above-mentioned condition is constant, then change between video camera and object Distance, distance change size be Δ s, gather the second width image;
Step 2: set up diffraction blurring model, is used for describing the relation between the scenery degree of depth and image blurring degree;
Step 2.1: according to Fresel diffraction principle, set up on video camera imaging face the Luminance Distribution of arbitrfary point and the scenery degree of depth it Between relation, formula is:
I P = | - i J 1 ( 2 πλ - 1 y sin u ) πλ - 1 y sin u + λ πsin 2 u · Σ n = 2 ∞ ( - iy - 1 sin u ) n ( l m ) n - 1 J n ( 2 πλ - 1 y sin u ) | 2
In formula, IpIt it is the Luminance Distribution of arbitrfary point P on video camera imaging face;J1It it is 1 rank Bessel function;JnIt is n rank shellfishes Sai Er function;λ represents the imaging light wavelength of video camera;R is the image distance of video camera;U is mirror bicker;Sinu=a/R;A represents The effective radius of camera lens;Y represents that in image planes, arbitrfary point P is to the distance of image plane center;L represents that current object distance becomes with ideal As the difference between object distance;M represents the axial multiplication factor of video camera;I is imaginary unit;
Step 2.2: the Luminance Distribution that the different scenery degree of depth set up in step 2.1 are corresponding is carried out Gauss curve fitting, obtains scenery deep Relation curve between degree and fog-level;
Step 2.3: describe the relation that step 2.2 obtains with conic section, sets up diffraction blurring model, and formula is:
σ=al2+bl+c
In formula, σ is the fog-level of each pixel;A, b, c are respectively quadratic term in conic section, first order and constant term Coefficient;L represents the difference between current object distance and ideal image object distance;
Step 3: utilize two width blurred pictures in the diffraction blurring model in step 2 and step 1, determine the second width image Global depth information;
Step 3.1: use the length with the second width image and the degree of depth of wide identical initialization of two-dimensional array the second width image;Set and work as The size of energy threshold τ, τ when the front degree of depth no longer changes is inversely proportional to the precision of result of calculation;Set and ensure current depth The normalization parameter α and Optimal Parameters k of bounded, and have α > 0, k > 0;Setting iteration step length β, the order of magnitude of β is according to video camera Ideal object, away from determining, is inversely proportional to iterations;
Step 3.2: use the method in current depth and step 2.3 to calculate Relative Fuzzy degree, according to Relative Fuzzy degree just Negative value determines that the direction of radiation coefficient, Relative Fuzzy formula are:
( Δ σ ) 2 = σ 2 2 - σ 1 2
In formula, Δ σ is Relative Fuzzy degree, σ1Fog-level for each pixel of image during current depth;σ2Current depth changes delta s Time each pixel of image fog-level, wherein, (Δ σ)2> 0 represent when current depth image to changes delta s the radiation of image be just Radiation, radiation coefficient is positive number, (Δ σ)2< 0 represents that the radiation of image is negative radiation when current depth image to changes delta s, spoke Penetrating coefficient is negative;
Step 3.3: with piece image as radiation source, the second width image is radiation target, sets up the heat radiation equation group of each pixel, Formula is:
u &CenterDot; ( y , z , t ) = &dtri; &CenterDot; ( &epsiv; ( y , z ) &dtri; u ( y , z , t ) ) t &Element; ( 0 , &infin; ) u ( y , z , 0 ) = E 1 ( y , z ) u ( y , z , &Delta; t ) = E 2 ( y , z )
In formula, (y z) represents that (wherein, y, z represent the horizontal direction of imaging plane and hang down pixel respectively for y, radiation coefficient z) to ε Nogata to;E1(y z) is piece image;E2(y z) is the second width image;" " represents gradient operator, " " It it is differential operator;T represents the time;Δ t represents that, from piece image to the radiated time of the second width image, formula is:
&Delta; t = ( &Delta; &sigma; ) 2 2 &epsiv; ( y , z )
In formula, radiation coefficient ε (y, direction z) and (Δ σ)2Direction identical;
Step 3.4: utilize radiation non trivial solution in step 3.3, the global energy calculating the second width image and radiation solution of equation is poor, If energy difference is more than energy threshold τ, updates the degree of depth of each pixel with step-length β, then return to step 3.2 and continue iteration;If Energy is less than or equal to threshold tau, then iteration stopping;
Step 4: the global depth information obtained according to step 3, sets up the 3D rendering of the second width image.
Beneficial effects of the present invention: the present invention has incorporated diffraction mechanism in traditional geometric optics convex lens fuzzy imaging, sets up Mathematical Modeling between diffraction blurring and the scenery 3D degree of depth, improves out of focus scenery degree of depth acquisition side based on conventional geometric optics The precision of method.
Accompanying drawing explanation
Fig. 1 be embodiment of the present invention be the 3D scanning result schematic diagram of grid;
Fig. 2 is embodiment of the present invention single vision based on diffraction blurring global depth information getting method flow chart;
Fig. 3 is the two width images that embodiment of the present invention video camera photographs, and wherein (a) is that the first width coloured image shot carries Taking the image after gray scale, (b) is that the second width coloured image of shooting extracts the image after gray scale;
When Fig. 4 is embodiment of the present invention scene change image plane take up an official post meaning point P light distribution schematic diagram;
Fig. 5 is embodiment of the present invention fitted Gaussian curve synoptic diagram when fixing l;
Fig. 6 is the relation schematic diagram between the embodiment of the present invention scenery degree of depth and fuzziness;
Fig. 7 is the relation schematic diagram that embodiment of the present invention uses between the conic fitting scenery degree of depth and fuzziness;
Fig. 8 is embodiment of the present invention calculated 3D depth image, and wherein, (a) is the 3D rendering calculated;Figure (b) Identical constantly for all experiment conditions, it is assumed that diffraction not in the presence of the 3D rendering of depth from defocus acquisition methods;
Fig. 9 is the 3D Error Graph that embodiment of the present invention calculates, and wherein, (a) is the depth from defocus acquisition methods considering diffraction 3D Error Graph, figure (b) be that hypothesis does not exist the depth from defocus acquisition methods Error Graph of optical diffraction;
Figure 10 is any depth section comparison diagram of embodiment of the present invention and the depth from defocus acquisition methods that there is not optical diffraction.
Detailed description of the invention
Below in conjunction with the accompanying drawings embodiments of the present invention are further described in detail.
Present embodiment uses depth from defocus acquisition methods based on diffraction blurring that standard nanometer grid carries out global depth and obtains in fact Test, this grid height 500nm, wide 1500nm, error within 3%, grid 3D rendering such as Fig. 1 that AFM scan obtains.This reality Execute and mode uses HIROX-7700 type microscope as video camera, can be 7000 times by standard nanometer grid multiplication factor.
Present embodiment uses single vision global depth information getting method based on diffraction blurring, and its flow process is as in figure 2 it is shown, wrap Include following steps:
Step 1: same video camera, in focal distance f=0.357mm, image distance R=0.399mm, numerical aperture D=2, imaging wavelength When λ=600nm, proportionality coefficient γ=0.002 between circle of confusion radius and fuzziness are fixed, gather piece image.At above-mentioned bar In the case of part is constant, then changing the distance between video camera and object, in present embodiment, distance change is dimensioned to Δ s=5 μm, gathers the second width image.If collecting image is coloured image, then need to change, by them two width images It is converted into gray level image.Gray level image can be 8 gray level images or 12 gray level images, uses in present embodiment The brightness value 0-255 of 8 gray level images, i.e. pixel, as it is shown on figure 3, wherein, (a) in Fig. 3 is the first width coloured image to result Gray level image after conversion;(b) in Fig. 3 is the gray level image after the conversion of the second width coloured image.
Step 2: set up diffraction blurring model, is used for describing the relation between the scenery degree of depth and image blurring degree.
Step 2.1: according to Fresnel (Fresenel) diffraction principle, sets up the Luminance Distribution of arbitrfary point on video camera imaging face And the relation between the scenery degree of depth, formula is:
I P = | - i J 1 ( 2 &pi;&lambda; - 1 y sin u ) &pi;&lambda; - 1 y sin u + &lambda; &pi;sin 2 u &CenterDot; &Sigma; n = 2 &infin; ( - iy - 1 sin u ) n ( l m ) n - 1 J n ( 2 &pi;&lambda; - 1 y sin u ) | 2
In formula, IpIt it is the Luminance Distribution of arbitrfary point P on video camera imaging face;J1It it is 1 rank Bessel function;JnIt is n rank shellfish plugs That function;Imaging light wavelength λ=the 600nm of video camera;Image distance R=0.399mm of video camera;U is mirror bicker;Sinu=a/R=0.5; The effective radius a=0.178mm of camera lens;Y represents that in image planes, arbitrfary point P is to the distance of image plane center;L represents current Difference between object distance and ideal image object distance, ideal image object distance=3.4mm;Axial multiplication factor m=700 of video camera;I is Imaginary unit.
On the imaging surface built, as shown in Figure 4, wherein, transverse axis is P to the relation between Luminance Distribution and the scenery degree of depth of arbitrfary point Distance to image plane center;Unit is mm;The longitudinal axis represents Luminance Distribution Ip.When l fixes, each IpCurve all close to Gaussian function, and the peak value of curve is at the center of camera image plane.
Step 2.2: the Luminance Distribution that the different scenery degree of depth set up in step 2.1 are corresponding is carried out Gauss curve fitting, with change in depth As a example by l=0, as it is shown in figure 5, wherein, asterism is actual result of calculation to fitting result, and solid-line curve is the result of Gauss curve fitting.
L is changed for each object distance, after Gauss curve fitting, a matched curve as shown in Figure 5 and one can be obtained Fuzzy core σ of representative image fog-level, finally obtains the relation curve between the scenery degree of depth and fog-level, as shown in Figure 6, Wherein, transverse axis is the scenery degree of depth relative to the ideal object change in depth away from 3.4mm;The longitudinal axis is fog-level.
Step 2.3: the relation described in step 2.2 in Fig. 6 with conic section, sets up diffraction blurring model, and formula is:
σ=al2+bl+c
In formula, σ is the fog-level of each pixel;A, b, c are respectively quadratic term in conic section, first order and constant term Coefficient, the a=1.2146e5 obtained after matching;B=-0.19461;C=0.00015614;L represents current object distance and ideal image thing Difference between away from, when the conic section obtained is as it is shown in fig. 7, Fig. 7 is diffraction blurring, the scenery degree of depth and the two of image blurring degree (asterisk point is Practical Calculation value to secondary relation curve;Curve is quadratic fit value).
Step 3: utilize two width blurred pictures in the diffraction blurring model in step 2 and step 1, determine the second width image Global depth information.
Step 3.1: use the length with the second width image and the degree of depth of wide identical initialization of two-dimensional array the second width image, for letter Single, this array s can be defined2(y z) is equal to 3.4mm at each pixel value;Set energy when current depth no longer changes Threshold tau=2, the size of τ is inversely proportional to the precision of result of calculation;Set normalization parameter α and the optimization ensureing current depth bounded Parameter k, here α=0.6, k=2;Set iteration step length β, the order of magnitude of β according to video camera ideal object away from determining, with iteration time Number is inversely proportional to, β=50nm here.
Step 3.2: use the method in current depth and 2.3 to calculate Relative Fuzzy degree, according to the positive and negative values of Relative Fuzzy degree Determine the direction of radiation coefficient.Relative Fuzzy formula is:
( &Delta; &sigma; ) 2 = &sigma; 2 2 - &sigma; 1 2
In formula, Δ σ is Relative Fuzzy degree, σ1Fog-level for each pixel of image during current depth;σ2Current depth changes delta s Time each pixel of image fog-level, as (Δ σ)2> 0 time represent that the radiation of image is positive spoke when current depth image to changes delta s Penetrating, radiation coefficient is positive number, (Δ σ)2< 0 represents that the radiation of image is negative radiation when current depth image to changes delta s, radiative chain Number is negative.
Step 3.3: with image 1 as radiation source, image 2 is radiation target, sets up the heat radiation equation group of each pixel, and formula is:
u &CenterDot; ( y , z , t ) = &dtri; &CenterDot; ( &epsiv; ( y , z ) &dtri; u ( y , z , t ) ) t &Element; ( 0 , &infin; ) u ( y , z , 0 ) = E 1 ( y , z ) u ( y , z , &Delta; t ) = E 2 ( y , z )
In formula, (y z) represents that (wherein, y, z represent horizontal direction and the Vertical Square of imaging plane to pixel respectively for y, radiation coefficient z) to ε To;E1(y z) is image 1;E2(y z) is image 2;" " represents gradient operator, and " " is that differential is calculated Son;T represents the time;Δ t represents that, from piece image to the radiated time of the second width image, formula is:
&Delta; t = ( &Delta; &sigma; ) 2 2 &epsiv; ( y , z )
In formula, radiation coefficient ε (y, direction z) and (Δ σ)2Direction identical.
Step 3.4: utilize radiation non trivial solution in 3.3, calculates the second width image and global energy difference F (s) of radiation solution of equation, Formula is:
F ( s ) = &Integral; &Integral; ( u ( y , z , &Delta; t ) - E 2 ( y , z ) ) 2 d y d z + &alpha; | | &dtri; s | | 2 + &alpha; k | | s 2 | | 2
If energy difference is more than energy threshold τ, updates the degree of depth of each pixel with step-length β, then return to step 3.2 and continue iteration. If energy is less than or equal to threshold tau, then iteration stopping.
In present embodiment, being total to iteration 100 times, if having reached iterations, the most automatically jumping out iteration, end loop process.
Step 4: global depth information s obtained according to step 32, set up the 3D rendering of the second width image.Calculated As shown in Figure 8, wherein figure (a) is the 3D rendering using the method for present embodiment to calculate to 3D depth image;Figure (b) is institute Have experiment condition identical constantly, it is assumed that diffraction not in the presence of the 3D rendering of depth from defocus acquisition methods.
Except the image of Nano grade being carried out 3D reconstruction, the method for present embodiment is equally applicable to other mid-scale level. Such as, micron, millimeter.
For the precision of degree of depth acquisition methods in checking present embodiment more intuitively, carry out following estimation error:
(1) real depth value S is calculated2With estimating depth s2Between relative error curved surface Φ, wherein real depth is by Fig. 1 Obtaining, computing formula is:
φ=s2/S2-1
As it is shown in figure 9, wherein, figure (a) is the 3D of the depth from defocus acquisition methods considering diffraction in present embodiment to result of calculation Error Graph, figure (b) is the depth from defocus acquisition methods Error Graph assuming not exist optical diffraction, and wherein, vertical axis represents that 3D is high Degree, unit is mm.
(2) then, because it is known that the exact height of standard grid is 500nm, the mean error of 500 estimation points is calculated, Formula is:
E a v e = 1 n &Sigma; k = 1 n | H k - H ~ k |
In formula, n represents sampling number;HkRepresent AFM scan height,Represent the estimation height of kth point.
The result that error analysis obtains is: when assuming to there is not diffraction, the mean error of depth from defocus acquisition methods is 161nm, and The mean error of the degree of depth acquisition methods based on diffraction blurring that present embodiment proposes is only 69nm.
Any depth section of two kinds of methods contrasts as shown in Figure 10: solid line is for assuming that in the presence of diffraction blurring is or not optical dimming obtains Result;Dotted line be the diffraction blurring degree of depth obtain result, result show the degree of depth that present implementation obtains be closer to ideal object away from, This is because image blurring in present embodiment is optical diffraction and the coefficient result of scenery change in depth, and traditional out of focus Supposing in degree of depth acquisition methods that optical diffraction does not exists, the fuzzy of image is only caused by scenery change in depth.So, for phase With blurred picture, present embodiment relatively with ideal object away from change in depth less, precision is higher.
Although the foregoing describing the detailed description of the invention of the present invention, but the those skilled in the art in this area should be appreciated that this It is merely illustrative of, these embodiments can be made various changes or modifications, without departing from principle and the essence of the present invention. The scope of the present invention is only limited by the claims that follow.

Claims (1)

1. a single vision global depth information getting method based on diffraction blurring, it is characterised in that: comprise the following steps:
Step 1: same video camera, focal length, image distance, numerical aperture, imaging wavelength and circle of confusion radius and fuzziness it Between proportionality coefficient when fixing, gather piece image;In the case of above-mentioned condition is constant, then change between video camera and object Distance, distance change size be Δ s, gather the second width image;
Step 2: set up diffraction blurring model, is used for describing the relation between the scenery degree of depth and image blurring degree;
Step 2.1: according to Fresel diffraction principle, set up on video camera imaging face the Luminance Distribution of arbitrfary point and the scenery degree of depth it Between relation, formula is:
I P = | - i J 1 ( 2 &pi;&lambda; - 1 y sin u ) &pi;&lambda; - 1 y sin u + &lambda; &pi;sin 2 u &CenterDot; &Sigma; n = 2 &infin; ( - iy - 1 sin u ) n ( l m ) n - 1 J n ( 2 &pi;&lambda; - 1 y sin u ) | 2
In formula, IpIt it is the Luminance Distribution of arbitrfary point P on video camera imaging face;J1It it is 1 rank Bessel function;JnIt is n rank shellfishes Sai Er function;λ represents the imaging light wavelength of video camera;R is the image distance of video camera;U is mirror bicker;Sinu=a/R;A represents The effective radius of camera lens;Y represents that in image planes, arbitrfary point P is to the distance of image plane center;L represents that current object distance becomes with ideal As the difference between object distance;M represents the axial multiplication factor of video camera;I is imaginary unit;
Step 2.2: the Luminance Distribution that the different scenery degree of depth set up in step 2.1 are corresponding is carried out Gauss curve fitting, obtains scenery deep Relation curve between degree and fog-level;
Step 2.3: describe the relation that step 2.2 obtains with conic section, sets up diffraction blurring model, and formula is:
σ=al2+bl+c
In formula, σ is the fog-level of each pixel;A, b, c are respectively quadratic term in conic section, first order and constant term Coefficient;L represents the difference between current object distance and ideal image object distance;
Step 3: utilize two width blurred pictures in the diffraction blurring model in step 2 and step 1, determine the second width image Global depth information;
Step 3.1: use the length with the second width image and the degree of depth of wide identical initialization of two-dimensional array the second width image;Set and work as The size of energy threshold τ, τ when the front degree of depth no longer changes is inversely proportional to the precision of result of calculation;Set and ensure current depth The normalization parameter α and Optimal Parameters k of bounded, and have α > 0, k > 0;Setting iteration step length β, the order of magnitude of β is according to video camera Ideal object, away from determining, is inversely proportional to iterations;
Step 3.2: use the method in current depth and step 2.3 to calculate Relative Fuzzy degree, according to Relative Fuzzy degree just Negative value determines that the direction of radiation coefficient, Relative Fuzzy formula are:
( &Delta; &sigma; ) 2 = &sigma; 2 2 - &sigma; 1 2
In formula, Δ σ is Relative Fuzzy degree, σ1Fog-level for each pixel of image during current depth;σ2Current depth changes delta s Time each pixel of image fog-level, wherein, (Δ σ)2> 0 represent when current depth image to changes delta s the radiation of image be just Radiation, radiation coefficient is positive number, (Δ σ)2< 0 represents that the radiation of image is negative radiation when current depth image to changes delta s, spoke Penetrating coefficient is negative;
Step 3.3: with piece image as radiation source, the second width image is radiation target, sets up the heat radiation equation group of each pixel, Formula is:
u &CenterDot; ( y , z , t ) = &dtri; &CenterDot; ( &epsiv; ( y , z ) &dtri; u ( y , z , t ) ) t &Element; ( 0 , &infin; ) u ( y , z , 0 ) = E 1 ( y , z ) u ( y , z , &Delta; t ) = E 2 ( y , z )
In formula, (y z) represents that (wherein, y, z represent the horizontal direction of imaging plane and hang down pixel respectively for y, radiation coefficient z) to ε Nogata to;E1(y z) is piece image;E2(y z) is the second width image;Represent gradient operator, be Differential operator;T represents the time;Δ t represents that, from piece image to the radiated time of the second width image, formula is:
&Delta; t = ( &Delta; &sigma; ) 2 2 &epsiv; ( y , z )
In formula, radiation coefficient ε (y, direction z) and (Δ σ)2Direction identical;
Step 3.4: utilize radiation non trivial solution in step 3.3, the global energy calculating the second width image and radiation solution of equation is poor, If energy difference is more than energy threshold τ, updates the degree of depth of each pixel with step-length β, then return to step 3.2 and continue iteration;If Energy is less than or equal to threshold tau, then iteration stopping;
Step 4: the global depth information obtained according to step 3, sets up the 3D rendering of the second width image.
CN201410028369.3A 2014-01-21 2014-01-21 Single vision global depth information getting method based on diffraction blurring Active CN103793909B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410028369.3A CN103793909B (en) 2014-01-21 2014-01-21 Single vision global depth information getting method based on diffraction blurring

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410028369.3A CN103793909B (en) 2014-01-21 2014-01-21 Single vision global depth information getting method based on diffraction blurring

Publications (2)

Publication Number Publication Date
CN103793909A CN103793909A (en) 2014-05-14
CN103793909B true CN103793909B (en) 2016-08-17

Family

ID=50669532

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410028369.3A Active CN103793909B (en) 2014-01-21 2014-01-21 Single vision global depth information getting method based on diffraction blurring

Country Status (1)

Country Link
CN (1) CN103793909B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104574369B (en) * 2014-12-19 2017-10-20 东北大学 The fuzzy depth acquisition methods of global diffusion based on thermal diffusion
CN108364274B (en) * 2018-02-10 2020-02-07 东北大学 Nondestructive clear reconstruction method of optical image under micro-nano scale
CN108805975B (en) * 2018-05-29 2021-03-16 常熟理工学院 Microscopic 3D reconstruction method based on improved iterative shrinkage threshold algorithm
JP6569157B1 (en) * 2018-06-27 2019-09-04 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Control device, imaging device, moving object, control method, and program
CN113077395B (en) * 2021-03-26 2023-10-24 东北大学 Deblurring method for large-size sample image under high-power optical microscope
CN113436120B (en) * 2021-07-20 2023-06-20 湖南圣洲生物科技有限公司 Image fuzzy value identification method and device
CN115580690B (en) * 2022-01-24 2023-10-20 荣耀终端有限公司 Image processing method and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0328527A1 (en) * 1986-10-02 1989-08-23 British Aerospace Public Limited Company Real time generation of stereo depth maps
CN102867297A (en) * 2012-08-31 2013-01-09 天津大学 Digital processing method for low-illumination image acquisition

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0328527A1 (en) * 1986-10-02 1989-08-23 British Aerospace Public Limited Company Real time generation of stereo depth maps
CN102867297A (en) * 2012-08-31 2013-01-09 天津大学 Digital processing method for low-illumination image acquisition

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Global shape reconstruction of nano grid with singly fixed camera;WEI YangJie,etc;《Technological Sciences》;20110430;第54卷(第4期);1044-1047 *
Shape from Defocus via Diffusion;Paolo Favaro,etc;《IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE》;20080331;第30卷(第3期);518-523 *
一种基于多项式的图像离焦建模方法研究;刘英华等;《2007’仪表,自动化及先进集成技术大会论文集(二)》;20071201;511-514 *
摄像机参数固定的全局离焦深度恢复;魏阳杰等;《中国图象图形学报》;20101231;第15卷(第12期);1811-1815 *

Also Published As

Publication number Publication date
CN103793909A (en) 2014-05-14

Similar Documents

Publication Publication Date Title
CN103793909B (en) Single vision global depth information getting method based on diffraction blurring
CN103971353B (en) Splicing method for measuring image data with large forgings assisted by lasers
Vasiljevic et al. Diode: A dense indoor and outdoor depth dataset
CN105205858B (en) A kind of indoor scene three-dimensional rebuilding method based on single deep vision sensor
US8471897B2 (en) Method and camera for the real-time acquisition of visual information from three-dimensional scenes
CN103530880B (en) Based on the camera marking method of projection Gaussian network pattern
CN105716539B (en) A kind of three-dimentioned shape measurement method of quick high accuracy
CN109166154A (en) Light-field camera calibration method for light field three dimensional particles image reconstruction
CN103308000B (en) Based on the curve object measuring method of binocular vision
CN104111485A (en) Stereo imaging based observation method for raindrop size distribution and other rainfall micro physical characteristics
CN103993548A (en) Multi-camera stereoscopic shooting based pavement damage crack detection system and method
CN108362469A (en) Size based on pressure sensitive paint and light-field camera and surface pressure measurement method and apparatus
CN109325981A (en) Based on the microlens array type optical field camera geometrical parameter calibration method for focusing picture point
AT509884A4 (en) Microscopy method and device
Thomason et al. Calibration of a microlens array for a plenoptic camera
CN115082446B (en) Method for measuring aircraft skin rivet based on image boundary extraction
Fu et al. Targetless extrinsic calibration of stereo, thermal, and laser sensors in structured environments
Knyaz et al. Joint geometric calibration of color and thermal cameras for synchronized multimodal dataset creating
CN108876825A (en) A kind of space non-cooperative target Relative Navigation three-dimensional matching estimation method
CN109443319A (en) Barrier range-measurement system and its distance measuring method based on monocular vision
Fahringer et al. The effect of grid resolution on the accuracy of tomographic reconstruction using a plenoptic camera
CN113808019A (en) Non-contact measurement system and method
CN104574369B (en) The fuzzy depth acquisition methods of global diffusion based on thermal diffusion
CN108364274A (en) The lossless clear reconstructing method of optical imagery under micro-nano-scale
CN109458929A (en) Cylinder measurement site rapid calibration device and method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant