CN110285788A - The design method of ToF camera and diffraction optical element - Google Patents
The design method of ToF camera and diffraction optical element Download PDFInfo
- Publication number
- CN110285788A CN110285788A CN201810225284.2A CN201810225284A CN110285788A CN 110285788 A CN110285788 A CN 110285788A CN 201810225284 A CN201810225284 A CN 201810225284A CN 110285788 A CN110285788 A CN 110285788A
- Authority
- CN
- China
- Prior art keywords
- tof camera
- pixel
- light source
- sensor
- illuminating bundle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/22—Measuring arrangements characterised by the use of optical techniques for measuring depth
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0012—Optical design, e.g. procedures, algorithms, optimisation routines
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Optics & Photonics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The present invention relates to a kind of design methods of the diffraction optical element of ToF camera, optical projection system, robot, control loop and electronic equipment with the ToF camera and the array light source for ToF camera.The ToF camera includes array light source, light receiving microscopy head system and sensor, the array light source is used to issue the illuminating bundle of lattice arrangement to target object, the illuminating bundle that the light receiving microscopy head system is used to receive the lattice arrangement exposes to target object and the illuminating bundle of lattice arrangement that reflects is to be imaged on the sensor for the target object, the sensor is used to sense the illuminating bundle for the lattice arrangement that the light receiving microscopy head system is imaged to the sensor to obtain the deep image information of the target object, wherein, the sensor includes multiple pixels, the illuminating bundle of the pixel and the lattice arrangement corresponds, each pixel is for sensing a corresponding illuminating bundle.
Description
Technical field
The present invention relates to a kind of ToF cameras, optical projection system, robot, control loop and electronics with the ToF camera
The design method of the diffraction optical element of equipment and the array light source for ToF camera.
Background technique
In gesture identification and 3D the imaging application of control loop (such as automated driving system), optical projection system or electronic equipment
It is frequently necessary to the depth information of measurement target object.Common depth measurement method have triangle telemetry, structural light measurement method and
ToF telemetry.The different ranging technology of these types respectively has advantage and disadvantage.Triangle telemetry is insensitive to environment light and hardware configuration is simple
It is single, but having the stationary problem between camera due to using multiple cameras causes image analysis at high cost, and product be difficult to it is whole
Body miniaturization.The spatial resolution of structural light measurement method is high, and calculating is at low cost, but measurement result is rung big and surveyed by environment shadow
It is big to measure high-speed moving object error.For ToF telemetry for triangle telemetry and structural light measurement method, structure is simple, produces
Product are small in size, and measurement high-speed moving object result is preferable, and environment light influences small, the wherein survey of ToF telemetry to measurement result
It is as shown in Figure 1 away from schematic diagram.In view of measurement result precision, triangle telemetry measures middle and long distance, structure light under normal circumstances
Mensuration and ToF telemetry measure short distance.
Specifically, ToF telemetry is divided into two classes, and one kind is the echo time of measurement pulse signal to measure distance, generally
Single-point type rangefinder and scanning type laser radar are in this way.Since light energy is concentrated, rangefinder in this way
Measurement range with laser radar is several meters and arrives several kms.Another kind of is phase-detection method, utilizes what is loaded on continuous optical signal
The phase-shift measurement distance of modulated signal echo.The distance of this method measurement is limited to the frequency of modulated signal, generally several hundred
KHz is to tens megahertzs.Effectively measurement distance increases with the frequency of modulated signal and is reduced.ToF camera is surveyed using latter
Away from principle, structure as shown in Figure 1, system by a light source 101, testee 102, camera lens 103, ToF depth transducer 104
It is formed with hardware controls processing circuit 105.The amplitude of the light issued by light source 101 is irradiated by being emitted after sinusoidal or square-wave frequency modulation
To testee 102.Camera lens 103 collects the signal light by reflection and passes to ToF depth image sensor 104.ToF depth
Imaging sensor 104 is locking phase CCD or CMOS photosensitive array.Fig. 2 show the structural schematic diagram of locking phase CCD a kind of, table in figure
The region for being shown as Light Opening is effective photosensitive area, and the light of other parts incidence will not have an impact result.It is this
Field distribution in the locking phase CCD of special construction or the control device of cmos device dynamic makes the hole electricity by photon excitation
Into device, specific region is mobile and accumulates in the electron trap in the region for the electronics of sub- centering.Due to CCD or CMOS device
The limitation of part, each frame image needs the integral of long period, thus can not achieve to High Speed Modulation optical signal (several hundred kHz
To tens megahertzs) high-speed sampling.But locking phase CCD or cmos device can quickly be cut with the synchronous frequency of modulating frequency
Field distribution on parallel operation part, so that the energy within the scope of modulated signal same phase is accumulated in the same electron trap, together
To realize the sampling to modulated signal out of phase time point.
The Computing Principle of testee distance is as shown in Figure 4.The phase difference of reflected light relative laser device sending signal lightSymbol
Close following equation 1.
Wherein, A is the amplitude for receiving signal, and B is the DC component for receiving signal, is proportional to the signal light received respectively
Subnumber and environment/noise light subnumber.A and B correspond with following equation 2 and formula 3:
Further, testee distance meets following equation 4:
Since the error source of ToF camera ranging is mainly environment light, quantum noise and device noise.Ambient light illumination is not
With distance change, so measurement result and apart from unrelated, but it is related with angular is received.Quantum noise is called shot noise, by
Quantum effect in photoelectric effect determines, does not change with the variation of environment, and device noise includes reset noise, flashes and make an uproar
The current noise and dark current noise of sound, amplifier.Noise intensity increases with increased temperature.The size of measurement error limits
The distance of ToF camera ranging, the process for reducing error are exactly to increase the process of ToF camera measurement distance.Not due to quantum noise
With device and environmental change, thus quantum noise determines the maximal accuracy of ideally ranging.
Distance is calculated since ToF camera ranging is the phase difference between the incident optical signal and reflected light signal measured, institute
Meet following equation 5 with error delta L:
Ideally, environment light can be approximated to be B=A/2 much smaller than signal light, is substituted into formula 5, can obtained
Following equation 6:
It can deduce that the bigger measurement error of A is smaller according to formula 6.But due to each picture on ToF depth image sensor
The electron number that element can be accumulated is limited, and increase A is also continued after accumulation electron number saturation can not reduce error and waste
Light source power, therefore the value of A is limited, cannot be infinity.When pixel saturated electrons number is 100000, error be can control
To the 0.04% of maximum measure distance.
The method that measurement error is further decreased under the conditions of A value is limited is to reduce environment light to the shadow of measurement result
It rings.Because ambient light illumination is not with distance change, the influence of its generation and apart from unrelated related with angular is received.Therefore can lead to
The focal length of increase light receiving microscopy head is crossed to reduce the visual angle of camera lens, to weaken the influence of environment light generation.
Under the conditions of different lens focus, emulation enters the environment optical power of camera lens with variation such as Fig. 5 (a) institute of distance
Show.Under the conditions of different environment photoelectron numbers, shown in the variation such as Fig. 5 (b) of simulated measurement precision with distance.Simulation result table
Bright, in the case that ambient light illumination is constant, visual angle becomes 5% that 15 degree of environment light are original from 50 degree, does not influence the feelings of precision
It can be improved four times of measurement distances under condition.Meanwhile the fewer measurement accuracy of electron number that environment light generates is higher.
Further, the expression formula for the influence that device noise generates measurement result (i.e. error delta L) is formula 7:
Wherein, Npseudo is the noise electronics that device introduces.
If the laser used is non ideal solution reconciliation light source, A, B are indicated with its introduced free electron number respectively,
It can then know following equation 8, formula 9 and formula 10.
Beff=Nambient+Npseudo+PEopt(formula 8);
A=CmodCdemodPEopt(formula 9);
In specific measurement application, device parameters and environment influence are substantially constant, can be subtracted by following three kinds of methods
The influence of gadget noise, to reduce the error delta L in measurement process:
1) cool down, can reduce device noise;
2) influence of device noise can be weakened by increasing the electron number received.There are two types of methods, and first method is to increase
Large aperture, in the case where not changing light conversion electron number, the aperture that the distance often doubled needs to double, this side
The shortcomings that method is that structural volume is caused to increase cost increase.Second method is to increase optical power, and the distance often doubled needs
Want four times of optical power.
3) it since the light source being adapted at a distance will lead to short distance device saturation, can reduce using dynamic optical power such
Error.
Assuming that when device and following light source parameters: light source power 700mW, 50 degree of beam divergence angle, pixel photosensitive area
12.5 × 14.5 μm 2, optical maser wavelength 630nm, quantum efficiency 65%, integration sampling time 25ms, camera lens efficiency be 0.35, mirror
Head bore diameter is 2.6mm, emulates the relationship between incident light subnumber and error change and environmental variance.Fig. 6 show the anti-of object
Penetrate the relationship of rate and incident light subnumber, it can be seen that exist between the number of photons that the reflectivity and sensor of object receive linear
Relationship.It on the other hand, is the influence for inhibiting environment light, the receipts angular of camera lens influences the precision of measurement, and emulation experiment shows will
Camera lens receives angular from when being kept to 15 degree for 50 degree, and the number of photons that environment light generates will become original 5%, this will be greatly improved
The precision of measurement, to obtain farther measurement distance.It is long by changing the method reduction error post-simulation that angle lens are 15 °
Measurement result in distance range is as shown in Figure 7.Wherein simulation parameter is F=1.5MHz, and testee reflectivity is 0.5.
Fig. 7 (a) is under the conditions of different light source powers, the electron number that signal light and environment light generate within the pixel is with distance
Change curve;Fig. 7 (b) be different light source light power conditions under, measurement error with distance change curve.Simulation result table
Bright, long range measurements need very big light source power that can guarantee measurement accuracy.Using the difference frequency side of two high frequency light sources
The maximum nothing that method can increase phase method obscures ranging.But theory analysis surface, survey not can increase by the method for difference frequency
The precision of amount.
In view of non-for the relatively total elemental area of effective photosensitive area of the dot structure on ToF depth image sensor
Often small, between the generally 6%-15% of elemental area, the method for further increasing measurement accuracy is to promote light signal light
Light efficiency.Consider that the method for promoting light efficiency is to increase a corresponding lenticule on each pixel from sensor structure, will enter
The light penetrated further converges to photosensitive area.This method is the problem is that also improve ring while improving signal light efficiency
The intensity of border light, it is unfavorable that the precision under strong light environment is improved.
Summary of the invention
Against the above technical problems, it is necessary to which a kind of ToF camera, the optical projection system with the ToF camera, machine are provided
The design side of the diffraction optical element of people, control loop, electronic equipment and electronic equipment and the array light source for ToF camera
Method.
A kind of ToF camera comprising array light source, light receiving microscopy head system and sensor, the array light source is for issuing
To target object, the illuminating bundle that the light receiving microscopy head system is used to receive the lattice arrangement shines the illuminating bundle of lattice arrangement
It is incident upon target object and the illuminating bundle of lattice arrangement that reflects is the target object to be imaged on the sensor, it is described
Sensor is used to sense the illuminating bundle for the lattice arrangement that the light receiving microscopy head system is imaged to the sensor to obtain
Obtain the deep image information of the target object, wherein the sensor includes multiple pixels, and the pixel and the dot matrix are arranged
The illuminating bundle of column corresponds, and each pixel is for sensing a corresponding illuminating bundle.
A kind of optical projection system comprising ToF camera, the ToF camera include array light source, light receiving microscopy head system and sensing
Device, the array light source are used to issue the illuminating bundle of lattice arrangement to target object, and the light receiving microscopy head system is for receiving
The illuminating bundle of the lattice arrangement exposes to target object and the illuminating bundle of lattice arrangement that reflects is with by the object
Body is imaged on the sensor, and the institute to the sensor is imaged for sensing the light receiving microscopy head system in the sensor
The illuminating bundle of lattice arrangement is stated to obtain the deep image information of the target object, wherein the sensor includes multiple
The illuminating bundle of pixel, the pixel and the lattice arrangement corresponds, and each pixel is for sensing a corresponding illumination light
Beam.
A kind of robot comprising ToF camera, the ToF camera include array light source, light receiving microscopy head system and sensing
Device, the array light source are used to issue the illuminating bundle of lattice arrangement to target object, and the light receiving microscopy head system is for receiving
The illuminating bundle of the lattice arrangement exposes to target object and the illuminating bundle of lattice arrangement that reflects is with by the object
Body is imaged on the sensor, and the institute to the sensor is imaged for sensing the light receiving microscopy head system in the sensor
The illuminating bundle of lattice arrangement is stated to obtain the deep image information of the target object, wherein the sensor includes multiple
The illuminating bundle of pixel, the pixel and the lattice arrangement corresponds, and each pixel is for sensing a corresponding illumination light
Beam.
In one embodiment, the artificial homing guidance robot of the machine.
A kind of control loop comprising steer and ToF camera, the ToF camera include array light source, light receiving microscopy head
System and sensor, the array light source are used to issue the illuminating bundle of lattice arrangement to target object, light receiving microscopy head system
It unites and exposes to target object and the illuminating bundle of lattice arrangement that reflects will be with will for receiving the illuminating bundle of the lattice arrangement
The target object is imaged on the sensor, and the sensor is imaged for sensing the light receiving microscopy head system to the sense
The illuminating bundle for the lattice arrangement surveyed on device is to obtain the deep image information of the target object, wherein the sensing
Device includes multiple pixels, and the illuminating bundle one-to-one correspondence of the pixel and the lattice arrangement, each pixel is for sensing correspondence
An illuminating bundle.
In one embodiment, the control loop is automated driving system.
A kind of electronic equipment comprising ToF camera, the ToF camera include array light source, light receiving microscopy head system and sensing
Device, the array light source are used to issue the illuminating bundle of lattice arrangement to target object, and the light receiving microscopy head system is for receiving
The illuminating bundle of the lattice arrangement exposes to target object and the illuminating bundle of lattice arrangement that reflects is with by the object
Body is imaged on the sensor, and the institute to the sensor is imaged for sensing the light receiving microscopy head system in the sensor
The illuminating bundle of lattice arrangement is stated to obtain the deep image information of the target object, wherein the sensor includes multiple
The illuminating bundle of pixel, the pixel and the lattice arrangement corresponds, and each pixel is for sensing a corresponding illumination light
Beam.
In one embodiment, the electronic equipment is mobile phone, computer or unmanned flight's instrument.
A kind of design method of the diffraction optical element of the array light source for ToF camera, it is special the following steps are included:
Calculate effective photosensitive area distribution of the sensor of the ToF camera;
Set the plane typical range of far field plane;
Calculate projection pattern distribution of the far field plane on the sensor;
Calculate the phase diagram of the diffraction optical element.
Compared with prior art, ToF camera of the present invention is due to using array light source and making pixel and lattice arrangement
Illuminating bundle corresponds, so that each pixel of sensor can sense corresponding illuminating bundle, and then provides reliable
Sensing result improves the reliability of the ToF camera;Meanwhile under identical power, number of photons phase that each pixels sense arrives
For being improved using the light source of Uniform Illumination, and photon that environment light that each pixels sense arrives generates and using Uniform Illumination
Light source improves the light efficiency and maximum measure distance distance that the ToF camera is not only improved compared to not changing substantially, also improves reception
The signal-to-noise ratio of sensing signal has better measurement accuracy.
Detailed description of the invention
Fig. 1 is the range measurement principle figure of ToF telemetry.
Fig. 2 is a kind of range measurement principle that ToF camera uses.
Fig. 3 show the structural schematic diagram of locking phase CCD a kind of.
Fig. 4 is the schematic diagram calculation of testee distance.
Fig. 5 (a) is that the environment optical power that emulation enters camera lens under the conditions of different lens focus is illustrated with the variation of distance
Figure.
Fig. 5 (b) be under the conditions of different environment photoelectron numbers simulated measurement precision with distance variation schematic diagram.
Fig. 6 is the reflectivity of object and the relation schematic diagram of incident light subnumber.
Fig. 7 (a) be under the conditions of different light source powers the electron number that generates within the pixel of signal light and environment light with distance
Change curve schematic diagram.
Fig. 7 (b) is measurement error under different light source light power conditions with the change curve schematic diagram of distance.
Fig. 8 is the structural schematic diagram of the ToF camera of a better embodiment of the invention.
Fig. 9 is the light path schematic diagram that modulation element shown in Fig. 8 carries out diffraction processing to the light source light that light source issues.
Figure 10 is the design method schematic diagram of modulation element shown in Fig. 8.
When Figure 11 (a) is using 630nm wavelength signals light, the measurement essence of Uniform Illumination and dot matrix illumination under different capacity
Exactness is with the curve graph for measuring distance change.
When Figure 11 (b) is using 1550nm wavelength signals light, the measurement of Uniform Illumination and dot matrix illumination under different capacity
Accuracy is with the curve graph for measuring distance change.
When Figure 12 (a) is using 630nm wavelength signals light, the measurement essence of Uniform Illumination and dot matrix illumination under different capacity
Exactness is with the curve graph for measuring distance change.
When Figure 12 (b) is using 1550nm wavelength signals light, the measurement of Uniform Illumination and dot matrix illumination under different capacity
Accuracy is with the curve graph for measuring distance change.
Figure 13 (a) is the measuring accuracy of the Uniform Illumination and dot matrix illumination under different capacity with the song of measurement distance change
Line chart.
Figure 13 (b) is the measurement of the Uniform Illumination and dot matrix illumination after valid pixel is kept to original 1/4, under different capacity
Accuracy is with the curve graph for measuring distance change.
Main element symbol description
ToF camera 100
Array light source 107
Light receiving microscopy head system 102
Target object 103
Far field plane 103a
First area 103b
Sensor 101
Light source 104
Collimation lens 106
Diffraction optical element 105
Pixel 101a
The present invention that the following detailed description will be further explained with reference to the above drawings.
Specific embodiment
The present invention is directed to the characteristics of ToF ranging technology sensor structure, proposes and is replaced uniformly using dot matrix lighting source
The method of lighting source improves the measurement of ToF camera under the conditions of same optical power so that ToF telemetry light efficiency gets a promotion
Distance.
Specifically, referring to Fig. 8, Fig. 8 is the structural schematic diagram of the ToF camera 100 of a better embodiment of the invention.Institute
Stating ToF camera 100 includes array light source 107, light receiving microscopy head system 102 and sensor 101.The array light source 107 is for sending out
The illuminating bundle of lattice arrangement to target object 103, the light receiving microscopy head system 102 is used to receive the photograph of the lattice arrangement out
Mingguang City's beam exposes to target object 103 and the illuminating bundle of the lattice arrangement of reflection is to be imaged on the sense for the target object
It surveys on device 101, the sensor 101 is for described in sensing in the light receiving microscopy head system 102 imaging to the sensor 101
The illuminating bundle of lattice arrangement is to obtain the deep image information of the target object 103, wherein the sensor 101 includes
The illuminating bundle of multiple pixel 101a, the pixel 101a and the lattice arrangement corresponds, and each pixel 101a is for feeling
Survey a corresponding illuminating bundle.
Specifically, the array light source 107 includes light source 104, collimation lens 106 and modulation element 105.The light source
104 be invention diode or laser light source, and for issuing light source light, the modulation element 105 is used for the light source light modulation
At the illuminating bundle of the lattice arrangement.The collimation lens 106 be set to the light source 104 and the modulation element 105 it
Between, the light source light for issuing the light source 104 is provided to the modulation element 105 after being collimated.
In present embodiment, the modulation element 105 is diffraction optical element.Referring to Fig. 9, Fig. 9 is the modulation element
Part 105 carries out the light path schematic diagram of diffraction processing to the light source light that the light source 104 issues.The modulation element 105 can wrap
Multiple diffraction elements are included, each diffraction element is converted to an illuminating bundle for that will receive light source light, and the illuminating bundle is through mesh
It marks after object 103 reflects through by the light receiving microscopy head system 102 projection a to pixel corresponding on the sensor 101
101a。
Further, in present embodiment, the sensor 101 is ToF chip, can be felt for locking phase CCD or CMOS
Optical arrays;The light receiving microscopy head system 102 may include receiving optical lens, and the optical centre of the light receiving microscopy head system 102 can be with
Point-blank with the center of the target object 103 and the center of the sensor 101;In one embodiment, institute
Stating target object 103 can be located in a plane in the far field that the light source 104 is formed via the light receiving microscopy head system 102,
In other words, by taking the target object 103 is positioned at a far field plane 103a in the far field as an example, so as to clearer description institute
State the light path principle of ToF camera 100 and its design principle of diffraction optical element.
Specifically, the pixel 101a includes effective photosensitive area 101b, and the sensor further includes not photosensitive area
101c.The not photosensitive area 101c can be between effective photosensitive area 101b of two neighboring pixel 101a.The picture
The corresponding illuminating bundle of plain 101a is covered to effective photosensitive area 101b, i.e., the spot size of the described illuminating bundle with
Effective photosensitive area is consistent, so that the hot spot of the illuminating bundle is just only covered to effective photosensitive area 101b
On.
The effect of the light receiving microscopy head system 102 is the photograph that the far field plane 103a is reflected to the array light source 107
The Space Angle distribution of Mingguang City's beam becomes the distribution of the spatial position on the sensor 101.It is any on the far field plane 103a
Any may map to the correspondence on the sensor 101 a bit.By light receiving microscopy head system 102, on the sensor 101
Any effective photosensitive area 101b uniquely corresponds to a region (such as described far field plane 103a on the far field plane 103a
On first area 103b indicate corresponding region).It is put down in this way, effective photosensitive area of the sensor 101 obtains the far field
The light field signal that face is formed through the array light source 107 illumination, it is possible to understand that.The light field signal is the photograph of the lattice arrangement
Plane 103a in far field described in Mingguang City Shu Jing and the light receiving microscopy head system 102 are incident to the beam signal of the sensor 101, institute
The depth image of the far field plane 103a can be obtained according to the beam signal that the sensor 101 senses by stating ToF camera
Information.
As shown in figure 8, each spreading out since the distance between the light source 104 and described sensor 101 can be ignored
The corresponding space angle in the center corresponding pixel 101a can be considered as equal to by penetrating the corresponding space angle in the center unit 105a.Each
The corresponding space angle in the center pixel (i, j) includes the angle theta between the corresponding vertical direction of the pixel (i, j) and optical axisc i,j
And the included angle between the corresponding horizontal direction of pixel (i, j) and optical axisc i,j, wherein
Wherein, i and j respectively represents number of horizontal lines of the pixel (i, j) on the pixel-matrix on the sensor 101
With vertical columns, that is, the pixel (i, j) is the pixel for being located at the i-th row and jth column on the sensor 101, FOVVIt is vertical
The field of view angle in direction, FOVhFor the field of view angle of horizontal direction, NvFor the number of pixels of vertical direction, NhFor horizontal direction
Number of pixels.
Further, in the pixel (i, j), the corresponding space angle half-breadth θ Δ of the effective photosensitive areai,jWith φ
Δi,jMeet following formula:
Wherein, dhAnd dvFor the size of effective photosensitive area 101b in a vertical and horizontal direction, d1For 101 He of sensor
The distance of the optical centre of light receiving microscopy head system 102.
According to above-mentioned analysis, the illumination for the lattice arrangement that the array light source 104 is formed on the far field plane 103a
Light beam need to pass through projection pattern of the light receiving microscopy head system 102 in the far field plane 103a with the sensor 101 and be overlapped, this hair
The bright diffraction optical element (the i.e. described modulation element 105) for further providing for the array light source 104 for the ToF camera 100
Design method, that is, obtain the phase for being suitable for the diffraction optical element of the sensor 101 and the light receiving microscopy head system 102
Distribution.Wherein, projection pattern of the sensor 101 on the far field plane 103a, can be equivalent to effective photosensitive area
The space multistory angle of 101b is distributed, every since the distance between the light source 104 and described sensor 101 can be ignored
The corresponding space angle in the center a diffraction element 105a can be considered as equal to the corresponding space angle in the center corresponding pixel 101a,
Therefore in the design method, the corresponding space angle in the center each pixel 101a is obtained you can learn that the diffraction list by calculating
The corresponding space angle in the first center 105a, thus as described in obtaining diffraction optical element phase distribution, wherein each pixel
The corresponding space angle distribution in the center 101a is by the size of the sensor 101 and effective photosensitive area 101b pattern and receives light
The focal length of lens system 102 determines.
Specifically, according to above-mentioned principle, as shown in Figure 10, the design method includes the following steps S1-S4.
Step S1 calculates effective photosensitive area 101b distribution of the sensor 101 of the ToF camera.It is appreciated that passing through
101 parameter of sensor that uses you can learn that.Specifically, it can be arranged according to the pixel 101a of the sensor 101 and effective
Position of the photosensitive area 101b in pixel 101a, size and shape determine effective photosensitive area figure of the sensor 101.
Step S2 sets the plane typical range of far field plane 103a.It is appreciated that the plane typical range is described
Far field plane 103a to the distance d2 of the light receiving microscopy head system 102 can also be considered as the ToF camera 100 to far field plane
The detecting distance of 103a, it is related with 100 ability of ToF camera.
Step S3 calculates projection pattern distribution of the far field plane 103a on the sensor 101.According to the receipts
The focal length of light microscopic head system 102, the size of the sensor 101, the spacing of the pixel 101a are (such as the not photosensitive region
The width of 101c), effective photosensitive area 101b size, calculate the effective photosensitive area 101b of pixel 101a corresponding Space Angle distribution
The space angular spacing (θ 1 as shown in Figure 8) of (θ as shown in Figure 8, as shown in formula 11-14, details are not described herein again) and pixel 101a.
Step S4 calculates the phase diagram of the diffraction optical element.It specifically, can be according to each of the sensor 101
The Space Angle distribution at the center pixel 101a, the design angle theta 2 between 104 optical axis of light source and 102 optical axis of light receiving microscopy head system,
Using the generation method of diffraction optics design and phase hologram, the phase delay distribution of the diffraction optical element is calculated, is used
To design the diffraction optical element.In addition, since the distance between the light source 104 and the sensor 101 compares measurement
Distance can be ignored, thus θ 2 can be approximated to be 0 in the design, i.e., the Space Angle that the described sensor 101 projects is distributed (i.e. each picture
The corresponding Space Angle in plain center) can directly be equivalent to diffraction optical element emergent light Space Angle distribution (θ as shown in Figure 8, such as
Shown in formula 11-14, details are not described herein again).
Compared with prior art, ToF camera 100 of the present invention due to using array light source 107 and make pixel 101a with
The illuminating bundle of lattice arrangement corresponds, so that each pixel 101a of sensor 101 can sense corresponding illumination light
Beam, and then reliable sensing result is provided, improve the reliability of the ToF camera 100;Meanwhile under identical power, each
The number of photons that pixel 101a is sensed is improved relative to using the light source of Uniform Illumination, and the environment that each pixel 101a is sensed
The photon that light generates is compared with the light source raising using Uniform Illumination not to be changed substantially, not only improves the ToF camera 100
Light efficiency and maximum measure distance distance also improve the signal-to-noise ratio for receiving sensing signal, have better measurement accuracy.
The present invention also provides a kind of optical projection system, the optical projection system is used for projection-display image, the optical projection system tool
There is ToF camera, the optical projection system has the function of gesture identification, sensing signal of the optical projection system according to the ToF camera
Realize the gesture identification function, the ToF camera uses the ToF camera 100 of above embodiment.
The present invention also provides a kind of robot, the robot can be homing guidance robot, and the robot includes
ToF camera, the robot realize that the sensing to external object, the ToF camera use above-mentioned implementation using the ToF camera
The ToF camera 100 of mode.
The present invention also provides a kind of control loop, the control loop can be automated driving system comprising driving is set
Standby (such as autonomous driving vehicle) and ToF camera, the ToF camera may be mounted on the steer, and the ToF camera is adopted
With the ToF camera of above embodiment.
Additionally, it is appreciated that the ToF phase of ToF camera 100 and its change embodiment in above embodiment of the present invention
Machine can be also used on the electronic equipments such as mobile phone, computer, unmanned flight's instrument, however it is not limited to above-mentioned optical projection system, robot
And control loop.
Further, experiments verify that, it is assumed that the illumination efficiency of light source is 70%, then when Uniform Illumination and dot matrix illuminate
When the power of (illuminating bundle for the lattice arrangement that the i.e. described array light source provides is illuminated) is identical, is illuminated and detected using dot matrix
The relatively uniform illumination of number of photons increase by 10 times.Specifically, in order to verify the effect for using array light source, to using Uniform Illumination
It is emulated with the ToF camera of dot matrix lighting source, measuring accuracy is with the curve for measuring distance change under observation different capacity.
The sensor 101 of 65 × 24 resolution ratio is used in emulation, the reflectivity of the target object is 35%, and is directed to different wave length
The light beam of 630nm and 1550nm is emulated twice.
Simulation result is as shown in figure 11, and the resolution ratio of sensor 101 (i.e. ToF chip) is 65*24, that is, includes 65*24
When the pixel of lattice arrangement, when Figure 11 (a) is using 630nm wavelength signals light, Uniform Illumination and dot matrix under different capacity shine
Bright measuring accuracy is with the curve graph for measuring distance change;When Figure 11 (b) is using 1550nm wavelength signals light, different capacity
Under Uniform Illumination and dot matrix illumination measuring accuracy with measurement distance change curve graph.Comparison is it is found that and Uniform Illumination
In the case that light source reaches same accuracy, the optical power using dot matrix lighting source is only 1/10th of Uniform Illumination light source.
Meanwhile the light source range accuracy of 1550nm wavelength is high, this is because Transmittance spectrum difference of the atmosphere to different wave length electromagnetic wave
It is caused.
In addition, the resolution sizes of ToF camera 100 also have an impact measuring result error, for proofing chip point
The influence of resolution repeats the above emulation using the sensor that another resolution ratio is 320 × 240, and simulation result is as shown in figure 12.
When Figure 12 (a) is using 630nm wavelength signals light, the measuring accuracy of Uniform Illumination and dot matrix illumination under different capacity is with survey
Measure the curve graph of distance change;Uniform Illumination and dot matrix when Figure 12 (b) is using 1550nm wavelength signals light, under different capacity
The measuring accuracy of illumination is with the curve graph for measuring distance change.
Contrast simulation result it can be concluded that
1. required optical power also wants proportional increase with the increase of pixel number;
2. to realize the high-resolution remote ranging of ToF camera, the method by increasing optical power and point light source can not expire
The requirement of the sensor under sufficient 100m range;
The precision within 1m can be reached under 3.65 × 24 resolution ratio at 100m distance.
Summarize above-mentioned analysis result, it can be deduced that following two points conclusion:
Firstly, comparing scanning type laser radar, the scanning speed of the ToF camera is fast, there is very high precision in short distance, can
To be suitable for AGV robot.
The characteristics of AGV robot application is: movement speed is slow, and less (storehouse, plant area etc.), 5MHZ is modulated moving range,
30 meters of ranging ranges can meet ranging requirement.ToF camera range accuracy is verified for these features, then is surveyed in emulation
Measuring distance range is within 30 meters, in addition, other simulation parameters are as follows: Cmod=1, (λ)=0.65, klens=0.35, Tint
=33ms, λ=630nm, D=2.6mm, ρ=0.2, Aimage=4.6mm2, Apixel=3.75 μm 2, τ=0.8, Npix=
320 × 280, lensAngle=50 °, simulation result are as shown in figure 13.Figure 13 (a) is the Uniform Illumination and point under different capacity
The measuring accuracy of battle array illumination is with the curve graph for measuring distance change;Figure 13 (b) is after valid pixel is kept to original 1/4, no
Measuring accuracy with Uniform Illumination and dot matrix illumination under power is with the curve graph for measuring distance change.
From in simulation result shown in Figure 13 can from obtain, the illumination of the dot matrix of 7W precision can reach at 30m
3%, sampling number is 2.3M/s (320 × 240*30ps) at this time.Reduce the number of pixels on CDD sensor is original 1/4
(reducing light source lattice point number, the energy that each pixel can distribute is more), then the dot matrix illumination of 7W precision can reach at 30m
To 1%.
In conclusion although the principle of ToF camera limits its application in long range ranging, but from simulation result
It sees, can accomplish 1% to 3% precision after illuminating using dot matrix within 30m, meet the needs of low speed AGV enough, it is specific
Parameter are as follows: when 3%, can accomplish the rate of 2Mpps with the ToF chip of TI;When 1%, 700kpps rate;Its performance has surpassed
Cross most of AGV laser radar on the market.
The optical power for meeting error requirements needs when secondly, the ToF camera is as long range ranging is very high, leads to it
Cost and volume advantage it is weaker, therefore, the ToF camera can be applied to control loop, such as automated driving system, but phase
Compared with AGV robot, the ToF camera is more suitable for AGV robot.
Claims (17)
1. a kind of ToF camera, it is characterised in that: the ToF camera includes array light source, light receiving microscopy head system and sensor, institute
Array light source is stated for issuing the illuminating bundle of lattice arrangement to target object, the light receiving microscopy head system is for receiving the point
The illuminating bundle of battle array arrangement exposes to target object and the illuminating bundle of lattice arrangement that reflects is the target object to be imaged
On the sensor, the dot matrix to the sensor is imaged for sensing the light receiving microscopy head system in the sensor
The illuminating bundle of arrangement is to obtain the deep image information of the target object, wherein the sensor includes multiple pixels, institute
The illuminating bundle for stating pixel and the lattice arrangement corresponds, and each pixel is for sensing a corresponding illuminating bundle.
2. ToF camera as described in claim 1, it is characterised in that: the array light source includes light source and modulation element, described
Light source issues light source light, and the modulation element is used to for the light source light being modulated into the illuminating bundle of the lattice arrangement.
3. ToF camera as claimed in claim 2, it is characterised in that: the modulation element is diffraction optical element comprising more
A diffraction element, each diffraction element are converted to an illuminating bundle and by the illuminating bundle through the receipts light for light source light is received
Lens system is projected to a pixel corresponding on the sensor.
4. ToF camera as claimed in claim 3, it is characterised in that: the corresponding space angle in each diffraction element center is equal to
The corresponding space angle of corresponding pixel center.
5. ToF camera as claimed in claim 2, it is characterised in that: ToF camera as claimed in claim 2, feature exist
In: the light source is invention diode or laser light source.
6. ToF camera as claimed in claim 2, it is characterised in that: the array light source further includes collimation lens, the collimation
Lens are set between the light source and the modulation element, and the light source light for issuing the light source provides after being collimated
To the modulation element.
7. the ToF camera as described in 1-6 any one of claim, it is characterised in that: each pixel includes effective photosurface
Product, the illuminating bundle of each pixels sense are covered to effective photosensitive area of the pixel.
8. ToF camera as claimed in claim 7, it is characterised in that: the corresponding space angle in each pixel (i, j) center includes
Angle theta between the corresponding vertical direction of the pixel (i, j) and optical axisc i,jAnd the corresponding horizontal direction of pixel (i, j) and optical axis
Between included anglec i,j, wherein
Wherein, i and j respectively represents number of horizontal lines of the pixel on the pixel-matrix on the sensor and vertical columns,
FOVVFor the field of view angle of vertical direction, FOVhFor the field of view angle of horizontal direction, NvFor the number of pixels of vertical direction, NhFor
The number of pixels of horizontal direction.
9. ToF camera as claimed in claim 8, it is characterised in that: the corresponding sky of effective photosensitive area of each pixel (i, j)
Between angle half-breadth θΔ i,jWith φΔ i,jMeet following formula:
Wherein, dhAnd dvFor the size of effective photosensitive area in a vertical and horizontal direction, d1For sensor and light receiving microscopy head
The distance of the optical centre of system.
10. a kind of optical projection system, it is characterised in that: the optical projection system includes ToF camera, and the ToF camera is wanted using right
Seek ToF camera described in 1-9 any one.
11. a kind of robot, it is characterised in that: the robot includes ToF camera, and the ToF camera uses claim 1-9
ToF camera described in item any one.
12. a kind of control loop, it is characterised in that: the control loop includes steer and ToF camera, the ToF camera
Using ToF camera described in 1-9 any one of claim.
13. a kind of electronic equipment, it is characterised in that: the electronic equipment includes ToF camera, and the ToF camera is wanted using right
Seek ToF camera described in 1-9 any one.
14. a kind of design method of the diffraction optical element of the array light source for ToF camera, it is characterised in that: the method
The following steps are included:
Calculate effective photosensitive area distribution of the sensor of the ToF camera;
Set the plane typical range of far field plane;
Calculate projection pattern distribution of the far field plane on the sensor;
Calculate the phase diagram of the diffraction optical element.
15. design method as claimed in claim 14, it is characterised in that: the diffraction optical element includes multiple diffraction lists
Received light source light is converted to an illuminating bundle and by the illuminating bundle through light receiving microscopy head system by member, each diffraction element
System is projected to a pixel corresponding on the sensor.
16. the design method as described in claim 15, it is characterised in that: the phase diagram of the diffraction optical element includes every
The corresponding space angle in a diffraction element center, the corresponding space angle in each diffraction element center are equal to corresponding pixel center
Corresponding space angle, the corresponding space angle in each pixel (i, j) center includes the corresponding vertical direction of the pixel (i, j)
Angle theta between optical axisc i,jAnd the included angle between the corresponding horizontal direction of pixel (i, j) and optical axisc i,jAnd correspond with following public affairs
Formula:
Wherein, i and j respectively represents number of horizontal lines of the pixel on the pixel-matrix on the sensor and vertical columns,
FOVVFor the field of view angle of vertical direction, FOVhFor the field of view angle of horizontal direction, NvFor the number of pixels of vertical direction, NhFor
The number of pixels of horizontal direction.
17. design method as claimed in claim 16, it is characterised in that: each pixel includes effective photosensitive area, each picture
The illuminating bundle of element sensing is imaged on effective photosensitive area of the pixel, the corresponding space angle of the effective photosensitive area
Half-breadth θΔ i,jWith φΔ i,jMeet following formula:
Wherein, dhAnd dvFor the size of effective photosensitive area in a vertical and horizontal direction, d1For sensor and light receiving microscopy head system
Optical centre distance.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810225284.2A CN110285788B (en) | 2018-03-19 | 2018-03-19 | ToF camera and design method of diffractive optical element |
PCT/CN2018/113875 WO2019179123A1 (en) | 2018-03-19 | 2018-11-05 | Tof camera and design method for diffractive optical element |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810225284.2A CN110285788B (en) | 2018-03-19 | 2018-03-19 | ToF camera and design method of diffractive optical element |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110285788A true CN110285788A (en) | 2019-09-27 |
CN110285788B CN110285788B (en) | 2022-08-26 |
Family
ID=67988208
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810225284.2A Active CN110285788B (en) | 2018-03-19 | 2018-03-19 | ToF camera and design method of diffractive optical element |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110285788B (en) |
WO (1) | WO2019179123A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111025321A (en) * | 2019-12-28 | 2020-04-17 | 深圳奥比中光科技有限公司 | Variable-focus depth measuring device and measuring method |
CN111650681A (en) * | 2020-06-24 | 2020-09-11 | 欧菲微电子技术有限公司 | Diffractive optical element, TOF depth sensor, optical system, and device |
CN112804795A (en) * | 2019-11-14 | 2021-05-14 | 手持产品公司 | Apparatus and method for flicker control |
WO2022121879A1 (en) * | 2020-12-09 | 2022-06-16 | 华为技术有限公司 | Tof apparatus and electronic device |
US11536804B2 (en) * | 2018-08-29 | 2022-12-27 | Sense Photonics, Inc. | Glare mitigation in LIDAR applications |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113281767A (en) * | 2021-07-19 | 2021-08-20 | 上海思岚科技有限公司 | Narrow-window coaxial single-line laser scanning range finder |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09160364A (en) * | 1995-12-12 | 1997-06-20 | Ricoh Co Ltd | Image forming device |
US5808800A (en) * | 1994-12-22 | 1998-09-15 | Displaytech, Inc. | Optics arrangements including light source arrangements for an active matrix liquid crystal image generator |
JP2002031516A (en) * | 2000-07-18 | 2002-01-31 | Asahi Optical Co Ltd | Three-dimensional image input device |
US20120293625A1 (en) * | 2011-05-18 | 2012-11-22 | Sick Ag | 3d-camera and method for the three-dimensional monitoring of a monitoring area |
CN104483105A (en) * | 2014-12-25 | 2015-04-01 | 中国科学院半导体研究所 | Interpixel crosstalk detection system and method |
US20150163474A1 (en) * | 2013-12-05 | 2015-06-11 | Samsung Electronics Co., Ltd. | Camera for measuring depth image and method of measuring depth image using the same |
CN105100638A (en) * | 2014-05-19 | 2015-11-25 | 洛克威尔自动控制技术股份有限公司 | Optical area monitoring with spot matrix illumination |
WO2016093415A1 (en) * | 2014-12-09 | 2016-06-16 | 한화테크윈 주식회사 | Distance measuring apparatus and distance measuring method |
CN106093911A (en) * | 2016-07-25 | 2016-11-09 | 北京理工大学 | A kind of dot matrix emitting-receiving system for Non-scanning mode laser imaging |
DE102016219515A1 (en) * | 2015-10-30 | 2017-05-04 | pmdtechnologies ag | Time of flight camera system |
CN107515402A (en) * | 2017-08-21 | 2017-12-26 | 东莞市迈科新能源有限公司 | A kind of TOF three-dimensionals range-measurement system |
CN206946179U (en) * | 2017-07-11 | 2018-01-30 | 深圳市光峰光电技术有限公司 | Light supply apparatus and optical projection system |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9841496B2 (en) * | 2014-11-21 | 2017-12-12 | Microsoft Technology Licensing, Llc | Multiple pattern illumination optics for time of flight system |
KR101792207B1 (en) * | 2016-03-17 | 2017-10-31 | 주식회사 미래컴퍼니 | Diffractive optical element and optical system |
-
2018
- 2018-03-19 CN CN201810225284.2A patent/CN110285788B/en active Active
- 2018-11-05 WO PCT/CN2018/113875 patent/WO2019179123A1/en active Application Filing
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5808800A (en) * | 1994-12-22 | 1998-09-15 | Displaytech, Inc. | Optics arrangements including light source arrangements for an active matrix liquid crystal image generator |
JPH09160364A (en) * | 1995-12-12 | 1997-06-20 | Ricoh Co Ltd | Image forming device |
JP2002031516A (en) * | 2000-07-18 | 2002-01-31 | Asahi Optical Co Ltd | Three-dimensional image input device |
US20120293625A1 (en) * | 2011-05-18 | 2012-11-22 | Sick Ag | 3d-camera and method for the three-dimensional monitoring of a monitoring area |
US20150163474A1 (en) * | 2013-12-05 | 2015-06-11 | Samsung Electronics Co., Ltd. | Camera for measuring depth image and method of measuring depth image using the same |
CN105100638A (en) * | 2014-05-19 | 2015-11-25 | 洛克威尔自动控制技术股份有限公司 | Optical area monitoring with spot matrix illumination |
WO2016093415A1 (en) * | 2014-12-09 | 2016-06-16 | 한화테크윈 주식회사 | Distance measuring apparatus and distance measuring method |
CN104483105A (en) * | 2014-12-25 | 2015-04-01 | 中国科学院半导体研究所 | Interpixel crosstalk detection system and method |
DE102016219515A1 (en) * | 2015-10-30 | 2017-05-04 | pmdtechnologies ag | Time of flight camera system |
CN106093911A (en) * | 2016-07-25 | 2016-11-09 | 北京理工大学 | A kind of dot matrix emitting-receiving system for Non-scanning mode laser imaging |
CN206946179U (en) * | 2017-07-11 | 2018-01-30 | 深圳市光峰光电技术有限公司 | Light supply apparatus and optical projection system |
CN107515402A (en) * | 2017-08-21 | 2017-12-26 | 东莞市迈科新能源有限公司 | A kind of TOF three-dimensionals range-measurement system |
Non-Patent Citations (1)
Title |
---|
林勇: "用于激光光束整形的衍射光学元件设计", 《中国博士学位论文全文数据库信息科技辑》 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11536804B2 (en) * | 2018-08-29 | 2022-12-27 | Sense Photonics, Inc. | Glare mitigation in LIDAR applications |
CN112804795A (en) * | 2019-11-14 | 2021-05-14 | 手持产品公司 | Apparatus and method for flicker control |
US11792904B2 (en) | 2019-11-14 | 2023-10-17 | Hand Held Products, Inc. | Apparatuses and methodologies for flicker control |
CN112804795B (en) * | 2019-11-14 | 2024-01-05 | 手持产品公司 | Apparatus and method for flicker control |
US12089311B2 (en) | 2019-11-14 | 2024-09-10 | Hand Held Products, Inc. | Apparatuses and methodologies for flicker control |
CN111025321A (en) * | 2019-12-28 | 2020-04-17 | 深圳奥比中光科技有限公司 | Variable-focus depth measuring device and measuring method |
CN111650681A (en) * | 2020-06-24 | 2020-09-11 | 欧菲微电子技术有限公司 | Diffractive optical element, TOF depth sensor, optical system, and device |
WO2022121879A1 (en) * | 2020-12-09 | 2022-06-16 | 华为技术有限公司 | Tof apparatus and electronic device |
Also Published As
Publication number | Publication date |
---|---|
CN110285788B (en) | 2022-08-26 |
WO2019179123A1 (en) | 2019-09-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110285788A (en) | The design method of ToF camera and diffraction optical element | |
Horaud et al. | An overview of depth cameras and range scanners based on time-of-flight technologies | |
US11435446B2 (en) | LIDAR signal acquisition | |
CA3017819C (en) | Lidar based 3-d imaging with varying illumination intensity | |
US10000000B2 (en) | Coherent LADAR using intra-pixel quadrature detection | |
US7800739B2 (en) | Distance measuring method and distance measuring element for detecting the spatial dimension of a target | |
CN109557522A (en) | Multi-beam laser scanner | |
US20170176596A1 (en) | Time-of-flight detector with single-axis scan | |
CN110824490B (en) | Dynamic distance measuring system and method | |
CA3017811A1 (en) | Lidar based 3-d imaging with varying pulse repetition | |
US7495746B2 (en) | Optical method and device for measuring a distance from an obstacle | |
CN111025318A (en) | Depth measuring device and measuring method | |
CN111025321B (en) | Variable-focus depth measuring device and measuring method | |
CN110658529A (en) | Integrated beam splitting scanning unit and manufacturing method thereof | |
KR101145132B1 (en) | The three-dimensional imaging pulsed laser radar system using geiger-mode avalanche photo-diode focal plane array and auto-focusing method for the same | |
Grollius et al. | Concept of an automotive LiDAR target simulator for direct time-of-flight LiDAR | |
CN111025319B (en) | Depth measuring device and measuring method | |
CN110716190A (en) | Transmitter and distance measurement system | |
CN110716189A (en) | Transmitter and distance measurement system | |
Luo et al. | A low-cost high-resolution LiDAR system with nonrepetitive scanning | |
CN210835244U (en) | 3D imaging device and electronic equipment based on synchronous ToF discrete point cloud | |
CN113932908B (en) | Measuring system and measuring method for vibration parameters of MEMS scanning galvanometer | |
CN216211121U (en) | Depth information measuring device and electronic apparatus | |
KR101866764B1 (en) | Range Image Sensor comprised of Combined Pixel | |
Marszalec et al. | A photoelectric range scanner using an array of LED chips |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |