CN107783353B - Device and system for capturing three-dimensional image - Google Patents
Device and system for capturing three-dimensional image Download PDFInfo
- Publication number
- CN107783353B CN107783353B CN201610737230.5A CN201610737230A CN107783353B CN 107783353 B CN107783353 B CN 107783353B CN 201610737230 A CN201610737230 A CN 201610737230A CN 107783353 B CN107783353 B CN 107783353B
- Authority
- CN
- China
- Prior art keywords
- light
- module
- time
- flight
- reflected
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The invention provides a device and a system for capturing a stereoscopic image. The device for capturing the three-dimensional image comprises a flight time image capturing module and a light emitting module, wherein the light emitting module generates structured light and illumination light, and the flight time image capturing module captures first and second reflected lights formed by projecting the structured light and the illumination light to an object so as to obtain three-dimensional image information of the object. By integrating time of flight (TOF) and structured light techniques, the invention can obtain stereoscopic image information with high depth accuracy and low noise, while reducing the overall size and power of the device and system.
Description
Technical Field
The present invention relates to an apparatus and a system for capturing image information, and more particularly, to an apparatus and a system for capturing stereoscopic images.
Background
The stereo scanning technology has been widely used in the industrial field and daily life, such as stereo scanning, monitor recognition, depth camera, etc. The stereo scanning technology is mainly used for detecting and analyzing geometric construction and appearance data of an object or an environment, and performing three-dimensional operation by using the detected data so as to simulate and establish a digital model of the object and the environment.
In the conventional stereoscopic scanning technique, the Time of Flight method is implemented by calculating the Time of a light source projected on the surface of an object to be measured, and then reflecting the light from the surface of the object to reach a sensor to calculate the distance between the object and an image capturing module, thereby obtaining stereoscopic image data of the object.
On the other hand, the structured light technology is to project a specific pattern to the object to be measured, then to capture a three-dimensional image of the surface of the object to be measured by a sensor, and finally to calculate by utilizing a triangulation principle to obtain a three-dimensional coordinate of the object to be measured. Although the structured light technology can obtain lower depth noise and higher image data precision, the calculation of the three-dimensional point cloud (3D CloudPoint) by the structured light technology is relatively time-consuming.
Therefore, there is still a need to provide a solution to combine the advantages of both time-of-flight method and structured light, and further provide a system for capturing stereoscopic images that provides high depth accuracy, and has a small size and low power consumption.
Disclosure of Invention
In order to solve the above technical problems, according to an embodiment of the present invention, an apparatus for capturing a stereoscopic image includes a time-of-flight image capturing module and a light emitting module. The time-of-flight image capturing module is used for capturing three-dimensional image information of at least one object. The light-emitting module is adjacent to the time-of-flight image capturing module and generates structural light and illumination light which are projected to the object. The structured light is reflected by the object to form a first reflected light. The illumination light is reflected by the object to form a second reflected light. The time-of-flight image capturing module captures the first reflected light and the second reflected light to obtain the three-dimensional image information of the object.
Furthermore, the light emitting module comprises a light generating unit, a light diffraction assembly and a light diffusion assembly, the structured light generated by the light emitting module is formed after being converted by the light diffraction assembly, and the illumination light generated by the light emitting module is formed after being converted by the light diffusion assembly.
Furthermore, the light generating unit comprises a laser generator for generating a laser light source and an optical component, the laser light source sequentially passes through the optical component and the light diffraction component to form the structured light, and the laser light source sequentially passes through the optical component and the light diffusion component to form the illumination light.
Furthermore, the laser generator has at least one collimator, the optical assembly includes a beam splitter assembly and a reflective assembly, the laser source sequentially passes through collimation of at least one collimator, beam splitting of the beam splitter assembly and conversion of the light diffraction assembly to form the structured light, and the laser source sequentially passes through collimation of at least one collimator, beam splitting of the beam splitter assembly, reflection of the reflective assembly and conversion of the light diffusion assembly to form the illumination light.
Furthermore, the light generating unit includes a laser generator and a light emitting element, a laser light source generated by the laser generator is converted into the structured light through the light diffraction element, and a projection light source generated by the light emitting element is converted into the illumination light through the light diffusion element.
Furthermore, the time-of-flight image capturing module, the light-diffracting element and the light-diffusing element are linearly arranged, and a light receiving surface of the time-of-flight image capturing module, a light emitting surface of the light-diffracting element and a light emitting surface of the light-diffusing element are all arranged along the same reference axis.
Furthermore, the light diffraction element and the light diffusion element are disposed on the same side of the time-of-flight image capture module, and the light diffusion element is disposed between the light diffraction element and the time-of-flight image capture module.
Furthermore, the light diffraction element and the light diffusion element are respectively disposed on two sides of the time-of-flight image capture module.
Furthermore, the time-of-flight image capturing module further comprises a switching module, wherein the switching module comprises an infrared band-pass filter and a visible band-pass filter; and an optical switch. The optical switch is used for guiding the first reflected light and the second reflected light to the infrared band-pass filter or guiding the reflected light generated by irradiating the object with ambient light to the visible band-pass filter. When the first reflected light and the second reflected light are guided to the infrared light band filter by switching of the optical switch, the first reflected light and the second reflected light pass through the infrared light band filter to generate black-and-white stereoscopic image information; when the reflected light generated by the ambient light irradiating the object is switched by the light switch to be guided to the visible light band-pass filter, the reflected light generated by the ambient light irradiating the object passes through the visible light band-pass filter to obtain a color image information.
According to another embodiment of the present invention, a system for capturing stereoscopic images is provided, which is used for capturing image information of at least one object, the system includes a processor module, a time-of-flight processing module electrically connected to the processor module, a light-emitting module, and a time-of-flight image capturing module electrically connected to the time-of-flight processing module. The light emitting module is used for generating a structured light projected to the object and an illumination light projected to the object, the structured light is reflected by the object to generate a structured light message, and the illumination light is reflected by the object to generate an illumination light message. The time-of-flight image capturing module is used for capturing the structured light information and the illumination light information. The structured light information captured by the flight time image capturing module is calculated by the processor module to obtain a structured light three-dimensional point cloud. The illumination light information captured by the flight time image capturing module is calculated by the processor module to obtain a flight time three-dimensional point cloud. The structured light stereo point cloud and the time of flight stereo point cloud may be merged into a stereo depth map for providing the image information of the object.
Furthermore, the light emitting module includes a light generating unit, a light diffracting element and a light diffusing element, the structured light generated by the light emitting module is formed after being converted by the light diffracting element, and the illumination light generated by the light emitting module is formed after being converted by the light diffracting element.
Further, the system for capturing stereoscopic images further comprises a memory unit, wherein the structured light stereoscopic point cloud, the time-of-flight stereoscopic point cloud and the stereoscopic depth map are stored in the memory unit.
Furthermore, the system for capturing a stereoscopic image further comprises a computing unit electrically connected to the processor module for correcting the stereoscopic depth map.
The main technical means of the invention is that the three-dimensional image information of the object to be measured with low Depth Noise (Depth Noise) and high Depth precision can be obtained by generating the structured light and the illumination light in a single image capturing procedure and using the time-of-flight image capturing module to capture the first reflected light and the second reflected light which are obtained by the reflection of the structured light and the illumination light on the surface of the object to be measured, and combining the advantages of the time-of-flight method and the structured light technology. In addition, the device and the system for capturing the stereoscopic images have smaller overall volume and lower consumption of operating energy.
For a better understanding of the features and technical content of the present invention, reference should be made to the following detailed description of the invention and accompanying drawings, which are provided for purposes of illustration and description only and are not intended to limit the invention.
Drawings
FIG. 1 is a diagram illustrating an apparatus for capturing stereoscopic images according to a first embodiment of the present invention;
FIG. 2 is a diagram illustrating an apparatus for capturing stereoscopic images according to a second embodiment of the present invention;
FIG. 3 is a diagram illustrating an apparatus for capturing stereoscopic images according to a third embodiment of the present invention;
FIG. 4 is a flowchart of a method for capturing stereoscopic images according to the present invention;
FIGS. 5A and 5B are schematic diagrams of a system for capturing stereoscopic images according to the present invention; and
FIG. 6 is a flowchart illustrating an operation of the system for capturing stereoscopic images according to the present invention.
Detailed Description
The following embodiments of the present disclosure relating to an apparatus, a method and a system for capturing stereoscopic images are described with reference to specific embodiments, and those skilled in the art will understand the advantages and effects of the present disclosure from the disclosure of the present disclosure. The invention is capable of other and different embodiments and its several details are capable of modification in various other respects, all without departing from the spirit and scope of the present invention. The drawings of the present invention are for illustrative purposes only and are not drawn to scale. The following embodiments will further explain the technical contents related to the present invention in detail, but the disclosure is not intended to limit the technical scope of the present invention.
First embodiment
First, please refer to fig. 1. Fig. 1 is a schematic diagram of an apparatus for capturing stereoscopic images according to a first embodiment of the invention. The apparatus D for capturing a stereoscopic image according to the first embodiment of the present invention includes a time-of-flight image capturing module 1 and a light emitting module 2. The time-of-flight image capturing module 1 at least includes a lens module 13 and a time-of-flight sensing unit 14. The time-of-flight image capturing module 1 is used for capturing three-dimensional image information of at least one object O. The light emitting module 2 is adjacent to the time-of-flight image capturing module 1. The distance between the light emitting module 2 and the time-of-flight image capturing module 1 depends on the image resolution to be realized and the distance between the device D for capturing a stereoscopic image and the object O to be measured, which will be described in detail later.
In view of the above, the light emitting module 2 includes a light generating unit 21, a light diffracting element 22 and a light diffusing element 23. the light generating unit 21 includes a light generator, and may further include other optical elements, such as a lens, etc. the kind of the light generator of the light generating unit 21 is not limited thereto, the light generating unit 21 generates Coherent light (Coherent L light), so that, as long as the above-mentioned object can be achieved, the selection and arrangement of the internal components of the light generating unit 21 may be adjusted, for example, the light generator of the light generating unit 21 is the laser generator 211 to generate the laser light source L. for example, a light generator for generating infrared light may be used, in addition, other light emitting elements may be used as the light generator of the light generating unit 21, such as a light emitting diode (L light emitting diode, L ED), and a "Narrow Band Pass Filter" may be used to convert the light generator generated by the light emitting diode into Coherent light, such that the light generator may generate light with at least one light generating light, and the light generating unit 211 may include a stereo laser light generating Filter 211, and thus, when the light generating unit 211 includes at least one stereo laser light source 211, the stereo image capturing unit 211 may be used for generating light.
As described above, the light diffraction element 22 and the light diffusion element 23 are respectively used for converting the coherent light generated by the light generation unit 21 into the structured light S L and the illumination light I L, in other words, the structured light S L generated by the light emitting module 2 is formed after being converted by the light diffraction element 22, and the illumination light I L generated by the light emitting module 2 is formed after being converted by the light diffusion element 23. in addition, other optical elements may be further included between the light diffraction element 22 and the light diffusion element 23 and the light generation unit 21, for example, lenses, such as focusing lenses, may be included between the light diffraction element 22 and the light diffusion element 23 and the light generation unit.
The Diffractive Element 22 is also called a Diffractive Optical Element (DOE), which can be a hologram, a grating, or other suitable Optical Element, the Diffractive Element 22 is used to generate a two-dimensional code by a light source through a microstructure on a surface of the Diffractive Element 22. in the embodiment of the present invention, the Diffractive Element 22 converts the coherent light generated by the light generating unit 21 into a structured light S L projected on the surface of the object O (or an environment composed of a plurality of objects O), the light diffusing Element 23 is used to uniformly diffuse the coherent light generated by the light generating unit 21, so as to generate an illumination light I L projected on the surface of the object O to be measured, and the light diffusing Element 23 converts the coherent light generated by the light generating unit 21 into an illumination light I L that can completely cover the object O to be measured or the environment to obtain complete stereoscopic image information of the object O to be measured or the environment.
It should be noted that, since the light generating unit 21 used in the embodiment of the present invention has a less complex structure, it has a smaller size and volume and can be applied to small electronic products, such as mobile phones. In addition, the light generating unit 21 only needs to control the laser generator 211 therein to generate the pulse-type light source signal through a simple electronic control system, and the embodiment of the invention does not need to use a complicated panel and set a peripheral circuit for controlling the light emitting mode of the light generating unit 21, thereby reducing the volume and the manufacturing process and the production cost.
In the first embodiment, the light generating unit 21 includes a single light generator (laser generator 211). therefore, in order to convert the laser source L1 generated by the laser generator 211 into the structured light S L and the illumination light I L through the light diffraction element 22 and the light diffusion element 23, respectively, the light generating unit 21 further includes an optical element 212 for splitting the laser source L1. in other words, as shown in fig. 1, the light generating unit 21 includes the laser generator 211 and the optical element 212, the laser source L11 generated by the laser generator 211 sequentially passes through the optical element 212 and the light diffraction element 22 to form the structured light S L, and the laser source L11 sequentially passes through the optical element 212 and the light diffusion element 23 to form the illumination light I L.
Specifically, the optical element 212 includes a light splitting element 2121 and a light reflecting element 2122, for example, the light splitting element 2121 is a Half Mirror (Half Mirror) having a specific reflectivity/transmittance ratio for splitting the laser source L11, the light reflecting element 2122 is a light guiding element, such as a Mirror, after the laser source L11 is split, a portion of the light beam passes through the light diffraction element 22 to form a structured light S L, and another portion of the light beam is guided to the light diffusion element 23 by reflection of the light reflecting element 2122, so as to generate an illumination light I L, after the structured light S L is projected on the surface of the object O, a first reflected light R1 is formed by reflection of the surface of the object O, and the illumination light I L is projected on the surface of the object O, and a second reflected light R2 is formed by reflection of the surface of the object O.
Referring to fig. 1, the time-of-flight image capturing module 1 captures the first reflected light R1 and the second reflected light R2 to obtain the stereoscopic image information of the object O, in other words, the first reflected light R1 generated by the structured light S L is captured by the lens module 13 of the time-of-flight image capturing module 1, so that the stereoscopic image information of the object O can be obtained through spatial modulation, and the second reflected light R2 generated by the illumination light I L is captured by the lens module 13 of the time-of-flight image capturing module 1, so that the stereoscopic image information of the object O can be obtained through temporal modulation.
In the present invention, the arrangement of the time-of-flight image capturing module 1 and the light emitting module 2 depends on the resolution and precision of the stereo image information to be obtained, and the distance between the device D for capturing the stereo image and the object O to be measured, specifically, when the stereo image information is captured by using the structured light technology, a triangulation method is used to calculate the distance between the time-of-flight image capturing module 1 and the emitting point of the structured light S L (i.e. Baseline in the triangulation method, the base line B L in the x direction shown in fig. 1), and the angle between the first reflected light R1 and the base line B L (the triangulation angle θ shown in fig. 1) to calculate the depth distance (z direction shown in fig. 1), so that, as shown in fig. 1, the time-of-flight image capturing module 1, the light diffraction element 22 and the light diffusion element 23 are linearly arranged, and the light emitting surface 231 of the time-of the diffraction image capturing module 1 and the light diffusion element 221 are arranged along the same reference axis line.
Based on the above considerations, generally, the farther the distance between the light-diffracting element 22 and the time-of-flight image capture module 1 is, the higher the accuracy of the stereo image information obtained therefrom, for example, using gray code (Graycode) algorithm, for a time-of-flight image capture module 1 of 5Mega Pixel, when the distance between the light-diffracting element 22 and the time-of-flight image capture module 1 (baseline B L) is 14 centimeters and the triangular range angle θ is 20 degrees, a resolution of 0.1 millimeter (mm) can be obtained at a depth of 20 centimeters, additionally, when using Phase Shift algorithm or using Speckle (Speckle) algorithm plus Phase Shift algorithm, the resolution can be further improved according to the degree of bright-dark dynamic contrast, however, as the distance between the light-diffracting element 22 and the time-of-flight image capture module 1 increases, the overall volume of the device D for capturing stereo images is also larger, on the other hand, as the distance between the light-time-of-flight image capture module 1 and the light-of the light-time image capture module 1 is adjusted according to the principles of the present invention, so that the light-of the light-time-of the diffusion module 23, the light-of the image capture module, the light-of the light-diffusion module 23, the light-of the image capture module.
In summary, the apparatus D for capturing a stereoscopic image according to the first embodiment of the present invention can obtain stereoscopic image information obtained by spatial modulation and temporal modulation through a total of two projection-shooting (projection-shooting) procedures, i.e. projecting the structured light S L through the light diffraction element 22, capturing the first reflected light R1 generated by the structured light S L through the time-of-flight image capturing module 1, projecting the illumination light I L through the light diffusion element 23, and capturing the second reflected light R2 generated by the illumination light I L through the time-of-flight image capturing module 1.
By integrating two different stereo scanning techniques, the respective advantages of the stereo scanning techniques can be combined, namely, structured light is utilized to capture short-distance and high-precision stereo image information, and a flight time method is utilized to perform middle-distance and long-distance stereo scanning, and simultaneously, the Depth Accuracy (Depth Accuracy) when structured light is utilized to perform short-distance stereo scanning is improved.
Referring to fig. 1, the time-of-flight image capturing module 1 according to the first embodiment of the present invention further includes a switching module 12. The switching module 12 is used for Color rendering (Color Texturing) the obtained stereoscopic image information. Specifically, the switching module 12 includes an infrared band pass (Bandpass) filter, a visible band pass filter, and an optical switch. The light switch is used for guiding the first reflected light R1 and the second reflected light R2 to the infrared band pass filter or guiding the ambient light to the visible band pass filter. The optical switch may be, for example, a Piezoelectric Motor (piezo Motor), a Voice Coil Motor (VCM), or an electromagnetic switch.
For example, in the infrared light mode, the first reflected light R1 and the second reflected light R2 are guided to the infrared band pass filter by switching of the optical switch, and the wavelength bands other than the infrared light in the first reflected light R1 and the second reflected light R2 are filtered, and the 2D structured light S L and the illumination light I L reflected image are converted into the black and white stereoscopic image information of the object O through the spatial modulation and the time modulation algorithm, respectively.
In other words, after the total two projection-shooting (projection-shooting) processes (realized by the structured light S L and the illumination light I L, respectively) are performed, the reflected light generated by the ambient light irradiating the object O through the switch in the switch module 12 is guided to the visible light band pass filter, thereby completing the third color shooting process.
In summary, the switching module 12 is further added to the structure of the time-of-flight image capturing module 1, so that the black-and-white stereoscopic image information can be pasted, and the time-of-flight image capturing module 1 including the switching module 12 can be used as a time-of-flight Camera (TOF Camera) and a Color Camera (Color Camera) at the same time, and the two cameras have the same pixel and the same world coordinate, so that the effects of fast pasting and seamless pasting can be achieved without performing an additional calibration procedure, and the necessity of using an additional Color Camera is eliminated.
Second embodiment
Please refer to fig. 2. Fig. 2 is a schematic diagram of an apparatus for capturing stereoscopic images according to a second embodiment of the invention. The greatest difference between the second embodiment of the invention and the first embodiment is the design of the light emitting module 2. The differences between the second embodiment and the first embodiment will be described below, and the parts of the second embodiment that are substantially the same as the parts of the first embodiment will not be described again.
As shown in FIG. 2, in the second embodiment, the light generating unit 21 includes a laser generator 211 and a light emitting module 213, the laser source L11 generated by the laser source 211 is converted into the structured light S L by the light diffraction module 22, and the projection light source L12 generated by the light emitting module 213 is converted into the illumination light I L by the light diffusion module 23. in other words, unlike the first embodiment, the second embodiment uses two independent light generators to generate the laser source L11 and the projection light source L12 for generating the structured light S L and the illumination light I L, respectively.
Specifically, since the structured light S L must be generated by coherent light, the laser generator 211 is selected and the collimator 2111 is used to generate the laser source L11 in the second embodiment, on the other hand, the type of the light emitting device 231 for generating the illumination light I L is not limited thereto, for example, the light emitting device 231 may be a L ED light emitting device.
Referring to fig. 2, the diffractive element 22 and the diffusive element 23 are disposed on the same side of the time-of-flight image capturing module 1, and the diffusive element 22 is disposed between the diffractive element 23 and the time-of-flight image capturing module 1. as described in the first embodiment, the time-of-flight image capturing module 1, the diffractive element 22, and the diffusive element 23 are linearly disposed, the light receiving surface 111 of the time-of-flight image capturing module 11, the light emitting surface 221 of the diffractive element 22, and the light emitting surface 231 of the diffusive element 23 are all arranged along the same reference axis BA, and the distance between the time-of-flight image capturing module 11 and the diffractive element 222 in the x direction (horizontal direction) is a Baseline B L (Baseline) in the triangulation method.
Third embodiment
Please refer to fig. 3. Fig. 3 is a schematic diagram of an apparatus for capturing stereoscopic images according to a third embodiment of the invention. The greatest difference between the third embodiment and the second embodiment of the present invention is the arrangement of the components in the light emitting module 2. Differences between the third embodiment and the second embodiment will be described below, and parts of the third embodiment that are substantially the same as the second embodiment will not be described again.
As shown in fig. 3, the light generating unit 21 includes a laser generator 211 and a light emitting element 213, a laser source L11 generated by the laser source 211 is converted into a structured light S L by a light diffraction element 22, and a projection light L12 generated by the light emitting element 213 is converted into an illumination light I L by a light diffusion element 23. in addition, the light diffraction element 22 and the light diffusion element 23 are respectively disposed at two sides of the time-of-flight image capture module 1. in other words, the light diffraction element 22 and the light diffusion element 23 are respectively disposed at two opposite sides of the time-of-flight image capture module 1. it should be noted that the relative positions between the light diffraction element 22, the light diffusion element 23 and the time-of-flight image capture module 1 in the third embodiment and the aforementioned second embodiment can be adjusted according to the precision of the desired stereoscopic image information and other parameters in the process of manufacturing the device D for capturing stereoscopic images.
In addition, the invention also provides a method for capturing the stereoscopic image. Referring to fig. 4, fig. 4 is a flowchart illustrating a method for capturing a stereoscopic image according to the present invention. The method for capturing the stereoscopic image comprises the following steps: generating a structured light and an illumination light by a light-emitting module, the structured light and the illumination light being directed to an object (step S100); the structured light and the illumination light source are reflected by the object respectively to form a first reflected light and a second reflected light respectively (step S102); and capturing the first reflected light and the second reflected light by a time-of-flight image capturing module to obtain a stereoscopic image information of the object (step S104).
Referring to fig. 1 to 3, first, in step S100, the light emitting module 2 generates the structured light S L and the illumination light I L directed to the object O, the light emitting module 2 may use the light emitting module 2 of the first to third embodiments, and the details thereof are not described again.
Next, in step S102, the light source S L and the illumination light I L are reflected by the surface of the object O to form a first reflected light R1 and a second reflected light R2, respectively, and finally, in step S104, the first reflected light R1 and the second reflected light R2 are captured by the time-of-flight image capturing module 1 to obtain the stereoscopic image information of the object O.
Specifically, the time-of-flight image capturing module 1 may further include a switching module 12, where the switching module 12 includes an infrared band-pass filter, a visible band-pass filter, and an optical switch, and the optical switch is configured to guide the first reflected light R1 and the second reflected light R2 to the infrared band-pass filter, or guide the reflected light generated by the ambient light irradiating the object O to the visible band-pass filter. In the infrared light mode, both the first reflected light R1 and the second reflected light R2 are guided to the infrared band pass filter by the switching of the optical switch, and the first reflected light R1 and the second reflected light R2 pass through the infrared band pass filter and then reach the time-of-flight sensing assembly 14 to generate the black-and-white stereoscopic image information. In the visible light mode, the reflected light generated by the object O illuminated by the ambient light is guided to the visible light bandpass filter by switching of the optical switch, and the reflected light passes through the visible light bandpass filter and reaches the time-of-flight sensing component 14 to obtain the color image information. The monochrome three-dimensional image information and the color image information are used for calculation and processing to obtain the color three-dimensional image information.
The present invention further provides a system for capturing stereoscopic images, which is used for capturing stereoscopic image information of at least one object O. Please refer to fig. 5A and fig. 5B in combination with fig. 6. Fig. 5A and 5B are schematic views of a system for capturing stereoscopic images according to the present invention, and fig. 6 is a flowchart illustrating an operation of the system for capturing stereoscopic images according to the present invention. The system S for capturing stereoscopic images provided by the present invention comprises a memory unit 51, a processor module 52, a time-of-flight processing module 53, a light-emitting module 58 and a time-of-flight image capturing module 56.
First, please refer to fig. 5A in conjunction with fig. 1. The processor module 52 is electrically connected to the memory unit 51; the time-of-flight processing module 53 is electrically connected to the processor module 52; the light emitting module 58 is electrically connected to the time-of-flight processing module 53; the time-of-flight image capturing module 56 is electrically connected to the time-of-flight processing module 53. For example, the light emitting module 58 corresponds to the light emitting module 2 in the first embodiment. The time-of-flight image capturing module 56 may correspond to the time-of-flight image capturing module 1 in the first embodiment.
As shown in FIG. 5A, the light emitting module 58 can be controlled by the time-of-flight processing module 53 to generate the structured light S L projected to the object O and the illumination light I L projected to the object O, the structured light S L and the illumination light I L respectively generate a structured light information and an illumination light information by the reflection of the object O, and the time-of-flight image capturing module 56 captures the structured light information and the illumination light information by the control of the time-of-flight processing module 53.
In light of the above, after the structured light information captured by the time-of-flight image capture module 56 is calculated by the processor module 52, a structured light three-dimensional point cloud of the object O can be obtained, and after the illumination light information captured by the time-of-flight image capture module 56 is calculated by the processor module 52, a time-of-flight three-dimensional point cloud can be obtained, and the structured light three-dimensional point cloud and the time-of-flight three-dimensional point cloud can be combined into a three-dimensional depth map through calculation. For example, the structured light stereo point cloud and the time-of-flight stereo point cloud of the processor module 52 are merged into a stereo depth map for providing image information of the object O through the operation of the microprocessor module 52. However, the invention is not limited thereto.
Specifically, the time-of-flight processing module 53 is configured to modulate the illumination light I L generated by the light-emitting module 58 (i.e., provide a modulation signal for time modulation), and simultaneously demodulate (De-modulate) the sensor of the time-of-flight image capturing module 56 according to the manner of the illumination light I L for modulating the light-emitting module 58, the information captured by the time-of-flight image capturing module 56 and the information calculated by the processor module 52, including the structured light stereo point cloud, the time-of-flight stereo point cloud, and other information, are stored in the memory unit 51. in other words, the information captured by the time-of-flight image capturing module 56 may be temporarily stored in the memory unit 51 and then sent back to the processor module 52 for processing and calculation, in addition, the system S for capturing a stereo image may further include a calculating unit 57 electrically connected to the micro-processing module 52 for correcting the stereo depth map, the processor module 52 may also perform overall control on the time-of-flight image capturing module 56 and the light-emitting module 58 to avoid the occurrence of timing interference between these components.
Referring to fig. 5B, referring to fig. 2 and 3, as shown in fig. 5B, the system S for capturing a stereoscopic image includes a memory unit 51, a processor module 52, a time-of-flight processing module 53, a first light-emitting module 54, a second light-emitting module 55, and a time-of-flight image capturing module 56, the processor module 52 is electrically connected to the memory unit 51, the time-of-flight processing module 53 is electrically connected to the processor module 52, the first light-emitting module 54 is electrically connected to the processor module 52, the second light-emitting module 55 is electrically connected to the time-of-flight processing module 53, and the time-of-flight image capturing module 56 is electrically connected to the time-of-flight processing module 53, the first light-emitting module 54 is controlled by the processor module 52 to generate a structured light S L directed to the object O, the structured light S L generates a piece of structural light information by reflection of the object O, the second light-emitting module 55 is controlled by the time-of-flight processing module 53 to generate an illumination light I L directed to the object O, the illumination light I L generates an illumination light information by reflection of the object O, the time-of the time.
For example, the first light emitting module 54 and the second light emitting module 55 correspond to the laser generator 211 and the light emitting element 213 shown in fig. 2 and 3, respectively. The time-of-flight image capturing module 56 may correspond to the time-of-flight image capturing module 1 in the first to third embodiments. In other words, the system for capturing stereoscopic images shown in fig. 5B can employ the apparatus for capturing stereoscopic images of fig. 2 and 3. The difference between the system for capturing stereoscopic images shown in fig. 5B and fig. 5A is that in the system for capturing stereoscopic images shown in fig. 5B, the first light-emitting module 54 can be switched On and Off (On/Off switching) directly through the processor module 52. In other words, the first light-emitting module 54 does not need to be time-modulated by the time-of-flight processing module 53. Other details of fig. 5B are substantially the same as those of fig. 5A, and will not be described again.
Please refer to fig. 6, and refer to the contents of fig. 1 to 3, fig. 5A and fig. 5B as required. The system for capturing a stereoscopic image shown in fig. 5A and 5B can be used to perform a stereoscopic image information capturing procedure. As shown in fig. 6, the system S for capturing a stereoscopic image is operated in the monochrome mode to obtain monochrome image information (step S200). In other words, step S200 is performed by the system S for capturing stereoscopic images in an Infrared (IR) mode. The step S200 includes a program for obtaining a structured light stereo point cloud by structured light (steps S2001, S2003, S2005) and a program for obtaining a time-of-flight stereo point cloud by a time-of-flight method (steps S2002, S2004, S2006). The sequence of the two processes can be adjusted as required.
For example, the structured light S L projected to the at least one object O may be generated (step S2001), that is, the structured light S L may be generated by the combination of the light generating unit 21 and the light diffracting element 22 (included in the light emitting module 58 or the first light emitting module 54), the structured light S L is reflected by the object O to form the first reflected light R1, the time-of-flight image capturing module 56 captures the first reflected light R1 to generate the structured light information (step S2003), and then the structured light information captured by the time-of-flight image capturing module 56 is calculated by the processor module 52 to obtain the structured light stereoscopic point cloud (step S2005).
Next, an illumination light source LL projected to at least one object O is generated (step S2002), i.e., illumination light I L is generated by a combination of the light generating unit 21 and the light diffusing assembly 23 (contained in the light emitting module 58 or the second light emitting module 55), the illumination light I L is reflected by the object O to form second reflected light R2, the time-of-flight image capturing module 56 captures the second reflected light R2 to generate illumination light information (step S2004), next, the illumination light information captured by the time-of-flight image capturing module 56 is calculated by the processor module 52 to obtain a time-of-flight stereoscopic point cloud (step S2006), and thus, a procedure of capturing a stereoscopic image of the object O by a time-of-flight method is completed.
Then, the structured light stereo point cloud and the time-of-flight stereo point cloud are merged into a black-and-white stereo depth map through the operation of the processor module 52 (S2007). In this step, information obtained by performing short-range stereo scanning with structured light and information obtained by performing intermediate-range and long-range stereo scanning with a time-of-flight method are integrated. Therefore, the structured light can be used for acquiring high-precision three-dimensional image information at a short distance, and the intermediate-distance and long-distance three-dimensional scanning can be carried out by a flight time method, and meanwhile, the depth precision of the structured light during the short-distance three-dimensional scanning can be improved.
As mentioned above, since the system S for capturing a stereoscopic image is operated in the infrared light mode, only the monochrome stereoscopic depth map can be obtained, and after the above step S200, the steps S202 and S204 can be performed, that is, the system S for capturing a stereoscopic image is operated in the color mode to obtain a color image information, and the monochrome stereoscopic depth map is colored based on the Pixel (Pixel) information in the color image information to obtain the color stereoscopic depth map, it is noted that since the Pixel information in the color image information is identical to the inter-Pixel information of the information obtained by the structured light S L or the illumination light I L, the monochrome stereoscopic depth map can be colored with high precision without performing an external correction, the step S202 can be completed by using the switching module 12 described in fig. 1 to 3, specifically, the step S202 includes obtaining the color image information in the monochrome image by projecting the ambient light to the object O, capturing the color reflected light reflected by the surface of the object O by the time-of flight image capturing module 56, using the color-to-visible light guide filter, and obtaining the color image information in the color image S202, the step S204 obtaining the color stereoscopic depth map obtained by the color image information obtained in the step S200.
The invention has the advantages of
In summary, the present invention has the advantages that by integrating the structured light and the time-of-flight technology into a single device or system for capturing stereoscopic images, the present invention can obtain stereoscopic image information with high depth precision and low noise, and simultaneously reduce the overall size and power consumption of the device and system.
In detail, the device and the system provided by the invention can generate the structured light S L by simple components, namely the small-volume laser module 211 and the collimator 2111 and the light diffraction module 22 (both are small-volume lenses), wherein the laser module 211 only needs to provide pulse signals without a complex control module for control, and the manufacturing cost can be greatly reduced.
In addition, the device, the method and the system provided by the invention can achieve the effects of quick color pasting and seamless color pasting without increasing the number of cameras, the size of the device and the size of the system by additionally arranging the switching module on the time-of-flight image acquisition module.
The disclosure is only a preferred embodiment of the invention and should not be taken as limiting the scope of the invention, which is defined by the appended claims.
Claims (12)
1. An apparatus for capturing stereoscopic images, the apparatus comprising:
a time-of-flight image capture module for capturing a stereoscopic image of at least one object, the time-of-flight image capture module comprising a switching module, the switching module comprising:
an infrared band-pass filter;
a visible light band pass filter; and
an optical switch; and
a light emitting module adjacent to the time-of-flight image capture module, wherein the light emitting module generates a structured light and an illumination light directed toward the object;
the structured light is reflected by the object to form a first reflected light, the illumination light is reflected by the object to form a second reflected light, and the time-of-flight image capturing module captures the first reflected light and the second reflected light to obtain the stereoscopic image information of the object;
the optical switch is used for guiding the first reflected light and the second reflected light to the infrared band-pass filter or guiding reflected light generated by irradiating the object with ambient light to the visible band-pass filter;
when the first reflected light and the second reflected light are switched by the optical switch to be guided to the infrared band-pass filter, the first reflected light and the second reflected light pass through the infrared band-pass filter to generate black-and-white stereoscopic image information;
when the reflected light generated by the ambient light irradiating the object is switched by the light switch to be guided to the visible light band-pass filter, the reflected light generated by the ambient light irradiating the object passes through the visible light band-pass filter to obtain a color image information.
2. The apparatus according to claim 1, wherein the light emitting module comprises a light generating unit, a light diffracting element and a light diffusing element, the structured light generated by the light emitting module is formed by conversion of the light diffracting element, and the illumination light generated by the light emitting module is formed by conversion of the light diffusing element.
3. The apparatus of claim 2, wherein the light generating unit comprises a laser generator for generating a laser source and an optical element, the laser source sequentially passes through the optical element and the light diffraction element to form the structured light, and the laser source sequentially passes through the optical element and the light diffusion element to form the illuminating light.
4. The apparatus of claim 3, wherein the laser generator has at least one collimator, the optical element comprises a beam splitter and a reflector, the laser source sequentially passes through collimation of the at least one collimator, beam splitting of the beam splitter and conversion of the light diffraction element to form the structured light, and the laser source sequentially passes through collimation of the at least one collimator, beam splitting of the beam splitter, reflection of the reflector and conversion of the light diffusion element to form the illumination light.
5. The apparatus according to claim 2, wherein the light generating unit comprises a laser generator and a light emitting element, a laser light source generated by the laser generator is converted into the structured light by the light diffraction element, and a projection light source generated by the light emitting element is converted into the illumination light by the light diffusion element.
6. The apparatus according to claim 3, wherein the time-of-flight image capturing module, the light-diffracting element and the light diffusing element are linearly disposed, and a light receiving surface of the time-of-flight image capturing module, a light emitting surface of the light-diffracting element and a light emitting surface of the light diffusing element are all aligned along a same reference axis.
7. The apparatus according to claim 3, wherein the light-diffracting element and the light-diffusing element are disposed on a same side of the time-of-flight image capture module, and the light-diffusing element is disposed between the light-diffracting element and the time-of-flight image capture module.
8. The apparatus according to claim 3, wherein the light diffraction element and the light diffusion element are respectively disposed at two sides of the time-of-flight image capturing module.
9. A system for capturing stereoscopic images, which is used for capturing image information of at least one object, the system for capturing stereoscopic images comprising:
a processor module;
a time-of-flight processing module electrically connected to the processor module;
the first light-emitting module is electrically connected with the processor module and used for generating structured light projected to the object, and the structured light is reflected by the object to generate structured light information;
the second light-emitting module is electrically connected with the processor module and used for generating illumination light projected to the object, and the illumination light is reflected by the object to generate illumination light information; and
a time-of-flight image capture module electrically connected to the time-of-flight processing module for capturing the structured light information and the illumination light information, the time-of-flight image capture module comprising a switching module, the switching module comprising:
an infrared band-pass filter;
a visible light band pass filter; and
an optical switch;
the structured light information captured by the flight time image capturing module is calculated by the processor module to obtain a structured light three-dimensional point cloud;
the illumination light information captured by the flight time image capturing module is calculated by the processor module to obtain a flight time three-dimensional point cloud;
wherein the structured light stereo point cloud and the time of flight stereo point cloud can be merged into a stereo depth map for providing the image information;
the structured light is reflected by the object to form first reflected light, and the illuminating light is reflected by the object to form second reflected light;
the optical switch is used for guiding the first reflected light and the second reflected light to the infrared band-pass filter or guiding reflected light generated by irradiating the object with ambient light to the visible band-pass filter;
when the first reflected light and the second reflected light are switched by the optical switch to be guided to the infrared band-pass filter, the first reflected light and the second reflected light pass through the infrared band-pass filter to generate black-and-white stereoscopic image information;
when the reflected light generated by the ambient light irradiating the object is switched by the light switch to be guided to the visible light band-pass filter, the reflected light generated by the ambient light irradiating the object passes through the visible light band-pass filter to obtain a color image information.
10. The system of claim 9, wherein the light emitting module comprises a light generating unit, a light diffracting element and a light diffusing element, the structured light generated by the light emitting module is formed by conversion of the light diffracting element, and the illumination light generated by the light emitting module is formed by conversion of the light diffracting element.
11. The system for capturing stereoscopic imagery according to claim 9, further comprising a memory unit, wherein the structured light stereo point cloud, the time-of-flight stereo point cloud, and the stereo depth map are stored in the memory unit.
12. The system for capturing stereoscopic images as claimed in claim 9, further comprising a computing unit electrically connected to the processor module for correcting the stereoscopic depth map.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610737230.5A CN107783353B (en) | 2016-08-26 | 2016-08-26 | Device and system for capturing three-dimensional image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610737230.5A CN107783353B (en) | 2016-08-26 | 2016-08-26 | Device and system for capturing three-dimensional image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107783353A CN107783353A (en) | 2018-03-09 |
CN107783353B true CN107783353B (en) | 2020-07-10 |
Family
ID=61439738
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610737230.5A Active CN107783353B (en) | 2016-08-26 | 2016-08-26 | Device and system for capturing three-dimensional image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107783353B (en) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110278426B (en) | 2018-03-18 | 2024-02-13 | 宁波舜宇光电信息有限公司 | Depth information camera module, base assembly thereof, electronic equipment and preparation method |
TWI661232B (en) * | 2018-05-10 | 2019-06-01 | 視銳光科技股份有限公司 | Integrated structure of flood illuminator and dot projector |
WO2019218265A1 (en) * | 2018-05-16 | 2019-11-21 | Lu Kuanyu | Multi-spectrum high-precision method for identifying objects |
TWI734079B (en) * | 2018-05-29 | 2021-07-21 | 大陸商廣州印芯半導體技術有限公司 | Image sensing system and multi-function image sensor thereof |
AU2019295740B2 (en) | 2018-06-28 | 2022-02-03 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Depth processor and three-dimensional image device |
EP3824339A4 (en) * | 2018-08-24 | 2021-09-29 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Infrared projector, imaging device, and terminal device |
CN111083453B (en) * | 2018-10-18 | 2023-01-31 | 中兴通讯股份有限公司 | Projection device, method and computer readable storage medium |
CN111323991A (en) * | 2019-03-21 | 2020-06-23 | 深圳市光鉴科技有限公司 | Light projection system and light projection method |
US10585194B1 (en) | 2019-01-15 | 2020-03-10 | Shenzhen Guangjian Technology Co., Ltd. | Switchable diffuser projection systems and methods |
CN113253473A (en) * | 2019-01-25 | 2021-08-13 | 深圳市光鉴科技有限公司 | Switchable diffuser projection system and method |
CN113050112B (en) * | 2019-03-21 | 2024-06-04 | 深圳市光鉴科技有限公司 | System and method for enhancing time-of-flight resolution |
CN112887697B (en) * | 2021-01-21 | 2022-06-10 | 北京华捷艾米科技有限公司 | Image processing method and system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013138434A (en) * | 2011-12-27 | 2013-07-11 | Thomson Licensing | Device for acquisition of stereoscopic images |
CN103959089A (en) * | 2012-11-21 | 2014-07-30 | Lsi公司 | Depth imaging method and apparatus with adaptive illumination of an object of interest |
CN104769389A (en) * | 2012-11-05 | 2015-07-08 | 赫克斯冈技术中心 | Method and device for determining three-dimensional coordinates of an object |
CN104903677A (en) * | 2012-12-17 | 2015-09-09 | Lsi公司 | Methods and apparatus for merging depth images generated using distinct depth imaging techniques |
CN205193425U (en) * | 2015-08-07 | 2016-04-27 | 高准精密工业股份有限公司 | Optical device |
-
2016
- 2016-08-26 CN CN201610737230.5A patent/CN107783353B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013138434A (en) * | 2011-12-27 | 2013-07-11 | Thomson Licensing | Device for acquisition of stereoscopic images |
CN104769389A (en) * | 2012-11-05 | 2015-07-08 | 赫克斯冈技术中心 | Method and device for determining three-dimensional coordinates of an object |
CN103959089A (en) * | 2012-11-21 | 2014-07-30 | Lsi公司 | Depth imaging method and apparatus with adaptive illumination of an object of interest |
CN104903677A (en) * | 2012-12-17 | 2015-09-09 | Lsi公司 | Methods and apparatus for merging depth images generated using distinct depth imaging techniques |
CN205193425U (en) * | 2015-08-07 | 2016-04-27 | 高准精密工业股份有限公司 | Optical device |
Also Published As
Publication number | Publication date |
---|---|
CN107783353A (en) | 2018-03-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107783353B (en) | Device and system for capturing three-dimensional image | |
US9967545B2 (en) | System and method of acquiring three-dimensional coordinates using multiple coordinate measurment devices | |
CN104634276B (en) | Three-dimension measuring system, capture apparatus and method, depth computing method and equipment | |
CN111623725B (en) | Tracking type three-dimensional scanning system | |
CN108718406B (en) | Variable-focus 3D depth camera and imaging method thereof | |
US10917626B2 (en) | Active illumination 3D imaging system | |
US11675056B2 (en) | Illumination for zoned time-of-flight imaging | |
US11067692B2 (en) | Detector for determining a position of at least one object | |
US20200192206A1 (en) | Structured light projector, three-dimensional camera module and terminal device | |
US20230199324A1 (en) | Projection unit and photographing apparatus comprising same projection unit, processor, and imaging device | |
Yang et al. | Optical MEMS devices for compact 3D surface imaging cameras | |
CN111721239A (en) | Depth data measuring device and structured light projection apparatus | |
WO2023207756A1 (en) | Image reconstruction method and apparatus, and device | |
US11326874B2 (en) | Structured light projection optical system for obtaining 3D data of object surface | |
CN219512497U (en) | Laser projection module, structured light depth camera and TOF depth camera | |
US9354606B1 (en) | Systems and methodologies related to generating projectable data for 3-D viewing | |
TWI630431B (en) | Device and system for capturing 3-d images | |
WO2022017441A1 (en) | Depth data measurement device and structured light projection apparatus | |
US20240127566A1 (en) | Photography apparatus and method, electronic device, and storage medium | |
EP4443379A1 (en) | Three-dimensional recognition apparatus, terminal, image enhancement method and storage medium | |
JP6232784B2 (en) | Pattern illumination device and distance measuring device | |
CN213091888U (en) | Depth measurement system and electronic device | |
CN212779132U (en) | Depth data measuring device and structured light projection apparatus | |
CN103697825A (en) | Super-resolution 3D laser measurement system and method | |
CN116783452A (en) | Method and intraoral scanner for detecting the surface topography of a semitransparent object, in particular a dental object |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20230506 Address after: No. 25, spectrum West Road, Science City, Guangzhou high tech Industrial Development Zone, Guangzhou, Guangdong Patentee after: LUXVISIONS INNOVATION Ltd. Address before: 510730 Guangdong Guangzhou hi tech Industrial Development Zone, 25 West Road, science city. Patentee before: LITE-ON ELECTRONICS (GUANGZHOU) Ltd. Patentee before: Lite-On Technology Co.,Ltd. |
|
TR01 | Transfer of patent right |