[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN112363180A - Imaging distance measuring sensor, method, system and storage medium - Google Patents

Imaging distance measuring sensor, method, system and storage medium Download PDF

Info

Publication number
CN112363180A
CN112363180A CN202011173365.6A CN202011173365A CN112363180A CN 112363180 A CN112363180 A CN 112363180A CN 202011173365 A CN202011173365 A CN 202011173365A CN 112363180 A CN112363180 A CN 112363180A
Authority
CN
China
Prior art keywords
sub
pixels
pixel
infrared
imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011173365.6A
Other languages
Chinese (zh)
Inventor
张学勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202011173365.6A priority Critical patent/CN112363180A/en
Publication of CN112363180A publication Critical patent/CN112363180A/en
Priority to PCT/CN2021/115371 priority patent/WO2022088913A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • G01S7/4866Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak by fitting a model or function to the received signal

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

本申请适用于飞行时间技术领域,提供一种成像测距传感器、方法、系统及存储介质,成像测距传感器包括以阵列形式排列的若干像素,每个所述像素包括以阵列形式排列的至少四个子像素,每个所述子像素包括一个单光子感光单元和覆盖所述单光子感光单元的滤光片,所述至少四个子像素包括红外子像素、红色子像素、绿色子像素和蓝色子像素,使得一个成像测距传感器可以同时获取深度信息和RGB信息的功能,基于深度信息获得的深度图像和基于RGB信息获得RGB图像无视角差异,在对深度图像和RGB图像进行图像融合时无需对准,可以有效节省运算资源和时间。

Figure 202011173365

The present application is applicable to the field of time-of-flight technology, and provides an imaging ranging sensor, a method, a system, and a storage medium. The imaging ranging sensor includes a plurality of pixels arranged in an array, and each pixel includes at least four pixels arranged in an array. sub-pixels, each of the sub-pixels includes a single-photon photosensitive unit and a filter covering the single-photon photosensitive unit, the at least four sub-pixels include infrared sub-pixels, red sub-pixels, green sub-pixels and blue sub-pixels Pixels enable an imaging ranging sensor to obtain depth information and RGB information at the same time. There is no difference in perspective between the depth image obtained based on depth information and the RGB image obtained based on RGB information. It can effectively save computing resources and time.

Figure 202011173365

Description

Imaging distance measuring sensor, method, system and storage medium
Technical Field
The present application relates to Time Of Flight (TOF) technologies, and in particular, to an imaging range finding sensor, method, system, and storage medium.
Background
The time-of-flight ranging method is a method in which an optical pulse signal is continuously transmitted to a target, and then an optical signal reflected by the target is received, and the distance to the target is obtained by detecting the time of flight (round trip) of the optical pulse signal. Single photon ranging systems based on time-of-flight technology have been widely used in the fields of consumer electronics, unmanned driving, virtual reality, augmented reality, and the like.
Currently, most TOF devices are specially used for measuring distances, in practical applications, depth information measured by the TOF devices, RGB ((Red, R), Green (G), Blue (B)) image sensors are often required to obtain RGB information, and depth images obtained based on the depth information and RGB images obtained based on the RGB information are subjected to image fusion, and then the depth images and the RGB images need to be aligned through an algorithm, which is complex in calculation and consumes computing resources and time.
Disclosure of Invention
In view of this, embodiments of the present application provide an imaging ranging sensor, a method, a system, and a storage medium, so as to solve the problems in the prior art that a depth image needs to be aligned with an RGB image through an algorithm, the calculation is complex, and computational resources and time are consumed.
A first aspect of the embodiments of the present application provides an imaging ranging sensor, including a plurality of pixels arranged in an array, where each of the pixels includes at least four sub-pixels arranged in an array, each of the sub-pixels includes a single photon photosensitive unit and a filter covering the single photon photosensitive unit, and each of the pixels includes an infrared sub-pixel, a red sub-pixel, a green sub-pixel, and a blue sub-pixel.
A second aspect of the embodiments of the present application provides an imaging ranging method, which is implemented by an imaging ranging sensor according to the first aspect of the embodiments of the present application, and the method includes:
acquiring a first light sensing signal output by each pixel and used for indicating the time of receiving infrared light reflected by an object;
acquiring a second light sensing signal output by each pixel and used for indicating the time of receiving the visible light reflected by the object;
acquiring depth information of the object according to the first light sensing signal;
and acquiring RGB information of the object according to the second light sensing signal.
A third aspect of an embodiment of the present application provides an imaging ranging system, including:
a light emitter for emitting infrared rays and visible rays to an object;
an imaging ranging sensor as described in the first aspect of the embodiments of the present application;
and the controller is respectively connected with the light emitter and the imaging ranging sensor and is used for executing the imaging ranging method according to the second aspect of the embodiment of the application.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, which stores a computer program, which when executed by a controller, implements the steps of the imaging range finding method according to the second aspect of embodiments of the present application.
The imaging distance measuring sensor provided by the first aspect of the embodiments of the present application includes a plurality of pixels arranged in an array, each of the pixels includes at least four sub-pixels arranged in an array, each of the sub-pixels includes a single photon photosensitive unit and a filter covering the single photon photosensitive unit, the at least four sub-pixels include an infrared sub-pixel, a red sub-pixel, a green sub-pixel and a blue sub-pixel, each of the sub-pixels includes an infrared sub-pixel, a red sub-pixel, a green sub-pixel and a blue sub-pixel, and each of the sub-pixels is composed of a single photon photosensitive unit and a filter covering the single photon photosensitive unit, so that one imaging distance measuring sensor can simultaneously acquire a function of depth information and RGB information, and a depth image acquired based on the depth information and an RGB image based on the RGB information have no viewing angle difference, alignment is not needed when the depth image and the RGB image are fused, so that the operation resource and time can be effectively saved.
The imaging ranging method provided by the second aspect of the embodiment of the present application is implemented based on the imaging ranging sensor provided by the first aspect of the embodiment of the present application, and is implemented by acquiring a first light sensing signal output by each pixel and used for indicating a time when an infrared ray reflected by an object is received; acquiring a second light sensing signal output by each pixel and used for indicating the time of receiving the visible light reflected by the object; acquiring depth information of the object according to the first light sensing signal; the RGB information of the object is acquired according to the second light-induced signal, and the depth signal and the RGB information of the object can be acquired respectively through the first light-induced signal and the second light-induced signal output by the imaging distance measuring sensor, so that a depth image acquired based on the depth information and an RGB image acquired based on the RGB information have no visual angle difference, alignment is not needed when the depth image and the RGB image are fused, and the calculation resource and time can be effectively saved.
It is to be understood that, the beneficial effects of the third aspect and the fourth aspect may be referred to the description of the first aspect or the second aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic diagram of a first structure of a pixel provided in an embodiment of the present application;
fig. 2 is a schematic diagram of a second structure of a pixel provided in an embodiment of the present application;
fig. 3 is a schematic diagram of a third structure of a pixel provided in this embodiment of the present application;
FIG. 4 is a schematic flow chart of a method for imaging range finding provided in an embodiment of the present application;
FIG. 5 is a table of correspondence provided in an embodiment of the present application;
fig. 6 is a schematic structural diagram of an imaging ranging system provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The embodiment of the application provides an imaging ranging sensor, which comprises a plurality of pixels arranged in an array form, wherein each pixel comprises at least four sub-pixels arranged in the array form, each sub-pixel comprises a single photon light-sensitive unit and a light filter covering the single photon light-sensitive unit, and each pixel comprises an infrared sub-pixel, a red sub-pixel, a green sub-pixel and a blue sub-pixel.
In application, the number of pixels included in the imaging ranging sensor can be set according to actual needs, and the imaging ranging sensor has higher resolution when the number of pixels is larger. The several pixels in the imaging ranging sensor may be arranged in an array of any regular shape, for example, a rectangular array, a circular array, or any other regular polygonal array. The number of the sub-pixels included in each pixel can be set according to actual needs, and as long as it is ensured that each pixel includes an infrared sub-pixel, a red sub-pixel, a green sub-pixel and a blue sub-pixel, when the depth information and the RGB information of the object are acquired by the imaging distance measuring sensor, the depth information and the RGB information of each reflective point corresponding to each pixel on the object can be acquired. The depth information of each light reflecting point on the object comprises the distance between the light reflecting point and the imaging ranging sensor, the RGB information of each light reflecting point comprises the R value, the G value and the B value of the light reflecting point, and the R value, the G value and the B value can be CIE spectrum tristimulus values specifically.
In application, each sub-pixel comprises a single photon light sensing unit and a filter covering the single photon light sensing unit, the filters of the infrared sub-pixel, the red sub-pixel, the green sub-pixel and the blue sub-pixel are respectively an infrared filter which can only pass infrared light, a red filter which can only pass red light, a green filter which can only pass green light and a blue filter which can only pass blue light, and each filter can be set as a reflection filter or an absorption filter according to actual needs.
In application, the Single Photon photosensitive unit may be a Single Photon Avalanche Diode (SPAD) or a photomultiplier tube, and may respond to an incident Single Photon and output a light sensing signal for indicating a Time when the Photon reaches the Single Photon photosensitive unit, and based on the light sensing signal output by the Single Photon photosensitive unit, the collection of a weak light signal and the calculation of a flight Time may be implemented by using a Time-Correlated Single Photon Counting method (TCSPC).
As shown in fig. 1, a first structural diagram of an imaging distance measuring sensor is exemplarily shown;
each pixel 101 includes four infrared sub-pixels IR, three red sub-pixels R, six green sub-pixels G, and three blue sub-pixels B;
four infrared sub-pixels IR are arranged in a 2 × 2 array to form an infrared sub-pixel unit 10, and one red sub-pixel R, two green sub-pixels G and one blue sub-pixel B are arranged in a bayer array to form a color sub-pixel unit 20;
each pixel 101 comprises one infrared sub-pixel element 10 and three color sub-pixel elements 20.
In application, the arrangement of one infrared sub-pixel unit and three color sub-pixel units included in each pixel can be set according to actual needs, for example, the infrared sub-pixel units and the three color sub-pixel units are arranged in a 1 × 4, 4 × 1 or 2 × 2 array.
Fig. 1 schematically shows a structure of an imaging range sensor in which one infrared sub-pixel unit 10 and three color sub-pixel units 20 are arranged in a 2 × 2 array.
In the structure of the imaging ranging sensor shown in fig. 1, four infrared sub-pixels are arranged in a 2 × 2 array form to form one infrared sub-pixel unit, so that the four infrared sub-pixels can be combined (bining) into one super-pixel, for example, the four infrared sub-pixels are connected to one or gate or one and gate to form one infrared super-pixel, which can effectively improve the signal-to-noise ratio of the infrared sensing signal and improve the ranging accuracy; by arranging one red sub-pixel, two green sub-pixels, and one blue sub-pixel into one color sub-pixel unit in the bayer array format (i.e., RGGB), color information can be reduced using a well-established bayer array-based algorithm, and imaging accuracy is high.
As shown in fig. 2, a second structural diagram of the imaging range sensor is exemplarily shown, wherein each pixel 102 includes one infrared sub-pixel IR, one red sub-pixel R, one green sub-pixel G and one blue sub-pixel B arranged in a 2 × 2 array.
The imaging ranging sensor structure shown in fig. 2 arranges four sub-pixels into a pixel in a 2 × 2 array form, so that the occupied area of a single pixel is small, and the imaging ranging sensor can accommodate more pixels in the same area, thereby effectively improving the resolution of the imaging ranging sensor and ensuring that the fusion result of depth information and RGB information is more accurate.
As shown in fig. 3, a third structural diagram of the imaging distance measuring sensor is exemplarily shown;
wherein each pixel 103 comprises M × M infrared subpixels IR, M × M red subpixels R, M × M green subpixels G, and M × M blue subpixels B;
the M × M infrared sub-pixels IR are arranged in an array form to form an infrared sub-pixel unit 1, the M × M red sub-pixels R are arranged in an array form to form a red sub-pixel unit 2, the M × M green sub-pixels G are arranged in an array form to form a green sub-pixel unit 3, and the M × M blue sub-pixels B are arranged in an array form to form a blue sub-pixel unit 4;
each pixel 103 includes an infrared sub-pixel unit 1, a red sub-pixel unit 2, a green sub-pixel unit 3, and a blue sub-pixel unit 4.
In application, M is an integer greater than or equal to 2, and the specific value of M can be set according to actual needs. The arrangement of one infrared sub-pixel unit, one red sub-pixel unit, one green sub-pixel unit and one blue sub-pixel unit included in each pixel can be set according to actual needs, for example, the infrared sub-pixel unit, the red sub-pixel unit, the green sub-pixel unit and the blue sub-pixel unit are arranged in a 1 × 4, 4 × 1 or 2 × 2 array.
Fig. 3 exemplarily shows a schematic structural diagram of an imaging ranging sensor when M is 2 and one infrared sub-pixel unit, one red sub-pixel unit, one green sub-pixel unit, and one blue sub-pixel unit are arranged in a 2 × 2 array.
In the structure of the imaging ranging sensor shown in fig. 3, by arranging M × M infrared sub-pixels in an M × M array form into one infrared sub-pixel unit, four infrared sub-pixels can be combined into one infrared super-pixel, for example, the four infrared sub-pixels are connected to one or gate or one and gate to form one infrared super-pixel, so that the signal-to-noise ratio of the infrared sensing signal can be effectively improved, and the ranging accuracy is improved;
by arranging the M multiplied by M red sub-pixels into a red sub-pixel unit in an M multiplied by M array form, four red sub-pixels can be synthesized into a red super-pixel, for example, the four red sub-pixels are connected into an OR gate or an AND gate to form a red super-pixel, so that the signal-to-noise ratio of a red sensing signal can be effectively improved, and the precision of an R value can be improved;
the four green sub-pixels can be combined into one green super-pixel by arranging the M multiplied by M green sub-pixels into one green sub-pixel unit in an M multiplied by M array form, for example, the four green sub-pixels are connected into one OR gate or one AND gate to form one green super-pixel, so that the signal-to-noise ratio of a green light induction signal can be effectively improved, and the precision of a G value is improved;
by arranging the M multiplied by M blue sub-pixels into a blue sub-pixel unit in an M multiplied by M array form, four blue sub-pixels can be synthesized into one blue super-pixel, for example, the four blue sub-pixels are connected into one OR gate or one AND gate to form one blue super-pixel, so that the signal-to-noise ratio of a blue sensing signal can be effectively improved, and the precision of a B value is improved;
the red sub-pixel unit, the green sub-pixel unit and the blue sub-pixel unit in each pixel have a structure which can improve the accuracy of RGB information as a whole.
The imaging ranging sensor provided by the embodiment of the application enables each pixel to comprise an infrared sub-pixel, a red sub-pixel, a green sub-pixel and a blue sub-pixel, and enables each sub-pixel to be composed of a single photon light-sensitive unit and a light filter covering the single photon light-sensitive unit, so that one imaging ranging sensor can simultaneously obtain the depth information and the RGB information, a depth image obtained based on the depth information and an RGB image obtained based on the RGB information have no visual angle difference, alignment is not needed when the depth image and the RGB image are fused, and operation resources and time can be effectively saved.
As shown in fig. 4, based on the structure of the imaging ranging sensor in any of the embodiments, the embodiment of the present application further provides an imaging ranging method, including:
step S201, acquiring a first light sensing signal which is output by each pixel and used for indicating the time of receiving infrared light reflected by an object;
step S202, acquiring a second light sensing signal which is output by each pixel and used for indicating the time of receiving the visible light reflected by the object;
step S203, acquiring depth information of the object according to the first light sensing signal;
and step S204, acquiring RGB information of the object according to the second light sensing signal.
In application, the infrared sub-pixel in each pixel is used for receiving infrared light reflected by a reflecting point at a corresponding position on an object, outputting a first light sensing signal when receiving the infrared light, calculating the distance between each reflecting point on the object and the imaging ranging sensor by using a flight time method based on the time of receiving the first light sensing signal and the time of transmitting the infrared light to the object, and obtaining the depth image of the object based on the depth information of the object, wherein the depth information of the object comprises the distances between all the reflecting points on the object and the imaging ranging sensor. When the imaging ranging sensor is only used for acquiring depth information, the depth information of the object can also be acquired according to the second light sensing signal, that is, the distance between each reflecting point on the object and the imaging ranging sensor can be calculated by using a flight time method based on the time of receiving the second light sensing signal and the time of emitting visible light to the object.
It should be understood that since the depth information of the object includes the distance between each light reflection point on the object and the imaging ranging sensor, the imaging ranging sensor can be used for ranging in addition to obtaining the depth information and RGB information.
In application, the color sub-pixels in each pixel are used for receiving visible light rays reflected by the reflective points at corresponding positions on the object, outputting a second photoinduction signal when the visible light rays are received, obtaining the RGB value (including the R value, the G value and the B value) of each reflective point on the object based on the second photoinduction signal, wherein the RGB information of the object includes the RGB values of all the reflective points on the object, and obtaining the RGB image of the object based on the RGB information of the object. Similarly, the infrared image information of the object can be obtained based on the first light sensing signal, and the infrared image of the object is obtained based on the infrared image information.
In one embodiment, step S203 includes:
acquiring the light intensity of infrared light reflected by the object according to the first light sensing signal;
acquiring infrared image information of the object according to the light intensity of the infrared light reflected by the object;
step S204 includes:
acquiring the light intensity of the visible light reflected by the object according to the second light sensing signal;
acquiring RGB information of the object according to the light intensity of the visible light reflected by the object;
in application, the infrared image information of the object is obtained based on the light intensity of the infrared light reflected by each reflection point on the object, and similarly, the RGB information of the object is obtained based on the light intensity of the visible light reflected by each reflection point on the object. According to the light intensity of the infrared light reflected by each point on the object, the brightness value corresponding to each reflection point in the infrared image information of the object can be obtained, the brightness value can be embodied as a gray value in the infrared image, and the infrared image obtained based on the infrared image information of the object is a gray image only containing the gray value corresponding to each reflection point. According to the light intensity of the visible light reflected by each reflecting point on the object, the brightness value corresponding to each reflecting point in the RGB information of the object can be obtained, and according to the brightness value of each reflecting point and the color of each color pixel in the corresponding pixel, the RGB value of each reflecting point can be obtained based on the calculation formula of the tristimulus values, so that the RGB information comprising the RGB value of each reflecting point of the object is obtained, and the RGB image of the object is further obtained.
In one embodiment, the acquiring the light intensity of the visible light reflected by the object according to the second light sensing signal includes:
acquiring the total photon number or the total peak signal frequency received by the imaging ranging sensor according to the second optical sensing signal;
acquiring the light intensity of the visible light reflected by the object according to a first corresponding relation between preset visible light intensity and total photon number or total peak signal frequency;
similarly, the acquiring the light intensity of the infrared light reflected by the object according to the first light sensing signal includes:
acquiring the total photon number or the total peak signal frequency received by the imaging ranging sensor according to the first light sensing signal;
and acquiring the light intensity of the infrared light reflected by the object according to a second corresponding relation between the preset infrared light intensity and the total photon number or the total peak signal frequency.
In application, since each single photon photosensitive unit is configured to respond to an incident single photon and output a light sensing signal indicating a time when the photon arrives at the single photon photosensitive unit, the total photon number of the visible light received by the imaging ranging sensor can be obtained according to the light sensing signals output by all the color sub-pixels, that is, the second light sensing signal. The Time of flight is calculated and converted into a Time code according to the photoinduction signal output by each color sub-pixel through a Time To Digital Converter (TDC) connected with the color sub-pixel in each pixel, then the Time code output by the Time Digital Converter is saved through a histogram circuit connected with the Time Digital Converter, a histogram is generated according to the Time code saved in at least one emission period of visible light, and the frequency of peak signals in the histogram is obtained, so that the total frequency of the peak signals in the histogram generated according to the photoinduction signal output by the color sub-pixel in each pixel can be obtained, and the total frequency of the peak signals is obtained. Similarly, the total number of photons that can obtain the infrared light received by the imaging ranging sensor and the total number of times of peak signals in a histogram generated according to the photosensitive signals output by the infrared sub-pixels in each pixel can also be obtained, so as to obtain the total number of times of peak signals. The first corresponding relationship and the second corresponding relationship may be obtained through a large number of experiments in advance by using the above-mentioned manner of obtaining the total photon number and the total peak signal number.
In one embodiment, before acquiring the total number of photons or the total number of peak signals received by the imaging ranging sensor according to the second photo sensing signal, the method includes:
respectively acquiring the total photon number or the total peak signal frequency received by the imaging ranging sensor under different light intensities of the visible light rays reflected by the object;
acquiring a first corresponding relation between different visible light intensities and the total photon number or the total peak signal number according to the total photon number or the total peak signal number acquired under each light intensity of the visible light reflected by the object;
similarly, before acquiring the total number of photons or the total number of peak signals received by the imaging ranging sensor according to the first light sensing signal, the method includes:
respectively acquiring the total photon number or the total peak signal frequency received by the imaging distance measuring sensor under different light intensities of infrared light rays reflected by the object;
and acquiring a second corresponding relation between different infrared light intensities and the total photon number or the total peak signal number according to the total photon number or the total peak signal number acquired under each light intensity of the infrared light reflected by the object.
In application, infrared rays with different light intensities and visible rays with different light intensities can be emitted to an object respectively or simultaneously, and then based on the mode, a first corresponding relation between different visible light intensities and the total photon number or the total peak signal frequency and a second corresponding relation between different infrared light intensities and the total photon number or the total peak signal frequency are obtained. Therefore, when RGB information needs to be obtained subsequently, the light intensity of the visible light can be obtained according to the first corresponding relation and the total photon number or the total peak signal frequency obtained based on the second light sensing signal; when infrared image information needs to be obtained subsequently, the light intensity of the infrared light can be obtained according to the second corresponding relation and the total photon number or the total peak signal frequency obtained based on the first light sensing signal.
In an application, the first corresponding relation and the second corresponding relation may exist in the form of a corresponding relation table, for example, a display lookup table.
As shown in fig. 5, a table of correspondence among light intensity, total photon number, and total peak signal number is exemplarily shown.
The imaging distance measurement method provided by the embodiment of the application can directly acquire the depth signal and the RGB information of an object through the first light-induced signal and the second light-induced signal output by the imaging distance measurement sensor, so that the depth image acquired based on the depth information and the RGB image acquired based on the RGB information have no visual angle difference, alignment is not needed when the depth image and the RGB image are subjected to image fusion, and operation resources and time can be effectively saved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
As shown in fig. 6, an embodiment of the present application provides an imaging ranging system 1000, including:
a light emitter 200 for emitting infrared rays and visible rays toward the object 2000;
an imaging range sensor 100;
and a controller 300 connected to the light emitter 200 and the imaging range sensor 100, respectively, for performing the imaging range finding method in the above-described embodiment.
In application, the imaging range finding system at least comprises a controller, a light emitter and an imaging range finding sensor, and may further comprise a collimating Optical element and a Diffractive Optical Element (DOE) covering the light emitter, a focusing lens or a micro-lens array covering the imaging range finding sensor, and the like. The collimating optical element is used for collimating the light rays emitted by the light emitter, and the diffracting optical element is used for diffracting the light rays. The lens or the micro-lens array is used for focusing the light reflected by the object on the photosensitive surface of the imaging distance measuring sensor. The controller is used for controlling the light emitter and the imaging distance measuring sensor to be turned on or off and can also adjust the light intensity of light emitted by the light emitter. The object may be any object in free space that reflects light.
In application, the Light emitter may be set as a Laser, a Light Emitting Diode (LED), a Laser Diode (LD), an Edge-Emitting Laser (EEL), or the like according to actual needs. The Laser may be a Vertical-Cavity Surface-Emitting Laser (VCSEL). The optical transmitter may be a tunable device.
In application, the controller may include signal amplifiers, Time-to-Digital converters (TDCs), Analog-to-Digital converters (ADCs), and the like, which are equal to the number of pixels or single photon sensing units in the imaging range sensor. The time rate converter connected to each pixel is connected to a histogram circuit.
In Application, the controller may further include a Central Processing Unit (CPU), other general purpose controllers, a Digital Signal controller (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA), other Programmable logic devices, discrete gates, transistor logic devices, discrete hardware components, or the like. The general controller may be a microcontroller or any conventional controller or the like.
The imaging distance measuring sensor, the imaging distance measuring method and the imaging distance measuring system provided by the embodiment of the application can be applied to terminal devices such as mobile phones, tablet computers, wearable devices, vehicle-mounted devices, Augmented Reality (AR) devices, Virtual Reality (VR) devices, notebook computers, netbooks and Personal Digital Assistants (PDAs).
The embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a controller, the steps in the above-mentioned embodiments of the imaging ranging method may be implemented.
The embodiments of the present application further provide a computer program product, which when running on a terminal device, enables the terminal device to implement the steps in the above-mentioned imaging ranging method embodiments when executed.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment. In the embodiments provided in the present application, it should be understood that the disclosed system/terminal device and method may be implemented in other ways. For example, the system/terminal device embodiments described above are merely illustrative.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1.一种成像测距传感器,其特征在于,包括以阵列形式排列的若干像素,每个所述像素包括以阵列形式排列的至少四个子像素,每个所述子像素包括一个单光子感光单元和覆盖所述单光子感光单元的滤光片,每个所述像素包括红外子像素、红色子像素、绿色子像素和蓝色子像素。1. An imaging ranging sensor, characterized by comprising several pixels arranged in an array, each of the pixels comprising at least four sub-pixels arranged in an array, and each of the sub-pixels comprising a single-photon photosensitive unit and a filter covering the single-photon photosensitive unit, each of the pixels includes an infrared sub-pixel, a red sub-pixel, a green sub-pixel and a blue sub-pixel. 2.如权利要求1所述的成像测距传感器,其特征在于,每个所述像素包括四个所述红外子像素、三个所述红色子像素、六个所述绿色子像素和三个所述蓝色子像素;2 . The imaging ranging sensor of claim 1 , wherein each of the pixels comprises four of the infrared sub-pixels, three of the red sub-pixels, six of the green sub-pixels, and three of the red sub-pixels. 3 . the blue sub-pixel; 四个所述红外子像素以2×2阵列形式排列构成一个红外子像素单元,一个所述红色子像素、两个所述绿色子像素和一个所述蓝色子像素以拜耳阵列形式排列构成一个彩色子像素单元;Four of the infrared sub-pixels are arranged in a 2×2 array to form an infrared sub-pixel unit, and one of the red sub-pixels, two of the green sub-pixels, and one of the blue sub-pixels are arranged in a Bayer array to form a unit. Color sub-pixel unit; 每个所述像素包括一个所述红外子像素单元和三个所述彩色子像素单元。Each of the pixels includes one of the infrared sub-pixel units and three of the color sub-pixel units. 3.如权利要求1所述的成像测距传感器,其特征在于,每个所述像素包括以2×2阵列形式排列的一个所述红外子像素、一个所述红色子像素、一个所述绿色子像素和一个所述蓝色子像素。3 . The imaging ranging sensor according to claim 1 , wherein each of the pixels comprises one of the infrared sub-pixels, one of the red sub-pixels, and one of the green sub-pixels arranged in a 2×2 array. 4 . sub-pixel and one of the blue sub-pixels. 4.如权利要求1所述的成像测距传感器,其特征在于,每个所述像素包括M×M个所述红外子像素、M×M个所述红色子像素、M×M个所述绿色子像素和M×M个所述蓝色子像素;4 . The imaging distance measuring sensor according to claim 1 , wherein each of the pixels comprises M×M infrared sub-pixels, M×M red sub-pixels, and M×M red sub-pixels. 5 . green sub-pixels and M×M of the blue sub-pixels; M×M个所述红外子像素以阵列形式排列构成一个红外子像素单元,M×M个所述红色子像素以阵列形式排列构成一个红色子像素单元,M×M个所述绿色子像素以阵列形式排列构成一个绿色子像素单元,M×M个所述蓝色子像素以阵列形式排列构成一个蓝色子像素单元;The M×M infrared sub-pixels are arranged in an array to form an infrared sub-pixel unit, the M×M red sub-pixels are arranged in an array to form a red sub-pixel unit, and the M×M green sub-pixels are A green sub-pixel unit is arranged in an array form, and M×M blue sub-pixels are arranged in an array form to form a blue sub-pixel unit; 每个所述像素包括一个所述红外子像素单元、一个所述红色子像素单元、一个所述绿色子像素单元和一个所述蓝色子像素单元,M为大于或等于2的整数。Each of the pixels includes one of the infrared sub-pixel units, one of the red sub-pixel units, one of the green sub-pixel units and one of the blue sub-pixel units, and M is an integer greater than or equal to 2. 5.一种成像测距方法,其特征在于,基于权利要求1至4任一项所述的成像测距传感器实现,所述方法包括:5. An imaging ranging method, characterized in that, implemented based on the imaging ranging sensor according to any one of claims 1 to 4, the method comprising: 获取每个像素输出的用于指示接收到物体反射的红外光线的时间的第一光感应信号;acquiring a first light sensing signal output by each pixel and used to indicate the time when infrared light reflected by the object is received; 获取每个像素输出的用于指示接收到物体反射的可见光线的时间的第二光感应信号;acquiring a second light sensing signal output by each pixel and used to indicate the time when the visible light reflected by the object is received; 根据所述第一光感应信号获取所述物体的深度信息;Acquiring depth information of the object according to the first light sensing signal; 根据所述第二光感应信号获取所述物体的RGB信息。The RGB information of the object is acquired according to the second light sensing signal. 6.如权利要求5所述的成像测距方法,其特征在于,所述根据所述第二光感应信号获取所述物体的RGB信息,包括:6 . The imaging ranging method according to claim 5 , wherein the acquiring the RGB information of the object according to the second light sensing signal comprises: 6 . 根据所述第二光感应信号,获取所述物体反射的可见光线的光强度;obtaining the light intensity of the visible light reflected by the object according to the second light sensing signal; 根据所述物体反射的可见光线的光强度,获取所述物体的RGB信息。The RGB information of the object is acquired according to the light intensity of the visible light reflected by the object. 7.如权利要求6所述的成像测距方法,其特征在于,所述根据所述第二光感应信号,获取所述物体反射的可见光线的光强度,包括:7 . The imaging ranging method according to claim 6 , wherein the obtaining, according to the second light sensing signal, the light intensity of the visible light reflected by the object comprises: 8 . 根据所述第二光感应信号,获取所述成像测距传感器接收到的总光子数量或总峰值信号次数;obtaining the total number of photons or the total number of peak signals received by the imaging ranging sensor according to the second light sensing signal; 根据预设的可见光强度与总光子数量或总峰值信号次数之间的第一对应关系,获取所述物体反射的可见光线的光强度。According to the preset first correspondence between the visible light intensity and the total number of photons or the total peak signal times, the light intensity of the visible light reflected by the object is acquired. 8.如权利要求7所述的成像测距方法,其特征在于,所述根据所述第二光感应信号,获取所述成像测距传感器接收到的总光子数量或总峰值信号次数之前,包括:8 . The imaging ranging method according to claim 7 , wherein, before acquiring the total number of photons or the total number of peak signals received by the imaging ranging sensor according to the second light sensing signal, comprising: 8 . : 分别在所述物体反射的可见光线的不同光强度下,获取所述成像测距传感器接收到的总光子数量或总峰值信号次数;Obtaining the total number of photons or the total number of peak signals received by the imaging ranging sensor under different light intensities of the visible light reflected by the object respectively; 根据在所述物体反射的可见光线的每种光强度下获取的总光子数量或总峰值信号次数,获取不同的可见光强度与总光子数量或总峰值信号次数之间的第一对应关系。According to the total number of photons or total peak signal times obtained under each light intensity of visible light reflected by the object, a first correspondence between different visible light intensities and total photons or total peak signal times is obtained. 9.一种成像测距系统,其特征在于,包括:9. An imaging ranging system, comprising: 光发射器,用于向物体发射红外光线和可见光线;Light emitters for emitting infrared and visible light to objects; 如权利要求1至4任一项所述的成像测距传感器;The imaging ranging sensor according to any one of claims 1 to 4; 控制器,分别与所述光发射器和所述成像测距传感器连接,用于执行如权利权要5至8任一项所述的成像测距方法。A controller, connected to the light transmitter and the imaging ranging sensor, respectively, is configured to execute the imaging ranging method according to any one of claims 5 to 8. 10.一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被控制器执行时实现如权利要求5至8任一项所述的成像测距方法的步骤。10. A computer-readable storage medium storing a computer program, wherein when the computer program is executed by a controller, the imaging measurement method according to any one of claims 5 to 8 is implemented. distance method steps.
CN202011173365.6A 2020-10-28 2020-10-28 Imaging distance measuring sensor, method, system and storage medium Pending CN112363180A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011173365.6A CN112363180A (en) 2020-10-28 2020-10-28 Imaging distance measuring sensor, method, system and storage medium
PCT/CN2021/115371 WO2022088913A1 (en) 2020-10-28 2021-08-30 Imaging distance measurement sensor, method, and system, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011173365.6A CN112363180A (en) 2020-10-28 2020-10-28 Imaging distance measuring sensor, method, system and storage medium

Publications (1)

Publication Number Publication Date
CN112363180A true CN112363180A (en) 2021-02-12

Family

ID=74511135

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011173365.6A Pending CN112363180A (en) 2020-10-28 2020-10-28 Imaging distance measuring sensor, method, system and storage medium

Country Status (2)

Country Link
CN (1) CN112363180A (en)
WO (1) WO2022088913A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112738385A (en) * 2021-03-30 2021-04-30 北京芯海视界三维科技有限公司 Sensor and shooting module
CN112804438A (en) * 2021-03-30 2021-05-14 北京芯海视界三维科技有限公司 Sensor and shooting module
CN112799097A (en) * 2021-04-14 2021-05-14 深圳阜时科技有限公司 Acquisition method of depth map and grayscale image, depth camera, and electronic device
CN113055621A (en) * 2021-03-11 2021-06-29 维沃移动通信有限公司 Camera module and electronic equipment
CN113938664A (en) * 2021-09-10 2022-01-14 思特威(上海)电子科技股份有限公司 Signal acquisition method of pixel array, image sensor, equipment and storage medium
CN114119696A (en) * 2021-11-30 2022-03-01 上海商汤临港智能科技有限公司 Method, device and system for acquiring depth image and computer readable storage medium
WO2022088913A1 (en) * 2020-10-28 2022-05-05 Oppo广东移动通信有限公司 Imaging distance measurement sensor, method, and system, and storage medium
CN115022562A (en) * 2022-05-25 2022-09-06 Oppo广东移动通信有限公司 Image Sensors, Cameras and Electronics
CN115184956A (en) * 2022-09-09 2022-10-14 荣耀终端有限公司 TOF sensor system and electronics

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115049711A (en) * 2022-06-29 2022-09-13 维沃移动通信有限公司 Image registration method, device, electronic equipment and medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1657873A (en) * 2004-02-17 2005-08-24 欧姆龙株式会社 Optical measuring device and optical measuring method
CN105049829A (en) * 2015-07-10 2015-11-11 北京唯创视界科技有限公司 Optical filter, image sensor, imaging device and three-dimensional imaging system
CN105895645B (en) * 2015-02-17 2019-09-03 豪威科技股份有限公司 Pixel Array and Image Sensing System
CN110574367A (en) * 2019-07-31 2019-12-13 华为技术有限公司 A kind of image sensor and method for image sensitivity
CN211429422U (en) * 2018-09-14 2020-09-04 深圳阜时科技有限公司 Image sensor, image acquisition device, identity recognition device and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110003696A (en) * 2009-07-06 2011-01-13 삼성전자주식회사 Optical filter array for single chip stereoscopic image sensor and manufacturing method thereof
KR102086509B1 (en) * 2012-11-23 2020-03-09 엘지전자 주식회사 Apparatus and method for obtaining 3d image
CN112363180A (en) * 2020-10-28 2021-02-12 Oppo广东移动通信有限公司 Imaging distance measuring sensor, method, system and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1657873A (en) * 2004-02-17 2005-08-24 欧姆龙株式会社 Optical measuring device and optical measuring method
CN105895645B (en) * 2015-02-17 2019-09-03 豪威科技股份有限公司 Pixel Array and Image Sensing System
CN105049829A (en) * 2015-07-10 2015-11-11 北京唯创视界科技有限公司 Optical filter, image sensor, imaging device and three-dimensional imaging system
CN211429422U (en) * 2018-09-14 2020-09-04 深圳阜时科技有限公司 Image sensor, image acquisition device, identity recognition device and electronic equipment
CN110574367A (en) * 2019-07-31 2019-12-13 华为技术有限公司 A kind of image sensor and method for image sensitivity

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022088913A1 (en) * 2020-10-28 2022-05-05 Oppo广东移动通信有限公司 Imaging distance measurement sensor, method, and system, and storage medium
CN113055621A (en) * 2021-03-11 2021-06-29 维沃移动通信有限公司 Camera module and electronic equipment
CN113055621B (en) * 2021-03-11 2024-04-09 维沃移动通信有限公司 Camera module and electronic equipment
CN112738385A (en) * 2021-03-30 2021-04-30 北京芯海视界三维科技有限公司 Sensor and shooting module
CN112804438A (en) * 2021-03-30 2021-05-14 北京芯海视界三维科技有限公司 Sensor and shooting module
CN112799097A (en) * 2021-04-14 2021-05-14 深圳阜时科技有限公司 Acquisition method of depth map and grayscale image, depth camera, and electronic device
CN112799097B (en) * 2021-04-14 2023-11-28 深圳阜时科技有限公司 Depth map and grayscale image acquisition methods, depth cameras, and electronic equipment
CN113938664A (en) * 2021-09-10 2022-01-14 思特威(上海)电子科技股份有限公司 Signal acquisition method of pixel array, image sensor, equipment and storage medium
CN114119696A (en) * 2021-11-30 2022-03-01 上海商汤临港智能科技有限公司 Method, device and system for acquiring depth image and computer readable storage medium
CN115022562A (en) * 2022-05-25 2022-09-06 Oppo广东移动通信有限公司 Image Sensors, Cameras and Electronics
WO2023226395A1 (en) * 2022-05-25 2023-11-30 Oppo广东移动通信有限公司 Image sensor, camera, and electronic device
CN115184956A (en) * 2022-09-09 2022-10-14 荣耀终端有限公司 TOF sensor system and electronics
CN115184956B (en) * 2022-09-09 2023-01-13 荣耀终端有限公司 TOF sensor system and electronic device

Also Published As

Publication number Publication date
WO2022088913A1 (en) 2022-05-05

Similar Documents

Publication Publication Date Title
CN112363180A (en) Imaging distance measuring sensor, method, system and storage medium
WO2022262332A1 (en) Calibration method and apparatus for distance measurement device and camera fusion system
US9058081B2 (en) Application using a single photon avalanche diode (SPAD)
WO2021051477A1 (en) Time of flight distance measurement system and method with adjustable histogram
WO2021051478A1 (en) Time-of-flight-based distance measurement system and method for dual-shared tdc circuit
CN206650757U (en) a device
WO2021120403A1 (en) Depth measurement device and method
CN109196378B (en) Optical systems for remote sensing receivers
WO2021051479A1 (en) Interpolation-based time of flight measurement method and system
WO2021120402A1 (en) Fused depth measurement apparatus and measurement method
WO2021072802A1 (en) Distance measurement system and method
WO2022011974A1 (en) Distance measurement system and method, and computer-readable storage medium
WO2022241942A1 (en) Depth camera and depth calculation method
WO2022017366A1 (en) Depth imaging method and depth imaging system
US11709271B2 (en) Time of flight sensing system and image sensor used therein
CN102685402A (en) Color sensor insensitive to distance variations
KR102819418B1 (en) Time-of-flight (TOF) sensor module and electronics
CN112596068A (en) Collector, distance measurement system and electronic equipment
WO2019041257A1 (en) Signal processing chip, image processing system and distance measurement system
US12345812B2 (en) Angle of rotation determination in scanning LIDAR systems
JPWO2018211831A1 (en) Photodetectors and portable electronics
TWI748460B (en) Time of flight device and time of flight method
CN111983630A (en) Single photon ranging system, method, terminal equipment and storage medium
WO2022110947A1 (en) Control method for electronic device, electronic device, and computer-readable storage medium
CN112105944B (en) Optical ranging system with multi-mode operation using short and long pulses

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210212

RJ01 Rejection of invention patent application after publication