Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The embodiment of the application provides an imaging ranging sensor, which comprises a plurality of pixels arranged in an array form, wherein each pixel comprises at least four sub-pixels arranged in the array form, each sub-pixel comprises a single photon light-sensitive unit and a light filter covering the single photon light-sensitive unit, and each pixel comprises an infrared sub-pixel, a red sub-pixel, a green sub-pixel and a blue sub-pixel.
In application, the number of pixels included in the imaging ranging sensor can be set according to actual needs, and the imaging ranging sensor has higher resolution when the number of pixels is larger. The several pixels in the imaging ranging sensor may be arranged in an array of any regular shape, for example, a rectangular array, a circular array, or any other regular polygonal array. The number of the sub-pixels included in each pixel can be set according to actual needs, and as long as it is ensured that each pixel includes an infrared sub-pixel, a red sub-pixel, a green sub-pixel and a blue sub-pixel, when the depth information and the RGB information of the object are acquired by the imaging distance measuring sensor, the depth information and the RGB information of each reflective point corresponding to each pixel on the object can be acquired. The depth information of each light reflecting point on the object comprises the distance between the light reflecting point and the imaging ranging sensor, the RGB information of each light reflecting point comprises the R value, the G value and the B value of the light reflecting point, and the R value, the G value and the B value can be CIE spectrum tristimulus values specifically.
In application, each sub-pixel comprises a single photon light sensing unit and a filter covering the single photon light sensing unit, the filters of the infrared sub-pixel, the red sub-pixel, the green sub-pixel and the blue sub-pixel are respectively an infrared filter which can only pass infrared light, a red filter which can only pass red light, a green filter which can only pass green light and a blue filter which can only pass blue light, and each filter can be set as a reflection filter or an absorption filter according to actual needs.
In application, the Single Photon photosensitive unit may be a Single Photon Avalanche Diode (SPAD) or a photomultiplier tube, and may respond to an incident Single Photon and output a light sensing signal for indicating a Time when the Photon reaches the Single Photon photosensitive unit, and based on the light sensing signal output by the Single Photon photosensitive unit, the collection of a weak light signal and the calculation of a flight Time may be implemented by using a Time-Correlated Single Photon Counting method (TCSPC).
As shown in fig. 1, a first structural diagram of an imaging distance measuring sensor is exemplarily shown;
each pixel 101 includes four infrared sub-pixels IR, three red sub-pixels R, six green sub-pixels G, and three blue sub-pixels B;
four infrared sub-pixels IR are arranged in a 2 × 2 array to form an infrared sub-pixel unit 10, and one red sub-pixel R, two green sub-pixels G and one blue sub-pixel B are arranged in a bayer array to form a color sub-pixel unit 20;
each pixel 101 comprises one infrared sub-pixel element 10 and three color sub-pixel elements 20.
In application, the arrangement of one infrared sub-pixel unit and three color sub-pixel units included in each pixel can be set according to actual needs, for example, the infrared sub-pixel units and the three color sub-pixel units are arranged in a 1 × 4, 4 × 1 or 2 × 2 array.
Fig. 1 schematically shows a structure of an imaging range sensor in which one infrared sub-pixel unit 10 and three color sub-pixel units 20 are arranged in a 2 × 2 array.
In the structure of the imaging ranging sensor shown in fig. 1, four infrared sub-pixels are arranged in a 2 × 2 array form to form one infrared sub-pixel unit, so that the four infrared sub-pixels can be combined (bining) into one super-pixel, for example, the four infrared sub-pixels are connected to one or gate or one and gate to form one infrared super-pixel, which can effectively improve the signal-to-noise ratio of the infrared sensing signal and improve the ranging accuracy; by arranging one red sub-pixel, two green sub-pixels, and one blue sub-pixel into one color sub-pixel unit in the bayer array format (i.e., RGGB), color information can be reduced using a well-established bayer array-based algorithm, and imaging accuracy is high.
As shown in fig. 2, a second structural diagram of the imaging range sensor is exemplarily shown, wherein each pixel 102 includes one infrared sub-pixel IR, one red sub-pixel R, one green sub-pixel G and one blue sub-pixel B arranged in a 2 × 2 array.
The imaging ranging sensor structure shown in fig. 2 arranges four sub-pixels into a pixel in a 2 × 2 array form, so that the occupied area of a single pixel is small, and the imaging ranging sensor can accommodate more pixels in the same area, thereby effectively improving the resolution of the imaging ranging sensor and ensuring that the fusion result of depth information and RGB information is more accurate.
As shown in fig. 3, a third structural diagram of the imaging distance measuring sensor is exemplarily shown;
wherein each pixel 103 comprises M × M infrared subpixels IR, M × M red subpixels R, M × M green subpixels G, and M × M blue subpixels B;
the M × M infrared sub-pixels IR are arranged in an array form to form an infrared sub-pixel unit 1, the M × M red sub-pixels R are arranged in an array form to form a red sub-pixel unit 2, the M × M green sub-pixels G are arranged in an array form to form a green sub-pixel unit 3, and the M × M blue sub-pixels B are arranged in an array form to form a blue sub-pixel unit 4;
each pixel 103 includes an infrared sub-pixel unit 1, a red sub-pixel unit 2, a green sub-pixel unit 3, and a blue sub-pixel unit 4.
In application, M is an integer greater than or equal to 2, and the specific value of M can be set according to actual needs. The arrangement of one infrared sub-pixel unit, one red sub-pixel unit, one green sub-pixel unit and one blue sub-pixel unit included in each pixel can be set according to actual needs, for example, the infrared sub-pixel unit, the red sub-pixel unit, the green sub-pixel unit and the blue sub-pixel unit are arranged in a 1 × 4, 4 × 1 or 2 × 2 array.
Fig. 3 exemplarily shows a schematic structural diagram of an imaging ranging sensor when M is 2 and one infrared sub-pixel unit, one red sub-pixel unit, one green sub-pixel unit, and one blue sub-pixel unit are arranged in a 2 × 2 array.
In the structure of the imaging ranging sensor shown in fig. 3, by arranging M × M infrared sub-pixels in an M × M array form into one infrared sub-pixel unit, four infrared sub-pixels can be combined into one infrared super-pixel, for example, the four infrared sub-pixels are connected to one or gate or one and gate to form one infrared super-pixel, so that the signal-to-noise ratio of the infrared sensing signal can be effectively improved, and the ranging accuracy is improved;
by arranging the M multiplied by M red sub-pixels into a red sub-pixel unit in an M multiplied by M array form, four red sub-pixels can be synthesized into a red super-pixel, for example, the four red sub-pixels are connected into an OR gate or an AND gate to form a red super-pixel, so that the signal-to-noise ratio of a red sensing signal can be effectively improved, and the precision of an R value can be improved;
the four green sub-pixels can be combined into one green super-pixel by arranging the M multiplied by M green sub-pixels into one green sub-pixel unit in an M multiplied by M array form, for example, the four green sub-pixels are connected into one OR gate or one AND gate to form one green super-pixel, so that the signal-to-noise ratio of a green light induction signal can be effectively improved, and the precision of a G value is improved;
by arranging the M multiplied by M blue sub-pixels into a blue sub-pixel unit in an M multiplied by M array form, four blue sub-pixels can be synthesized into one blue super-pixel, for example, the four blue sub-pixels are connected into one OR gate or one AND gate to form one blue super-pixel, so that the signal-to-noise ratio of a blue sensing signal can be effectively improved, and the precision of a B value is improved;
the red sub-pixel unit, the green sub-pixel unit and the blue sub-pixel unit in each pixel have a structure which can improve the accuracy of RGB information as a whole.
The imaging ranging sensor provided by the embodiment of the application enables each pixel to comprise an infrared sub-pixel, a red sub-pixel, a green sub-pixel and a blue sub-pixel, and enables each sub-pixel to be composed of a single photon light-sensitive unit and a light filter covering the single photon light-sensitive unit, so that one imaging ranging sensor can simultaneously obtain the depth information and the RGB information, a depth image obtained based on the depth information and an RGB image obtained based on the RGB information have no visual angle difference, alignment is not needed when the depth image and the RGB image are fused, and operation resources and time can be effectively saved.
As shown in fig. 4, based on the structure of the imaging ranging sensor in any of the embodiments, the embodiment of the present application further provides an imaging ranging method, including:
step S201, acquiring a first light sensing signal which is output by each pixel and used for indicating the time of receiving infrared light reflected by an object;
step S202, acquiring a second light sensing signal which is output by each pixel and used for indicating the time of receiving the visible light reflected by the object;
step S203, acquiring depth information of the object according to the first light sensing signal;
and step S204, acquiring RGB information of the object according to the second light sensing signal.
In application, the infrared sub-pixel in each pixel is used for receiving infrared light reflected by a reflecting point at a corresponding position on an object, outputting a first light sensing signal when receiving the infrared light, calculating the distance between each reflecting point on the object and the imaging ranging sensor by using a flight time method based on the time of receiving the first light sensing signal and the time of transmitting the infrared light to the object, and obtaining the depth image of the object based on the depth information of the object, wherein the depth information of the object comprises the distances between all the reflecting points on the object and the imaging ranging sensor. When the imaging ranging sensor is only used for acquiring depth information, the depth information of the object can also be acquired according to the second light sensing signal, that is, the distance between each reflecting point on the object and the imaging ranging sensor can be calculated by using a flight time method based on the time of receiving the second light sensing signal and the time of emitting visible light to the object.
It should be understood that since the depth information of the object includes the distance between each light reflection point on the object and the imaging ranging sensor, the imaging ranging sensor can be used for ranging in addition to obtaining the depth information and RGB information.
In application, the color sub-pixels in each pixel are used for receiving visible light rays reflected by the reflective points at corresponding positions on the object, outputting a second photoinduction signal when the visible light rays are received, obtaining the RGB value (including the R value, the G value and the B value) of each reflective point on the object based on the second photoinduction signal, wherein the RGB information of the object includes the RGB values of all the reflective points on the object, and obtaining the RGB image of the object based on the RGB information of the object. Similarly, the infrared image information of the object can be obtained based on the first light sensing signal, and the infrared image of the object is obtained based on the infrared image information.
In one embodiment, step S203 includes:
acquiring the light intensity of infrared light reflected by the object according to the first light sensing signal;
acquiring infrared image information of the object according to the light intensity of the infrared light reflected by the object;
step S204 includes:
acquiring the light intensity of the visible light reflected by the object according to the second light sensing signal;
acquiring RGB information of the object according to the light intensity of the visible light reflected by the object;
in application, the infrared image information of the object is obtained based on the light intensity of the infrared light reflected by each reflection point on the object, and similarly, the RGB information of the object is obtained based on the light intensity of the visible light reflected by each reflection point on the object. According to the light intensity of the infrared light reflected by each point on the object, the brightness value corresponding to each reflection point in the infrared image information of the object can be obtained, the brightness value can be embodied as a gray value in the infrared image, and the infrared image obtained based on the infrared image information of the object is a gray image only containing the gray value corresponding to each reflection point. According to the light intensity of the visible light reflected by each reflecting point on the object, the brightness value corresponding to each reflecting point in the RGB information of the object can be obtained, and according to the brightness value of each reflecting point and the color of each color pixel in the corresponding pixel, the RGB value of each reflecting point can be obtained based on the calculation formula of the tristimulus values, so that the RGB information comprising the RGB value of each reflecting point of the object is obtained, and the RGB image of the object is further obtained.
In one embodiment, the acquiring the light intensity of the visible light reflected by the object according to the second light sensing signal includes:
acquiring the total photon number or the total peak signal frequency received by the imaging ranging sensor according to the second optical sensing signal;
acquiring the light intensity of the visible light reflected by the object according to a first corresponding relation between preset visible light intensity and total photon number or total peak signal frequency;
similarly, the acquiring the light intensity of the infrared light reflected by the object according to the first light sensing signal includes:
acquiring the total photon number or the total peak signal frequency received by the imaging ranging sensor according to the first light sensing signal;
and acquiring the light intensity of the infrared light reflected by the object according to a second corresponding relation between the preset infrared light intensity and the total photon number or the total peak signal frequency.
In application, since each single photon photosensitive unit is configured to respond to an incident single photon and output a light sensing signal indicating a time when the photon arrives at the single photon photosensitive unit, the total photon number of the visible light received by the imaging ranging sensor can be obtained according to the light sensing signals output by all the color sub-pixels, that is, the second light sensing signal. The Time of flight is calculated and converted into a Time code according to the photoinduction signal output by each color sub-pixel through a Time To Digital Converter (TDC) connected with the color sub-pixel in each pixel, then the Time code output by the Time Digital Converter is saved through a histogram circuit connected with the Time Digital Converter, a histogram is generated according to the Time code saved in at least one emission period of visible light, and the frequency of peak signals in the histogram is obtained, so that the total frequency of the peak signals in the histogram generated according to the photoinduction signal output by the color sub-pixel in each pixel can be obtained, and the total frequency of the peak signals is obtained. Similarly, the total number of photons that can obtain the infrared light received by the imaging ranging sensor and the total number of times of peak signals in a histogram generated according to the photosensitive signals output by the infrared sub-pixels in each pixel can also be obtained, so as to obtain the total number of times of peak signals. The first corresponding relationship and the second corresponding relationship may be obtained through a large number of experiments in advance by using the above-mentioned manner of obtaining the total photon number and the total peak signal number.
In one embodiment, before acquiring the total number of photons or the total number of peak signals received by the imaging ranging sensor according to the second photo sensing signal, the method includes:
respectively acquiring the total photon number or the total peak signal frequency received by the imaging ranging sensor under different light intensities of the visible light rays reflected by the object;
acquiring a first corresponding relation between different visible light intensities and the total photon number or the total peak signal number according to the total photon number or the total peak signal number acquired under each light intensity of the visible light reflected by the object;
similarly, before acquiring the total number of photons or the total number of peak signals received by the imaging ranging sensor according to the first light sensing signal, the method includes:
respectively acquiring the total photon number or the total peak signal frequency received by the imaging distance measuring sensor under different light intensities of infrared light rays reflected by the object;
and acquiring a second corresponding relation between different infrared light intensities and the total photon number or the total peak signal number according to the total photon number or the total peak signal number acquired under each light intensity of the infrared light reflected by the object.
In application, infrared rays with different light intensities and visible rays with different light intensities can be emitted to an object respectively or simultaneously, and then based on the mode, a first corresponding relation between different visible light intensities and the total photon number or the total peak signal frequency and a second corresponding relation between different infrared light intensities and the total photon number or the total peak signal frequency are obtained. Therefore, when RGB information needs to be obtained subsequently, the light intensity of the visible light can be obtained according to the first corresponding relation and the total photon number or the total peak signal frequency obtained based on the second light sensing signal; when infrared image information needs to be obtained subsequently, the light intensity of the infrared light can be obtained according to the second corresponding relation and the total photon number or the total peak signal frequency obtained based on the first light sensing signal.
In an application, the first corresponding relation and the second corresponding relation may exist in the form of a corresponding relation table, for example, a display lookup table.
As shown in fig. 5, a table of correspondence among light intensity, total photon number, and total peak signal number is exemplarily shown.
The imaging distance measurement method provided by the embodiment of the application can directly acquire the depth signal and the RGB information of an object through the first light-induced signal and the second light-induced signal output by the imaging distance measurement sensor, so that the depth image acquired based on the depth information and the RGB image acquired based on the RGB information have no visual angle difference, alignment is not needed when the depth image and the RGB image are subjected to image fusion, and operation resources and time can be effectively saved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
As shown in fig. 6, an embodiment of the present application provides an imaging ranging system 1000, including:
a light emitter 200 for emitting infrared rays and visible rays toward the object 2000;
an imaging range sensor 100;
and a controller 300 connected to the light emitter 200 and the imaging range sensor 100, respectively, for performing the imaging range finding method in the above-described embodiment.
In application, the imaging range finding system at least comprises a controller, a light emitter and an imaging range finding sensor, and may further comprise a collimating Optical element and a Diffractive Optical Element (DOE) covering the light emitter, a focusing lens or a micro-lens array covering the imaging range finding sensor, and the like. The collimating optical element is used for collimating the light rays emitted by the light emitter, and the diffracting optical element is used for diffracting the light rays. The lens or the micro-lens array is used for focusing the light reflected by the object on the photosensitive surface of the imaging distance measuring sensor. The controller is used for controlling the light emitter and the imaging distance measuring sensor to be turned on or off and can also adjust the light intensity of light emitted by the light emitter. The object may be any object in free space that reflects light.
In application, the Light emitter may be set as a Laser, a Light Emitting Diode (LED), a Laser Diode (LD), an Edge-Emitting Laser (EEL), or the like according to actual needs. The Laser may be a Vertical-Cavity Surface-Emitting Laser (VCSEL). The optical transmitter may be a tunable device.
In application, the controller may include signal amplifiers, Time-to-Digital converters (TDCs), Analog-to-Digital converters (ADCs), and the like, which are equal to the number of pixels or single photon sensing units in the imaging range sensor. The time rate converter connected to each pixel is connected to a histogram circuit.
In Application, the controller may further include a Central Processing Unit (CPU), other general purpose controllers, a Digital Signal controller (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA), other Programmable logic devices, discrete gates, transistor logic devices, discrete hardware components, or the like. The general controller may be a microcontroller or any conventional controller or the like.
The imaging distance measuring sensor, the imaging distance measuring method and the imaging distance measuring system provided by the embodiment of the application can be applied to terminal devices such as mobile phones, tablet computers, wearable devices, vehicle-mounted devices, Augmented Reality (AR) devices, Virtual Reality (VR) devices, notebook computers, netbooks and Personal Digital Assistants (PDAs).
The embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a controller, the steps in the above-mentioned embodiments of the imaging ranging method may be implemented.
The embodiments of the present application further provide a computer program product, which when running on a terminal device, enables the terminal device to implement the steps in the above-mentioned imaging ranging method embodiments when executed.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment. In the embodiments provided in the present application, it should be understood that the disclosed system/terminal device and method may be implemented in other ways. For example, the system/terminal device embodiments described above are merely illustrative.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.