Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
First, in order to facilitate understanding of the embodiments of the present invention, some terms or nouns referred to in the present invention will be explained as follows:
digital Signal Processing (DSP): refers to the theory and technique of representing and processing signals digitally.
Example 1
In accordance with an embodiment of the present invention, there is provided an embodiment of a method of controlling a night vision system, it being noted that the steps illustrated in the flowchart of the figure may be performed in a computer system such as a set of computer executable instructions and that, although a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than that presented herein.
Fig. 1 is a flow chart of the steps of a method of controlling a night vision system according to an embodiment of the invention, as shown in fig. 1, comprising the steps of:
step S102, determining a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens;
step S104, acquiring a first distance between the imaging plane of the first image acquisition device and a target object based on the second distance;
step S106, determining a target light-emitting angle of the laser lens according to the first distance and the second distance, wherein the laser lens is used for emitting laser for supplementing light for the first image acquisition device when the first image acquisition device shoots the target object;
and step S108, controlling the laser lens to adjust to the position corresponding to the target light-emitting angle.
In the embodiment of the invention, a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens is determined; acquiring a first distance between an imaging plane of the first image acquisition device and a target object based on the second distance; determining a target light-emitting angle of the laser lens according to the first distance and the second distance, wherein the laser lens is used for emitting laser for supplementing light for the first image acquisition device when the first image acquisition device shoots the target object; the laser lens is controlled to be adjusted to the position corresponding to the target light-emitting angle, the aim of irradiating laser spots to the center of the field of view of the low-illumination camera by controlling the rotation of the laser lens is achieved, the technical effects of improving the efficiency and the light supplementing rate of laser illumination are achieved, and the technical problem that the night vision system in the prior art cannot adjust the angle of the laser lens in real time to cause low efficiency of the laser illumination is solved.
It should be clear that, alternatively, the current light-exiting angle and the target light-exiting angle may be any angle between 0 and 45 °.
In an optional embodiment, the first image capturing device is configured to capture image data of the target object in a near infrared band, and the second image capturing device is configured to capture image data of the target object in a far infrared band.
Optionally, a laser lens of the laser device is fixedly disposed on the laser angle fine-tuning device, and the laser angle fine-tuning device may be connected to the controller, and is configured to adjust the laser lens to a position corresponding to the target light-emitting angle under the control of the controller, where the laser angle fine-tuning device may adjust a horizontal angle and a vertical angle of the laser lens.
As an alternative embodiment, the first image capturing device may be a low-light camera, and the first image capturing device may include, but is not limited to: a visible light telephoto lens; the low-illumination camera can be used for sensing low-illumination information of visible light of the environment, and can provide images of visible light wave bands for the DSP high-speed fusion circuit under the control of the controller, and the zoom focusing position of the visible light telephoto lens on the low-illumination camera is controlled by the controller.
As an alternative embodiment, the second image capturing device may be an infrared camera. The infrared camera can be used for sensing the temperature field of the environment and imaging in an invisible light wave band (such as a far infrared wave band), and further the infrared camera can provide images in the far infrared wave band for the DSP high-speed fusion circuit under the control of the controller. The second image acquisition device further comprises an infrared zoom lens, wherein the zooming and focusing positions of the infrared zoom lens are controlled by the controller.
In an alternative embodiment, fig. 2 is a flow chart of steps of an alternative method of controlling a night vision system according to an embodiment of the invention, as shown in fig. 2, before the first distance and the second distance are obtained, the method further comprising:
step S202, detecting whether the optical axis of the first image acquisition device is parallel to the optical axis of the second image acquisition device;
step S204, if the detection result is negative, adjusting the optical axis of the first image acquisition device to be parallel to the optical axis of the second image acquisition device;
in step S206, if the detection result is yes, the first image feature data collected by the first image collection device and the second image feature data collected by the second image collection device are extracted.
It should be noted that, it is detected whether the optical axis of the first image capturing device is parallel to the optical axis of the second image capturing device, that is, whether the first image capturing device and the second image capturing device are coaxial.
Based on the optional embodiments provided in steps S202 to S206, a precondition for controlling the laser lens to adjust to the position corresponding to the target light-emitting angle is that the optical axis of the first image capturing device is parallel to the optical axis of the second image capturing device. Therefore, when the optical axis of the first image acquisition device is detected to be not parallel to the optical axis of the second image acquisition device, the optical axis of the first image acquisition device is adjusted to be parallel to the optical axis of the second image acquisition device.
As an alternative embodiment, when night vision needs to be performed according to the fusion mode, the controller in the night vision system may detect whether the optical axis of the preset visible light camera and the optical axis of the preset infrared camera are parallel, and when the optical axis of the visible light camera and the optical axis of the preset infrared camera are parallel, the controller performs sampling measurement on the horizontal angle and the pitch angle of the infrared angle fine-tuning device at the current moment. And according to the comparison sampling value and the expected value, the controller controls a stepping motor in the infrared angle fine adjustment device to rotate so as to enable the infrared camera to reach the designated position.
And after the infrared camera reaches the designated position, acquiring first image characteristic data acquired by the infrared camera and second image characteristic data acquired by the low-illumination camera, selecting a target object existing in both the first image characteristic data and the second image characteristic data, and marking the target object in the first image characteristic data. And then finding a corresponding target object in the laser imaging image through edge detection and feature point matching. As shown in FIG. 3, the energy barycenters Pr and Pl of the target object in the image are calculated by the energy barycenter method, and the coordinates are xrAnd x1。
As also shown in fig. 3, where d is the first distance; f1 is the focal length number of the lens of the first image pickup device (low-illuminance camera), and f2 is the focal length number of the lens of the second image pickup device (infrared camera); k is the first diagramThe distance between the imaging plane of the image acquisition device and the imaging plane of the second image acquisition device; b is a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens; x is the number ofrIs the imaging distance, x, of the energy center of gravity of the target object in the imaging plane of the first image acquisition device from the optical axisr1The distance between the imaging of the energy gravity center of the target object in the imaging plane of the first image acquisition device and a first preset parallel line is taken; x is the number oflIs the imaging distance, x, of the energy center of gravity of the target object in the imaging plane of the second image acquisition device from the optical axisl1The distance between the imaging of the energy center of gravity of the target object in the imaging plane of the second image capturing device and the second predetermined parallel line, O1 the optical axis of the first image capturing device (i.e., the optical axis of visible light), and O2 the optical axis of the second image capturing device (the optical axis of thermal imaging).
In an alternative embodiment, the second distance b between the optical axis of the first image capturing device and the optical axis of the laser lens may be determined according to the following formula:
based on the second distance b, a first distance d between the imaging plane of the first image capturing device and the target object can be acquired:
it should be noted that, in the above alternative embodiments provided in the present application, only the focal length number f1 of the lens of the first image capturing device, the focal length number f2 of the lens of the second image capturing device, the second distance b between the optical axis of the first image capturing device and the optical axis of the laser lens, the distance k between the imaging plane of the first image capturing device and the imaging plane of the second image capturing device, and the distance x between the imaging plane of the first image capturing device and the optical axis of the energy center of gravity of the target object in the imaging plane of the first image capturing device are requiredrAnd the energy center of gravity of the target objectDistance x between the image of the imaging plane of the second image acquisition device and the optical axisl。
In an alternative embodiment, the image capturing device in the present application may use either a fixed focus lens or a zoom lens, wherein if the fixed focus lens is used, the focal length f1 of the visible light lens of the low-illumination camera and the focal length f2 of the thermal imaging lens of the infrared camera are fixed. If the zoom lens is used, the focal length parameters of the visible light lens and the thermal imaging lens at the moment need to be read. The distance k between the imaging plane of the first image acquisition device and the imaging plane of the second image acquisition device can be determined, but is not limited to, by measurement. At this time, the present application can derive the first distance d between the target object to be observed and the imaging plane of the visible light camera.
As shown in fig. 4, the target light-emitting angle θ can be further determined based on the first distance d and the second distance b:
where θ is the target light-emitting angle.
In addition, it should be noted that fig. 4 includes the target object 1 and the target object 2, and since the first distance d between the imaging plane of the first image capturing device and the target object is different, the target light-emitting angle varies with the first distance d. The laser lens is used for emitting laser for supplementing light for the first image acquisition device when the first image acquisition device shoots the target object; the target light-emitting angle, that is, the angle of the laser emitted by the laser lens, can be adjusted to a position corresponding to the target light-emitting angle by controlling the laser lens through a laser angle fine-tuning device shown in fig. 4.
In an optional embodiment, if the determination result is yes, that is, if the current light-emitting angle of the laser lens is the target light-emitting angle, the laser device is controlled to adjust the laser spot to a preset size.
Fig. 5 is a schematic diagram of an alternative laser spot according to an embodiment of the present invention, and as shown in fig. 5, since the size and shape of the laser spot are not completely circular in the picture, it is highly likely that only a part of the circle appears in the picture. Since the application has already finished the horizontal direction calibration of the laser spot at this time, only the size of the laser spot and the angle in the vertical direction need to be adjusted. The edge of the laser spot is extracted through an edge detection algorithm, then binarization processing is carried out on the image data, and the radius and the center of a circle closest to the laser spot are fitted through a circle fitting algorithm.
As an optional embodiment, according to the relationship between the circle center radius and the view field size, the optimal size of the laser spot can be calculated, and at this time, the controller compares the fitted circle center, radius and optimal laser spot size to obtain the control strategy at this time. At this point, the apparatus controls the stepper motor on the laser lens according to this control strategy. The lens group in the laser lens is driven to move on the spiral groove of the laser lens by controlling the stepping motor. The optical system in the laser lens is adjusted to a proper position, so that the laser spot is in an optimal position, and the size of the laser spot can be matched with a target in a visual field.
In an alternative embodiment, the night vision system in the present application may calculate the zoom value and the focus value of the infrared zoom lens matched with the visible light camera at this time according to the size of the field of view at the current time. In addition, the night vision system can calculate an included angle between an optical axis (a main optical axis) of the infrared zoom lens and an optical axis (a main optical axis) of the visible light zoom lens according to the size of the field of view and the first distance d. That is, based on the above alternative embodiments, the night vision system in the present application may determine the best matching focus value and zoom value of the specific ir zoom lens and the best angle value of the ir angle fine adjustment device. After the operation is finished, the DSP high-speed fusion circuit transmits the angle value of the included angle to the controller through a data line by an RS485 level protocol.
In an optional embodiment, the controller receives the angle value calculated by the DSP high-speed fusion circuit, and samples the focus value and the zoom value of the infrared zoom lens to determine the current light-emitting angle of the laser lens in the laser device. And comparing the current light-emitting angle with the target light-emitting angle in the sampling result to judge whether the current light-emitting angle of the laser lens is the target light-emitting angle, and controlling the stepping motor to rotate in the direction of reducing the difference between the current light-emitting angle and the target light-emitting angle under the condition that the current light-emitting angle is not the target light-emitting angle until the difference between the current light-emitting angle and the target light-emitting angle is within a set range.
In an alternative embodiment, fig. 6 is a flowchart illustrating steps of an alternative method for controlling a night vision system according to an embodiment of the present invention, as shown in fig. 6, after extracting first image characteristic data acquired by the first image acquisition device and second image characteristic data acquired by the second image acquisition device, the method further includes:
step S302, judging whether the first image characteristic data is matched with the second image characteristic data;
step S304, if the first image characteristic data matches the second image characteristic data, a first distance between an imaging plane of the first image capturing device and a target object and a second distance between an optical axis of the first image capturing device and an optical axis of the laser lens are obtained.
As an alternative embodiment, when it is detected that the optical axis of the first image capturing device is parallel to the optical axis of the second image capturing device, the first image feature data captured by the first image capturing device and the second image feature data captured by the second image capturing device may be obtained, and when it is determined that the first image feature data matches the second image feature data, the first distance between the imaging plane of the first image capturing device and the target object and the second distance between the optical axis of the first image capturing device and the optical axis of the laser lens may be obtained.
In an alternative embodiment, fig. 7 is a flowchart illustrating steps of an alternative method for controlling a night vision system according to an embodiment of the present invention, where, after controlling the laser lens to adjust to a position corresponding to the target light-emitting angle, as shown in fig. 7, the method further includes:
step S402, acquiring first image data acquired by the first image acquisition device and second image data acquired by the second image acquisition device;
step S404, an image fusion algorithm is adopted to perform fusion processing on the first image data and the second image data to obtain third image data; and outputting the third image data.
In an optional implementation manner, if the laser angle fine-tuning device has controlled the laser lens to adjust to the position corresponding to the target light-emitting angle, or the controller detects that the difference between the current light-emitting angle of the laser lens in the laser device and the target light-emitting angle is within the set range, the controller may send a trigger signal to the DSP high-speed fusion circuit, and the DSP high-speed fusion circuit reads each frame of image captured by the visible light camera and the infrared thermal imaging camera through the video decoder to obtain the first image data and the second image data.
As an alternative embodiment, the first image data and the second image data may be fused by a pyramid decomposition fusion algorithm or an odd-even segmentation fusion algorithm to obtain third image data, and the third image data may be displayed on a display.
Optionally, the image fusion algorithm may be a pyramid decomposition fusion algorithm, which is also called a tower decomposition fusion algorithm, or an odd-even segmentation fusion algorithm.
As an alternative embodiment, an alternative image-based optical axis smart-adjusting night vision fusion system further provided in the present application may include, but is not limited to: the system comprises an infrared camera, an infrared zoom lens, an infrared angle fine adjustment device, a low-illumination camera, a visible light zoom lens, a laser driver, a laser lens, a laser angle fine adjustment device, a controller, a photosensitive controller, a DSP high-speed fusion circuit, a display and the like.
In an alternative embodiment, the infrared zoom lens and the infrared camera are connected through a connecting ring and fixed on the infrared angle fine adjustment device through a lens bracket. The infrared zoom lens is communicated with the control circuit through the position feedback potentiometer. The infrared camera is connected with the DSP high-speed fusion circuit through a signal wire and is communicated with the control circuit through the signal wire. The infrared angle fine-tuning device is connected with the controller through a signal wire. The position of an optical axis passing through the capture infrared imaging system in the infrared angle fine adjustment device is transmitted back to the controller, and the controller controls the infrared angle fine adjustment device according to the back transmission angle.
In an alternative embodiment, the low-light camera and the visible light zoom lens are connected through a connecting ring and fixed on the bottom plate through a hoop. The visible light zoom lens is communicated with the controller through a position feedback potentiometer; the low-illumination camera is connected with the DSP high-speed fusion circuit through a signal line and communicated with the controller through the signal line.
In an alternative embodiment, the laser driver is connected with the laser through a power supply line, the laser is led out through the switching optical fiber, the switching optical fiber is connected to the laser lens through the flange, the size of a laser spot is adjusted through an optical system in the laser lens, and laser energy is uniformly distributed on the spot to homogenize the laser. The laser lens is fixed on the laser angle fine-tuning device through the lens support. Wherein, the laser driver is connected with the controller through a signal line. The laser angle fine-tuning device is connected with the controller through a signal line. The laser lens communicates with the controller through a position feedback potentiometer.
In an alternative embodiment, the DSP high-speed fusion circuit may communicate with the controller, and may transmit the fused image data to a display for displaying through a signal line.
Example 2
An embodiment of the present invention further provides an apparatus for implementing the method for controlling a night vision system, fig. 8 is a schematic structural diagram of an apparatus for controlling a night vision system according to an embodiment of the present invention, and as shown in fig. 8, the apparatus for controlling a night vision system includes: a first determination module 10, an acquisition module 12, a second determination module 14, and a control module 16, wherein,
the first determining module 10 is configured to determine a second distance between the optical axis of the first image capturing device and the optical axis of the laser lens; the obtaining module 12 is configured to obtain a first distance between an imaging plane of the first image capturing device and the target object based on the second distance; the second determining module 14 is configured to determine a target light-emitting angle of the laser lens according to the first distance and the second distance, where the laser lens is configured to emit laser light for supplementing light to the first image capturing device when the first image capturing device captures a target object; and the control module 16 is used for controlling the laser lens to adjust to a position corresponding to the target light-emitting angle.
In the embodiment of the present invention, the first determining module 10 is configured to determine a second distance between the optical axis of the first image capturing device and the optical axis of the laser lens; the obtaining module 12 is configured to obtain a first distance between an imaging plane of the first image capturing device and the target object based on the second distance; the second determining module 14 is configured to determine a target light-emitting angle of the laser lens according to the first distance and the second distance, where the laser lens is configured to emit laser light for supplementing light to the first image capturing device when the first image capturing device captures a target object; the control module 16 is used for controlling the laser lens to adjust to a position corresponding to the target light-emitting angle, so that the purpose of irradiating laser spots to the center of the field of view of the low-illumination camera by controlling the laser lens to rotate is achieved, the technical effects of improving the efficiency and the light supplement rate of laser illumination are achieved, and the technical problem that the night vision system in the prior art cannot adjust the angle of the laser lens in real time to cause low efficiency of laser illumination is solved.
It should be noted that the above modules may be implemented by software or hardware, for example, for the latter, the following may be implemented: the modules can be located in the same processor; alternatively, the modules may be located in different processors in any combination.
It should be noted that the first determining module 10, the obtaining module 12, the second determining module 14 and the control module 16 correspond to steps S102 to S108 in embodiment 1, and the modules are the same as the corresponding steps in the implementation example and application scenario, but are not limited to the disclosure in embodiment 1. It should be noted that the modules described above may be implemented in a computer terminal as part of an apparatus.
It should be noted that, reference may be made to the relevant description in embodiment 1 for alternative or preferred embodiments of this embodiment, and details are not described here again.
The above-described arrangement for controlling a night vision system may further comprise a processor and a memory, the above-described first determining module 10, the acquiring module 12, the second determining module 14, the control module 16, etc. being stored in the memory as program elements, the processor executing the above-described program elements stored in the memory to implement the respective functions.
The processor comprises a kernel, and the kernel calls a corresponding program unit from the memory, wherein one or more than one kernel can be arranged. The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
The embodiment of the application also provides a storage medium. Optionally, in this embodiment, the storage medium includes a stored program, and the device on which the storage medium is located is controlled to execute any one of the above methods for controlling a night vision system when the program runs.
Optionally, in this embodiment, the storage medium may be located in any one of computer terminals in a computer terminal group in a computer network, or in any one of mobile terminals in a mobile terminal group.
The embodiment of the application also provides a processor. Optionally, in this embodiment, the processor is configured to execute a program, where the program executes any one of the above methods for controlling a night vision system.
The embodiment of the application provides equipment, the equipment comprises a processor, a memory and a program which is stored on the memory and can run on the processor, and the following steps are realized when the processor executes the program: determining a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens; acquiring a first distance between an imaging plane of the first image acquisition device and a target object based on the second distance; determining a target light-emitting angle of the laser lens according to the first distance and the second distance, wherein the laser lens is used for emitting laser for supplementing light for the first image acquisition device when the first image acquisition device shoots the target object; and controlling the laser lens to adjust to the position corresponding to the target light-emitting angle.
Optionally, when the processor executes a program, it may further detect whether an optical axis of the first image capturing device is parallel to an optical axis of the second image capturing device; if the detection result is negative, adjusting the optical axis of the first image acquisition device to be parallel to the optical axis of the second image acquisition device; and if the detection result is positive, extracting first image characteristic data acquired by the first image acquisition device and second image characteristic data acquired by the second image acquisition device.
Optionally, when the processor executes a program, it may further determine whether the first image feature data matches the second image feature data; and if the first image characteristic data is matched with the second image characteristic data, acquiring a first distance between an imaging plane of the first image acquisition device and a target object and a second distance between an optical axis of the first image acquisition device and an optical axis of the laser lens.
Optionally, when the processor executes a program, first image data acquired by the first image acquisition device and second image data acquired by the second image acquisition device may also be acquired; performing fusion processing on the first image data and the second image data by adopting an image fusion algorithm to obtain third image data; and outputting the third image data.
Optionally, when the processor executes a program, the first image capturing device is configured to capture image data of the target object in a near infrared band, and the second image capturing device is configured to capture image data of the target object in a far infrared band.
Optionally, when the processor executes the program, the first distance between the imaging plane of the first image capturing device and the target object may be obtained according to the following formula:
wherein d is the first distance; f1 is the focal length number of the lens of the first image acquisition device, and f2 is the focal length number of the lens of the second image acquisition device; k is a distance between an imaging plane of the first image capturing device and an imaging plane of the second image capturing device; b is a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens; x is the number of
rThe distance between the imaging of the energy center of gravity of the target object in the imaging plane of the first image acquisition device and the optical axis is obtained; x is the number of
lThe distance between the imaging of the energy center of gravity of the target object in the imaging plane of the second image acquisition device and the optical axis is shown.
Optionally, when the processor executes the program, the target light-emitting angle may be determined according to the first distance and the second distance by the following formula:
where θ is the target light-emitting angle.
The present application further provides a computer program product adapted to perform a program for initializing the following method steps when executed on a data processing device: determining a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens; acquiring a first distance between an imaging plane of the first image acquisition device and a target object based on the second distance; determining a target light-emitting angle of the laser lens according to the first distance and the second distance, wherein the laser lens is used for emitting laser for supplementing light for the first image acquisition device when the first image acquisition device shoots the target object; and controlling the laser lens to adjust to the position corresponding to the target light-emitting angle.
Optionally, when the computer program product executes a program, it may further detect whether an optical axis of the first image capturing device is parallel to an optical axis of the second image capturing device; if the detection result is negative, adjusting the optical axis of the first image acquisition device to be parallel to the optical axis of the second image acquisition device; and if the detection result is positive, extracting first image characteristic data acquired by the first image acquisition device and second image characteristic data acquired by the second image acquisition device.
Optionally, when the computer program product executes a program, it may further determine whether the first image feature data matches the second image feature data; and if the first image characteristic data is matched with the second image characteristic data, acquiring a first distance between an imaging plane of the first image acquisition device and a target object and a second distance between an optical axis of the first image acquisition device and an optical axis of the laser lens.
Optionally, when the computer program product executes a program, first image data acquired by the first image acquisition device and second image data acquired by the second image acquisition device may also be acquired; performing fusion processing on the first image data and the second image data by adopting an image fusion algorithm to obtain third image data; and outputting the third image data.
Optionally, when the computer program product executes a program, the first image capturing device is configured to capture image data of the target object in a near infrared band, and the second image capturing device is configured to capture image data of the target object in a far infrared band.
Optionally, when the computer program product executes a program, the first distance between the imaging plane of the first image capturing device and the target object may be obtained according to the following formula:
wherein d is the first distance; f1 is the focal length number of the lens of the first image acquisition device, and f2 is the focal length number of the lens of the second image acquisition device; k is a distance between an imaging plane of the first image capturing device and an imaging plane of the second image capturing device; b is a second distance between the optical axis of the first image acquisition device and the optical axis of the laser lens; x is the number of
rThe distance between the imaging of the energy center of gravity of the target object in the imaging plane of the first image acquisition device and the optical axis is obtained; x is the number of
lThe distance between the imaging of the energy center of gravity of the target object in the imaging plane of the second image acquisition device and the optical axis is shown.
Optionally, when the computer program product executes a program, the target light-emitting angle may be determined according to the first distance and the second distance by the following formula:
where θ is the target light-emitting angle.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.