CN107563971A - A kind of very color high-definition night-viewing imaging method - Google Patents
A kind of very color high-definition night-viewing imaging method Download PDFInfo
- Publication number
- CN107563971A CN107563971A CN201710687848.XA CN201710687848A CN107563971A CN 107563971 A CN107563971 A CN 107563971A CN 201710687848 A CN201710687848 A CN 201710687848A CN 107563971 A CN107563971 A CN 107563971A
- Authority
- CN
- China
- Prior art keywords
- image
- lens
- color
- face
- image sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 97
- 238000000034 method Methods 0.000 claims abstract description 222
- 230000004297 night vision Effects 0.000 claims abstract description 56
- 230000003287 optical effect Effects 0.000 claims description 95
- 238000001914 filtration Methods 0.000 claims description 63
- 108091006146 Channels Proteins 0.000 claims description 46
- 238000009499 grossing Methods 0.000 claims description 33
- 230000005540 biological transmission Effects 0.000 claims description 25
- 238000005286 illumination Methods 0.000 claims description 25
- 230000009466 transformation Effects 0.000 claims description 20
- 239000011159 matrix material Substances 0.000 claims description 17
- 238000012545 processing Methods 0.000 claims description 16
- 238000000354 decomposition reaction Methods 0.000 claims description 13
- 230000001965 increasing effect Effects 0.000 claims description 12
- 238000004040 coloring Methods 0.000 claims description 11
- 230000000007 visual effect Effects 0.000 claims description 10
- 230000004927 fusion Effects 0.000 claims description 9
- 238000000926 separation method Methods 0.000 claims description 9
- 238000012937 correction Methods 0.000 claims description 6
- 238000013507 mapping Methods 0.000 claims description 6
- 230000004438 eyesight Effects 0.000 claims description 5
- 230000008569 process Effects 0.000 claims description 5
- 230000002708 enhancing effect Effects 0.000 claims description 4
- 238000010606 normalization Methods 0.000 claims description 4
- 206010034960 Photophobia Diseases 0.000 claims description 3
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 208000013469 light sensitivity Diseases 0.000 claims description 3
- 101000694017 Homo sapiens Sodium channel protein type 5 subunit alpha Proteins 0.000 claims description 2
- 238000006467 substitution reaction Methods 0.000 claims description 2
- 101100278332 Dictyostelium discoideum dotA gene Proteins 0.000 claims 1
- 108090000765 processed proteins & peptides Proteins 0.000 claims 1
- 101150116173 ver-1 gene Proteins 0.000 claims 1
- 239000003086 colorant Substances 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 10
- 238000005070 sampling Methods 0.000 description 8
- 239000013589 supplement Substances 0.000 description 7
- 230000008901 benefit Effects 0.000 description 6
- 238000001444 catalytic combustion detection Methods 0.000 description 5
- 230000009467 reduction Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 238000004088 simulation Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 230000009977 dual effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000011835 investigation Methods 0.000 description 2
- 238000001931 thermography Methods 0.000 description 2
- 238000012952 Resampling Methods 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003333 near-infrared imaging Methods 0.000 description 1
- 238000007500 overflow downdraw method Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000000053 physical method Methods 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Landscapes
- Color Television Image Signal Generators (AREA)
Abstract
To solve the problems, such as that existing night vision imaging method is difficult to obtain very color high-definition night-viewing image, the present invention discloses a kind of very color high-definition night-viewing imaging method, it is characterized in that:High resolution gray image, the low resolution high s/n ratio coloured image of Same Scene are obtained, obtains very color high-definition night-viewing image to high resolution gray image colorant using low resolution high s/n ratio coloured image.Compared with existing night vision imaging method, the inventive method can significantly reduce imaging system cost, and lifting color night vision image quality.
Description
Technical Field
The invention belongs to the technical field of night vision imaging, and particularly relates to a true-color high-definition night vision imaging method.
Background
The night vision imaging technology plays an important role in various fields such as military, police, civil use and the like. The current method for realizing night vision imaging comprises the following steps: infrared thermal imaging night vision, low-light-level photomultiplier night vision, laser light supplement night vision, and the like.
Infrared thermal imaging night vision technique realizes the formation of image through perception scene heat radiation, and the advantage is: the method is suitable for imaging at an extremely low illumination level even in a full black state, and has the defects of high device cost, low resolution and incapability of acquiring the surface texture information of the object. The low-light-level photomultiplier night vision imaging technology images on a fluorescent screen after photons are amplified by a photomultiplier tube by tens of thousands or even hundreds of thousands, and has the defects that: a color image cannot be obtained and the image noise is large. The laser light supplement night vision is used for remote light supplement by adopting laser as a light source, and images are shot by the image sensor, so that the night vision device has the advantages of being applicable to a non-luminous environment and easy to expose self.
For night vision imaging, the color of the target object is very important discrimination information. Research shows that compared with black and white images, the color images can improve the recognition speed by 30% and improve the recognition rate by 60%. Therefore, obtaining a true-color high-definition night vision image is a problem to be solved urgently in the technical field of night vision imaging at present.
At present, a Bayer image sensor is generally used for color imaging, the basic principle of which is shown in fig. 1, a Color Filter Array (CFA) composed of three color filters of r, g, and b is covered above a CCD or CMOS image sensor pixel array, each pixel senses a color component of one channel, and color components of other two channels are obtained by interpolating color components of adjacent pixels, so as to obtain an rgb color image. Because the optical filter is arranged at the front end of the pixel, each pixel can only receive photons in the current color wavelength range. And the pixel of the monochromatic image sensor is not provided with a color filter, so that all photons in the wavelength sensitive range (generally 0.5-0.9 um) can be obtained. Therefore, under the same illumination conditions, the Bayer pattern image sensor acquires fewer photons and generates more image noise than a monochrome image sensor. Theoretically, the sensitivity of a CCD or CMOS is closely related to the size of the pixel. Generally speaking, the larger the pixel size, the higher the sensitivity, the higher the image signal-to-noise ratio, and the better the imaging effect. Therefore, in order to improve the imaging capability of color images, the pixel size needs to be increased to increase the number of photons received and improve the signal-to-noise ratio of the images. However, increasing the pixel size increases the chip area of the high-resolution color image sensor, which significantly increases the manufacturing cost of the chip. Aiming at the problem, the high-resolution color night vision imaging method provided by the invention can greatly improve the color night vision imaging effect under the condition of not increasing the chip area.
At present, to realize color night vision imaging, application No.: CN201410038964.5, proposes to adopt Philips prism to divide the input light into 3 paths of R, G, B monochromatic lights, then adopts 3 monochromatic CCDs to image R, G, B paths of lights respectively, and finally, through a color combiner, the R, G, B three color images are fused into a color image; application number CN201410039525.6 proposes to use 3 dichroic mirrors to divide the input light into 3 paths of R, G, B monochromatic lights, to use monochromatic CCD to image at the rear end of the monochromatic lights, and finally to fuse R, G, B three color images into color images through a color combiner; the application number CN201510201154.1 adopts an electric color wheel to split light, uses EMCCD and monochrome CCD to image, and synthesizes color images in YUV color space. The above patents have problems that: after color separation is performed on input light, a monochromatic sensor is used for imaging, although a color image can be obtained, since the wavelength range of the R, G, B channel is narrow, a large amount of photon information is lost, and therefore the signal-to-noise ratio of the image is low. The invention adopts the technical principle different from the method, can realize high-resolution color night vision imaging only by one high-resolution monochrome image sensor and one low-resolution Bayer image sensor, and has the advantages of less quantity of required image sensors and low cost.
Techniques that are closer to the present invention also include: application number CN201410712735.7, which proposes a method, an apparatus and a terminal for photo noise reduction, comprising: starting a first camera and a second camera, wherein the first camera and the second camera are adjacent and are positioned on the same shooting plane; when a shutter instruction is detected, triggering the first camera to shoot a color picture of a current scene, and synchronously triggering the second camera to shoot a black-and-white picture of the current scene; and synthesizing the color picture and the black-and-white picture and outputting the synthesized picture. The disadvantages of this method are: 1) direct use of black and white and color camera output images for black and white and color image synthesis increases the synthesized image noise. The reason is that: the camera output image is an image processed by an ISP (image signal processor). In the low-illumination condition, a large amount of noise exists in an image shot by a Bayer image sensor, and pixel noise among different channels can be introduced into an interpolation pixel in the Demosaic process of ISP (internet service provider), so that the signal-to-noise ratio of an RGB (red, green and blue) color image after Demosaic is increased. The method of the invention directly utilizes the Bayer format Raw image, and has no problem of noise interference among different channels. In addition, the realization device of the invention also has near infrared imaging capability, and can further improve the imaging quality of black and white images; the gray image and color image fusion method provided by the invention has the advantages that the gray image is taken as a guide to carry out noise reduction or up-sampling refinement on the color image to obtain the color image, the imaging resolution of the color image is obviously reduced, the signal-to-noise ratio of the color image can be improved from the source through physical methods such as reducing the resolution of the color image, increasing the pixel size and the like, and the low-illumination color imaging quality is further improved under the condition of not increasing the size of an image sensor chip. Application number CN200810219843.5 proposes a method for fusing a low-resolution color image and a high-resolution black-and-white image, which includes resampling a three-channel color image, and then multiplying the result by the luminance ratio of a gray-scale image. The disadvantages of this method are: the detail information of the high-resolution gray-scale image is not fully utilized, and the color image is only resampled, so that the high-resolution color image with clear details is difficult to obtain. The invention is different from the method in that: the details of the high-resolution gray-scale image are kept more, and the details of the obtained high-resolution color image are richer.
Disclosure of Invention
In order to solve the problem that the existing night vision imaging method is difficult to obtain a true color high-definition night vision image, the invention discloses a true color high-definition night vision imaging method which is characterized by comprising the following steps: and obtaining a high-resolution gray image A and a low-resolution high-signal-to-noise ratio color image B of the same scene, and coloring the high-resolution gray image A by using the low-resolution high-signal-to-noise ratio color image B to obtain a true-color high-definition night vision image C.
The resolution of the true color high-definition night vision image C is not less than 100 ten thousand pixels.
The method for acquiring the high-resolution gray-scale image A comprises the following steps:
method a 1: imaging with a high-sensitivity monochrome image sensor, the high-sensitivity monochrome image sensor comprising: EMCCD, ICCD, ICMOS, SCMOS image sensors;
method a 2: imaging with a monochromatic image sensor, preferably a near infrared enhanced monochromatic image sensor, is used with added near infrared light source assist illumination.
The method for acquiring the low-resolution high-signal-to-noise-ratio color image B comprises methods B1 and B2.
The method b1 is characterized in that: shooting a Bayer image Q by using a high-sensitivity Bayer image sensor; according to a Bayer imaging mode, 2 x 2 pixels are taken as processing units, the average value of r and b channel pixels and 2 g channel pixels in 2 x 2 pixels in a Bayer image Q is taken to form an rgb color space image P, and the image P is resolvedThe rate is 1/4 of image Q; and carrying out mean value or Gaussian filtering downsampling on the image P to obtain an image D, wherein the size of a mean value filtering window is n1 × n1 pixels, and the value range of n1 is as follows: 3 to 100; carrying out image denoising processing on the image A, and enhancing the signal-to-noise ratio of the image A; carrying out mean filtering or Gaussian filtering downsampling on the image A to obtain an image E with the same resolution as the image D; three channels { D ] of image D with image E as a guide imager,Dg,DbCarrying out image denoising to obtain a low-resolution high-signal-to-noise-ratio color image B; the grayscale image a coincides with the Bayer image Q field of view.
The three channels { D of the image D are subjected to the image E as a guide imager,Dg,DbThe method for denoising the image comprises a method b1-a and a method b 1-b.
The method b1-a has the following operation steps:
step b 1-a-1:
carrying out Gaussian filtering on three channels of the image D, wherein the size of a Gaussian filtering window is s1 s1 pixels, and the value range of s1 is 3-111;
step b 1-a-2:
calculating a local smoothing filtering weight coefficient by using the image E, and carrying out local smoothing filtering on three channels of the image D:
taking a pixel E in an image EkWith EkIs in the neighborhood ΩjCalculating local smoothing filtering weight coefficients:
wherein, Ω is s2 × s2 pixel neighborhood, s2 is in the range of 3-111, I is pixel value, a is exponential function coefficient, in the range of 1-5, epsilon is adjustment factor, in the range of 0-100, | | | | | | represents the Euclidean distance of pixel value, k is pixel label, j is pixel label in the neighborhood Ω;
for the weight coefficient wjNormalization is carried out:
wherein, representing the summation of the weight coefficients of all pixels in the neighborhood Ω;
by normalized weight coefficient w'jAs a pixel D in an image DkFor pixel DkR, g, b three channel pixel valuesAnd (3) local smoothing filtering:
where the symbol "-" denotes a multiplication operation, r, g, b denote rgb color space,respectively representing the jth pixel D in the neighborhood omega of the image DjThe symbol "^" represents the pixel value after filtering;
traversing the image D to obtainResult of filteringAs low resolution high signal-to-noise ratio color imagesWherein,is the result of the r, g, b channel pixel filtering in the image D.
The method b1-b comprises the following steps:
step b 1-b-1:
calculating the local neighborhood mean of the image E:
μE=fm(E) (6)
calculating three color channels D of an image Dr,Dg,DbLocal neighborhood mean of (c):
step b 1-b-2:
calculating the local neighborhood variance of image E:
σE=fm(E.*E)-μE.*μE(10)
calculating three color channels D of an image E and an image Dr,Dg,DbLocal neighborhood covariance of the corresponding pixel in (1):
step b 1-b-3:
calculating linear transformation coefficients:
wherein, taur,τg,τbA penalty coefficient is obtained, and the value range is (0-10);
step b 1-b-4:
calculating the local neighborhood mean of the linear transformation coefficient:
step b 1-b-5:
for image D three color channels Dr,Dg,DbPerforming linear transformation:
f ism() is image mean filtering, window size s3 s3, s3 range 3-111, is dot product operation, which represents multiplication of elements at the same position in the matrix,/is dot division operation, which represents division of elements at the same position in the matrix;
by means of imagesAs low resolution high signal-to-noise ratio color images
The method b2 is characterized in that: under the condition that the area of a Bayer image sensor chip is not increased, the image resolution is reduced, the pixel size is increased, the light sensitivity of the Bayer image sensor is improved, and a Bayer image Q 'with a high signal-to-noise ratio is obtained, wherein the resolution of the Bayer image Q' is 1/f of the resolution of the image A, and the value range of f is 4-100000; according to a Bayer imaging mode, taking 2 x 2 pixels as a processing unit, and taking the average value of r and B channel pixels and 2 g channel pixels in 2 x 2 pixels in an image Q' as a low-resolution high-signal-to-noise-ratio color image B; the grayscale image a coincides with the Bayer image Q' field of view.
The method for obtaining the true-color high-definition night vision image C by coloring the high-resolution gray image A by using the low-resolution high-signal-to-noise-ratio color image B comprises a method C1 and a method C2.
The method c1 comprises the following steps:
step c 1-1:
carrying out image denoising processing on the image A, and enhancing the signal-to-noise ratio of the image A; selecting a color and brightness separation color space λ, creating a three channel blank image of the same resolution as image a: f ═ Fλ1,Fλ2,Fλ3In which Fλ1Is the luminance component, { Fλ2,Fλ3Is the colorA color component; converting the image B into a color space lambda to obtain a color image B on the color space lambdaλ={Bλ1,Bλ2,Bλ3In which B isλ1Is the luminance component, { Bλ2,Bλ3Is the color component;
step c 1-2:
substitution of image A into luminance component F of image Fλ1A is ═ a; for image BλColor component { B }λ2,Bλ3Upsampling and thinning are carried out to obtain a color component { B 'with the same resolution as the image F'λ2,B′λ3}; in color component { B'λ2,B′λ3Fill the color component of the image F { F }λ2=B′λ2,Fλ3=B′λ2};
Step c 1-3:
and converting the image F into an rgb color space image as a true color high-definition night vision image C, wherein the brightness information in the image C is derived from the image A, and the color information is derived from the image B.
The method c2 comprises the following steps:
step c 2-1:
selecting a color and brightness separation color space λ, creating a three channel blank image of the same resolution as image a: f ═ Fλ1,Fλ2,Fλ3In which Fλ1Is the luminance component, { Fλ2,Fλ3Is the color component; converting the image B into a color space lambda to obtain a color image B on the color space lambdaλ={Bλ1,Bλ2,Bλ3In which B isλ1Is the luminance component, { Bλ2,Bλ3Is the color component;
step c 2-2:
for image BλLuminance component B ofλ1Performing pixel interpolation to obtain a luminance component B 'with the same resolution as the image F'λ1;
Step c 2-3:
performing image edge-preserving multi-scale decomposition on the image A to obtain a base imageAnd detail imagesWherein (n +1) is the number of layers of image decomposition, and the value range is 1-10;
the base imageThe image edge-preserving smooth filtering results in:
multi-layer detail imageFrom the base imageAnd obtaining by difference with smooth images of different scales:
multi-layer detail imageIt can also be obtained by differencing two smooth images of different scales:
in formulae (29) to (31), fsmooth(. is) image edge-preserving smoothing filtering, superscript 0, z1、z2The index is a label of a filter scale layer, the larger the label is, the smaller the scale is, and the label is 0 to represent the maximum filtering scale;
step c 2-4: will be the base imageDetail imageAnd luminance component B'λ1Filling the weighted fusion into the brightness component F of the image Fλ1:
Wherein, { omega }0,ω1,…,ωn+1The weighting coefficient is in a value range of 0-1;
step c 2-5:
for image BλColor component { B }λ2,Bλ3Upsampling and thinning to obtain a color component { B 'with the same resolution as the image F'λ2,B′λ3} color component { B'λ2,B′λ3Fill to color components of image F: { Fλ2=B′λ2,Fλ3=B′λ2};
Step c 2-6:
and converting the image F into an rgb color space color image to be used as a true color high-definition night vision image C, wherein the brightness information in the image C is derived from the image B and the image A, and the color information is derived from the image B.
The color and brightness separation color space lambda comprises YUV, YCbCr, HSI, Lab and CIEXYZ color spaces.
In method c2, the image edge-preserving smoothing filter fsmooth(. includes a weighted minimum of twoThe method comprises the steps of edge smoothing by multiplication, edge smoothing by protection based on local extremum, edge smoothing by protection based on local Laplacian pyramid, median filtering and weighted median filtering.
In methods c1 and c2, for image BλColor component { B }λ2,Bλ3Upsampling and thinning to obtain a color component { B 'with the same resolution as the image F'λ2,B′λ3Methods of (1) include method d1 and method d 2.
Method d1 is characterized by: color component { B ] by sub-pixel interpolationλ2,Bλ3Upsampling the pixels to obtain color components { B'λ2,B′λ3}。
The sub-pixel interpolation method comprises nearest neighbor interpolation, bilinear interpolation, Gaussian interpolation, B-spline interpolation and RBF interpolation.
Method d2 is characterized by: with the luminance component F of the image Fλ1For guidance, for color component { Bλ2,Bλ3And (5) performing upsampling refinement, wherein the specific method comprises a method d2-a and a method d 2-b.
The method d2-a operates as follows:
step d 2-a-1: using linear interpolation method to process color component Bλ2,Bλ3Upsampling to obtain color components with the same resolution as the image F
Step d 2-a-2: for color componentPerforming Gaussian filtering, wherein the size of a filtering window is s4 × s4 pixels, and the value range of s4 is 3-111;
step d 2-a-3: with a luminance component Fλ1Calculating local smoothing filter weight coefficient for color componentAnd (3) local smoothing filtering:
taking the luminance component Fλ1Middle pixelTo be provided withThe j' th pixel in the neighborhood of ΩCalculating local smoothing filtering weight coefficients:
wherein, Ω 'is s5 × s5 pixel neighborhood, s5 is in the range of 3-111, I is pixel value, a is exponential function coefficient, is in the range of 1-5, epsilon is adjustment factor, is in the range of 0-100, | | | | | represents the Euclidean distance of pixel value, and k' is pixel label;
for the weight coefficient w ″)j′Normalization is carried out:
wherein, represents the weight coefficient w' for all pixels in the neighborhood Ωj′Summing;
by a normalized weight coefficient w'j′As a color componentMiddle pixelFor pixels, by local smoothing filter weight coefficientsCarrying out local smooth filtering to obtain refined color component pixels
Wherein,is the jth pixel within the neighborhood Ω';
traversing color componentsObtaining a filtered result { B'λ2,B′λ3As color component { Bλ2,Bλ3And (5) upsampling a thinning result.
The method d2-b operates as follows:
step d 2-b-1:
using linear interpolation method to process color component Bλ2,Bλ3Upsampling to obtain color components with the same resolution as the image F
Step d 2-b-2:
computing a luminance component F of an image Fλ1Local neighborhood mean of (c):
calculating color componentsLocal neighborhood mean of (c):
step d 2-b-3:
calculating Fλ1Local neighborhood variance of (c):
calculating Fλ1Andlocal neighborhood covariance of the corresponding pixel in (1):
step d 2-b-4:
calculating linear transformation coefficients:
wherein, tauλ2、τλ3The penalty coefficient is obtained, and the value range is (0-10);
step d 2-b-5:
calculating the local neighborhood mean of the linear transformation coefficient:
step d 2-b-6:
for color componentPerforming linear transformation:
in the method d2-b, fm() is the image mean filtering with window size s6 s6, s6 ranging from 3 to 111.
The method for denoising the image A comprises the following steps: gaussian filtering, median filtering, NLM, BM3D, KSVD.
Finally, white balance correction is carried out on the true color high-definition night vision image C; processing the image C by adopting nonlinear transformation operations such as Gamma correction and tone mapping, and improving the brightness, contrast and dynamic range of the image C; and performing color enhancement on the image C by adopting a CCM matrix method.
When the image a and the image B are acquired by the method a2 and the method B2, the following processing flow may also be adopted:
the first step is as follows: the bilinear interpolation method in the method d1 is adopted to carry out upsampling on the image B, and the image B with the same resolution as the image A is obtaineduThe superscript u denotes BuIs an up-sampled image;
the second step is that: referring to method B1-a or B1-B, image A is used as a guide image for image BuCarrying out local smooth filtering on the images by three channels to obtain an image Bu,sThe superscripts u and s denote Bu,sIs an up-sampled image BuThe locally smoothed filtered image of (1);
the third step: referring to method c2, an image is rendered in a selected color space λBu,sConversion to image Bu,s,λThe superscript λ denotes Bu,s,λIs Bu,sConverting the image into an image in a color space lambda, and performing edge preserving multi-scale decomposition on the image B by adopting an image edge preserving multi-scale decomposition methodu,s,λThe luminance component and the image A are fused, and the fusion result is converted into an rgb color space image as a true color high-definition night vision image C.
When the image a is a near-infrared image, the method C3 may also be adopted to color the color information in the image B to the image a, so as to obtain a true-color high-definition night vision image C:
step C3-1: adjusting the brightness of the image A and the image B to make the brightness of the image B similar to that of the image A, and upsampling the image B by adopting a method d1 or d2 to obtain an image B' with the same resolution as that of the image A;
the method for adjusting the brightness of the image A and the image B comprises the following steps: calculate mean m of image AaSum variance σaConverting the image B into a gray image Gb, calculating the mean value m of the image GbGbSum variance σGbAnd performing linear transformation on all pixels in the image B:
wherein,is the rgb channel pixel value of image B,/represents a multiply operation,/represents a divide operation;
step C3-2: v is formed by pixels in image A and image BA,B'Calculating the C pixel value of the image by adopting a polynomial regression methodWherein,is the rgb pixel value, V, of image CA,B'Is a pixel in image A and image BThe formed input vector comprises combination forms of 1 order, 2 order, 3 order and the like, wherein the combination form of 1 order is as follows:
the 2 nd order combination is as follows:
the 3-step combination is as follows:
VA,B'is an m-dimensional vector when VA,B'When the combination is 1 order combination, m is 4, when VA,B'When it is a 2-step combination, m is 10, when VA,B'When the combination is 3-order combination, m is 19;
the polynomial regression model is:
IC=XVA,B'(57)
x is a polynomial coefficient matrix with dimensions 3X m:
step C3-3: and taking the image A as a guide image, and performing local neighborhood smoothing filtering on the image C by adopting a method a1 or a2 to eliminate image noise in polynomial regression estimation so as to obtain a true-color high-definition night vision image C.
In step C3-2, the coefficient matrix X is calculated by:
shooting a color card color image C' under the visible light illumination condition; shooting a color chart near-infrared image N under the near-infrared light condition; shooting a color card low-illumination color image C' under the conditions of low illumination and no near-infrared supplementary lighting;
taking n pixels in Y color lump areas in the image C 'to form an output vector I'C,I'CIs 3 x Y 'dimension, Y' is Y x n, and the value range of Y is 1 to 50;
taking N pixels in Y color block areas in the images N and C ' and forming an input matrix V ' according to the selected combination form 'A,B',V'A,B'Is dimension m x Y';
estimating a coefficient matrix X by using least square regression:
X=(V'A,B'*V'A,B' T)-1(V'A,B'*I'C T) (59)
the method for acquiring the image A and the Bayer Q, Bayer image Q' comprises a method 1 to a method 11.
The method comprises the following steps: two lenses are used for imaging, a monochrome image sensor is arranged at the rear end of the lens 1 to shoot a gray image A, a Bayer image sensor is arranged at the rear end of the lens 2 to shoot a Bayer image Q or Q ', a coordinate mapping relation between the images shot by the lens 1 and the lens 2 is found through a binocular stereo vision calibration method, and coordinate transformation is carried out on the shot image A and the image Q or Q ', so that the image A and the image Q or Q ' have the same view field; preferably, the optical axes of the lens 1 and the lens 2 are placed in parallel, and the distance between the optical axes is less than 50 mm; preferably, the lens 1 adopts a near-infrared anti-reflection lens, a near-infrared light source is added for auxiliary illumination, and a near-infrared light cut-off filter is arranged at the front end of the Bayer image sensor.
The method 2 comprises the following steps: two lenses are used for imaging, a spectroscope is arranged at the front end of the lens 1 and forms an angle of 45 degrees with the optical axis of the lens 1, the lens 1 receives the transmitted light of the spectroscope, a lens 2 is arranged on the reflected light path of the spectroscope, the lens 2 receives the reflected light of the spectroscope, and the optical axes of the lens 1 and the lens 2 are vertical; a monochrome image sensor is arranged at the rear end of the lens 1 to shoot an image A, and a Bayer image sensor is arranged at the rear end of the lens 2 to shoot an image Q or an image Q'; and adjusting the positions of the lens 1, the lens 2 and the spectroscope to ensure that the image A and the image Q or Q' have the same visual field and the imaging is clear.
The method 3 comprises the following steps: two lenses are used for imaging, a spectroscope is arranged at the front end of the lens 1 and forms an angle of 45 degrees with the optical axis of the lens 1, a reflecting mirror is arranged on a reflecting light path of the spectroscope, the mirror surface of the reflecting mirror is parallel to the mirror surface of the spectroscope, and a lens 2 is arranged on the reflecting light path of the reflecting mirror; the lens 1 receives the transmitted light of the spectroscope, and the lens 2 receives the reflected light of the spectroscope; a monochrome image sensor is arranged at the rear end of the lens 1 to shoot a gray image A, and a Bayer image sensor is arranged at the rear end of the lens 2 to shoot a Bayer image Q or Q'; and adjusting the positions of the lens 1, the lens 2, the spectroscope and the reflector to ensure that the image A and the image Q or Q' have the same visual field and the imaging is clear.
The method 4 comprises the following steps: two lenses are used for imaging, a dichroic mirror is arranged at the front end of the lens 1 and forms an angle of 45 degrees with an optical axis of the lens 1 and is used for separating visible light and near infrared light, the lens 1 receives transmission light of the dichroic mirror, a lens 2 is arranged on a reflection light path of the dichroic mirror, and the lens 2 receives reflection light of the dichroic mirror; when a dichroic mirror which reflects near infrared light and transmits visible light is adopted, a Bayer image sensor is arranged at the rear end of the lens 1 to shoot a Bayer image Q or Q', and a monochrome image sensor is arranged at the rear end of the lens 2 to shoot a gray image A; when a dichroic mirror which reflects visible light and transmits near infrared light is adopted, a monochrome image sensor is arranged at the rear end of the lens 1 to shoot a gray image A, and a Bayer image sensor is arranged at the rear end of the lens 2 to shoot a Bayer image Q or Q'; adjusting the positions of the lens 1, the lens 2 and the dichroic mirror to enable the shot Bayer image Q or Q' and the gray level image A to have the same field of view and to form clear images; preferably, a near infrared light source is added to assist in illumination.
The method 5 comprises the following steps: two lenses are used for imaging, and a dichroic mirror is arranged at the front end of the lens 1 and forms an angle of 45 degrees with the optical axis of the lens 1 and is used for separating visible light and near infrared light; a reflecting mirror is arranged on a reflecting light path of the dichroic mirror, the mirror surface of the reflecting mirror is parallel to the mirror surface of the dichroic mirror, a lens 2 is arranged on the reflecting light path of the reflecting mirror, and the optical axes of the lens 2 and the lens 1 are parallel; when a dichroic mirror which reflects near infrared light and transmits visible light is adopted, a Bayer image sensor is arranged at the rear end of the lens 1 to shoot a Bayer image Q or Q', and a monochrome image sensor is arranged at the rear end of the lens 2 to shoot a gray image A; when a dichroic mirror which reflects visible light and transmits near infrared light is adopted, a Bayer image sensor is arranged at the rear end of the lens 2 to shoot a Bayer image Q or Q', and a monochrome image sensor is arranged at the rear end of the lens 1 to shoot a gray image A; adjusting the positions of the lens 1, the lens 2, the spectroscope and the reflector to ensure that the Bayer image Q or Q' and the gray level image A have the same field of view and the imaging is clear; preferably, a near infrared light source is added to assist in illumination.
The method 6 comprises the following steps: a lens is adopted for imaging, and a dichroic mirror is arranged at the rear end of the lens and forms an angle of 45 degrees with the optical axis of the lens and is used for separating visible light and near infrared light; when a dichroic mirror which reflects near infrared light and transmits visible light is adopted, a Bayer image sensor is arranged on a transmission light path of the dichroic mirror to shoot a Bayer image Q or Q', a monochrome image sensor is arranged on a reflection light path of the dichroic mirror to shoot a gray image A, the plane of the Bayer image sensor is vertical to the optical axis of a lens, and the plane of the monochrome image sensor is parallel to the optical axis of the lens; when a dichroic mirror which reflects visible light and transmits near-infrared light is adopted, a Bayer image sensor is arranged on a reflection light path of the dichroic mirror to shoot a Bayer image Q or Q', a monochrome image sensor is arranged on a transmission light path of the dichroic mirror to shoot a gray image A, the plane of the Bayer image sensor is parallel to the optical axis of a lens, and the plane of the monochrome image sensor is vertical to the optical axis of the lens; adjusting the positions of the Bayer image sensor, the monochrome image sensor and the dichroic mirror to enable the fields of view of the shot images to be overlapped and to be imaged clearly; preferably, a near-infrared light source is added for auxiliary illumination, and a near-infrared anti-reflection lens is adopted as the lens.
The method 7 comprises the following steps: a lens is adopted for imaging, a dichroic mirror is arranged at the rear end of the lens and forms an angle of 45 degrees with the optical axis of the lens and is used for separating visible light and near infrared light, a reflecting mirror is arranged on a reflecting light path of the dichroic mirror, and the mirror surface of the reflecting mirror is parallel to the mirror surface of the dichroic mirror; when a dichroic mirror which reflects near infrared light and transmits visible light is adopted, a Bayer image sensor is arranged on a transmission light path of the dichroic mirror to shoot a Bayer image Q or Q', a monochrome image sensor is arranged on a reflection light path of the reflecting mirror to shoot a gray image A, and the plane of the Bayer image sensor and the plane of the monochrome image sensor are vertical to the optical axis of a lens; when a dichroic mirror which transmits near infrared light and reflects visible light is adopted, a monochrome image sensor is arranged on a transmission light path of the dichroic mirror to shoot a gray image A, a Bayer image sensor is arranged on a reflection light path of the reflecting mirror to shoot a Bayer image Q or Q', and a monochrome image sensor plane and a Bayer image sensor plane are vertical to a lens optical axis; adjusting the positions of a Bayer image sensor, a monochrome image sensor, a dichroic mirror and a reflecting mirror to ensure that the fields of view of the shot images are superposed and the images are clearly imaged; preferably, a near-infrared light source is added for auxiliary illumination, and a near-infrared anti-reflection lens is adopted as the lens.
The method 8 comprises the following steps: a lens is adopted for imaging, and a spectroscope is arranged at the rear end of the lens and forms an angle of 45 degrees with the optical axis of the lens; a Bayer image sensor is arranged on a transmission light path of the spectroscope to shoot a Bayer image Q or Q', the plane of the Bayer image sensor is vertical to the optical axis of the lens, a monochrome image sensor is arranged on a reflection light path of the reflector to shoot a gray image A, and the plane of the monochrome image sensor is parallel to the optical axis of the lens; the Bayer image sensor receives the transmitted light of the spectroscope, and the monochrome image sensor receives the reflected light of the spectroscope; and positions of the Bayer image sensor, the monochromatic image sensor and the spectroscope are adjusted, so that the fields of view of the shot images are overlapped, and the images are clear.
The method 9: the monochrome image sensor and Bayer image sensor are interchanged in method 8.
The method 10 comprises the following steps: a lens is adopted for imaging, and a spectroscope is arranged at the rear end of the lens and forms an angle of 45 degrees with the optical axis of the lens; a Bayer image sensor is arranged on a transmission light path of the spectroscope to shoot a Bayer image Q or Q', the plane of the Bayer image sensor is vertical to the optical axis of the lens, a reflecting mirror is arranged on a reflection light path of the spectroscope, the mirror surface of the reflecting mirror is parallel to the mirror surface of the spectroscope, a monochrome image sensor is arranged on the reflection light path of the reflecting mirror to shoot a gray image A, and the plane of the monochrome image sensor is vertical to the optical axis of the lens; the Bayer image sensor receives the transmitted light of the spectroscope, and the monochrome image sensor receives the reflected light of the spectroscope; the positions of the Bayer image sensor, the monochromatic image sensor, the spectroscope and the reflector are adjusted, so that the fields of view of the shot images are overlapped, and the images are clear.
The method 11 comprises the following steps: the monochrome image sensor is interchanged with the Bayer image sensor in method 10.
In the method 1 to method 11, the specific implementation forms of the spectroscope, the dichroic mirror and the reflecting mirror include: a prism and a plane mirror; the coincidence requirement of the imaging field of view of the grayscale image a and the Bayer image Q or Q' means that: the mean value of the positional shifts of the pixels of the same name does not exceed 100 pixels.
In the method 7, as shown in fig. 20, an equivalent optical path can be realized by using an L-shaped prism, which has 5 light input and output end faces of O1, O2, O3, O4 and O5; the end face of O1 is opposite to the lens and is vertical to the optical axis of the lens, and the end face of O1 is plated with a near infrared antireflection film; the end face of O2 is positioned behind the end face of O1 and forms an angle of 45 degrees with the end face of O1, and the end face of O2 is plated with a film which transmits visible light and reflects near infrared light and is used for an equivalent dichroic mirror; the end face of O3 is parallel to the end face of O2, is positioned on a reflection light path of the end face of O2, and is plated with a near infrared reflection film for an equivalent reflector; the end face of O4 is positioned on the light reflecting path of the end face of O3, and a near infrared antireflection film is plated on the end face of O3578; the end face of O5 is positioned on the transmission light path of the end face of O2, and is plated with a visible light antireflection film; the end face of O4, the end face of O5 and the end face of O1 are parallel, and the optical paths from the end face of O1 to the end faces of O4 and O5 are the same; determining the optical path lengths from the end face of O1 to the end faces of O4 and O5 according to the back focal length of the lens; a monochrome image sensor is attached to the end face of O4, and a Bayer image sensor is attached to the end face of O5; the positions of the monochrome image sensor and the Bayer image sensor are adjusted to enable the visual fields of the shot images to be overlapped.
In the method 7, an L-shaped prism having 5 light input and output end faces of O1, O2, O3, O4 and O5 can be used to realize an equivalent optical path; the end face of O1 is opposite to the lens and is vertical to the optical axis of the lens, and the end face of O1 is plated with a near infrared antireflection film; the end face of O2 is positioned behind the end face of O1 and forms an angle of 45 degrees with the end face of O1, and the end face of O2 is plated with a film which transmits near infrared light and reflects visible light and is used for an equivalent dichroic mirror; the end face of O3 is parallel to the end face of O2, is positioned on a light reflecting path of the end face of O2, and is plated with a visible light reflecting film for an equivalent reflector; the end face of O4 is positioned on the reflection light path of the end face of O3, and a visible light antireflection film is plated on the end face of O4; the end face of O5 is positioned on the transmission light path of the end face of O2, and is plated with a near infrared light antireflection film; the end faces of O4, O5 and O1 are parallel and vertical to the optical axis of the lens; the optical distances from the end face of O1 to the end faces of O4 and O5 are the same; determining the optical path lengths from the end face of O1 to the end faces of O4 and O5 according to the back focal length of the lens; a Bayer image sensor is attached to the end face of O4, and a monochrome image sensor is attached to the end face of O5; the positions of the monochrome image sensor and the Bayer image sensor are adjusted to enable the visual fields of the shot images to be overlapped.
In the method 10, an L-shaped prism having 5 light input and output end faces of O1, O2, O3, O4 and O5 can be used to realize an equivalent optical path; the end face of O1 is opposite to the lens and is vertical to the optical axis of the lens, and the end face of O1 is plated with a near infrared antireflection film; the end face of O2 is positioned behind the end face of O1 and forms an angle of 45 degrees with the end face of O1, and the end face of O2 is plated with a semi-transparent and semi-reflective film for an equivalent spectroscope; the end face of O3 is parallel to the end face of O2, is positioned on a reflection light path of the end face of O2, and is plated with a visible light and near infrared reflection film for an equivalent reflector; the end face of O4 is positioned on the reflection light path of the end face of O3 and is plated with a visible light and near infrared antireflection film; the end face of O5 is positioned on the transmission light path of the end face of O2, and is plated with a visible light antireflection film; the end face of O4, the end face of O5 and the end face of O1 are parallel, and the optical paths from the end face of O1 to the end faces of O4 and O5 are the same; selecting optical path lengths from the end face of O1 to the end faces of O4 and O5 according to the back focal length of the lens; a Bayer image sensor is attached to the end face of O4, and a monochrome image sensor is attached to the end face of O5; adjusting positions of a monochrome image sensor and a Bayer image sensor to enable the fields of view of the shot images to be overlapped; the Bayer image sensor and monochrome image sensor locations may be interchanged.
The invention has the beneficial effects that:
1) provides a low-cost true color high-definition night vision imaging method
The invention fully utilizes the mechanism that human eyes are sensitive to brightness change and insensitive to color change during vision, and utilizes the color information of the color image with low resolution and high signal-to-noise ratio to color the high-resolution gray-scale image to obtain the high-resolution color night vision image. The imaging method not only retains the high-frequency information of brightness change in the gray-scale image, but also reduces the requirement of color image resolution. The advantages brought by the reduction of the color image resolution requirement are: a) after the resolution of the color image is greatly reduced, pixels with larger sizes can be made on the chip wafer with the same area, and the pixels are used for improving the light sensitivity and the imaging quality of the color image sensor. As shown in the simulation example of fig. 2, fig. 2(a) is an original high-resolution rgb three-channel color image with an image resolution of 4000 × 3000 pixels, fig. 2(b) is a luminance component y converted into yuv space with an image resolution of 4000 × 3000 pixels, fig. 2(c) is a rgb three-channel image obtained by down-sampling 2(a) by 1/(32 × 32) with an image resolution of 125 × 94 pixels; converting fig. 2(c) into a yuv space image, performing 32 × 32 times up-sampling on the uv component by bilinear interpolation to obtain a uv component image with the size of 4000 × 3000, then superposing the uv component with the y component of fig. 2(b) to obtain a yuv image with the size of 4000 × 3000, and converting into an rgb three-channel image fig. 2(d) with the resolution of 4000 × 3000, so that the images 2(a) and 2(d) have no obvious difference in brightness details and colors. The above simulation example demonstrates that a high resolution color image can be obtained using a low resolution color image fused with a high resolution black and white image. In the simulation example described above, when the color image resolution is 1/(32 × 32) of the black-and-white image, its light sensing capability will be 32 × 32 times that of the Bayer pattern sensor of the same resolution as the black-and-white image. The direct use of one image sensor chip to achieve the same level of light sensing capability requires a 32 × 32 chip area increase, which is difficult to achieve. Therefore, the invention can realize high-resolution color imaging only by one high-resolution black-and-white image sensor and one low-resolution Bayer image sensor, which has significant meaning for reducing the high-resolution color night vision imaging cost.
2) Provides two color image acquisition methods with high signal-to-noise ratio
The invention provides two methods for acquiring a color image with high signal-to-noise ratio: method b1 and method b 2. The method b1 is to improve the signal-to-noise ratio of the color image in a software manner, and includes two innovation points: firstly, for Bayer image processing, taking 2 × 2 pixels as a unit, extracting r, b channel and g channel pixel mean values to obtain a three-channel rgb image, avoiding crosstalk among different channel pixels of rgb caused by CFA operation in the traditional ISP processing flow, and reducing the noise level of the image; secondly, the gray scale image is used as a guide image to perform image denoising on three channels of the color image, the advantage that the signal-to-noise ratio of the gray scale image is higher than that of the color image can be fully utilized, and the signal-to-noise ratio of the color image is further improved. The method b2 is to improve the signal-to-noise ratio of the color image in a hardware form, and to improve the signal-to-noise ratio of the color image by reducing the resolution of the image, increasing the pixel size, improving the single-pixel sensitivity under the night vision condition, and without increasing the chip area of the image sensor.
3) Provides three efficient gray level image coloring methods
The invention provides three gray level image coloring methods, including methods c1, c2 and c 3. In the method c1, in a color space where color and brightness are separated, a high-resolution gray image is used as a guide image to perform up-sampling thinning on color components of a color image, and then the color components are superposed on the high-resolution gray image, so that the calculation amount is small, the method is easy to realize by an FPGA (field programmable gate array), a DSP (digital signal processor) and other embedded platforms, and is suitable for the fusion of a visible light gray image and the color; the method c2 extracts the detail information of the gray level image through image edge-preserving multi-scale decomposition, and fuses the detail information of the gray level image and the color information of the color image in a color and brightness separation color space, although the calculation amount is large, the color cast problem of the fused image generated by the method c1 can be better solved when the gray level image is a near-infrared image, and the method is suitable for fusing the near-infrared image and the color image. As shown in fig. 6, noise reduction and upsampling refinement are performed on a low-resolution noisy color image (fig. 6.a) in a near-infrared image (fig. 6.b) as a guide, then a base image and a multi-layer detail image of the near-infrared image are obtained by adopting an edge-protected image multi-scale decomposition method in a luminance and color separation color space, and a fused color image (fig. 6.c) can be obtained after the multi-layer detail image of the near-infrared image is superimposed with the luminance component of the color image as a reference. From the fused image, the detail information in the near infrared image (fig. 6.b) and the color information in the low-resolution color image (fig. 6.c) are fused, and the color cast of the image does not exist. The method c3 is used for solving the coloring problem of near infrared and color images, the detail information in the near infrared images and the color images is fused through polynomial regression, the color cast problem of the images in the fusion process of the method c1 can be solved similarly to the method c2, and compared with the method c2, the method does not need to carry out image edge-preserving smooth filtering for many times, reduces the calculated amount and is more beneficial to image coloring real-time processing.
4) Provides 3 effective color component up-sampling thinning methods
The invention provides a method for sampling and thinning 3 color components: method d1, method d2-a, method d 2-b. The method d1 performs upsampling refinement on the low-resolution color component through image sub-pixel interpolation, can obtain the high-resolution color component with the same resolution as the gray image, is simple to calculate, and is easy to implement by embedded system hardware such as FPGA, DSP and the like; the method d2-a and the method d2-b both use the gray level image as a guide image to perform upsampling and thinning on the color components, so that the thinned color components can be more thinned in regions such as object boundaries and the like, and the obvious blocking effect generated when the low-resolution color components are upsampled to the high-resolution color components is avoided. Therefore, the color image resolution requirements can be further reduced by using the methods d2-a and d 2-b.
5) The invention adopts the near-infrared light source for light supplement, can obviously enhance the imaging quality of the high-resolution gray image, takes the near-infrared image as the guide, carries out noise reduction or up-sampling thinning on the color image, and can obviously improve the signal-to-noise ratio of the color image. Fig. 5.a shows a noise-containing color image, fig. 5.b shows a near-infrared image, and fig. 5.c shows a denoising effect of the noise-containing image (fig. 5.a) by using the near-infrared image (fig. 5.b) as a guide image and using the method b 1-b. As can be seen from the denoising result, the noise of the color image can be reduced by adopting the near-infrared image as the guide.
The invention discloses a true color high-definition night vision imaging method which can be used for video monitoring, wherein automobile images comprise a vehicle-mounted night vision device, a vehicle traveling recorder and a back vision device, police patrol at night, capture at night, individual combat, investigation, patrol, guard and the like, unmanned aerial vehicle visual navigation, unmanned aerial vehicle aerial photography, helicopter auxiliary take-off and landing, helicopter aerial photography, engineering machinery night operation comprising a crane, a tower crane, a pump truck, a fire truck and the like, night outdoor exploration comprising hiking and touring, night outdoor entertainment comprising a real person CS night confrontation game, scientific investigation comprising night wild animal detection and all application fields relating to night color imaging.
Drawings
FIG. 1 is a Bayer format image schematic;
FIG. 2 is an example of a fusion of a high resolution grayscale image with a low resolution color image, wherein (a) a full-color original image, (b) a high resolution grayscale image, (c) a low resolution color image, and (d) a composite color image;
FIG. 3 is an example of grayscale image-guided color image denoising in which (a) the grayscale image, (b) the noise contaminates the color image, (c) method b1-a the denoising result, (d) method b1-b the denoising result;
fig. 4 grayscale image-guided color image upsampling refinement examples, in which (a) a grayscale image, (b) a noise-containing low-resolution color image, (c) method d1 upsampling results, (d) method d2-a upsampling refinement results, (e) method d2-b upsampling refinement results;
FIG. 5 is an example of near-infrared image guided color image denoising, where (a) is a noisy color image, (b) is a near-infrared image, and (c) is the denoising result using method b1-b with the near-infrared image as a guide;
fig. 6 shows an example of fusion of a near-infrared image with a low-resolution noisy color image, wherein (a) the low-resolution noisy color image, (b) the near-infrared image, and (c) the fused image;
FIG. 7 is a schematic diagram of a dual lens imaging apparatus;
FIG. 8 is a schematic diagram of a near-infrared light-compensating dual-lens imaging device;
FIG. 9 is a schematic view of a dual lens coaxial imaging device employing a beam splitter;
FIG. 10 is a schematic diagram of a dual-lens imaging apparatus using a beam splitter for near-infrared light compensation;
FIG. 11 is a schematic view of a dual-lens coaxial imaging device using a beam splitter and a reflector;
FIG. 12 is a schematic diagram of a dual-lens coaxial imaging device using a spectroscope and a reflector for near-infrared light supplement;
FIG. 13 is a schematic diagram of a coaxial dual-lens imaging device using dichroic mirrors for near-infrared light compensation;
FIG. 14 is a schematic diagram of a dual-lens coaxial imaging device using dichroic mirrors and reflectors;
FIG. 15 is a schematic view of a single lens coaxial imaging device employing a beam splitter;
FIG. 16 is a schematic diagram of a single-lens coaxial imaging device using a beam splitter for near-infrared light supplement;
FIG. 17 is a schematic view of a single-lens coaxial imaging device using a spectroscope and a reflector;
fig. 18 is a schematic diagram of a single-lens coaxial imaging device using a dichroic mirror for near-infrared light supplement;
FIG. 19 is a schematic diagram of a single-lens coaxial imaging apparatus using a dichroic mirror and a reflecting mirror for near-infrared light compensation;
FIG. 20 is a schematic diagram of an implementation of coaxial imaging based on an L-shaped prism;
in the figure, a light inlet end face and a light outlet end face of a 1-near infrared anti-reflection lens, a 2-monochromatic image sensor, a 3-visible light lens, a 4-Bayer image sensor, a 5-near infrared anti-reflection lens optical axis, a 6-visible light lens optical axis, a 7-near infrared light source, an 8-near infrared cut-off filter, a 9-spectroscope, a 10-shading box, an 11-light inlet window, a 12-reflector, a 13-dichroic mirror, a 14-camera shell, a 15-L-shaped prism and an O1-O5-L-shaped prism are arranged on the light inlet end face and the light outlet end face of the L-shaped.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings and examples.
Example 1
The grayscale image a was acquired by method a1, the monochrome image sensor was EMCCD with an imaging resolution of 200 ten thousand pixels, the color image B was acquired by method B1, and the Bayer image sensor had an imaging resolution of 200 ten thousand pixels. In the method B1, a KSVD image denoising algorithm is adopted to denoise the gray image A, a B1-a method is adopted, the gray image is used as a guide image, three channel images of the color image D are denoised, a color image B with a high signal-to-noise ratio is obtained, and the resolution of the color image B is 50 ten thousand pixels. In method b1-a, s1 ═ 5, a ═ 2, and ∈ 3. And coloring the color information of the color image B to the de-noised gray level image A by adopting a method C1 to obtain a high-resolution color night vision image C, wherein the resolution of the image C is 200 ten thousand pixels. In method c1, the YUV color space is selected as the color and luminance component color space λ, and color image B is processed using method d1λColor component { B }λ2,Bλ3Upsampling and thinning are carried out to obtain a color component { B 'with the same resolution as the gray level image A'λ2,B′λ3And B-spline interpolation is adopted as the sub-pixel interpolation method.
Example 2
The difference from embodiment 1 is that in method b1, the b1-b method is adopted, and the denoised grayscale image is used as a guide image to denoise the three channel images of the color image D. In method b1-b, s3 ═ 5, τr=0.01,τg=0.01,τb=0.01。
Example 3
The difference from the embodiment 1 is that: the gray level image A is obtained by adopting a SCMOS monochrome image sensor, and the imaging resolution is 200 ten thousand pixels; color image B was acquired using method B2, the Bayer image sensor chip area was the same as the SCMOS monochrome image sensor chip area, and the image resolution was 12.5 ten thousand pixels.
Example 4
The difference from the embodiment 1 is that: the grayscale image a is acquired by a CMOS monochrome image sensor, and the imaging resolution is 200 ten thousand pixels.
Example 5
The difference from the embodiment 1 is that: the gray level image A is obtained by a CMOS monochrome image sensor, the imaging resolution is 200 ten thousand pixels, near-infrared light source illumination is added, the spectrum range of the near-infrared light source is 800 nm-900 nm, and at the moment, the shot gray level image A is a near-infrared image. And coloring the color information of the color image B to the de-noised gray level image A by adopting a method C2 to obtain a high-resolution color night vision image C, wherein the resolution of the image C is 200 ten thousand pixels. In the method c2, the color and brightness component color space λ is selected as YUV color space, the number of image edge preserving multi-scale decomposition layers is 5, and the image detail fusion weighting coefficient is { omega [ [ omega ] ]0,ω1,…,ωn+1}=[0,0.2,0.2,0.2,0.2,0.2]The image edge-preserving smoothing filtering adopts a weighted least square edge-preserving smoothing method; for color image BλColor component { B }λ2,Bλ3The method of upsampling refinement adopts d2-b, where s6 is 5. As shown in fig. 5, wherein fig. 5.a is a low-resolution noisy color image, fig. 5.b is a near-infrared image, and fig. 5.c is a composite image.
Example 6
The difference from embodiment 5 is that the color information of the color image B is colored onto the near-infrared image a by the method C3, where VA,B'A1 st order combination is adopted.
Example 7
The difference from example 6 is that VA,B'A 2-step combination is adopted.
Example 8
The difference from example 6 is that VA,B'A 3-step combination form is adopted.
Example 9
The difference from embodiment 1 is that HSV color space is selected as the color space λ.
Example 10
The difference from embodiment 1 is that the color space λ selects the YCbCr color space.
Example 11
The difference from example 1 is that the color space λ is selected to be a Lab color space.
Example 12
The difference from embodiment 1 is that, in order to avoid color distortion of the image C, white balance correction based on gray edges is performed on the image C. In order to increase the brightness, contrast and dynamic range of the image, the image C is processed by nonlinear transformation operations such as Gamma correction and tone mapping. In order to make the image C more gorgeous, a matrix method is adopted to carry out color enhancement on the color image.
Example 13
The method a2 and the method B2 are adopted to obtain an image A and an image B, wherein the monochrome image sensor adopts a CMOS image sensor, the imaging resolution is 400 ten thousand pixels, the illumination is carried out by adopting a 850nm near infrared LED light source, the Bayer image sensor has the same area as the monochrome image sensor, the imaging resolution is 4 ten thousand pixels, and the color image C is obtained by adopting the following processing flow:
the first step is as follows: the bilinear interpolation method in the method d1 is adopted to carry out upsampling on the image B, and the image B with the same resolution as the image A is obtaineduThe superscript u denotes BuIs an up-sampled image;
the second step is that: referring to method B1-a or B1-B, image A is used as a guide image for image BuCarrying out local smooth filtering on the images by three channels to obtain an image Bu,sThe superscripts u and s denote Bu,sIs an up-sampled image BuThe locally smoothed filtered image of (1);
the third step: referring to method c2, image B is displayed in the selected color space λu,sConversion to image Bu,s,λThe superscript λ denotes Bu,s,λIs Bu,sConverting the image into an image in a color space lambda, and performing edge preserving multi-scale decomposition on the image B by adopting an image edge preserving multi-scale decomposition methodu,s,λThe luminance component and the image A are fused, and the fusion result is converted into an rgb color space image which is used as a true color high-definition night vision image C.
Example 14
As shown in fig. 7, two lenses are used for imaging, a monochrome image sensor 2 is arranged at the rear end of a near infrared anti-reflection lens 1, a Bayer image sensor 4 is arranged at the rear end of a visible light lens 3, a coordinate mapping relation of images shot by the near infrared anti-reflection lens 1 and the visible light lens 2 is found through a binocular stereo vision calibration method, and coordinate transformation is performed on a shot gray level image a and a Bayer image Q or an image Q ', so that the fields of view of the image a and the image Q or the image Q' coincide.
Example 15
The difference from the example 14 is that, as shown in fig. 8, the optical axis (5) of the near-infrared anti-reflection lens 1 and the optical axis (6) of the visible light lens 2 are placed in parallel, and the distance between the optical axes is equal to 40 mm; a near infrared light source (7) with 850nm wavelength is added for auxiliary illumination, and a near infrared light cut-off filter (8) is arranged at the front end of the Bayer image sensor.
Example 16
As shown in fig. 9, a spectroscope (9) is arranged at the front end of the visible light lens (3) and forms an angle of 45 degrees with the optical axis (6), the visible light lens (3) receives the transmission light of the spectroscope (9), a near-infrared anti-reflection lens (1) is arranged on the reflection light path of the spectroscope (9), the near-infrared anti-reflection lens (1) receives the reflection light of the spectroscope (9), and the optical axis (5) is perpendicular to the optical axis (6); a monochrome image sensor (2) is arranged at the rear end of a near infrared anti-reflection lens (1) to shoot a gray image A, a Bayer image sensor (4) is arranged at the rear end of a visible light lens (3) to shoot a Bayer image Q or an image Q', and a near infrared cut-off filter (8) is arranged at the front end of the Bayer image sensor (4); the device is enclosed in a light shielding box (10), a light inlet window (11) is formed in the light shielding box (10), and the light inlet window (11) is positioned on an optical axis (5) of the visible light lens 3; and adjusting the distances from the near-infrared anti-reflection lens (1) and the visible light lens (3) to the spectroscope (9) to ensure that the gray image A and the Bayer image Q or Q' 6 have the same field angle. The shading box (10) is a totally-enclosed box except the light inlet window (11) and is used for shading external light.
Example 17
As shown in fig. 10, the difference from the embodiment 16 is that a near-infrared light source (7) is added outside the light shielding box (10) for illumination.
Example 18
As shown in fig. 11, the difference from embodiment 16 is that a reflecting mirror (12) is added in the reflected light path of the beam splitter (9), the mirror surface of the reflecting mirror (12) is parallel to the mirror surface of the beam splitter (9), the near-infrared anti-reflection lens (1) is disposed in the reflected light path of the reflecting mirror (12), and the optical axes of the near-infrared anti-reflection lens (1) and the visible light lens (3) are parallel. And adjusting the distances from the near-infrared anti-reflection lens (1) to the reflector (12) and from the visible light lens (3) to the spectroscope (9) to ensure that the gray image A and the Bayer image Q or the image Q' have the same field angle.
Example 19
As shown in fig. 12, the difference from the embodiment 18 is that the illumination by the near-infrared light source (7) is added.
Example 20
As shown in fig. 13, the difference from example 17 is that the spectroscope (9) is replaced with a dichroic mirror (13) that transmits visible light, reflects near-infrared light, and the near-infrared cut filter (8) at the front end of the Bayer image sensor is eliminated.
Example 21
As shown in fig. 14, the difference from embodiment 18 is that the dichroic mirror (9) is replaced with a dichroic mirror (13) which transmits visible light and reflects near-infrared light, and the selected reflecting mirror (12) has near-infrared reflecting ability.
Example 22
As shown in fig. 15, on the optical axis (5) of the near infrared anti-reflection lens (1), a spectroscope (9) is arranged inside the casing (14) at an angle of 45 degrees with the optical axis (5) of the near infrared anti-reflection lens, a Bayer image sensor (4) is arranged on the transmission optical path of the spectroscope, a near infrared cut-off filter (8) is arranged at the front end of the Bayer image sensor (4), a monochromatic image sensor (2) is arranged on the reflection optical path of the spectroscope (9), the imaging planes of the Bayer image sensor (4) and the monochromatic image sensor (2) are perpendicular, the distances from the Bayer image sensor (4) and the monochromatic image sensor (2) to the spectroscope (9) are adjusted, and an image a and an image Q or Q' have the same field of view.
Example 23
As shown in fig. 16, the difference from the embodiment 22 is that a near-infrared light source (7) is added for illumination outside the housing (14).
Example 24
As shown in fig. 17, the difference from embodiment 22 is that a reflecting mirror (12) is disposed in parallel with the spectroscope (9) on the reflected light path of the spectroscope (9), a monochrome image sensor (2) is disposed in the reflected light path of the reflecting mirror (12), the imaging planes of the monochrome image sensor (2) and the Bayer image sensor (4) are parallel, and the distances between the monochrome image sensor (2) and the reflecting mirror (12) and between the Bayer image sensor (4) and the spectroscope (9) are adjusted so that the image a coincides with the Bayer image Q or Q' field of view.
Example 25
As shown in fig. 18, the difference from example 23 is that the dichroic mirror (13) is replaced with the dichroic mirror (9), and the dichroic mirror (13) transmits visible light and reflects near-infrared light, and the near-infrared cut filter at the front end of the Bayer image sensor is eliminated.
Example 26
As shown in fig. 19, the difference from embodiment 24 is that the dichroic mirror (9) is replaced by a dichroic mirror (13), the dichroic mirror (13) transmits visible light and reflects near infrared light, the selected reflecting mirror (12) has near infrared reflection capability, and at the same time, the near infrared cut filter (8) at the front end of the Bayer image sensor is eliminated, and near infrared light source illumination is added outside the housing (14).
Example 27
An L-shaped prism is adopted to realize coaxial imaging, as shown in fig. 20, the L-shaped prism is fixed at the rear end of the lens 1, and the L-shaped prism has 5 light input and output end faces of O1, O2, O3, O4 and O5; the end face of O1 is opposite to the lens and is vertical to the optical axis of the lens, and the end face of O1 is plated with a near infrared antireflection film; the end face of O2 is positioned behind the end face of O1 and forms an angle of 45 degrees with the end face of O1, and the end face of O2 is plated with a film which transmits visible light and reflects near infrared light and is used for an equivalent dichroic mirror; the end face of O3 is parallel to the end face of O2, is positioned on a reflection light path of the end face of O2, and is plated with a near infrared reflection film for an equivalent reflector; the end face of O4 is positioned on the light reflecting path of the end face of O3, and a near infrared antireflection film is plated on the end face of O3578; the end face of O5 is positioned on the transmission light path of the end face of O2, and is plated with a visible light antireflection film; the end face of O4, the end face of O5 and the end face of O1 are parallel, and the optical paths from the end face of O1 to the end faces of O4 and O5 are the same; determining the optical path lengths from the end face of O1 to the end faces of O4 and O5 according to the back focal length of the lens; a monochrome image sensor is attached to the end face of O4, and a Bayer image sensor is attached to the end face of O5; the positions of the monochrome image sensor and the Bayer image sensor are adjusted to enable the visual fields of the shot images to be overlapped, and a gray image A and a Bayer image Q or Q' are acquired.
Example 28
The coaxial imaging is realized by adopting an L-shaped prism, and the L-shaped prism is provided with 5 light inlet and outlet end faces of O1, O2, O3, O4 and O5; the end face of O1 is opposite to the lens and is vertical to the optical axis of the lens, and the end face of O1 is plated with a near infrared antireflection film; the end face of O2 is positioned behind the end face of O1 and forms an angle of 45 degrees with the end face of O1, and the end face of O2 is plated with a film which transmits near infrared light and reflects visible light and is used for an equivalent dichroic mirror; the end face of O3 is parallel to the end face of O2, is positioned on a light reflecting path of the end face of O2, and is plated with a visible light reflecting film for an equivalent reflector; the end face of O4 is positioned on the reflection light path of the end face of O3, and a visible light antireflection film is plated on the end face of O4; the end face of O5 is positioned on the transmission light path of the end face of O2, and is plated with a near infrared light antireflection film; the end faces of O4, O5 and O1 are parallel and vertical to the optical axis of the lens; the optical distances from the end face of O1 to the end faces of O4 and O5 are the same; determining the optical path lengths from the end face of O1 to the end faces of O4 and O5 according to the back focal length of the lens; a Bayer image sensor is attached to the end face of O4, and a monochrome image sensor is attached to the end face of O5; the positions of the monochrome image sensor and the Bayer image sensor are adjusted to enable the visual fields of the shot images to be overlapped, and a gray image A and a Bayer image Q or Q' are acquired.
Example 29
The coaxial imaging is realized by adopting an L-shaped prism, and the L-shaped prism is provided with 5 light inlet and outlet end faces of O1, O2, O3, O4 and O5; the end face of O1 is opposite to the lens and is vertical to the optical axis of the lens, and the end face of O1 is plated with a near infrared antireflection film; the end face of O2 is positioned behind the end face of O1 and forms an angle of 45 degrees with the end face of O1, and the end face of O2 is plated with a semi-transparent and semi-reflective film for an equivalent spectroscope; the end face of O3 is parallel to the end face of O2, is positioned on a reflection light path of the end face of O2, and is plated with a visible light and near infrared reflection film for an equivalent reflector; the end face of O4 is positioned on the reflection light path of the end face of O3 and is plated with a visible light and near infrared antireflection film; the end face of O5 is positioned on the transmission light path of the end face of O2, and is plated with a visible light antireflection film; the end face of O4, the end face of O5 and the end face of O1 are parallel, and the optical paths from the end face of O1 to the end faces of O4 and O5 are the same; selecting optical path lengths from the end face of O1 to the end faces of O4 and O5 according to the back focal length of the lens; a Bayer image sensor is attached to the end face of O4, and a monochrome image sensor is attached to the end face of O5; the positions of the monochrome image sensor and the Bayer image sensor are adjusted to enable the visual fields of the shot images to be overlapped, and a gray image A and a Bayer image Q or Q' are acquired.
Example 30
The difference from embodiment 29 is that the Bayer image sensor and the monochrome image sensor are interchanged in position.
Example 31
Shoot near redThe method comprises the following steps that an outer image A is obtained by coloring color information in an image B to the image A through a method C3, and a true-color high-definition night vision image C is obtained, wherein in the step C3-1, the image B is subjected to up-sampling through a method d1, and an image B' with the same resolution as the image A is obtained; in step C3-2, VA,B'Adopting a 2-step combination form, wherein the number Y' of color blocks in a color card image is 24; in the step C3-3, local neighborhood smoothing filtering is carried out on the image C by adopting a method a1, and finally the true-color night vision image C is obtained.
Example 32
The difference from the embodiment is that in step C3-1, the image B is up-sampled by the method d2 to obtain an image B' with the same resolution as the image a; in step C3-2, VA,B'Adopting a 3-order combination form; in the step C3-3, local neighborhood smoothing filtering is carried out on the image C by adopting a method a2, and finally the true-color night vision image C is obtained.
Claims (9)
1. A true color high-definition night vision imaging method is characterized by comprising the following steps: acquiring a high-resolution gray image A and a low-resolution high-signal-to-noise-ratio color image B of the same scene, and coloring the high-resolution gray image A by using the low-resolution high-signal-to-noise-ratio color image B to obtain a true-color high-definition night vision image C;
the resolution of the true color high-definition night vision image C is not less than 100 ten thousand pixels;
the method for acquiring the high-resolution gray-scale image A comprises the following steps:
method a 1: imaging with a high-sensitivity monochrome image sensor, the high-sensitivity monochrome image sensor comprising: EMCCD, ICCD, ICMOS, SCMOS image sensors;
method a 2: imaging with a monochromatic image sensor, adding near-infrared light source auxiliary lighting, preferably adopting a near-infrared enhanced monochromatic image sensor;
the method for acquiring the low-resolution high-signal-to-noise-ratio color image B comprises methods B1 and B2;
the method b1 is characterized in that: shooting a Bayer image Q by using a high-sensitivity Bayer image sensor; according to a Bayer imaging mode, taking 2 x 2 pixels as a processing unit, and taking the average value of r and b channel pixels and 2 g channel pixels in 2 x 2 pixels in a Bayer image Q to form an rgb color space image P, wherein the resolution of the image P is 1/4 of the image Q; and carrying out mean value or Gaussian filtering downsampling on the image P to obtain an image D, wherein the size of a mean value filtering window is n1 × n1 pixels, and the value range of n1 is as follows: 3 to 100; carrying out image denoising processing on the image A, and enhancing the signal-to-noise ratio of the image A; carrying out mean filtering or Gaussian filtering downsampling on the image A to obtain an image E with the same resolution as the image D; three channels { D ] of image D with image E as a guide imager,Dg,DbCarrying out image denoising to obtain a low-resolution high-signal-to-noise-ratio color image B; the gray level image A is superposed with a Bayer image Q view field;
the three channels { D of the image D are subjected to the image E as a guide imager,Dg,DbThe method for denoising the image comprises a method b1-a and a method b 1-b;
the method b1-a has the following operation steps:
step b 1-a-1:
carrying out Gaussian filtering on three channels of the image D, wherein the size of a Gaussian filtering window is s1 s1 pixels, and the value range of s1 is 3-111;
step b 1-a-2:
calculating a local smoothing filtering weight coefficient by using the image E, and carrying out local smoothing filtering on three channels of the image D:
taking a pixel E in an image EkWith EkIs in the neighborhood ΩjCalculating local smoothing filtering weight coefficients:
wherein, Ω is s2 × s2 pixel neighborhood, s2 is in the range of 3-111, I is pixel value, a is exponential function coefficient, in the range of 1-5, epsilon is adjustment factor, in the range of 0-100, | | | | | | represents the Euclidean distance of pixel value, k is pixel label, j is pixel label in the neighborhood Ω;
for the weight coefficient wjNormalization is carried out:
wherein, representing the summation of the weight coefficients of all pixels in the neighborhood Ω;
by normalized weight coefficient w'jAs a pixel D in an image DkFor pixel DkR, g, b three channel pixel valuesAnd (3) local smoothing filtering:
wherein the symbol ". multidot.The color space is defined by the color space,respectively representing the jth pixel D in the neighborhood omega of the image DjThe symbol "^" represents the pixel value after filtering;
traversing the image D to obtain a filtering resultAs low resolution high signal-to-noise ratio color imagesWherein,the result of the r, g and b channel pixel filtering in the image D is obtained;
the method b1-b comprises the following steps:
step b 1-b-1:
calculating the local neighborhood mean of the image E:
μE=fm(E) (6)
calculating three color channels D of an image Dr,Dg,DbLocal neighborhood mean of (c):
step b 1-b-2:
calculating the local neighborhood variance of image E:
σE=fm(E.*E)-μE.*μE(10)
calculating three color channels D of an image E and an image Dr,Dg,DbLocal neighborhood covariance of the corresponding pixel in (1):
step b 1-b-3:
calculating linear transformation coefficients:
wherein, taur,τg,τbA penalty coefficient is obtained, and the value range is (0-10);
step b 1-b-4:
calculating the local neighborhood mean of the linear transformation coefficient:
step b 1-b-5:
for image D three color channels Dr,Dg,DbPerforming linear transformation:
f ism() is the image mean filtering, window size s3 s3, s3 ranges from 3 to 111, is the dot product operation, which represents the multiplication of elements at the same position in the matrix,/is the dotA divide operation, representing a division of elements at the same position in the matrix;
by means of imagesAs low resolution high signal-to-noise ratio color images
The method b2 is characterized in that: under the condition that the area of a Bayer image sensor chip is not increased, the image resolution is reduced, the pixel size is increased, the light sensitivity of the Bayer image sensor is improved, and a Bayer image Q 'with a high signal-to-noise ratio is obtained, wherein the resolution of the Bayer image Q' is 1/f of the resolution of the image A, and the value range of f is 4-100000; according to a Bayer imaging mode, taking 2 x 2 pixels as a processing unit, and taking the average value of r and B channel pixels and 2 g channel pixels in 2 x 2 pixels in an image Q' as a low-resolution high-signal-to-noise-ratio color image B; the gray level image A is superposed with a Bayer image Q' view field;
the method for obtaining the true-color high-definition night vision image C by coloring the high-resolution gray image A by using the low-resolution high-signal-to-noise-ratio color image B comprises a method C1 and a method C2;
the method c1 comprises the following steps:
step c 1-1:
carrying out image denoising processing on the image A, and enhancing the signal-to-noise ratio of the image A; selecting a color and brightness separation color space λ, creating a three channel blank image of the same resolution as image a: f ═ Fλ1,Fλ2,Fλ3In which Fλ1Is the luminance component, { Fλ2,Fλ3Is the color component; converting the image B into a color space lambda to obtain a color image B on the color space lambdaλ={Bλ1,Bλ2,Bλ3In which B isλ1Is the luminance component, { Bλ2,Bλ3Is the color component;
step c 1-2:
substitution of image A into luminance component F of image Fλ1A is ═ a; for image BλColor component { B }λ2,Bλ3Upsampling and thinning are carried out to obtain a color component { B 'with the same resolution as the image F'λ2,B′λ3}; in color component { B'λ2,B′λ3Fill the color component of the image F { F }λ2=B′λ2,Fλ3=B′λ2};
Step c 1-3:
converting the image F into an rgb color space image as a true color high-definition night vision image C, wherein the brightness information in the image C comes from the image A, and the color information comes from the image B;
the method c2 comprises the following steps:
step c 2-1:
selecting a color and brightness separation color space λ, creating a three channel blank image of the same resolution as image a: f ═ Fλ1,Fλ2,Fλ3In which Fλ1Is the luminance component, { Fλ2,Fλ3Is the color component; converting the image B into a color space lambda to obtain a color image B on the color space lambdaλ={Bλ1,Bλ2,Bλ3In which B isλ1Is the luminance component, { Bλ2,Bλ3Is the color component;
step c 2-2:
for image BλLuminance component B ofλ1Performing pixel interpolation to obtain a luminance component B 'with the same resolution as the image F'λ1;
Step c 2-3:
performing image edge-preserving multi-scale decomposition on the image A to obtain a base imageAnd detail imagesWherein (n +1) is the number of layers of image decomposition, and the value range is 1-10;
the base imageThe image edge-preserving smooth filtering results in:
multi-layer detail imageFrom the base imageAnd obtaining by difference with smooth images of different scales:
multi-layer detail imageIt can also be obtained by differencing two smooth images of different scales:
in formulae (29) to (31), fsmooth(. is) image edge-preserving smoothing filtering, superscript 0, z1、z2The index is a label of a filter scale layer, the larger the label is, the smaller the scale is, and the label is 0 to represent the maximum filtering scale;
step c 2-4: will be the base imageDetail imageAnd luminance component B'λ1Filling the weighted fusion into the brightness component F of the image Fλ1:
Wherein, { omega }0,ω1,…,ωn+1The weighting coefficient is in a value range of 0-1;
step c 2-5:
for image BλColor component { B }λ2,Bλ3Upsampling and thinning to obtain a color component { B 'with the same resolution as the image F'λ2,B′λ3} color component { B'λ2,B′λ3Fill to color components of image F: { Fλ2=B′λ2,Fλ3=B′λ2};
Step c 2-6:
converting the image F into an rgb color space color image as a true color high-definition night vision image C, wherein the brightness information in the image C comes from an image B and an image A, and the color information comes from the image B;
the color and brightness separation color space lambda comprises YUV, YCbCr, HSI, Lab and CIEXYZ color spaces;
in method c2, the image edge-preserving smoothing filter fsmooth(. h) comprises weighted least squares guaranteed edge smoothing, guaranteed edge smoothing based on local extrema, guaranteed edge smoothing based on local laplacian pyramid, median filtering, weighted median filtering methods;
in methods c1 and c2, for image BλColor component { B }λ2,Bλ3Upsampling and thinning to obtain a color component { B 'with the same resolution as the image F'λ2,B′λ3Methods of (1) include method d1 and method d 2;
method d1 is characterized by: color component { B ] by sub-pixel interpolationλ2,Bλ3Upsampling the pixels to obtain color components { B'λ2,B′λ3};
The sub-pixel interpolation method comprises nearest neighbor interpolation, bilinear interpolation, Gaussian interpolation, B spline interpolation and RBF interpolation;
method d2 is characterized by: with the luminance component F of the image Fλ1For guidance, for color component { Bλ2,Bλ3Performing upsampling and thinning, wherein the specific method comprises a method d2-a and a method d 2-b;
the method d2-a operates as follows:
step d 2-a-1: using linear interpolation method to process color component Bλ2,Bλ3Upsampling to obtain color components with the same resolution as the image F
Step d 2-a-2: for color componentPerforming Gaussian filtering, wherein the size of a filtering window is s4 × s4 pixels, and the value range of s4 is 3-111;
step d 2-a-3: with a luminance component Fλ1Calculating local smoothing filter weight coefficient for color componentAnd (3) local smoothing filtering:
taking the luminance component Fλ1Middle pixelTo be provided withThe j' th pixel in the neighborhood of ΩCalculating local smoothing filtering weight coefficients:
wherein, Ω 'is s5 × s5 pixel neighborhood, s5 is in the range of 3-111, I is pixel value, a is exponential function coefficient, is in the range of 1-5, epsilon is adjustment factor, is in the range of 0-100, | | | | | represents the Euclidean distance of pixel value, and k' is pixel label;
for the weight coefficient w ″)j′Normalization is carried out:
wherein, represents the weight coefficient w' for all pixels in the neighborhood Ωj′Summing;
by a normalized weight coefficient w'j′As a color componentMiddle pixelFor pixels, by local smoothing filter weight coefficientsCarrying out local smooth filtering to obtain refined color component pixels
Wherein,is the j' th image in the neighborhood ΩA peptide;
traversing color componentsObtaining a filtered result { B'λ2,B′λ3As color component { Bλ2,Bλ3Upsampling the refinement result;
the method d2-b operates as follows:
step d 2-b-1:
using linear interpolation method to process color component Bλ2,Bλ3Upsampling to obtain color components with the same resolution as the image F
Step d 2-b-2:
computing a luminance component F of an image Fλ1Local neighborhood mean of (c):
calculating color componentsLocal neighborhood mean of (c):
step d 2-b-3:
calculating Fλ1Local neighborhood variance of (c):
calculating Fλ1Andlocal neighborhood covariance of the corresponding pixel in (1):
step d 2-b-4:
calculating linear transformation coefficients:
wherein, tauλ2、τλ3The penalty coefficient is obtained, and the value range is (0-10);
step d 2-b-5:
calculating the local neighborhood mean of the linear transformation coefficient:
step d 2-b-6:
for color componentPerforming linear transformation:
in the method d2-b, fm() is image mean filtering with window size s6 s6, s6 ranging from 3 to 111, which represents multiplication of corresponding elements in the matrix,/represents division of corresponding elements in the matrix;
the method for denoising the image A comprises the following steps: gaussian filtering, median filtering, NLM, BM3D and KSVD;
finally, white balance correction is carried out on the true color high-definition night vision image C; processing the image C by adopting nonlinear transformation operations such as Gamma correction and tone mapping, and improving the brightness, contrast and dynamic range of the image C; and performing color enhancement on the image C by adopting a CCM matrix method.
2. According to claim 1, when the method a2 and the method B2 are adopted to acquire the image a and the image B, the following processing flow can be adopted:
the first step is as follows: the bilinear interpolation method in the method d1 is adopted to carry out upsampling on the image B, and the image B with the same resolution as the image A is obtaineduThe superscript u denotes BuIs an up-sampled image;
the second step is that: referring to method B1-a or B1-B, image A is used as a guide image for image BuThree channels are subjected to image local smoothing filtering,obtain image Bu,sThe superscripts u and s denote Bu,sIs an up-sampled image BuThe locally smoothed filtered image of (1);
the third step: referring to method c2, image B is displayed in the selected color space λu,sConversion to image Bu,s,λThe superscript λ denotes Bu,s,λIs Bu,sConverting the image into an image in a color space lambda, and performing edge preserving multi-scale decomposition on the image B by adopting an image edge preserving multi-scale decomposition methodu,s,λThe luminance component and the image A are fused, and the fusion result is converted into an rgb color space image as a true color high-definition night vision image C.
3. According to the claims 1, 2 and 3, when the image A is a near-infrared image, the color information in the image B is colored to the image A by using a method C3 to obtain a true-color high-definition night vision image C:
step C3-1: adjusting the brightness of the image A and the image B to make the brightness of the image B similar to that of the image A, and upsampling the image B by adopting a method d1 or d2 to obtain an image B' with the same resolution as that of the image A;
the method for adjusting the brightness of the image A and the image B comprises the following steps: calculate mean m of image AaSum variance σaConverting the image B into a gray image Gb, calculating the mean value m of the image GbGbSum variance σGbAnd performing linear transformation on all pixels in the image B:
wherein,is r of image Bgb channel pixel values,/represent a multiply operation,/represent a divide operation;
step C3-2: v is formed by pixels in image A and image BA,B'Calculating the C pixel value of the image by adopting a polynomial regression methodWherein,is the rgb pixel value, V, of image CA,B'The input vector formed by the pixels in the image A and the image B' comprises combination forms of 1 order, 2 order, 3 order and the like, wherein the combination form of 1 order is as follows:
the 2 nd order combination is as follows:
the 3-step combination is as follows:
VA,B'is an m-dimensional vector when VA,B'When the combination is 1 order combination, m is 4, when VA,B'When it is a 2-step combination, m is 10, when VA,B'When the combination is 3-order combination, m is 19;
the polynomial regression model is:
IC=XVA,B'(57)
x is a polynomial coefficient matrix with dimensions 3X m:
step C3-3: taking the image A as a guide image, and performing local neighborhood smoothing filtering on the image C by adopting a method a1 or a2 to eliminate image noise in polynomial regression estimation to obtain a true-color high-definition night vision image C;
in step C3-2, the coefficient matrix X is calculated by:
shooting a color card color image C' under the visible light illumination condition; shooting a color chart near-infrared image N under the near-infrared light condition; shooting a color card low-illumination color image C' under the conditions of low illumination and no near-infrared supplementary lighting;
taking n pixels in Y color lump areas in the image C 'to form an output vector I'C,I'CIs 3 x Y 'dimension, Y' is Y x n, and the value range of Y is 1 to 50;
taking N pixels in Y color block areas in the images N and C ' and forming an input matrix V ' according to the selected combination form 'A,B',V'A,B'Is dimension m x Y';
estimating a coefficient matrix X by using least square regression:
X=(V'A,B'*V'A,B' T)-1(V'A,B'*I'C T) (59) 。
4. the method for acquiring the image A and the Bayer Q, Bayer image Q' according to claim 1 comprises the following steps:
the method comprises the following steps: two lenses are used for imaging, a monochrome image sensor is arranged at the rear end of the lens 1 to shoot a gray image A, a Bayer image sensor is arranged at the rear end of the lens 2 to shoot a Bayer image Q or Q ', a coordinate mapping relation between the images shot by the lens 1 and the lens 2 is found through a binocular stereo vision calibration method, and coordinate transformation is carried out on the shot image A and the image Q or Q ', so that the image A and the image Q or Q ' have the same view field;
preferably, the optical axes of the lens 1 and the lens 2 are placed in parallel, and the distance between the optical axes is less than 50 mm;
preferably, the lens 1 adopts a near-infrared anti-reflection lens, a near-infrared light source is added for auxiliary lighting, and a near-infrared light cut-off filter is arranged at the front end of the Bayer image sensor;
the method 2 comprises the following steps: two lenses are used for imaging, a spectroscope is arranged at the front end of the lens 1 and forms an angle of 45 degrees with the optical axis of the lens 1, the lens 1 receives the transmitted light of the spectroscope, a lens 2 is arranged on the reflected light path of the spectroscope, the lens 2 receives the reflected light of the spectroscope, and the optical axes of the lens 1 and the lens 2 are vertical; a monochrome image sensor is arranged at the rear end of the lens 1 to shoot an image A, and a Bayer image sensor is arranged at the rear end of the lens 2 to shoot an image Q or an image Q'; adjusting the positions of the lens 1, the lens 2 and the spectroscope to ensure that the image A and the image Q or Q' have the same view field and the imaging is clear;
the method 3 comprises the following steps: two lenses are used for imaging, a spectroscope is arranged at the front end of the lens 1 and forms an angle of 45 degrees with the optical axis of the lens 1, a reflecting mirror is arranged on a reflecting light path of the spectroscope, the mirror surface of the reflecting mirror is parallel to the mirror surface of the spectroscope, and a lens 2 is arranged on the reflecting light path of the reflecting mirror; the lens 1 receives the transmitted light of the spectroscope, and the lens 2 receives the reflected light of the spectroscope; a monochrome image sensor is arranged at the rear end of the lens 1 to shoot a gray image A, and a Bayer image sensor is arranged at the rear end of the lens 2 to shoot a Bayer image Q or Q'; adjusting the positions of the lens 1, the lens 2, the spectroscope and the reflector to ensure that the image A and the image Q or Q' have the same field of view and the imaging is clear;
the method 4 comprises the following steps: two lenses are used for imaging, a dichroic mirror is arranged at the front end of the lens 1 and forms an angle of 45 degrees with an optical axis of the lens 1 and is used for separating visible light and near infrared light, the lens 1 receives transmission light of the dichroic mirror, a lens 2 is arranged on a reflection light path of the dichroic mirror, and the lens 2 receives reflection light of the dichroic mirror; when a dichroic mirror which reflects near infrared light and transmits visible light is adopted, a Bayer image sensor is arranged at the rear end of the lens 1 to shoot a Bayer image Q or Q', and a monochrome image sensor is arranged at the rear end of the lens 2 to shoot a gray image A; when a dichroic mirror which reflects visible light and transmits near infrared light is adopted, a monochrome image sensor is arranged at the rear end of the lens 1 to shoot a gray image A, and a Bayer image sensor is arranged at the rear end of the lens 2 to shoot a Bayer image Q or Q'; adjusting the positions of the lens 1, the lens 2 and the dichroic mirror to enable the shot Bayer image Q or Q' and the gray level image A to have the same field of view and to form clear images; preferably, a near infrared light source is added for auxiliary illumination;
the method 5 comprises the following steps: two lenses are used for imaging, and a dichroic mirror is arranged at the front end of the lens 1 and forms an angle of 45 degrees with the optical axis of the lens 1 and is used for separating visible light and near infrared light; a reflecting mirror is arranged on a reflecting light path of the dichroic mirror, the mirror surface of the reflecting mirror is parallel to the mirror surface of the dichroic mirror, a lens 2 is arranged on the reflecting light path of the reflecting mirror, and the optical axes of the lens 2 and the lens 1 are parallel; when a dichroic mirror which reflects near infrared light and transmits visible light is adopted, a Bayer image sensor is arranged at the rear end of the lens 1 to shoot a Bayer image Q or Q', and a monochrome image sensor is arranged at the rear end of the lens 2 to shoot a gray image A; when a dichroic mirror which reflects visible light and transmits near infrared light is adopted, a Bayer image sensor is arranged at the rear end of the lens 2 to shoot a Bayer image Q or Q', and a monochrome image sensor is arranged at the rear end of the lens 1 to shoot a gray image A; adjusting the positions of the lens 1, the lens 2, the spectroscope and the reflector to ensure that the Bayer image Q or Q' and the gray level image A have the same field of view and the imaging is clear; preferably, a near infrared light source is added for auxiliary illumination;
the method 6 comprises the following steps: a lens is adopted for imaging, and a dichroic mirror is arranged at the rear end of the lens and forms an angle of 45 degrees with the optical axis of the lens and is used for separating visible light and near infrared light; when a dichroic mirror which reflects near infrared light and transmits visible light is adopted, a Bayer image sensor is arranged on a transmission light path of the dichroic mirror to shoot a Bayer image Q or Q', a monochrome image sensor is arranged on a reflection light path of the dichroic mirror to shoot a gray image A, the plane of the Bayer image sensor is vertical to the optical axis of a lens, and the plane of the monochrome image sensor is parallel to the optical axis of the lens; when a dichroic mirror which reflects visible light and transmits near-infrared light is adopted, a Bayer image sensor is arranged on a reflection light path of the dichroic mirror to shoot a Bayer image Q or Q', a monochrome image sensor is arranged on a transmission light path of the dichroic mirror to shoot a gray image A, the plane of the Bayer image sensor is parallel to the optical axis of a lens, and the plane of the monochrome image sensor is vertical to the optical axis of the lens; adjusting the positions of the Bayer image sensor, the monochrome image sensor and the dichroic mirror to enable the fields of view of the shot images to be overlapped and to be imaged clearly; preferably, a near-infrared light source is added for auxiliary lighting, and a near-infrared anti-reflection lens is adopted as a lens;
the method 7 comprises the following steps: a lens is adopted for imaging, a dichroic mirror is arranged at the rear end of the lens and forms an angle of 45 degrees with the optical axis of the lens and is used for separating visible light and near infrared light, a reflecting mirror is arranged on a reflecting light path of the dichroic mirror, and the mirror surface of the reflecting mirror is parallel to the mirror surface of the dichroic mirror; when a dichroic mirror which reflects near infrared light and transmits visible light is adopted, a Bayer image sensor is arranged on a transmission light path of the dichroic mirror to shoot a Bayer image Q or Q', a monochrome image sensor is arranged on a reflection light path of the reflecting mirror to shoot a gray image A, and the plane of the Bayer image sensor and the plane of the monochrome image sensor are vertical to the optical axis of a lens; when a dichroic mirror which transmits near infrared light and reflects visible light is adopted, a monochrome image sensor is arranged on a transmission light path of the dichroic mirror to shoot a gray image A, a Bayer image sensor is arranged on a reflection light path of the reflecting mirror to shoot a Bayer image Q or Q', and a monochrome image sensor plane and a Bayer image sensor plane are vertical to a lens optical axis; adjusting the positions of a Bayer image sensor, a monochrome image sensor, a dichroic mirror and a reflecting mirror to ensure that the fields of view of the shot images are superposed and the images are clearly imaged; preferably, a near-infrared light source is added for auxiliary lighting, and a near-infrared anti-reflection lens is adopted as a lens;
the method 8 comprises the following steps: a lens is adopted for imaging, and a spectroscope is arranged at the rear end of the lens and forms an angle of 45 degrees with the optical axis of the lens; a Bayer image sensor is arranged on a transmission light path of the spectroscope to shoot a Bayer image Q or Q', the plane of the Bayer image sensor is vertical to the optical axis of the lens, a monochrome image sensor is arranged on a reflection light path of the reflector to shoot a gray image A, and the plane of the monochrome image sensor is parallel to the optical axis of the lens; the Bayer image sensor receives the transmitted light of the spectroscope, and the monochrome image sensor receives the reflected light of the spectroscope; adjusting the positions of the Bayer image sensor, the monochrome image sensor and the spectroscope to ensure that the fields of view of the shot images are overlapped and the images are clear;
the method 9: interchanging the positions of the monochrome image sensor and the Bayer image sensor in the method 8;
the method 10 comprises the following steps: a lens is adopted for imaging, and a spectroscope is arranged at the rear end of the lens and forms an angle of 45 degrees with the optical axis of the lens; a Bayer image sensor is arranged on a transmission light path of the spectroscope to shoot a Bayer image Q or Q', the plane of the Bayer image sensor is vertical to the optical axis of the lens, a reflecting mirror is arranged on a reflection light path of the spectroscope, the mirror surface of the reflecting mirror is parallel to the mirror surface of the spectroscope, a monochrome image sensor is arranged on the reflection light path of the reflecting mirror to shoot a gray image A, and the plane of the monochrome image sensor is vertical to the optical axis of the lens; the Bayer image sensor receives the transmitted light of the spectroscope, and the monochrome image sensor receives the reflected light of the spectroscope; adjusting the positions of the Bayer image sensor, the monochromatic image sensor, the spectroscope and the reflector to ensure that the fields of view of the shot images are overlapped and the images are clear;
the method 11 comprises the following steps: the monochrome image sensor is interchanged with the Bayer image sensor in method 10.
5. According to claim 4, the implementation forms of the beam splitter, the dichroic mirror and the reflecting mirror include: a prism and a plane mirror.
6. According to the claims 3 and 4, in the method 7, the equivalent optical path can be realized by using an L-shaped prism, and the L-shaped prism has 5 light input and output end faces of O1, O2, O3, O4 and O5; the end face of O1 is opposite to the lens and is vertical to the optical axis of the lens, and the end face of O1 is plated with a near infrared antireflection film; the end face of O2 is positioned behind the end face of O1 and forms an angle of 45 degrees with the end face of O1, and the end face of O2 is plated with a film which transmits visible light and reflects near infrared light and is used for an equivalent dichroic mirror; the end face of O3 is parallel to the end face of O2, is positioned on a reflection light path of the end face of O2, and is plated with a near infrared reflection film for an equivalent reflector; the end face of O4 is positioned on the light reflecting path of the end face of O3, and a near infrared antireflection film is plated on the end face of O3578; the end face of O5 is positioned on the transmission light path of the end face of O2, and is plated with a visible light antireflection film; the end face of O4, the end face of O5 and the end face of O1 are parallel, and the optical paths from the end face of O1 to the end faces of O4 and O5 are the same; determining the optical path lengths from the end face of O1 to the end faces of O4 and O5 according to the back focal length of the lens; a monochrome image sensor is attached to the end face of O4, and a Bayer image sensor is attached to the end face of O5; the positions of the monochrome image sensor and the Bayer image sensor are adjusted to enable the visual fields of the shot images to be overlapped.
7. According to the claims 3, 4 and 6, in the method 7, the equivalent optical path can be realized by using an L-shaped prism, and the L-shaped prism has 5 light input and output end faces of O1, O2, O3, O4 and O5; the end face of O1 is opposite to the lens and is vertical to the optical axis of the lens, and the end face of O1 is plated with a near infrared antireflection film; the end face of O2 is positioned behind the end face of O1 and forms an angle of 45 degrees with the end face of O1, and the end face of O2 is plated with a film which transmits near infrared light and reflects visible light and is used for an equivalent dichroic mirror; the end face of O3 is parallel to the end face of O2, is positioned on a light reflecting path of the end face of O2, and is plated with a visible light reflecting film for an equivalent reflector; the end face of O4 is positioned on the reflection light path of the end face of O3, and a visible light antireflection film is plated on the end face of O4; the end face of O5 is positioned on the transmission light path of the end face of O2, and is plated with a near infrared light antireflection film; the end faces of O4, O5 and O1 are parallel and vertical to the optical axis of the lens; the optical distances from the end face of O1 to the end faces of O4 and O5 are the same; determining the optical path lengths from the end face of O1 to the end faces of O4 and O5 according to the back focal length of the lens; a Bayer image sensor is attached to the end face of O4, and a monochrome image sensor is attached to the end face of O5; the positions of the monochrome image sensor and the Bayer image sensor are adjusted to enable the visual fields of the shot images to be overlapped.
8. According to claim 4, in the method 10, the equivalent optical path can be realized by using an L-shaped prism, which has 5 light input and output end faces of O1, O2, O3, O4 and O5; the end face of O1 is opposite to the lens and is vertical to the optical axis of the lens, and the end face of O1 is plated with a near infrared antireflection film; the end face of O2 is positioned behind the end face of O1 and forms an angle of 45 degrees with the end face of O1, and the end face of O2 is plated with a semi-transparent and semi-reflective film for an equivalent spectroscope; the end face of O3 is parallel to the end face of O2, is positioned on a reflection light path of the end face of O2, and is plated with a visible light and near infrared reflection film for an equivalent reflector; the end face of O4 is positioned on the reflection light path of the end face of O3 and is plated with a visible light and near infrared antireflection film; the end face of O5 is positioned on the transmission light path of the end face of O2, and is plated with a visible light antireflection film; the end face of O4, the end face of O5 and the end face of O1 are parallel, and the optical paths from the end face of O1 to the end faces of O4 and O5 are the same; selecting optical path lengths from the end face of O1 to the end faces of O4 and O5 according to the back focal length of the lens; a Bayer image sensor is attached to the end face of O4, and a monochrome image sensor is attached to the end face of O5; adjusting positions of a monochrome image sensor and a Bayer image sensor to enable the fields of view of the shot images to be overlapped; the Bayer image sensor and monochrome image sensor locations may be interchanged.
9. According to claims 1 and 3, the imaging field coincidence requirements of the grayscale image a and the Bayer image Q or Q' are: the mean value of the positional shifts of the pixels of the same name does not exceed 100 pixels.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710687848.XA CN107563971A (en) | 2017-08-12 | 2017-08-12 | A kind of very color high-definition night-viewing imaging method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710687848.XA CN107563971A (en) | 2017-08-12 | 2017-08-12 | A kind of very color high-definition night-viewing imaging method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107563971A true CN107563971A (en) | 2018-01-09 |
Family
ID=60975350
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710687848.XA Pending CN107563971A (en) | 2017-08-12 | 2017-08-12 | A kind of very color high-definition night-viewing imaging method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107563971A (en) |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108038479A (en) * | 2018-01-17 | 2018-05-15 | 昆山龙腾光电有限公司 | Fingerprint identification device and recognition methods |
CN108134897A (en) * | 2018-01-10 | 2018-06-08 | 黄心铭 | A kind of lll night vision imaging arrangement and its application |
CN108340837A (en) * | 2018-04-25 | 2018-07-31 | 深圳新亮智能技术有限公司 | Vehicle night vision DAS (Driver Assistant System) |
CN108540736A (en) * | 2018-04-03 | 2018-09-14 | 深圳新亮智能技术有限公司 | Infrared laser illuminates the camera chain of Color License Plate |
CN108717691A (en) * | 2018-06-06 | 2018-10-30 | 成都西纬科技有限公司 | A kind of image interfusion method, device, electronic equipment and medium |
CN108828769A (en) * | 2018-05-25 | 2018-11-16 | 深圳新亮智能技术有限公司 | The laser monitoring system of occupant |
CN109462722A (en) * | 2018-12-06 | 2019-03-12 | Oppo广东移动通信有限公司 | Image collecting device and electronic equipment |
CN109672808A (en) * | 2018-05-23 | 2019-04-23 | 李芝宏 | Light shunt multisensor camera system and method |
CN109714546A (en) * | 2018-12-26 | 2019-05-03 | 呈像科技(北京)有限公司 | Image processing method and device |
CN110572626A (en) * | 2019-09-03 | 2019-12-13 | 云南白药集团健康产品有限公司 | Image processing method and device |
CN110891138A (en) * | 2018-09-10 | 2020-03-17 | 杭州萤石软件有限公司 | Black light full-color realization method and black light full-color camera |
CN110913098A (en) * | 2019-10-28 | 2020-03-24 | 香港理工大学深圳研究院 | High-definition depth information acquisition system, system preparation method and system ranging method |
CN111458890A (en) * | 2020-04-12 | 2020-07-28 | 国科天成(北京)科技有限公司 | True-color double-light night vision device system and implementation method |
CN111986084A (en) * | 2020-08-03 | 2020-11-24 | 南京大学 | Multi-camera low-illumination image quality enhancement method based on multi-task fusion |
CN112449095A (en) * | 2020-11-12 | 2021-03-05 | Oppo广东移动通信有限公司 | Image processing method and device, electronic equipment and readable storage medium |
CN112785510A (en) * | 2019-11-11 | 2021-05-11 | 华为技术有限公司 | Image processing method and related product |
US11009772B2 (en) | 2017-04-24 | 2021-05-18 | Ramot At Tel-Aviv University Ltd. | Multi-frequency infrared imaging based on frequency conversion |
CN112991244A (en) * | 2019-12-17 | 2021-06-18 | 华为技术有限公司 | Image fusion method, device, camera, storage medium and program product |
CN113170048A (en) * | 2019-02-19 | 2021-07-23 | 华为技术有限公司 | Image processing device and method |
CN113408396A (en) * | 2021-06-15 | 2021-09-17 | 广西交科集团有限公司 | Bridge intelligent sensing system based on cloud computing |
WO2021184353A1 (en) * | 2020-03-20 | 2021-09-23 | 华为技术有限公司 | Camera device |
US20210368080A1 (en) * | 2018-08-09 | 2021-11-25 | Corephotonics Ltd. | Multi-cameras with shared camera apertures |
CN113947535A (en) * | 2020-07-17 | 2022-01-18 | 四川大学 | Low-illumination image enhancement method based on illumination component optimization |
CN114284306A (en) * | 2021-12-15 | 2022-04-05 | 武汉新芯集成电路制造有限公司 | Depth and image sensor device, manufacturing method thereof and depth and image sensor chip |
WO2022068598A1 (en) * | 2020-09-29 | 2022-04-07 | 华为技术有限公司 | Imaging method and apparatus |
WO2022199416A1 (en) * | 2021-03-24 | 2022-09-29 | 华为技术有限公司 | Camera module, terminal device, and imaging method |
WO2024098351A1 (en) * | 2022-11-11 | 2024-05-16 | Lenovo (Beijing) Limited | Imaging system and method for high resolution imaging of a subject |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101950412A (en) * | 2010-07-23 | 2011-01-19 | 北京理工大学 | Method for enhancing details and compressing dynamic range of infrared image |
CN103823296A (en) * | 2014-01-27 | 2014-05-28 | 杭州科汀光学技术有限公司 | Color night vision system based on Philips-type prism |
CN103839230A (en) * | 2012-11-27 | 2014-06-04 | 大连灵动科技发展有限公司 | Brain imaging grayscale image dyeing method |
US8836793B1 (en) * | 2010-08-13 | 2014-09-16 | Opto-Knowledge Systems, Inc. | True color night vision (TCNV) fusion |
CN104735351A (en) * | 2015-03-06 | 2015-06-24 | 中国科学院计算技术研究所 | High resolution light field image recreation method and imaging device |
-
2017
- 2017-08-12 CN CN201710687848.XA patent/CN107563971A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101950412A (en) * | 2010-07-23 | 2011-01-19 | 北京理工大学 | Method for enhancing details and compressing dynamic range of infrared image |
US8836793B1 (en) * | 2010-08-13 | 2014-09-16 | Opto-Knowledge Systems, Inc. | True color night vision (TCNV) fusion |
CN103839230A (en) * | 2012-11-27 | 2014-06-04 | 大连灵动科技发展有限公司 | Brain imaging grayscale image dyeing method |
CN103823296A (en) * | 2014-01-27 | 2014-05-28 | 杭州科汀光学技术有限公司 | Color night vision system based on Philips-type prism |
CN104735351A (en) * | 2015-03-06 | 2015-06-24 | 中国科学院计算技术研究所 | High resolution light field image recreation method and imaging device |
Non-Patent Citations (1)
Title |
---|
DIRK HERTEL: "《A low-cost VIS-NIR true color night vision video system based on a wide dynamic range CMOS imager》", 《2009 IEEE INTELLIGENT VEHICLES SYMPOSIUM》 * |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11009772B2 (en) | 2017-04-24 | 2021-05-18 | Ramot At Tel-Aviv University Ltd. | Multi-frequency infrared imaging based on frequency conversion |
CN108134897A (en) * | 2018-01-10 | 2018-06-08 | 黄心铭 | A kind of lll night vision imaging arrangement and its application |
CN108038479A (en) * | 2018-01-17 | 2018-05-15 | 昆山龙腾光电有限公司 | Fingerprint identification device and recognition methods |
CN108540736A (en) * | 2018-04-03 | 2018-09-14 | 深圳新亮智能技术有限公司 | Infrared laser illuminates the camera chain of Color License Plate |
CN108340837A (en) * | 2018-04-25 | 2018-07-31 | 深圳新亮智能技术有限公司 | Vehicle night vision DAS (Driver Assistant System) |
CN109672808A (en) * | 2018-05-23 | 2019-04-23 | 李芝宏 | Light shunt multisensor camera system and method |
CN108828769A (en) * | 2018-05-25 | 2018-11-16 | 深圳新亮智能技术有限公司 | The laser monitoring system of occupant |
CN108717691A (en) * | 2018-06-06 | 2018-10-30 | 成都西纬科技有限公司 | A kind of image interfusion method, device, electronic equipment and medium |
CN108717691B (en) * | 2018-06-06 | 2022-04-15 | 成都西纬科技有限公司 | Image fusion method and device, electronic equipment and medium |
US20210368080A1 (en) * | 2018-08-09 | 2021-11-25 | Corephotonics Ltd. | Multi-cameras with shared camera apertures |
CN110891138A (en) * | 2018-09-10 | 2020-03-17 | 杭州萤石软件有限公司 | Black light full-color realization method and black light full-color camera |
CN109462722A (en) * | 2018-12-06 | 2019-03-12 | Oppo广东移动通信有限公司 | Image collecting device and electronic equipment |
CN109714546A (en) * | 2018-12-26 | 2019-05-03 | 呈像科技(北京)有限公司 | Image processing method and device |
CN109714546B (en) * | 2018-12-26 | 2020-10-09 | 呈像科技(北京)有限公司 | Image processing method and device |
CN113170048A (en) * | 2019-02-19 | 2021-07-23 | 华为技术有限公司 | Image processing device and method |
CN110572626A (en) * | 2019-09-03 | 2019-12-13 | 云南白药集团健康产品有限公司 | Image processing method and device |
CN110572626B (en) * | 2019-09-03 | 2021-05-28 | 云南白药集团健康产品有限公司 | Image processing method and device |
CN110913098A (en) * | 2019-10-28 | 2020-03-24 | 香港理工大学深圳研究院 | High-definition depth information acquisition system, system preparation method and system ranging method |
CN112785510A (en) * | 2019-11-11 | 2021-05-11 | 华为技术有限公司 | Image processing method and related product |
CN112785510B (en) * | 2019-11-11 | 2024-03-05 | 华为技术有限公司 | Image processing method and related product |
WO2021093712A1 (en) * | 2019-11-11 | 2021-05-20 | 华为技术有限公司 | Image processing method and related product |
CN112991244A (en) * | 2019-12-17 | 2021-06-18 | 华为技术有限公司 | Image fusion method, device, camera, storage medium and program product |
CN113728618A (en) * | 2020-03-20 | 2021-11-30 | 华为技术有限公司 | Camera device |
WO2021184353A1 (en) * | 2020-03-20 | 2021-09-23 | 华为技术有限公司 | Camera device |
CN111458890B (en) * | 2020-04-12 | 2021-03-16 | 国科天成科技股份有限公司 | True-color double-light night vision device system and implementation method |
CN111458890A (en) * | 2020-04-12 | 2020-07-28 | 国科天成(北京)科技有限公司 | True-color double-light night vision device system and implementation method |
CN113947535B (en) * | 2020-07-17 | 2023-10-13 | 四川大学 | Low-illumination image enhancement method based on illumination component optimization |
CN113947535A (en) * | 2020-07-17 | 2022-01-18 | 四川大学 | Low-illumination image enhancement method based on illumination component optimization |
CN111986084A (en) * | 2020-08-03 | 2020-11-24 | 南京大学 | Multi-camera low-illumination image quality enhancement method based on multi-task fusion |
CN111986084B (en) * | 2020-08-03 | 2023-12-12 | 南京大学 | Multi-camera low-illumination image quality enhancement method based on multi-task fusion |
WO2022068598A1 (en) * | 2020-09-29 | 2022-04-07 | 华为技术有限公司 | Imaging method and apparatus |
CN114338962A (en) * | 2020-09-29 | 2022-04-12 | 华为技术有限公司 | Image forming method and apparatus |
CN114338962B (en) * | 2020-09-29 | 2023-04-18 | 华为技术有限公司 | Image forming method and apparatus |
CN112449095A (en) * | 2020-11-12 | 2021-03-05 | Oppo广东移动通信有限公司 | Image processing method and device, electronic equipment and readable storage medium |
WO2022199416A1 (en) * | 2021-03-24 | 2022-09-29 | 华为技术有限公司 | Camera module, terminal device, and imaging method |
CN113408396A (en) * | 2021-06-15 | 2021-09-17 | 广西交科集团有限公司 | Bridge intelligent sensing system based on cloud computing |
CN114284306A (en) * | 2021-12-15 | 2022-04-05 | 武汉新芯集成电路制造有限公司 | Depth and image sensor device, manufacturing method thereof and depth and image sensor chip |
WO2024098351A1 (en) * | 2022-11-11 | 2024-05-16 | Lenovo (Beijing) Limited | Imaging system and method for high resolution imaging of a subject |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107563971A (en) | A kind of very color high-definition night-viewing imaging method | |
US20240106971A1 (en) | Method and system for generating at least one image of a real environment | |
TWI737979B (en) | Image demosaicer and method | |
JP5989076B2 (en) | Image capture and processing using monolithic camera arrays with different types of imagers | |
EP3416369B1 (en) | Image processing method and apparatus for terminal, and terminal | |
WO2014185064A1 (en) | Image processing method and system | |
CN106960428A (en) | Visible ray and infrared double-waveband image co-registration Enhancement Method | |
CN110660088A (en) | Image processing method and device | |
CN113676628A (en) | Multispectral sensor, imaging device and image processing method | |
CN107580163A (en) | A kind of twin-lens black light camera | |
CN104683767A (en) | Fog penetrating image generation method and device | |
CN110519489A (en) | Image-pickup method and device | |
CN103430551A (en) | An imaging system using a lens unit with longitudinal chromatic aberrations and a method of operating | |
JP5406151B2 (en) | 3D imaging device | |
JP7171254B2 (en) | Image processing device, imaging device, and image processing method | |
CN109804619A (en) | Image processing apparatus, image processing method and camera | |
JP5186517B2 (en) | Imaging device | |
CN111131798B (en) | Image processing method, image processing apparatus, and imaging apparatus | |
CN109118463B (en) | SAR image and optical image fusion method based on HSL and image entropy | |
CN114757831A (en) | High-resolution video hyperspectral imaging method, device and medium based on intelligent space-spectrum fusion | |
CN108370422A (en) | Visible and near-infrared image system and method are acquired using single matrix sensor | |
CN114868384B (en) | Apparatus and method for image processing | |
CN115550570A (en) | Image processing method and electronic equipment | |
CN110580684A (en) | image enhancement method based on black-white-color binocular camera | |
CN110493493B (en) | Panoramic detail camera and method for acquiring image signal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20180109 |