CN110517206B - Method and device for eliminating color moire - Google Patents
Method and device for eliminating color moire Download PDFInfo
- Publication number
- CN110517206B CN110517206B CN201910840627.0A CN201910840627A CN110517206B CN 110517206 B CN110517206 B CN 110517206B CN 201910840627 A CN201910840627 A CN 201910840627A CN 110517206 B CN110517206 B CN 110517206B
- Authority
- CN
- China
- Prior art keywords
- color
- image
- component
- coefficient
- original image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 32
- 230000003595 spectral effect Effects 0.000 claims abstract description 81
- 238000001228 spectrum Methods 0.000 claims description 41
- 238000001914 filtration Methods 0.000 claims description 40
- 230000003287 optical effect Effects 0.000 claims description 9
- 230000002194 synthesizing effect Effects 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 4
- 230000008030 elimination Effects 0.000 abstract description 3
- 238000003379 elimination reaction Methods 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 8
- 239000011159 matrix material Substances 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 4
- 241001428800 Cell fusing agent virus Species 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 208000036971 interstitial lung disease 2 Diseases 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 101100248200 Arabidopsis thaliana RGGB gene Proteins 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Quality & Reliability (AREA)
- Color Image Communication Systems (AREA)
- Image Processing (AREA)
- Color Television Image Signal Generators (AREA)
Abstract
A color moire elimination method and a device, the method comprises the following steps: acquiring an original image; calculating a brightness component and three color components of the original image; obtaining a spectral overlap coefficient of the luminance component and each of the three color components; acquiring a weight coefficient corresponding to each color component in the three color components according to the spectral overlapping coefficient, wherein the weight coefficient is greater than 0 and less than or equal to 1; and acquiring output values of four color channels according to the brightness component, the three color components and the weight coefficients of the three color components.
Description
Technical Field
The invention relates to the field of image processing, in particular to a color moire eliminating method and device.
Background
An image sensor is a semiconductor device that converts an optical signal into an electrical signal. Conventional image sensors include Charge Coupled Devices (CCD) image sensors and Complementary Metal Oxide Semiconductor (CMOS) image sensors. To capture a Color image, the most common method is to add a Color Filter Array (CFA), such as a bayer CFA shown in fig. 1, to the front of the image sensor, wherein the CFA includes three Color channels of red, green and blue (RGB).
During imaging, each pixel position of the CFA only allows light of one color to pass through, enter the image sensor, and be converted into an electrical signal to be output, so that an original image captured by the image sensor is a sampled Mosaic image (Mosaic image) in a certain regular arrangement. Demosaicing (demosaicing) is a process by which a color image is formed and output to an output device such as a display, printer, projector, etc.
When the shot object has regular and dense texture, due to the limitation of sampling frequency, slowly changing color stripes, called color moire, appear on the demosaiced image, and the color moire not only destroys the aesthetic feeling of the image, but also causes misjudgment on image information and seriously affects the imaging quality.
The most common method for eliminating color moire of an image is to add a low pass filter in an optical system, however, the low pass filter simultaneously blurs the details of the image and reduces the imaging quality. Another method of eliminating color moire is to improve the CFA, which is complicated and expensive and is not available for CFAs already in practical use, as well as for digital images that have been taken.
In the method of removing color moir e through image processing, the published documents and patents mainly include converting an image into luminance and color components, blurring the color components with a filter, and then converting back to a color image. The disadvantage of this method is that sharp color edges become blurred and tinged with surrounding colors. The larger the filter used for blurring, the more severe this color shading is, and to eliminate color moir e, the filter is usually very large, resulting in unacceptable color shading. Another disadvantage of this type of method is that it is computationally expensive and cannot be used in real-time systems.
In order to improve the imaging effect of an image sensor under a low illumination condition, many new CFAs add a transparent channel W in addition to three primary colors of red, green and blue (RGB), as shown in fig. 2, the CFA has four color channels, which reduces the noise of the image sensor under the low illumination condition, but the lower sampling rate makes the problem of color moire of the collected image more serious.
Disclosure of Invention
In order to solve the problem of color moire in the image, an embodiment of the present invention provides a color moire removing method, including: acquiring an original image; calculating a brightness component and three color components of the original image; obtaining a spectral overlap coefficient of the luminance component and each of the three color components; acquiring a weight coefficient corresponding to each color component in the three color components according to the spectral overlapping coefficient, wherein the weight coefficient is greater than 0 and less than or equal to 1; and acquiring output values of four color channels according to the brightness component, the three color components and the weight coefficients of the three color components.
Optionally, the method further comprises: after an original image is acquired, the original image is converted into a spectral image.
Optionally, the calculating the luminance component of the original image comprises: performing frequency domain filtering on the frequency spectrum image through a first filter, and converting the filtered frequency spectrum image back to a spatial domain to obtain the brightness component; or performing space domain convolution on the original image by adopting a first convolution template to obtain the brightness component; wherein a center frequency of the first filter corresponding to the first convolution template is related to a spectral image peak position of the luminance component.
Optionally, the calculating three color components of the original image comprises: performing frequency domain filtering on the frequency spectrum image for multiple times, and converting the filtered frequency spectrum image back to a spatial domain so as to obtain a color component by frequency domain filtering each time; or performing multiple spatial domain convolutions on the original image, wherein each spatial domain convolution obtains a color component; wherein, each time of frequency domain filtering respectively corresponds to different center frequencies; each spatial domain convolution corresponds to a different center frequency, which is related to the peak position of each color component in the spectral image.
Optionally, obtaining the spectral overlap coefficient of the luminance component and each of the three color components includes: for each color component, performing frequency domain filtering on the frequency spectrum image by adopting a second filter, and converting the filtered frequency spectrum image back to a spatial domain to obtain a first filtering result; performing a square operation on the first filtering result to obtain a frequency spectrum overlapping coefficient of the color component; wherein a center frequency of the second filter is on a line connecting a spectral image peak position of each color component and a spectral image peak position of the luminance component.
Alternatively, when the spectral overlap coefficient of the luminance component and the color component is larger, the weight coefficient corresponding to the color component is set to be smaller.
Optionally, the weight coefficient is calculated by the following formula:wherein w represents a weight coefficient, E represents the spectral overlap coefficient, and σ is an adjustable parameter.
Optionally, the weight coefficient is calculated by the following formula:
wherein w represents a weight coefficient, E represents the spectral overlap coefficient, a, b, EminAnd EmaxAre all adjustable parameters, EminLess than Emax。
Optionally, the output values of the four color channels are calculated by the following formula:
P1=L1+w1*C2+w2*C3+w3*C4;
P2=L1+w1*C2-w2*C3-w3*C4;
P3=L1-w1*C2+w2*C3-w3*C4;
P4=L1-w1*C2-w2*C3+w3*C4;
where P1-P4 represent output values of four color channels, L1 represents the luminance component, C2-C4 represent the three color components, and w1-w3 represent weight coefficients corresponding to the three color components, respectively.
Optionally, the method further comprises: and after the output values of the four color channels are obtained, synthesizing the image without the color moire according to the output values.
The embodiment of the invention also provides a color moire eliminating device, which comprises: the device comprises a memory, a processor and a signal processing module, wherein the memory is stored with a computer program which can run on the processor, and when the processor executes the program, the processor controls the signal processing module to execute the following steps: acquiring an original image; calculating a brightness component and three color components of the original image; obtaining a spectral overlap coefficient of the luminance component and each of the three color components; acquiring a weight coefficient corresponding to each color component in the three color components according to the spectral overlapping coefficient, wherein the weight coefficient is greater than 0 and less than or equal to 1; and acquiring output values of four color channels according to the brightness component, the three color components and the weight coefficients of the three color components.
Optionally, the signal processing module is further adapted to perform: after an original image is acquired, the original image is converted into a spectral image.
Optionally, the calculating the luminance component of the original image comprises: performing frequency domain filtering on the frequency spectrum image through a first filter, and converting the filtered frequency spectrum image back to a spatial domain to obtain the brightness component; or performing space domain convolution on the original image by adopting a first convolution template to obtain the brightness component; wherein a center frequency of the first filter corresponding to the first convolution template is related to a spectral image peak position of the luminance component.
Optionally, the calculating three color components of the original image comprises: performing frequency domain filtering on the frequency spectrum image for multiple times, and converting the filtered frequency spectrum image back to a spatial domain so as to obtain a color component by frequency domain filtering each time; or performing multiple spatial domain convolutions on the original image, wherein each spatial domain convolution obtains a color component; wherein, each time of frequency domain filtering respectively corresponds to different center frequencies; each spatial domain convolution corresponds to a different center frequency, which is related to the peak position of each color component in the spectral image.
Optionally, obtaining the spectral overlap coefficient of the luminance component and each of the three color components includes: for each color component, performing frequency domain filtering on the frequency spectrum image by adopting a second filter, and converting the filtered frequency spectrum image back to a spatial domain to obtain a first filtering result; performing a square operation on the first filtering result to obtain a frequency spectrum overlapping coefficient of the color component; wherein a center frequency of the second filter is on a line connecting a spectral image peak position of each color component and a spectral image peak position of the luminance component.
Alternatively, when the spectral overlap coefficient of the luminance component and the color component is larger, the weight coefficient corresponding to the color component is set to be smaller.
Optionally, the weight coefficient is calculated by the following formula:wherein w represents a weight coefficient, E represents the spectral overlap coefficient, and σ is an adjustable parameter.
Optionally, the weight coefficient is calculated by the following formula:
wherein w represents a weight coefficient, E represents the spectral overlap coefficient, a, b, EminAnd EmaxAre all adjustable parameters, EminLess than Emax。
Optionally, the output values of the four color channels are calculated by the following formula:
P1=L1+w1*C2+w2*C3+w3*C4;
P2=L1+w1*C2-w2*C3-w3*C4;
P3=L1-w1*C2+w2*C3-w3*C4;
P4=L1-w1*C2-w2*C3+w3*C4;
where P1-P4 represent output values of four color channels, L1 represents the luminance component, C2-C4 represent the three color components, and w1-w3 represent weight coefficients corresponding to the three color components, respectively.
Optionally, the signal processing module is further adapted to perform: and after the output values of the four color channels are obtained, synthesizing the image without the color moire according to the output values.
The embodiment of the invention also provides a camera device which comprises an optical lens, an image sensor and the color moire eliminating device.
Compared with the prior art, the technical scheme of the embodiment of the invention has the following advantages:
in an embodiment of the present invention, the color moire removing method includes: acquiring an original image; calculating a brightness component and three color components of the original image; obtaining a spectral overlap coefficient of the luminance component and each of the three color components; acquiring a weight coefficient corresponding to each color component in the three color components according to the spectral overlapping coefficient, wherein the weight coefficient is greater than 0 and less than or equal to 1; and acquiring output values of four color channels according to the brightness component, the three color components and the weight coefficients of the three color components. The color moire in the image can be effectively eliminated by applying a smaller weight coefficient to the color component where the color moire is located, and the calculation amount is small, so that the method can be applied to a real-time system.
Further, compared with a color moire elimination method for blurring color components by using a filter, the scheme can not blur details of the processed image.
Drawings
FIG. 1 is a schematic diagram of a Beel-type RGGB color filter array;
FIG. 2 is a schematic diagram of an RGBW color filter array having 4 color channels;
FIG. 3 is a schematic flow chart of a color moire removal method according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a spectral image in an embodiment of the present invention;
FIG. 5 is a convolution template for computing a luminance component according to an embodiment of the present invention;
FIGS. 6-8 illustrate a convolution template for computing color components according to an embodiment of the present invention;
FIG. 9 is a diagram illustrating a weighting factor as a function of spectral overlap factors according to an embodiment of the present invention; and
fig. 10 is a schematic structural diagram of an image pickup apparatus according to an embodiment of the present invention.
Detailed Description
Referring to fig. 3, fig. 3 is a schematic flow chart of a color moire removing method provided by the present invention, and the embodiment of the present invention can be applied to an image capturing device.
In S11, an original image is acquired.
In the embodiment of the invention, the light rays are imaged on the image sensor through the optical lens of the camera device, and the camera device can acquire the original image from the image sensor, wherein the original image can be in an RGBW format. Since the image sensor converts the optical signal into an electrical signal output, in some embodiments, the raw image may be represented by a matrix, such as an 800 x 600 matrix.
In S12, the original image is converted into a spectral image.
In some embodiments, the color moire removal method further comprises: after an original image is acquired, the original image is converted into a spectral image. Specifically, the original image may be transformed into the frequency domain by a Fast Fourier Transform (FFT) to obtain a spectral image. Fig. 4 is a schematic diagram of a spectrum image in an embodiment of the present invention, in the coordinate system shown in fig. 4, the u-axis on the bottom surface, the v-axis representing spatial frequencies in the horizontal and vertical directions, and the coordinate axis perpendicular to the bottom surface representing amplitude. The peak positions of the luminance component L1 and the color components C2, C3, C4 of the original image in the frequency domain can be seen in fig. 4.
In S13, a luminance component and three color components of the original image are calculated.
In some embodiments, said calculating a luminance component of said original image comprises: performing frequency domain filtering on the frequency spectrum image through a first filter, and converting the filtered frequency spectrum image back to a spatial domain to obtain the brightness component; or performing spatial domain convolution on the original image by adopting a first convolution template to obtain the brightness component.
In some embodiments, the luminance component L1 may be obtained by frequency filtering and converting the filtered spectral image back to the spatial domain, and the first filter may be a low-pass filter, and the center frequency of the first filter is related to the peak position of the spectral image of the luminance component L1. Based on the arrangement of the color channels in the CFA, in the embodiment shown in fig. 4, the peak of the luminance component L1 is located at (0, 0). Therefore, the center frequency of the first filter may be set to (0, 0).
In some embodiments, the luminance component L1 may also be obtained by performing a spatial domain convolution on the original image through a convolution template. The center frequency corresponding to the convolution template is also related to the spectral image peak position of the luminance component, and in the embodiment shown in fig. 4, the peak of the luminance component L1 is located at (0, 0). Thus, the center frequency corresponding to the convolution template may be (0, 0). Fig. 5 is a convolution template for calculating a luminance component according to an embodiment of the present invention, and the original image may be convolved in a spatial domain by the convolution template shown in fig. 5 to obtain the luminance component L1.
It is to be noted that if the original image is an 800 × 600 matrix, the luminance component L1 obtained by filtering or spatial domain convolution is also an 800 × 600 matrix.
In some embodiments, calculating three color components of the original image comprises: performing frequency domain filtering on the frequency spectrum image for multiple times, and converting the filtered frequency spectrum image back to a spatial domain so as to obtain a color component by frequency domain filtering each time; or performing multiple spatial domain convolutions on the original image, wherein each spatial domain convolution obtains a color component; wherein, each time of frequency domain filtering respectively corresponds to different center frequencies; each spatial domain convolution corresponds to a different center frequency, which is related to the peak position of each color component in the spectral image.
In particular, the three color components C2, C3, and C4 of the original image may be obtained by triple filtering and converting the filtered spectral image back into the spatial domain, respectively. That is, the spectral image may be filtered using three filters having different center frequencies that are related to the peak position of each color component in the spectral image. For example, in the embodiment shown in fig. 4, the peak position coordinate of the color component C2 is (0.5, 0), the peak position coordinate of the color component C3 is (0, 0.5), and the peak position coordinate of the color component C4 is (0.5 ). Therefore, the spectral image is filtered with a filter with a center frequency of (0.5, 0), and the filtered spectral image is converted back to the spatial domain to obtain the color component C2; filtering the spectral image with a filter having a center frequency of (0, 0.5), and converting the filtered spectral image back to the spatial domain to obtain the color component C3; the spectral image is filtered with a filter with a center frequency of (0.5 ) and the filtered spectral image is converted back into the spatial domain to obtain the color component C4.
In some embodiments, three color components may also be obtained by performing a cubic spatial domain convolution on the original image. The three spatial domain convolutions correspond to three different convolution templates, respectively, which correspond to different center frequencies, respectively, which are related to the peak positions of each color component in the spectral image. Similarly, in the embodiment shown in fig. 4, a convolution template with a center frequency of (0.5, 0) is used to perform a spatial domain convolution with the original image to obtain the color component C2; performing a spatial domain convolution with the original image using a convolution template having a center frequency of (0, 0.5) to obtain the color component C3; a convolution template with a center frequency of (0.5 ) is used to perform a spatial domain convolution with the original image to obtain the color component C4.
In a specific implementation, the convolution templates shown in fig. 6 to 8 may be used to perform a spatial domain convolution on the original image to obtain the color components C2, C3, and C4, respectively.
It is to be noted that if the original image is an 800 × 600 matrix, the three color components C2, C3, and C4 obtained by filtering or spatial domain convolution are also all 800 × 600 matrices.
In S14, a spectral overlap coefficient of the luminance component and each of the three color components is acquired.
Specifically, the spectral overlap coefficients of the luminance component L1 and the color components C2, C3, and C4 are calculated, respectively, and the larger the spectral overlap coefficient, the stronger the color moire is indicated for the color component.
In some embodiments, the method of obtaining spectral overlap coefficients comprises: for each color component, performing frequency domain filtering on the frequency spectrum image by adopting a second filter, and converting the filtered frequency spectrum image back to a spatial domain to obtain a first filtering result; and performing a square operation on the first filtering result to obtain a spectral overlapping coefficient of the color component.
In a specific implementation, the second filter may be a gaussian band pass filter. The center frequency of the second filter is on a line connecting the spectral image peak position of each color component and the spectral image peak position of the luminance component.
For example, in the embodiment shown in fig. 4, when the spectral overlap coefficient E1 of the luminance component L1 and the color component C2 is calculated, the center frequency of the second filter is on the line connecting the peak positions of the spectral images of the luminance component L1 and the color component C2, i.e., (Uc, 0). In some embodiments, Uc may be 0.25, and thus, the center frequency of the second filter may be (0.25, 0). Similarly, in calculating the spectral overlap coefficient E2 of the luminance component L1 and the color component C3, the center frequency of the second filter may be (0, 0.25); in calculating the spectral overlap coefficient E3 of the luminance component L1 and the color component C4, the center frequency of the second filter may be (0.25 ).
The obtained spectral overlaps E1, E2, E3 correspond to the three color components C2, C3, C4, respectively. The spectral overlap coefficients E1, E2, E3 are all matrices of the same size as the original image.
In S15, a weighting coefficient corresponding to each of the three color components is obtained based on the spectral overlap coefficient.
A larger spectral overlap coefficient indicates a stronger color moire for the color component, and thus, when the spectral overlap coefficient of the luminance component and the color component is larger, the weight coefficient corresponding to the color component is set smaller. The weight coefficients are matrices of the same size as the color component dimensions. The value range of each element in the weight coefficient is more than 0 and less than or equal to 1.
In some embodiments, the weight coefficient is calculated by the following formula:
wherein w represents a weight coefficient, E represents the spectral overlap coefficient, and σ is an adjustable parameter.
In some embodiments, for convenience of implementation in hardware, the weight coefficient may also be obtained according to a function diagram shown in fig. 9, and specifically, the weight coefficient is calculated by the following formula:
w represents a weight coefficient, E represents the spectral overlap coefficient, a, b, EminAnd EmaxAre all adjustable parameters, EminLess than Emax. In particular implementations, E can be setminAnd EmaThe magnitude of the color moire is controlled to control the intensity of the color moire.
In S16, output values of four color channels are obtained according to the luminance component, the three color components, and the weighting coefficients of the three color components.
In some embodiments, the output values of the four color channels may be calculated by the following formulas.
P1=L1+w1*C2+w2*C3+w3*C4
P2=L1+w1*C2-w2*C3-w3*C4
P3=L1-w1*C2+w2*C3-w3*C4
P4=L1-w1*C2-w2*C3+w3*C4
Where P1-P4 represent output values of four color channels, L1 represents the luminance component, C2-C4 represent the three color components, and w1-w3 represent weight coefficients corresponding to the three color components, respectively.
In a CFA having four RGBW color channels, P1-P4 represent the RGBW four color channels, respectively. Specifically, it can be further determined which color channel each of P1 to P4 represents, according to the arrangement of the four color channels in the CFA. P1-P4 are all matrices of the same size as the original image.
In S17, the image after the elimination of the color moire is synthesized from the output values.
In some embodiments, the color moire removal method further comprises: and after the output values of the four color channels are obtained, synthesizing the image without the color moire according to the output values. In a specific implementation, the output values of the four color channels may be converted into an RGB color image of the three color channels by multiplying by a color correction matrix, or the output values of the four color channels may be directly output to a display supporting the RGBW format to generate the RGBW color image.
The embodiment of the invention also provides a color moire eliminating device, which comprises a memory, a processor and a signal processing module, wherein the memory is stored with a computer program capable of running on the processor, and when the processor executes the program, the signal processing module is controlled to execute the following steps: acquiring an original image; calculating a brightness component and three color components of the original image; acquiring a spectrum overlapping coefficient of the brightness component and each of the three color components, and acquiring a weight coefficient of each of the three color components according to the spectrum overlapping coefficient, wherein the weight coefficient is greater than 0 and less than or equal to 1; and acquiring output values of four color channels according to the brightness component, the three color components and the weight coefficients of the three color components.
For more details of the color moire removing device, reference may be made to the above description of the color moire removing method, which is not repeated herein.
Fig. 10 is a schematic structural diagram of an image capturing apparatus 100 according to an embodiment of the present invention. The image pickup apparatus includes: an optical lens 110, an image sensor 120, and the color moire removing device, i.e., a signal processing module 130, a processor 140, and a memory 150. The light is imaged on the image sensor 120 through the optical lens to form an original image, an RGBW CFA is disposed in front of the image sensor 120, the processor 140 controls the signal processing module 130 to perform a color moire removing operation on the original image, and a color image with color moire removed is stored in the memory 150.
In order to make the aforementioned objects, features and advantages of the embodiments of the present invention comprehensible, specific embodiments accompanied with figures are described in detail below.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer-readable storage medium, and the storage medium may include: ROM, RAM, magnetic or optical disks, and the like.
Although the present invention is disclosed above, the present invention is not limited thereto. Various changes and modifications may be effected therein by one skilled in the art without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (15)
1. A color moire removal method, comprising:
acquiring an original image;
calculating a brightness component and three color components of the original image;
obtaining a spectral overlap coefficient of the luminance component and each of the three color components;
acquiring a weight coefficient corresponding to each color component in the three color components according to the spectral overlapping coefficient, wherein the weight coefficient is greater than 0 and less than or equal to 1; and
acquiring output values of four color channels according to the brightness component, the three color components and the weight coefficients of the three color components;
after an original image is obtained, converting the original image into a frequency spectrum image;
obtaining a spectral overlap coefficient of the luminance component and each of the three color components, comprising:
for each color component, performing frequency domain filtering on the frequency spectrum image by adopting a second filter, and converting the filtered frequency spectrum image back to a spatial domain to obtain a first filtering result; performing a square operation on the first filtering result to obtain a frequency spectrum overlapping coefficient of the color component; the center frequency of the second filter is on a line connecting the spectral image peak position of each color component and the spectral image peak position of the luminance component;
when the spectral overlap coefficient of the luminance component and the color component is larger, the weight coefficient corresponding to the color component is set smaller.
2. The method according to claim 1, wherein the calculating the luminance component of the original image comprises:
performing frequency domain filtering on the frequency spectrum image through a first filter, and converting the filtered frequency spectrum image back to a spatial domain to obtain the brightness component; or performing space domain convolution on the original image by adopting a first convolution template to obtain the brightness component;
wherein a center frequency of the first filter corresponding to the first convolution template is related to a spectral image peak position of the luminance component.
3. The method of claim 1, wherein the calculating three color components of the original image comprises:
performing frequency domain filtering on the frequency spectrum image for multiple times, and converting the filtered frequency spectrum image back to a spatial domain so as to obtain a color component by frequency domain filtering each time; or performing multiple spatial domain convolutions on the original image, wherein each spatial domain convolution obtains a color component;
wherein, each time of frequency domain filtering respectively corresponds to different center frequencies; each spatial domain convolution corresponds to a different center frequency, which is related to the peak position of each color component in the spectral image.
6. The color moire removal method as defined in claim 1, wherein the output values of said four color channels are calculated by the following formula:
P1=L1+w1*C2+w2*C3+w3*C4;
P2=L1+w1*C2-w2*C3-w3*C4;
P3=L1-w1*C2+w2*C3-w3*C4;
P4=L1-w1*C2-w2*C3+w3*C4;
where P1-P4 represent output values of four color channels, L1 represents the luminance component, C2-C4 represent the three color components, and w1-w3 represent weight coefficients corresponding to the three color components, respectively.
7. The color moire removal method as defined in claim 1, further comprising: and after the output values of the four color channels are obtained, synthesizing the image without the color moire according to the output values.
8. A color moire reducing device, comprising: the computer program comprises a memory, a processor and a signal processing module, wherein the memory stores a computer program which can run on the processor, and when the processor executes the program, the processor controls the signal processing module to execute the following steps:
acquiring an original image;
calculating a brightness component and three color components of the original image;
acquiring a spectrum overlapping coefficient of the brightness component and each of the three color components, and acquiring a weight coefficient of each of the three color components according to the spectrum overlapping coefficient, wherein the weight coefficient is greater than 0 and less than or equal to 1; and
acquiring output values of four color channels according to the brightness component, the three color components and the weight coefficients of the three color components;
the signal processing module is further adapted to perform: after an original image is obtained, converting the original image into a frequency spectrum image;
wherein obtaining the spectral overlap coefficient of the luminance component and each of the three color components comprises: for each color component, performing frequency domain filtering on the frequency spectrum image by adopting a second filter, and converting the filtered frequency spectrum image back to a spatial domain to obtain a first filtering result; performing a square operation on the first filtering result to obtain a frequency spectrum overlapping coefficient of the color component; the center frequency of the second filter is on a line connecting the spectral image peak position of each color component and the spectral image peak position of the luminance component;
when the spectral overlap coefficient of the luminance component and the color component is larger, the weight coefficient corresponding to the color component is set smaller.
9. The color moire removal device as defined in claim 8, wherein said calculating a luminance component of said original image comprises:
performing frequency domain filtering on the frequency spectrum image through a first filter, and converting the filtered frequency spectrum image back to a spatial domain to obtain the brightness component; or performing space domain convolution on the original image by adopting a first convolution template to obtain the brightness component;
wherein a center frequency of the first filter corresponding to the first convolution template is related to a spectral image peak position of the luminance component.
10. The color moire removal device as recited in claim 8, wherein said calculating three color components of said original image comprises:
performing frequency domain filtering on the frequency spectrum image for multiple times, and converting the filtered frequency spectrum image back to a spatial domain so as to obtain a color component by frequency domain filtering each time; or performing multiple spatial domain convolutions on the original image, wherein each spatial domain convolution obtains a color component;
wherein, each time of frequency domain filtering respectively corresponds to different center frequencies; each spatial domain convolution corresponds to a different center frequency, which is related to the spectral image peak position of each color component.
13. The color moire removal device as defined in claim 8, wherein the output values of said four color channels are calculated by the following formula:
P1=L1+w1*C2+w2*C3+w3*C4;
P2=L1+w1*C2-w2*C3-w3*C4;
P3=L1-w1*C2+w2*C3-w3*C4;
P4=L1-w1*C2-w2*C3+w3*C4;
where P1-P4 represent output values of four color channels, L1 represents the luminance component, C2-C4 represent the three color components, and w1-w3 represent weight coefficients corresponding to the three color components, respectively.
14. The color moire removal device as defined in claim 8, wherein said signal processing module is further adapted to perform: and after the output values of the four color channels are obtained, synthesizing the image without the color moire according to the output values.
15. An image pickup apparatus comprising an optical lens, an image sensor, and the color moire removing device according to any one of claims 8 to 14.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910840627.0A CN110517206B (en) | 2019-09-05 | 2019-09-05 | Method and device for eliminating color moire |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910840627.0A CN110517206B (en) | 2019-09-05 | 2019-09-05 | Method and device for eliminating color moire |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110517206A CN110517206A (en) | 2019-11-29 |
CN110517206B true CN110517206B (en) | 2021-10-15 |
Family
ID=68630037
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910840627.0A Active CN110517206B (en) | 2019-09-05 | 2019-09-05 | Method and device for eliminating color moire |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110517206B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111311511B (en) * | 2020-01-22 | 2023-08-29 | 凌云光技术股份有限公司 | Method and device for removing moire patterns |
CN112070671B (en) * | 2020-09-04 | 2024-07-19 | 平安科技(深圳)有限公司 | Mosaic removing method, system, terminal and storage medium based on spectrum analysis |
CN112907467B (en) * | 2021-02-03 | 2023-04-28 | 杭州海康威视数字技术股份有限公司 | Rainbow pattern removing method and device and electronic equipment |
CN113707110B (en) * | 2021-06-15 | 2023-12-01 | 浙江意博高科技术有限公司 | Intelligent illumination control method and system |
CN116013190A (en) * | 2022-11-03 | 2023-04-25 | 深圳创维-Rgb电子有限公司 | Color bar picture detection method and device, display equipment and readable storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105303530A (en) * | 2015-09-30 | 2016-02-03 | 天津大学 | Fabric image mole stripe elimination method based on low-rank sparse matrix decomposition |
CN105678718A (en) * | 2016-03-29 | 2016-06-15 | 努比亚技术有限公司 | Method and device for image denoising |
CN108615227A (en) * | 2018-05-08 | 2018-10-02 | 浙江大华技术股份有限公司 | A kind of suppressing method and equipment of image moire fringes |
-
2019
- 2019-09-05 CN CN201910840627.0A patent/CN110517206B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105303530A (en) * | 2015-09-30 | 2016-02-03 | 天津大学 | Fabric image mole stripe elimination method based on low-rank sparse matrix decomposition |
CN105678718A (en) * | 2016-03-29 | 2016-06-15 | 努比亚技术有限公司 | Method and device for image denoising |
CN108615227A (en) * | 2018-05-08 | 2018-10-02 | 浙江大华技术股份有限公司 | A kind of suppressing method and equipment of image moire fringes |
Non-Patent Citations (2)
Title |
---|
"Automatic moire pattern removal in microscopic images";Giorgian-Marius ionita;《2015 19th International Conference on System Theory,Control and Computing》;20150909;第776-779页 * |
"摄屏类图像重构算法";陈申渭;《计算机系统应用》;20190515;第110-118页 * |
Also Published As
Publication number | Publication date |
---|---|
CN110517206A (en) | 2019-11-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110517206B (en) | Method and device for eliminating color moire | |
US11625815B2 (en) | Image processor and method | |
US10298863B2 (en) | Automatic compensation of lens flare | |
US9710896B2 (en) | Systems and methods for chroma noise reduction | |
EP2055093B1 (en) | Adaptive spatial image filter for filtering image information | |
US7916940B2 (en) | Processing of mosaic digital images | |
US8482659B2 (en) | Image processing apparatus and image processing method | |
CN111784605B (en) | Image noise reduction method based on region guidance, computer device and computer readable storage medium | |
EP1766569A1 (en) | Methods, system, program modules and computer program product for restoration of color components in an image model | |
US8238685B2 (en) | Image noise reduction method and image processing apparatus using the same | |
JP4328424B2 (en) | Image conversion method | |
CN111784603A (en) | RAW domain image denoising method, computer device and computer readable storage medium | |
JP2008527861A (en) | Noise removal from scattered color digital images | |
CN111539893A (en) | Bayer image joint demosaicing denoising method based on guided filtering | |
US7430334B2 (en) | Digital imaging systems, articles of manufacture, and digital image processing methods | |
US7269295B2 (en) | Digital image processing methods, digital image devices, and articles of manufacture | |
JP2013055623A (en) | Image processing apparatus, image processing method, information recording medium, and program | |
WO2012008116A1 (en) | Image processing apparatus, image processing method, and program | |
US8629917B2 (en) | Image processing apparatus, method, and recording medium | |
JP5268321B2 (en) | Image processing apparatus, image processing method, and image processing program | |
JP7183015B2 (en) | Image processing device, image processing method, and program | |
KR20150094350A (en) | Method and apparatus for executing demosaicking using edge information | |
JP2013055622A (en) | Image processing apparatus, image processing method, information recording medium, and program | |
Saito et al. | Demosaicing method using the extended color total-variation regularization | |
EP4332834A1 (en) | Method and camera device for generating a moiré-corrected image file |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information |
Address after: Room 508-511, building a, Modern Plaza, No. 18, Weiye Road, Kunshan Development Zone, Suzhou, Jiangsu Applicant after: Ruixin Microelectronics Co., Ltd Address before: Room 508-511, block A, Modern Plaza, 18 Weiye Road, Kunshan, Jiangsu, Suzhou, 215300 Applicant before: BRIGATES MICROELECTRONICS (KUNSHAN) Co.,Ltd. |
|
CB02 | Change of applicant information | ||
GR01 | Patent grant | ||
GR01 | Patent grant |