WO2010053029A1 - 画像入力装置 - Google Patents
画像入力装置 Download PDFInfo
- Publication number
- WO2010053029A1 WO2010053029A1 PCT/JP2009/068480 JP2009068480W WO2010053029A1 WO 2010053029 A1 WO2010053029 A1 WO 2010053029A1 JP 2009068480 W JP2009068480 W JP 2009068480W WO 2010053029 A1 WO2010053029 A1 WO 2010053029A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- signal
- color
- color space
- intensity
- image
- Prior art date
Links
- 238000006243 chemical reaction Methods 0.000 claims abstract description 48
- 238000003384 imaging method Methods 0.000 claims abstract description 7
- 238000000034 method Methods 0.000 claims description 45
- 238000009499 grossing Methods 0.000 claims description 40
- 238000012545 processing Methods 0.000 claims description 31
- 230000035945 sensitivity Effects 0.000 claims description 26
- 230000003595 spectral effect Effects 0.000 claims description 22
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 claims description 18
- 239000011159 matrix material Substances 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 description 11
- 230000005540 biological transmission Effects 0.000 description 8
- 241000282412 Homo Species 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 238000009826 distribution Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 230000007423 decrease Effects 0.000 description 4
- 238000012937 correction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000001131 transforming effect Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/48—Picture signal generators
- H04N1/486—Picture signal generators with separate detectors, each detector being used for one specific colour component
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/131—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/133—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing panchromatic light, e.g. filters passing white light
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/135—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
Definitions
- the present invention relates to an image input device that performs image processing on original image data captured by an image sensor.
- An object of the present invention is to provide an image input device capable of generating a luminance signal having a high S / N ratio of an image even at night when the amount of photons is small.
- An image input device includes an image sensor that captures original image data including at least three types of original image components in which at least three types of pixels having different spectral sensitivities are arranged, and the original image data
- a color space conversion unit that converts the color signal into a color space including a luminance signal and a chromaticity signal, and the color space conversion unit outputs a first intensity signal obtained by adding the original image components to the color It is calculated as a luminance signal of space.
- original image data including at least three kinds of original image components is imaged by the image sensor.
- the color space conversion unit generates a first intensity signal by adding the original image components constituting the original image data, and calculates the first intensity image signal as a luminance signal of the color space. . Therefore, a luminance signal with a high S / N ratio can be generated even at night when the amount of photons is small.
- the color space conversion unit performs a smoothing process on the original image component or a chromaticity signal based on the original image component.
- a color signal generation unit that generates an RGB color signal from the original image component is further provided, and the color space conversion unit generates a second intensity signal by converting the RGB color signal into the color space. It is preferable to calculate the chromaticity signal of the color space by calculating using a first ratio that is a ratio of the first intensity signal to the second intensity signal.
- the second intensity signal is generated by performing color space conversion on the RGB color signal.
- the ratio of the first intensity signal to the second intensity signal is calculated as the first ratio.
- the chromaticity signal of the color space is calculated by calculating using the first ratio. Therefore, the color space conversion unit can generate a chromaticity signal and a luminance signal balanced in the color space to be converted.
- a color signal generation unit that generates an RGB color signal from the original image component is further provided, and the color space conversion unit generates the second intensity signal by converting the RGB color signal into the color space. And correcting the first intensity signal by performing a calculation using a second ratio that is a ratio of the second intensity signal subjected to the smoothing process with respect to the first intensity signal subjected to the smoothing process,
- the corrected first intensity signal is preferably calculated as a luminance signal of the color space.
- the first intensity signal is corrected by calculating using the second ratio, which is the ratio of the second intensity signal subjected to the smoothing process with respect to the first intensity signal subjected to the smoothing process. ing. Therefore, it is possible to generate a luminance signal that accurately reproduces the intensity of an image visually recognized by a human.
- the color space conversion unit generates the second intensity signal by converting the RGB color signal into the color space, and is a ratio of the first intensity signal to the second intensity signal. It is preferable to calculate the chromaticity signal of the color space by calculating using the first ratio.
- the chromaticity signal of the color space is corrected by calculating using the first ratio. Therefore, the balance between the intensity signal and the chromaticity signal in the color space to be converted can be maintained.
- the original image data includes an infrared image component;
- the color space conversion unit compares the intensity of infrared light with the intensity of visible light based on the infrared image component, and the weight of the first intensity signal is increased as the intensity of infrared light increases.
- the luminance signal of the color space is calculated by weighting and adding the first intensity signal and the signal using the original image component as they are so as to decrease.
- the luminance signal of the color space to be converted is calculated. Therefore, it is possible to generate a luminance signal that accurately represents the intensity of an image viewed by a human.
- the color space conversion unit calculates the chromaticity signal of the color space to be lower as the intensity of infrared light is higher than the intensity of visible light.
- a smoothing processing unit that performs smoothing processing on the original image component, a color interpolation unit that interpolates missing pixel data of the original image component smoothed by the smoothing processing unit, and generates an RGB color signal from the original image component It is preferable that the color signal generation unit further generates the RGB color signal from an original image component obtained by interpolating missing pixel data by the color interpolation unit.
- unit pixel portions including a first pixel, a second pixel, a third pixel, and a fourth pixel having different spectral sensitivities are arranged in a matrix, and the visible wavelength region and the red pixel
- the first pixel includes a first color filter that transmits light in the sensitive wavelength band excluding a blue region of the visible wavelength region
- the second pixel includes a second color filter that transmits light in the sensitive wavelength band excluding the blue region and the green region of the visible wavelength region
- the third pixel includes the existence of the present pixel excluding the visible wavelength region. It is preferable that an infrared filter that transmits light in the sensitivity wavelength band is provided, and the fourth pixel does not include a filter.
- an infrared image component having a high S / N ratio can be obtained.
- an image sensor with high spectral transmission characteristics can be produced at low cost.
- At least one of at least three types of pixels having different spectral sensitivities of the image sensor has sensitivity in an infrared wavelength region.
- At least one of at least three types of pixels having different spectral sensitivities of the image sensor has sensitivity only in an infrared wavelength region, and the first intensity signal includes an original image including an infrared image component. It is preferable to calculate only by adding image components.
- At least one has sensitivity in the infrared wavelength region and is calculated only by adding the original image component including the infrared image component without performing the subtraction process.
- An infrared image component having a high N ratio can be obtained. Therefore, a luminance signal with a high S / N ratio can be generated even at night when the amount of photons is small.
- the first intensity signal is generated by adding the original image components constituting the original image data, and the first intensity signal is calculated as the luminance signal of the color space to be converted. Yes. Therefore, a luminance signal having a high S / N ratio can be generated even at night when the amount of photons is small.
- a chromaticity signal having a high S / N ratio is obtained even at night when the amount of photons is small. Can be generated.
- FIG. 1 is a block diagram of an image input device 1 according to Embodiment 1.
- FIG. It is a figure which shows the arrangement
- (A) is a table
- (b) is a table
- 3 is a flowchart illustrating an operation of the image input apparatus 1 according to the first embodiment.
- 6 is a flowchart illustrating an operation of the image input apparatus 1 according to the second embodiment.
- FIG. 6 is a block diagram of an image processing unit 4 when a smoothing processing unit is provided in the preceding stage of a color interpolation unit. It is the figure which showed the spectral sensitivity characteristic of cyan and
- FIG. 1 shows a block diagram of the image input apparatus 1.
- the image input device 1 includes a lens 2, an image sensor 3, an image processing unit 4, and a control unit 5.
- the image input device 1 is mounted on, for example, an automobile and images a subject around the automobile.
- the lens 2 is composed of an optical lens system that captures an optical image of a subject and guides it to the image sensor 3.
- the optical lens system for example, a zoom lens, a single focus lens, other lens blocks, or the like arranged in series along the optical axis L of the optical image of the subject can be employed.
- the lens 2 may include a diaphragm (not shown) for adjusting the amount of transmitted light, a shutter (not shown), and the like. In this case, the diaphragm and the shutter are driven under the control of the control unit 5. Be controlled.
- the image pickup device 3 includes a light receiving unit composed of a PD (photodiode), an output circuit that outputs a signal photoelectrically converted by the light receiving unit, and a drive circuit that drives the image pickup device 3, and has a level corresponding to the amount of light. Generate original image data.
- a CMOS image sensor a VMIS image sensor
- a CCD image sensor a CCD image sensor
- the image sensor 3 captures at least a visible color image component with a pixel including a color filter, captures an infrared image component with a pixel including an infrared filter, and includes a pixel without a filter.
- An image component including a visible image component and an infrared image component is imaged.
- the image processing unit 4 includes an arithmetic circuit and a memory used as a work area for the arithmetic circuit, A / D converts the original image data output from the image sensor 3 to convert it into a digital signal, and performs image processing to be described later. After execution, for example, the data is output to a memory or a display device (not shown).
- the control unit 5 includes a CPU and a memory that stores a program executed by the CPU, and controls the entire image input apparatus 1 in response to an external control signal.
- FIG. 2 is a diagram showing an array of pixels of the image sensor 3.
- the image sensor 3 includes a unit including a Ye pixel (first pixel), an R pixel (second pixel), an IR pixel (third pixel), and a W pixel (fourth pixel). Pixel portions 31 are arranged in a matrix.
- R pixels are arranged in the first row and first column
- IR pixels are arranged in the second row and first column
- W pixels are arranged in the first row and second column
- Ye pixels are arranged in the second row and the second column.
- R pixels, IR pixels, W pixels, and Ye pixels may be arranged in other patterns.
- the Ye pixel includes a Ye filter (first color filter), an image component Ye (original image component) and an infrared image component, which are visible color image components of Ye, are imaged.
- the R pixel includes an R filter (second color filter), an image component R (original image component) and an infrared image component, which are R visible color image components, are imaged.
- the IR pixel includes an IR filter (infrared filter), an image component IR (original image component) that is an infrared image component is captured.
- the W pixel does not include a filter, an image component W (original image component) that is an image component including the visible image component and the image component IR is captured.
- FIG. 3 is a diagram showing the spectral transmission characteristics of the Ye, R, and IR filters, where the vertical axis shows the transmittance (sensitivity) and the horizontal axis shows the wavelength (nm).
- a graph indicated by a dotted line shows the spectral sensitivity characteristics of the pixel in a state where the filter is removed. It can be seen that this spectral sensitivity characteristic has a peak in the infrared wavelength region and changes in a convex curve.
- the visible wavelength region is 400 nm to 700 nm
- the infrared wavelength region is 700 nm to 1100 nm
- the sensitive wavelength band is 400 nm to 1100 nm.
- the spectral sensitivity characteristic of the pixel without the filter has sensitivity up to the ultraviolet region, but since the ultraviolet region is cut by the lens 2, the sensitive wavelength band is 400 nm to 1100 nm. Further, the spectral sensitivity characteristic is not limited to that shown in FIG. 3, and any spectral sensitivity may be used as long as it has sensitivity from the visible light region to the infrared light region.
- the Ye filter has a characteristic of transmitting light in the sensitive wavelength band excluding the blue region in the visible wavelength region. Therefore, the Ye filter mainly transmits yellow light and infrared light.
- the R filter has a characteristic of transmitting light in a sensitive wavelength band excluding a blue region and a green region in the visible wavelength region. Therefore, the R filter mainly transmits red light and infrared light.
- the IR filter has a characteristic of transmitting light in a sensitive wavelength band excluding the visible wavelength region, that is, in the infrared wavelength band.
- W indicates a case where no filter is provided, and all light in the sensitive wavelength band of the pixel is transmitted.
- Ye, M (magenta) + IR, C (cyan) + IR instead of Ye, R, IR (however, M + IR blocks only green and C + IR only red) Is also possible.
- the R pixel, the IR pixel, and the Ye pixel can make the spectral transmission characteristics steep, and, for example, the spectral transmission characteristics are better than those of the M + IR filter and the C + IR filter. That is, each of the M + IR filter and the C + IR filter has a characteristic of shielding only the green region and the red region, which are partial regions in the center, in the sensitive wavelength band, and before and after the shielding wavelength region.
- the R filter, the IR filter, and the Ye filter In contrast to the transmission wavelength region, the R filter, the IR filter, and the Ye filter only need to have two adjacent wavelength regions, a shielding wavelength region and a transmission wavelength region. It can be easily held. Therefore, the R filter, the IR filter, and the Ye filter can extract RGB image components with higher accuracy even if they are calculated than the M + IR filter and the C + IR filter. Therefore, by configuring the image sensor 3 with R pixels, IR pixels, Ye pixels, and W pixels, the performance of the image sensor 3 can be improved.
- FIG. 4 is a block diagram showing a detailed configuration of the image processing unit 4.
- the image processing unit 4 includes a color interpolation unit 41, a color signal generation unit 42, a color space conversion unit 43, and an RGB color signal generation unit 44.
- the color interpolation unit 41 interpolates missing pixel data into each of an image component Ye obtained from Ye pixels, an image component R obtained from R pixels, an image component IR obtained from IR pixels, and an image component W obtained from W pixels.
- the image component R, the image component IR, the image component W, and the image component Ye are converted into image data having the same number of pixels as the number of pixels of the image sensor 3.
- the missing pixel data is generated in the image components Ye, R, IR, and W because of the above-described arrangement of the R pixel, the IR pixel, the W pixel, and the Ye pixel.
- the interpolation process for example, a linear interpolation process may be employed.
- the color signal generation unit 42 combines the image component Ye, the image component R, the image component IR, and the image component W that have been subjected to the interpolation processing by the color interpolation unit 41, using the equation (1), and the color signal dR. , DG, dB (RGB color signal).
- the color space conversion unit 43 converts the color signals dR, dG, and dB into a luminance signal Y (an example of a second intensity signal) and color difference signals Cb and Cr (an example of a chromaticity signal).
- Y an example of a second intensity signal
- Cb and Cr an example of a chromaticity signal
- the color difference signal Cb indicates a blue color difference signal
- the color difference signal Cr indicates a red color difference signal.
- the color space conversion unit 43 converts the luminance signal Yadd (an example of the first intensity signal) obtained by adding the image components Ye, R, IR, and W, as shown in Expression (3), to be converted. Is calculated as a luminance signal of the color space.
- FIG. 5 is a graph showing the distribution of image components including noise components.
- FIG. 6A is a table showing output image components including noise components subjected to subtraction processing
- FIG. 6B is a table showing output image components including noise components subjected to addition processing.
- the vertical axis indicates the frequency of image components including noise components
- the horizontal axis indicates the intensity of output image components including noise components.
- a indicates an output distribution including a noise component of a certain image component A
- b indicates an output distribution including a noise component of a certain image component B.
- the output distributions a and b including noise components have normal distributions having the same standard deviation, for example, with 10 and 9 as an average value.
- the color space conversion unit 43 performs the smoothing process on the color difference signals Cb and Cr obtained by Expression (2) to calculate the color difference signals Cbs and Crs.
- the smoothing process for example, a cascade filter process, which is a filter process that multi-resolutions the color difference signals Cb and Cr by repeatedly using a relatively small size low-pass filter such as 5 ⁇ 5, is adopted. Also good. Further, a filter process using a low-pass filter of a predetermined size having a relatively large size may be employed.
- an edge-preserving filter that smoothes areas other than edges without blurring the subject that emits light (smoothing when the signal level difference between pixels is smaller than a certain reference value, and smoothing the part larger than the reference value Filter) processing may be employed. Note that it can be estimated that light emission is detected by comparing the infrared component and the visible light component.
- the noise components included in the color difference signals Cb and Cr are blurred, and the S / N ratio of the color difference signals Cb and Cr can be improved.
- the color signal is calculated by calculating using the ratio RT1, and more specifically, the color difference signal and luminance signal in the color space to be converted are corrected by correcting the color difference signals Crs and Cbs using the ratio RT1.
- the RGB color signal generation unit 44 calculates color signals dR ′, dG ′, and dB ′ from the luminance signal Yadd and the color difference signals Crm and Cbm by inversely transforming the equation (2). Specifically, if Y in equation (2) is Yadd, Cb is Cbm, Cr is Crm, dR, dG, and dB are dR ′, dG ′, and dB ′, equation (2) is inversely transformed. Good.
- the color signals dR ′, dG ′, and dB ′ are calculated through the above processing, the color signals dR, dG calculated by subtracting the image components Ye, R, IR, and W are used.
- the color signal is much more accurate than, dB.
- FIG. 7 is a flowchart showing the operation of the image input apparatus 1 according to the first embodiment.
- the control unit 5 causes the image sensor 3 to capture one frame of original image data.
- image components Ye, R, IR, and W are obtained (step S1).
- the imaging device 3 images the image component Ye by the Ye pixel, images the image component R by the R pixel, images the image component IR by the IR pixel, and images the image component W by the W pixel.
- the control unit 5 may cause the image sensor 3 to capture the original image data at a frame rate such as 30 fps or 60 fps.
- the control unit 5 may cause the image sensor 3 to capture the original image data when the user presses the release button.
- the color interpolation unit 41 performs color interpolation processing on the image components Ye, R, IR, and W.
- the color space conversion unit 43 performs a smoothing process on the color difference signals Cr and Cb to calculate the color difference signals Crs and Cbs (step S4).
- the luminance signal Yadd is calculated by adding R, IR, W, and Ye as shown in the equation (1).
- the present invention is not limited to this.
- the luminance signal Yadd is expressed by the equation (1) ′.
- the luminance signal Yadd may be calculated by weighted addition as shown.
- Yadd ⁇ ⁇ R + ⁇ ⁇ IR + ⁇ ⁇ W + ⁇ ⁇ Ye (1) ′
- predetermined values may be adopted.
- the color space conversion unit 43 calculates the color difference signals Crm and Cbm by performing the calculation shown in Expression (4) (step S6).
- the color difference signals Crm and Cbm are collectively expressed as Cm
- the color difference signals Crs and Cbs are collectively expressed as Cs.
- the RGB color signal generation unit 44 inversely converts the equation (2) to calculate the color signals dR ′, dG ′, and dB ′ from the luminance signal Yadd and the color difference signals Crm and Cbm (step S7).
- the luminance signal Yadd is calculated using the equation (1), the luminance signal Yadd having a high S / N ratio can be calculated even at night. Further, since the color difference signals Cr and Cb are smoothed, the color difference signals Crs and Cbs having a high S / N ratio can be calculated even at night.
- the image input apparatus 1 according to the second embodiment is characterized in that the calculation method of the color difference signals Crm and Cbm is different from that of the first embodiment.
- the same elements as those in the first embodiment are not described. Further, since the detailed configuration of the image processing unit 4 is the same as that of the first embodiment, FIG. 4 is used.
- the color space conversion unit 43 shown in FIG. 4 performs a smoothing process on the luminance signal Y obtained by Expression (2) to obtain the luminance signal Ys. Further, the luminance signal Yadd obtained by the equation (3) is subjected to smoothing processing to calculate the luminance signal Yadds.
- a smoothing process a cascade filter process or a normal filter process may be employed.
- the luminance signal Y is obtained by the color space conversion process shown in Expression (2)
- the luminance signal Y is a signal that accurately reproduces the luminance of the image visually recognized by humans.
- the luminance signal Yadd is calculated by the addition process shown in Expression (3), it is difficult to accurately reproduce the luminance of the image visually recognized by humans as compared with Expression (2). Therefore, as the luminance signal Yadds becomes larger than the luminance signal Ys, that is, as the ratio RT2 becomes smaller from 1, the luminance signal Yadd cannot reproduce the luminance of the image viewed by humans with high accuracy. There is a fear.
- the luminance signal Yadd can be made into a signal that accurately reproduces the luminance of the image visually recognized by humans.
- the smoothed value is used for Ys, the noise component is reduced.
- unnatural artifacts may occur at the edge depending on the pattern.
- the color space conversion unit 43 corrects the color difference signals Crs and Cbs according to the above-described ratio RT1 as shown in the equation (6), and calculates the color difference signals Crm ′ and Cbm ′.
- Yadd ′ instead of Yadd.
- the color space conversion unit 43 compares the intensity of the infrared light and the intensity of the visible light. When the intensity of the infrared light is high, the luminance signal Yadd and the image component are reduced so that the weighting of the luminance signal Yadd is reduced.
- the luminance signal in the color space may be calculated by weighting and adding the signal Yt using Ye, R, IR, and W as they are.
- the luminance signal Yadd ′ may be calculated by weighting and adding the luminance signal Yadd and the signal Yt so that the weighting of the luminance signal Yadd is reduced as shown in the equation (7).
- Yadd ′ Yt ⁇ (1 ⁇ Ys / Yadds) + Yadd ⁇ (Ys / Yadds) (7)
- the luminance signal Yadd ′ may greatly differ from the luminance of the image viewed by humans. Therefore, when the luminance signal Yadd ′ is large, the luminance signal Yadd ′ is calculated so that the ratio of the signal Yt to the luminance signal Yadd is large, whereby the luminance signal Yadd ′ is determined according to the luminance of the image viewed by humans. It can be a close signal.
- the luminance signal Yadds is dominant at the ratio RT2, it indicates that infrared light is dominant. Therefore, the signal of the subject color between the pixels of Ye, R, IR, and W is represented. Level difference can be ignored. That is, it can be regarded as equivalent to the case of a monochrome sensor. Therefore, the pixels of Ye, R, IR, and W are not added, but are handled as independent signals, so that the resolution (resolution) can be maximized.
- the color space conversion unit 43 may compare the intensity of infrared light and the intensity of visible light using the difference or ratio between the image component W and the image component IR. In this case, when IR / W is larger than a predetermined value close to 1, it is determined that the intensity of infrared light is stronger than the intensity of visible light, and when it is smaller than a predetermined value close to 0, the intensity of visible light is What is necessary is just to determine with it being stronger than the intensity
- W-IR when W-IR is smaller than a predetermined value close to 0, it is determined that the intensity of infrared light is stronger than the intensity of visible light, and when IR is smaller than a predetermined value close to 0, the intensity of visible light is What is necessary is just to determine with it being stronger than the intensity
- the luminance signal Yadd ′ may be calculated by performing the calculation of Expression (7) ′.
- Yadd ′ k ⁇ Yt + Yadd ⁇ (1 ⁇ k) (7) ′
- k a weighting factor of a predetermined signal Yt whose value increases as IR / W or IR ⁇ W increases.
- the RGB color signal generation unit 44 calculates color signals dR ′, dG ′, and dB ′ from the luminance signal Yadd ′ and the color difference signals Crm ′ and Cbm ′ by inversely transforming the equation (2).
- FIG. 8 is a flowchart showing the operation of the image input apparatus 1 according to the second embodiment.
- Steps S11 to S15 are the same as steps S1 to S5 shown in FIG.
- the color space conversion unit 43 performs a smoothing process on the luminance signal Yadd to calculate the luminance signal Yadds, and also performs a smoothing process on the luminance signal Y to calculate the luminance signal Ys.
- the color space conversion unit 43 calculates the luminance signal Yadd ′ by the calculation of Expression (5) when the ratio RT2 is greater than or equal to the threshold Th1, and the calculation of Expression (7) when the ratio RT2 is less than the threshold Th1.
- a luminance signal Yadd ′ is calculated (step S17).
- the color space conversion unit 43 obtains the color difference signals Crm ′ and Cbm ′ by the calculation of Expression (6) (step S18).
- the RGB color signal generation unit 44 inversely converts the equation (2) to obtain the color signals dR ′, dG ′, and dB ′ (step S19).
- the luminance signal Yadd ′ is calculated using the equation (5).
- the luminance signal Yadd ′ is calculated using the equation (7), so that it is possible to generate a luminance signal that accurately reproduces the luminance of the image viewed by humans.
- the luminance signal and the color difference signal are obtained in a well-balanced manner in the color space to be converted. be able to.
- the color space conversion unit 43 may correct the color difference signals Crm ′ and Cbm ′ so that the color difference signals Crm ′ and Cbm ′ become lower as the intensity of infrared light approaches the intensity of visible light.
- the color space conversion unit 43 may calculate the color difference signals Crm ′ and Cbm ′ using, for example, equation (8) instead of equation (6).
- dR ′, dG ′, and dB ′ can be calculated visually without a sense of incongruity. Further, the color signals dR ′, dG ′, and dB ′ may be further color-converted to match a standard display (eg, IEC 61966-2-1 sRGB standard).
- a standard display eg, IEC 61966-2-1 sRGB standard.
- the formula used by comparing the ratio RT2 with the threshold value is not limited to switching to (6) or (8), and is determined in advance so as to decrease the color difference signals Crs and Cbs as the ratio RT2 decreases.
- the correction values may be stored, and the color difference signals Crm ′ and Cbm ′ may be calculated using the correction values. In this case, in the equation (8), this correction value may be adopted instead of RT2.
- R pixels and IR pixels are used as the second to third pixels.
- the present invention is not limited to this, and cyan (C) pixels and magenta (M) pixels are used. It is possible to do as described above. That is, the first to fourth pixels may be Ye, M, C, and W pixels. In this case, shot noise is reduced, and noise components included in the original image data can be further reduced.
- the RGB color signal generation unit 42 replaces the equation (1) with the equation (9). ) To obtain the color signals dR, dG, dB.
- the color space conversion unit 43 may obtain the luminance signal Yadd using the equation (10) instead of the equation (3).
- the color space conversion unit 43 performs the smoothing process on the color difference signals Cb and Cr.
- the present invention is not limited to this. That is, as shown in FIG. 9, a smoothing processing unit 45 is provided before the color interpolation unit 41, and the smoothing processing unit 45 performs smoothing processing on the image components Ye, R, IR, and W so that the smoothing by the color space conversion unit 43 is performed. Processing may be omitted.
- the smoothing process may be performed anywhere as long as the color signals dR, dG, and dB are calculated.
- the image pickup device 3 includes four types of pixels of the first to fourth pixels.
- the present invention is not limited to this, and the image pickup device 3 may include at least three types of pixels. That's fine.
- a type that can generate R, G, and B color signals from the spectral transmission characteristics of each pixel may be employed as the type of pixels that constitute the image sensor 3.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Color Television Image Signal Generators (AREA)
Abstract
Description
前記色空間変換部は、前記赤外画像成分を基に、赤外光の強度と可視光の強度とを比較し、赤外光の強度が強くなるに従って、前記第1の強度信号の重み付けが小さくなるように、前記第1の強度信号と前記原画像成分をそのまま用いた信号とを重み付け加算して前記色空間の輝度信号を算出することが好ましい。
以下、本発明の実施の形態1による画像入力装置1について説明する。図1は、画像入力装置1のブロック図を示している。図1に示すように画像入力装置1は、レンズ2、撮像素子3、画像処理部4、及び制御部5を備えている。ここで、画像入力装置1は、例えば自動車に搭載され、自動車の周囲の被写体を撮像する。
dG=Ye-R (1)
dB=W-Ye
色空間変換部43は、式(2)に示すように、色信号dR,dG,dBを、輝度信号Y(第2の強度信号の一例)と色差信号Cb,Cr(色度信号の一例)とを含む色空間に変換する。ここで、色差信号Cbは青の色差信号を示し、色差信号Crは赤の色差信号を示す。
Cb=dB-Y (2)
Cr=dR-Y
また、色空間変換部43は、式(3)に示すように、画像成分Ye,R,IR,Wを加算することで得られ輝度信号Yadd(第1の強度信号の一例)を、変換対象となる色空間の輝度信号として算出する。
ここで、輝度信号Yaddは、加算処理により算出されているため、減算処理により輝度信号Yを算出した場合に比べてノイズ成分を低くすることができる。
Cbm=Cbs×Yadd/Y (4)
このように割合RT1を用いて演算することで色信号を算出する、具体的には割合RT1を用いて色差信号Crs,Cbsを補正することで、変換対象となる色空間における色差信号と輝度信号とをバランス良く算出することができる。この処理を行うことにより、色信号dR′,dG′,dB′を計算したとき、輝度信号Yaddが輝度信号Yよりも大きい場合でも鮮やかさが失われることがなく、輝度信号Yaddが輝度信号Yよりも小さい場合でも鮮やかさが過剰になり過ぎる問題を引き起こすことを防止できる。
但し、α,β,γ,σは重み付け係数であり、α+β+γ+σ=1である。また、α,β,γ,σとしては、例えば予め定められた値を採用すればよい。
実施の形態2による画像入力装置1は、色差信号Crm,Cbmの算出手法が実施の形態1と相違する点を特徴としている。なお、本実施の形態において実施の形態1と同一のものは説明を省略する。また、画像処理部4の詳細な構成は実施の形態1と同一であるため、図4を用いる。
輝度信号Yは、式(2)に示す色空間変換処理により求められているため、人間が視認する画像の輝度を精度良く再現した信号となっている。一方、輝度信号Yaddは、式(3)に示す加算処理により算出されているため、式(2)に比べて人間が視認する画像の輝度を精度良く再現することが難しくなっている。したがって、輝度信号Ysに比べて輝度信号Yaddsが大きくなるにつれて、すなわち、割合RT2が1から小さくなるにつれて、輝度信号Yaddは、人間が視認する画像の輝度を精度良く再現することができなくなってしまう虞がある。そこで、式(5)を用いて輝度信号Yadd′を求めることで、輝度信号Yaddを人間が視認する画像の輝度を精度良く再現した信号にすることができる。なお、ここで、Ysはスムージングされた値を用いているため、ノイズ成分が低減されている。ただし、絵柄によってはエッジ部で不自然なアーチファクトが発生する場合がある。
Cbm′=Cbs×Yadd′/Y (6)
これにより、変換対象となる色空間における輝度信号と色差信号とのバランスが保たれ、後段のRGB色信号生成部44により生成される色信号dR′,dG′,dB′の算出精度を高めることができる。
但し、Ytは、画像成分Ye,R,IR,Wのいずれかをそのまま用いた信号を表す。具体的には、ある画素の輝度信号Yadd′を算出する場合に当該画素が、Wフィルタに対応する場合はYt=Wとし、Yeフィルタに対応する場合はYt=Yeとし、Rフィルタに対応する場合はYt=Rとし、IRフィルタに対応する場合はYt=IRとする。
但し、k≦1であり、kはIR/W又はIR-Wが大きくなるにつれて値が大きくなる予め定められた信号Ytの重み係数を示す。
Cbm′=Cbs×Yadd×RT2/Y (8)
つまり、割合RT2が閾値Th1より小さい場合は、輝度信号Yに対して輝度信号Yadd′が支配的となり、夜間を撮影している可能性が高く、色情報が不足するため色差信号が精度良く算出できていない可能性がある。そこで、式(8)を採用することで色差信号が低くなり、算出精度の低い色差信号の影響が低くなるように、色信号dR′,dG′,dB′を算出することができ、色信号dR′,dG′,dB′を視覚的に違和感なく算出することができる。さらに、色信号dR′,dG′,dB′は、標準的なディスプレイ(例えば、IEC 61966-2-1 sRGB規格)に合うようにさらに色変換されてもよい。
dG=W-M (9)
dB=W-Ye
また、色空間変換部43は、式(3)に代えて、式(10)により輝度信号Yaddを求めればよい。
なお、シアン(C)、マゼンタ(M)の分光感度特性を示すと図10のようになる。
以上の処理により、以下の効果が得られる。
2 レンズ
3 撮像素子
4 画像処理部
5 制御部
41 色補間部
42 色信号生成部
43 色空間変換部
44 RGB色信号生成部
45 スムージング処理部
Claims (11)
- 分光感度の異なる少なくとも3種類の画素が配列され、少なくとも3種類の原画像成分を含む原画像データを撮像する撮像素子と、
前記原画像データを輝度信号と色度信号とを含む色空間に変換する色空間変換部とを備え、
前記色空間変換部は、前記原画像成分を加算処理することで得られる第1の強度信号を前記色空間の輝度信号として算出することを特徴とする画像入力装置。 - 前記色空間変換部は、前記原画像成分又は前記原画像成分に基づく色度信号に対してスムージング処理を行うことを特徴とする請求項1記載の画像入力装置。
- 前記原画像成分からRGB色信号を生成する色信号生成部を更に備え、
前記色空間変換部は、前記RGB色信号を前記色空間に変換することで第2の強度信号を生成し、前記第2の強度信号に対する前記第1の強度信号の割合である第1の割合を用いて演算することで前記色空間の色度信号を算出することを特徴とする請求項1又は2記載の画像入力装置。 - 前記原画像成分からRGB色信号を生成する色信号生成部を更に備え、
前記色空間変換部は、前記RGB色信号を前記色空間に変換することで前記第2の強度信号を生成し、スムージング処理を施した前記第1の強度信号に対するスムージング処理を施した前記第2の強度信号の割合である第2の割合を用いて演算することで前記第1の強度信号を補正し、補正後の第1の強度信号を前記色空間の輝度信号として算出することを特徴とする請求項1又は2記載の画像入力装置。 - 前記色空間変換部は、前記RGB色信号を前記色空間に変換することで前記第2の強度信号を生成し、前記第2の強度信号に対する前記第1の強度信号の割合である第1の割合を用いて演算することで前記色空間の色度信号を算出することを特徴とする請求項4記載の画像入力装置。
- 前記原画像データは赤外画像成分を含み、
前記色空間変換部は、前記赤外画像成分を基に、赤外光の強度と可視光の強度とを比較し、赤外光の強度が強くなるに従って、前記第1の強度信号の重み付けが小さくなるように、前記第1の強度信号と前記原画像成分をそのまま用いた信号とを重み付け加算して前記色空間の輝度信号を算出することを特徴とする請求項1から5までのいずれか一項に記載の画像入力装置。 - 前記色空間変換部は、赤外光の強度が可視光の強度よりも強くなるほど、前記色空間の色度信号が低くなるように算出することを特徴とする請求項1から6までのいずれか一項に記載の画像入力装置。
- 前記原画像成分をスムージング処理するスムージング処理部と、
前記スムージング処理部によりスムージング処理された原画像成分の欠落画素データを補間する色補間部と、
前記原画像成分からRGB色信号を生成する色信号生成部とを更に備え、
前記色信号生成部は、前記色補間部により欠落画素データが補間された原画像成分から前記RGB色信号を生成することを特徴とする請求項1記載の画像入力装置。 - 前記撮像素子は、それぞれ分光感度が異なる第1の画素、第2の画素、第3の画素、及び第4の画素を含む単位画素部がマトリックス状に配列され、
可視波長領域と赤外波長領域とを有感度波長帯域とした際に、
前記第1の画素は、前記可視波長領域の青色領域を除く前記有感度波長帯域の光を透過する第1のカラーフィルタを備え、
前記第2の画素は、前記可視波長領域の青色領域及び緑色領域を除く前記有感度波長帯域の光を透過する第2のカラーフィルタを備え、
前記第3の画素は、前記可視波長領域を除く前記有感度波長帯域の光を透過する赤外フィルタを備え、
前記第4の画素は、フィルタを備えないことを特徴とする請求項1から8までのいずれか一項に記載の画像入力装置。 - 前記撮像素子の分光感度の異なる少なくとも3種類の画素のうちの、少なくとも一つは赤外波長領域に感度を持つことを特徴とする請求項1から8までのいずれか一項に記載の画像入力装置。
- 前記撮像素子の分光感度の異なる少なくとも3種類の画素のうちの、少なくとも一つは赤外波長領域のみに感度を有し、前記第1の強度信号は、赤外画像成分を含む原画像成分を加算処理することのみよって算出されることを特徴とする請求項1から10までのいずれか一項に記載の画像入力装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010536745A JP5206796B2 (ja) | 2008-11-04 | 2009-10-28 | 画像入力装置 |
US13/125,930 US8666153B2 (en) | 2008-11-04 | 2009-10-28 | Image input apparatus |
CN200980142922.3A CN102204258B (zh) | 2008-11-04 | 2009-10-28 | 图像输入装置 |
EP20090824729 EP2343904A4 (en) | 2008-11-04 | 2009-10-28 | Image input device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-283744 | 2008-11-04 | ||
JP2008283744 | 2008-11-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010053029A1 true WO2010053029A1 (ja) | 2010-05-14 |
Family
ID=42152838
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/068480 WO2010053029A1 (ja) | 2008-11-04 | 2009-10-28 | 画像入力装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8666153B2 (ja) |
EP (1) | EP2343904A4 (ja) |
JP (2) | JP5206796B2 (ja) |
CN (1) | CN102204258B (ja) |
WO (1) | WO2010053029A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011155136A1 (ja) * | 2010-06-07 | 2011-12-15 | コニカミノルタオプト株式会社 | 撮像装置 |
WO2011155135A1 (ja) * | 2010-06-07 | 2011-12-15 | コニカミノルタオプト株式会社 | 撮像装置 |
WO2011162155A1 (ja) * | 2010-06-23 | 2011-12-29 | コニカミノルタオプト株式会社 | 撮像装置 |
JP2012010141A (ja) * | 2010-06-25 | 2012-01-12 | Konica Minolta Opto Inc | 画像処理装置 |
WO2012067028A1 (ja) * | 2010-11-16 | 2012-05-24 | コニカミノルタオプト株式会社 | 画像入力装置および画像処理装置 |
Families Citing this family (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1064786A4 (en) * | 1998-01-27 | 2005-09-28 | Collaboration Properties Inc | MULTI-FUNCTIONAL SERVICE DEVICE FOR VIDEO COMMUNICATIONS |
US8643701B2 (en) | 2009-11-18 | 2014-02-04 | University Of Illinois At Urbana-Champaign | System for executing 3D propagation for depth image-based rendering |
US9735303B2 (en) * | 2010-03-25 | 2017-08-15 | Nri R&D Patent Licensing, Llc | Color imaging using color OLED or LED array as color light-field imaging sensor |
US9628722B2 (en) | 2010-03-30 | 2017-04-18 | Personify, Inc. | Systems and methods for embedding a foreground video into a background feed based on a control input |
US8818028B2 (en) | 2010-04-09 | 2014-08-26 | Personify, Inc. | Systems and methods for accurate user foreground video extraction |
US9008457B2 (en) * | 2010-05-31 | 2015-04-14 | Pesonify, Inc. | Systems and methods for illumination correction of an image |
US8649592B2 (en) | 2010-08-30 | 2014-02-11 | University Of Illinois At Urbana-Champaign | System for background subtraction with 3D camera |
JP5910043B2 (ja) * | 2011-12-02 | 2016-04-27 | 富士通株式会社 | 撮像装置、画像処理プログラム、画像処理方法、および画像処理装置 |
JP6136669B2 (ja) * | 2013-07-08 | 2017-05-31 | 株式会社ニコン | 撮像装置 |
US10136107B2 (en) * | 2013-11-21 | 2018-11-20 | Semiconductor Components Industries, Llc | Imaging systems with visible light sensitive pixels and infrared light sensitive pixels |
US9774548B2 (en) | 2013-12-18 | 2017-09-26 | Personify, Inc. | Integrating user personas with chat sessions |
US9485433B2 (en) | 2013-12-31 | 2016-11-01 | Personify, Inc. | Systems and methods for iterative adjustment of video-capture settings based on identified persona |
US9386303B2 (en) | 2013-12-31 | 2016-07-05 | Personify, Inc. | Transmitting video and sharing content via a network using multiple encoding techniques |
US9414016B2 (en) | 2013-12-31 | 2016-08-09 | Personify, Inc. | System and methods for persona identification using combined probability maps |
JP6291048B2 (ja) * | 2014-06-24 | 2018-03-14 | マクセル株式会社 | 撮像処理装置および撮像処理方法 |
DE102014115292A1 (de) * | 2014-10-21 | 2016-04-21 | Connaught Electronics Ltd. | Verfahren zum Bereitstellen von Bilddateien von einem Kamerasystem, Kamerasystem und Kraftfahrzeug |
US9671931B2 (en) * | 2015-01-04 | 2017-06-06 | Personify, Inc. | Methods and systems for visually deemphasizing a displayed persona |
US9563962B2 (en) | 2015-05-19 | 2017-02-07 | Personify, Inc. | Methods and systems for assigning pixels distance-cost values using a flood fill technique |
US9916668B2 (en) | 2015-05-19 | 2018-03-13 | Personify, Inc. | Methods and systems for identifying background in video data using geometric primitives |
US10244224B2 (en) | 2015-05-26 | 2019-03-26 | Personify, Inc. | Methods and systems for classifying pixels as foreground using both short-range depth data and long-range depth data |
US9607397B2 (en) | 2015-09-01 | 2017-03-28 | Personify, Inc. | Methods and systems for generating a user-hair-color model |
CN105430359B (zh) * | 2015-12-18 | 2018-07-10 | 广东欧珀移动通信有限公司 | 成像方法、图像传感器、成像装置及电子装置 |
CN105578081B (zh) * | 2015-12-18 | 2018-03-06 | 广东欧珀移动通信有限公司 | 成像方法、图像传感器、成像装置及电子装置 |
US9883155B2 (en) | 2016-06-14 | 2018-01-30 | Personify, Inc. | Methods and systems for combining foreground video and background video using chromatic matching |
US9881207B1 (en) | 2016-10-25 | 2018-01-30 | Personify, Inc. | Methods and systems for real-time user extraction using deep learning networks |
JP7468061B2 (ja) * | 2020-03-27 | 2024-04-16 | 富士フイルムビジネスイノベーション株式会社 | 画像処理装置、画像処理システムおよびプログラム |
US11800056B2 (en) | 2021-02-11 | 2023-10-24 | Logitech Europe S.A. | Smart webcam system |
US11800048B2 (en) | 2021-02-24 | 2023-10-24 | Logitech Europe S.A. | Image generating system with background replacement or modification capabilities |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002142228A (ja) * | 2000-10-31 | 2002-05-17 | Toyota Central Res & Dev Lab Inc | 撮像装置 |
JP2005184690A (ja) * | 2003-12-22 | 2005-07-07 | Sanyo Electric Co Ltd | カラー撮像素子およびカラー信号処理回路 |
JP2007184805A (ja) | 2006-01-10 | 2007-07-19 | Toyota Central Res & Dev Lab Inc | カラー画像再生装置 |
JP2008289001A (ja) * | 2007-05-18 | 2008-11-27 | Sony Corp | 画像入力処理装置、および、その方法 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5754448A (en) * | 1995-07-12 | 1998-05-19 | Minnesota Mining And Manufacturing Company | System and method for color characterization and transformation |
JP4282113B2 (ja) * | 1998-07-24 | 2009-06-17 | オリンパス株式会社 | 撮像装置および撮像方法、並びに、撮像プログラムを記録した記録媒体 |
JP3960901B2 (ja) * | 2002-10-24 | 2007-08-15 | 日本合成化学工業株式会社 | ポリビニルアルコール系フィルムの製造方法 |
US7274393B2 (en) * | 2003-02-28 | 2007-09-25 | Intel Corporation | Four-color mosaic pattern for depth and image capture |
JP3880553B2 (ja) | 2003-07-31 | 2007-02-14 | キヤノン株式会社 | 画像処理方法および装置 |
JP2005136497A (ja) * | 2003-10-28 | 2005-05-26 | Canon Inc | 画像処理方法、画像処理装置 |
JP5028791B2 (ja) * | 2005-11-10 | 2012-09-19 | 凸版印刷株式会社 | 撮像素子および撮像装置 |
JP5194363B2 (ja) * | 2006-01-20 | 2013-05-08 | 凸版印刷株式会社 | 光センサ |
KR20070115243A (ko) * | 2006-06-01 | 2007-12-05 | 삼성전자주식회사 | 이미지 촬상 장치, 및 그 동작 방법 |
KR100818987B1 (ko) * | 2006-09-19 | 2008-04-04 | 삼성전자주식회사 | 이미지 촬상 장치 및 상기 이미지 촬상 장치의 동작 방법 |
TW200830271A (en) * | 2007-01-04 | 2008-07-16 | Novatek Microelectronics Corp | Method and apparatus of color adjustment |
EP2418856A4 (en) * | 2009-04-07 | 2014-02-26 | Konica Minolta Opto Inc | Image input device |
-
2009
- 2009-10-28 US US13/125,930 patent/US8666153B2/en not_active Expired - Fee Related
- 2009-10-28 CN CN200980142922.3A patent/CN102204258B/zh not_active Expired - Fee Related
- 2009-10-28 EP EP20090824729 patent/EP2343904A4/en not_active Withdrawn
- 2009-10-28 JP JP2010536745A patent/JP5206796B2/ja not_active Expired - Fee Related
- 2009-10-28 WO PCT/JP2009/068480 patent/WO2010053029A1/ja active Application Filing
-
2013
- 2013-02-20 JP JP2013031377A patent/JP5527448B2/ja not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002142228A (ja) * | 2000-10-31 | 2002-05-17 | Toyota Central Res & Dev Lab Inc | 撮像装置 |
JP2005184690A (ja) * | 2003-12-22 | 2005-07-07 | Sanyo Electric Co Ltd | カラー撮像素子およびカラー信号処理回路 |
JP2007184805A (ja) | 2006-01-10 | 2007-07-19 | Toyota Central Res & Dev Lab Inc | カラー画像再生装置 |
JP2008289001A (ja) * | 2007-05-18 | 2008-11-27 | Sony Corp | 画像入力処理装置、および、その方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2343904A4 |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011155136A1 (ja) * | 2010-06-07 | 2011-12-15 | コニカミノルタオプト株式会社 | 撮像装置 |
WO2011155135A1 (ja) * | 2010-06-07 | 2011-12-15 | コニカミノルタオプト株式会社 | 撮像装置 |
US9294740B2 (en) | 2010-06-07 | 2016-03-22 | Konica Minolta Advanced Layers, Inc. | Imaging device having a color image generator generating a color image using edge data and a fake color suppressing coefficient |
EP2579573A4 (en) * | 2010-06-07 | 2014-09-10 | Konica Minolta Advanced Layers | IMAGING DEVICE |
EP2579574A4 (en) * | 2010-06-07 | 2013-12-25 | Konica Minolta Advanced Layers | IMAGING DEVICE |
EP2579573A1 (en) * | 2010-06-07 | 2013-04-10 | Konica Minolta Opto, Inc. | Imaging device |
EP2579574A1 (en) * | 2010-06-07 | 2013-04-10 | Konica Minolta Opto, Inc. | Imaging device |
JPWO2011155135A1 (ja) * | 2010-06-07 | 2013-08-01 | コニカミノルタ株式会社 | 撮像装置 |
JPWO2011155136A1 (ja) * | 2010-06-07 | 2013-08-01 | コニカミノルタ株式会社 | 撮像装置 |
JPWO2011162155A1 (ja) * | 2010-06-23 | 2013-08-22 | コニカミノルタ株式会社 | 撮像装置 |
WO2011162155A1 (ja) * | 2010-06-23 | 2011-12-29 | コニカミノルタオプト株式会社 | 撮像装置 |
JP2012010141A (ja) * | 2010-06-25 | 2012-01-12 | Konica Minolta Opto Inc | 画像処理装置 |
WO2012067028A1 (ja) * | 2010-11-16 | 2012-05-24 | コニカミノルタオプト株式会社 | 画像入力装置および画像処理装置 |
US9200895B2 (en) | 2010-11-16 | 2015-12-01 | Konica Minolta, Inc. | Image input device and image processing device |
Also Published As
Publication number | Publication date |
---|---|
EP2343904A4 (en) | 2012-09-05 |
EP2343904A1 (en) | 2011-07-13 |
US20110243430A1 (en) | 2011-10-06 |
CN102204258B (zh) | 2014-07-23 |
JPWO2010053029A1 (ja) | 2012-04-05 |
JP5206796B2 (ja) | 2013-06-12 |
JP2013093914A (ja) | 2013-05-16 |
CN102204258A (zh) | 2011-09-28 |
JP5527448B2 (ja) | 2014-06-18 |
US8666153B2 (en) | 2014-03-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5527448B2 (ja) | 画像入力装置 | |
JP5397788B2 (ja) | 画像入力装置 | |
JP5168353B2 (ja) | 撮像装置及び撮像素子 | |
JP5223742B2 (ja) | エッジ強調画像処理装置 | |
WO2011118071A1 (ja) | 画像処理方法および装置,ならびに画像処理プログラムおよびこのプログラムを記録した媒体 | |
WO2011155136A1 (ja) | 撮像装置 | |
US10395347B2 (en) | Image processing device, imaging device, image processing method, and image processing program | |
JP5098908B2 (ja) | 画像入力装置 | |
JP2010161455A (ja) | 赤外線混合撮像装置 | |
US20130083157A1 (en) | Imaging Device | |
JP2012010141A (ja) | 画像処理装置 | |
JP5464008B2 (ja) | 画像入力装置 | |
JP5330291B2 (ja) | 信号処理装置及び撮像装置 | |
JP2020198540A (ja) | 画像信号処理方法 | |
JP4993275B2 (ja) | 画像処理装置 | |
CN112335233B (zh) | 图像生成装置以及摄像装置 | |
WO2011162155A1 (ja) | 撮像装置 | |
JP2009284010A (ja) | 画像処理装置、撮像装置及び画像処理方法 | |
JP5920144B2 (ja) | 撮像装置および撮像方法 | |
JP2006324789A (ja) | 映像信号処理方法および映像信号処理装置 | |
JP2005303702A (ja) | 撮像装置、カメラ、及び信号処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980142922.3 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09824729 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2010536745 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009824729 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13125930 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |