[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2017101546A1 - 图像传感器、成像装置、移动终端及成像方法 - Google Patents

图像传感器、成像装置、移动终端及成像方法 Download PDF

Info

Publication number
WO2017101546A1
WO2017101546A1 PCT/CN2016/099753 CN2016099753W WO2017101546A1 WO 2017101546 A1 WO2017101546 A1 WO 2017101546A1 CN 2016099753 W CN2016099753 W CN 2016099753W WO 2017101546 A1 WO2017101546 A1 WO 2017101546A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
photosensitive
merged
pixels
filter
Prior art date
Application number
PCT/CN2016/099753
Other languages
English (en)
French (fr)
Inventor
康健
Original Assignee
广东欧珀移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广东欧珀移动通信有限公司 filed Critical 广东欧珀移动通信有限公司
Priority to KR1020177026309A priority Critical patent/KR102083292B1/ko
Priority to MYPI2017702822A priority patent/MY184809A/en
Priority to JP2017541006A priority patent/JP6325755B2/ja
Priority to US15/544,537 priority patent/US10594962B2/en
Priority to EP16874606.3A priority patent/EP3242479B1/en
Priority to SG11201706246XA priority patent/SG11201706246XA/en
Priority to AU2016369789A priority patent/AU2016369789B2/en
Publication of WO2017101546A1 publication Critical patent/WO2017101546A1/zh
Priority to ZA2017/06230A priority patent/ZA201706230B/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14603Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14621Colour filter arrangements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • H01L27/14627Microlenses
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14643Photodiode arrays; MOS imagers
    • H01L27/14645Colour imagers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/133Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing panchromatic light, e.g. filters passing white light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/135Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/46Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by combining or binning pixels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/71Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
    • H04N25/75Circuitry for providing, modifying or processing image signals from the pixel array
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/78Readout circuits for addressed sensors, e.g. output amplifiers or A/D converters

Definitions

  • the present invention relates to imaging technology, and in particular to an image sensor, an imaging device, a mobile terminal, and an imaging method of an image sensor.
  • the image generated by the image sensor of the related imaging device in a low illumination environment may have a problem of large noise and poor definition.
  • the present invention aims to solve at least one of the technical problems in the related art to some extent.
  • an embodiment of the present invention provides an image sensor including: a photosensitive pixel array; a filter disposed on the photosensitive pixel array, the filter including a plurality of filter units a filter unit array, wherein each of the filter units covers N photosensitive pixels, and some of the filter units include at least a white filter region, the white filter region covering at least one of the N photosensitive pixels a photosensitive pixel, wherein the N photosensitive pixels covered by the same filter unit constitute one combined pixel, and N is a positive integer.
  • the image sensor proposed by the embodiment of the present invention by inserting a white filter region in a part of the filter unit, the amount of light entering can be increased, thereby obtaining a high signal-to-noise ratio, brightness and sharpness under low illumination, and generating noise. Fewer images.
  • the filter unit array includes an R filter unit, a G filter unit, and a B filter unit, wherein the G filter unit includes at least the white filter region, the white The filter region covers at least one of the N photosensitive pixels covered by the G filter unit.
  • each of the filter units includes 2*2 of the photosensitive pixels, wherein the white filter region covers 1 photosensitive pixel covered by the G filter unit, wherein The G filter unit may further include a green filter region, the green filter region covering the other three photosensitive pixels; or the white filter region covers the two photosensitive pixels covered by the G filter unit, wherein The G filter unit may further include a green filter region covering the other two photosensitive pixels; or the white filter region covers the three photosensitive pixels covered by the G filter unit, wherein The G filter unit may further include a green filter region covering the other one of the photosensitive pixels; or The white filter region covers 4 photosensitive pixels covered by the G filter unit.
  • the image sensor further includes: a control module, configured to control the photosensitive pixel array to be progressively exposed.
  • the image sensor further comprises: an array of analog to digital converters having a plurality of analog to digital converters, each of the analog to digital converters being coupled to one of the photosensitive pixels.
  • the image sensor further includes: a micro mirror array, each of the micro mirrors corresponding to one of the photosensitive pixels.
  • an imaging apparatus including: the image sensor; an image processing module coupled to the image sensor, the image processing module for reading and processing An output of the photosensitive pixel array in the image sensor to obtain pixel values of the merged pixels to form a merged image.
  • the amount of light entering can be increased by the image sensor, thereby obtaining a higher signal-to-noise ratio, brightness, and sharpness under low illumination, and generating an image with less noise.
  • the image processing module is further configured to use the corresponding N pixels of the same merged pixel The output is added as the pixel value of the merged pixel.
  • the image processing module is further configured to: the corresponding one of the white filter regions in the merged pixel
  • the output of the photosensitive pixel is added as a first pixel value of the merged pixel
  • the output of the photosensitive pixel corresponding to the non-white filter region in the merged pixel is added as a second pixel of the merged pixel value.
  • the combined signal can further improve the signal-to-noise ratio, brightness and sharpness under low illumination, and further reduce the noise of the image.
  • another embodiment of the present invention provides a mobile terminal including the imaging device.
  • the amount of light entering can be increased by the above-mentioned imaging device, thereby obtaining a higher signal-to-noise ratio, brightness and sharpness under low illumination, and generating an image with less noise.
  • the noise of the merged pixels is smaller than the sum of the noises of the pixels before the merge, the combined signal can further improve the signal-to-noise ratio, brightness and sharpness under low illumination, and further reduce the noise of the image.
  • the mobile terminal may be a mobile phone.
  • the imaging device may be a front camera of the handset.
  • the mobile terminal further includes: a central processing unit connected to the imaging device; An external memory, the central processor is configured to control the external memory to store the merged image.
  • the mobile terminal further includes: a central processing unit and a display device coupled to the imaging device, the central processor configured to control the display device to display the merged image.
  • a further embodiment of the present invention provides an image forming method of the image sensor, comprising the steps of: reading an output of a photosensitive pixel array in the image sensor; The output of the photosensitive pixel calculates the pixel value of the merged pixel to generate a merged image.
  • the amount of light entering can be increased by the above image sensor, thereby obtaining a higher signal-to-noise ratio, brightness and sharpness under low illumination, and generating an image with less noise.
  • the filter unit when the filter unit includes only a white filter region or a non-white filter region, the pixel values of the merged pixels are calculated according to an output of the photosensitive pixels of the same merged pixel to Generating the merged image further includes adding, as the pixel values of the merged pixels, the outputs of the corresponding N photosensitive pixels of the same merged pixel.
  • the filter unit when the filter unit includes a white filter region and a non-white filter region, the pixel value of the merged pixel includes a first pixel value corresponding to the white filter region and the non-white a second pixel value corresponding to the filter region, wherein calculating the pixel value of the merged pixel according to an output of the photosensitive pixel of the same merged pixel to generate a merged image further comprises: filtering the white filter in the merged pixel The output of the photosensitive pixel corresponding to the light region is added as a first pixel value of the merged pixel, and the output of the photosensitive pixel corresponding to the non-white filter region in the merged pixel is added as the Merging the second pixel value of the pixel.
  • the combined signal can further improve the signal-to-noise ratio, brightness and sharpness under low illumination, and further reduce the noise of the image.
  • each of the photosensitive pixels is respectively connected to an analog to digital converter, wherein the imaging method further comprises: converting an analog signal output generated by the photosensitive pixel into a digital signal output; The digital signal output of the photosensitive pixels of the merged pixels calculates pixel values of the merged pixels.
  • FIG. 1 is a side elevational view of an image sensor in accordance with an embodiment of the present invention.
  • FIGS. 2a-2d are schematic illustrations of partial filter units of an image sensor in accordance with one embodiment of the present invention.
  • FIG. 3 is a schematic view of a filter unit array of a Bayer structure
  • 4a-4d are schematic diagrams of an array of filter elements of an image sensor in accordance with one embodiment of the present invention.
  • FIG. 5a-5d are schematic perspective structural views of an image sensor according to an embodiment of the present invention.
  • FIG. 6 is a block schematic diagram of an image sensor in accordance with one embodiment of the present invention.
  • FIG. 7 is a schematic diagram showing the circuit structure of a photosensitive pixel of an image sensor according to an embodiment of the present invention.
  • FIG. 8 is a block schematic diagram of an image sensor in accordance with an embodiment of the present invention, wherein the image sensor includes an analog to digital converter;
  • FIG. 9 is a perspective structural view of an image sensor according to an embodiment of the present invention, wherein the image sensor includes a micro mirror array;
  • Figure 10 is a block schematic diagram of an image forming apparatus according to an embodiment of the present invention.
  • FIG. 11 is a block diagram of a mobile terminal in accordance with one embodiment of the present invention.
  • FIG. 12 is a block diagram showing a mobile terminal according to another embodiment of the present invention.
  • Figure 13 is a flow chart of an imaging method in accordance with an embodiment of the present invention.
  • FIG. 14 is a flow chart showing reading of a photosensitive pixel output and generating an image by an imaging method according to an embodiment of the present invention
  • Figure 15 is a flow diagram showing the processing of a photosensitive pixel output and generating an image in accordance with an imaging method in accordance with one embodiment of the present invention.
  • the image sensor 10 of the embodiment of the invention includes a photosensitive pixel array 11 and a filter 13 disposed on the photosensitive pixel array 11.
  • the filter 13 includes a filter unit array 131, wherein the filter unit array 131 has a plurality of filter units 1311, each filter unit 1311 covers N photosensitive pixels 111, and the partial filter unit 1311 includes at least a white filter region. 1313.
  • the white filter region 1313 covers at least one of the N photosensitive pixels, wherein the N photosensitive pixels covered by the same filter unit constitute one combined pixel, and N is a positive integer.
  • the external light is irradiated to the photosensitive portion 1111 of the photosensitive pixel 111 through the filter 13 to generate an electric signal, that is, an output of the photosensitive pixel 111.
  • the white filter region 1313 mainly transmits natural light without filtering. Therefore, the white filter region 1313 may refer to a region in which a transparent filter is disposed, and may also refer to a region without a filter, that is, " ⁇ in the filter 13". Empty area.
  • the other filter unit 1311 other than the partial filter unit 1311 includes only a non-white filter region such as a green filter region, a red filter region or a blue filter region, and does not include a white filter region. .
  • the partial filter unit 1311 may also include a non-white filter region, in other words, the partial filter unit 1311 may be filtered by the white filter region 1313 and non-white.
  • the area is composed of two parts, which together cover the corresponding N photosensitive pixels.
  • the non-white filter area is used to obtain the color information of the merged pixels
  • the white filter area is used to obtain the information of the entire “white light”, that is, the white filter area allows the natural light to pass through, thereby the white filter area has The better light transmission effect makes the brightness value of the photosensitive pixel output higher, and the white filter area can obtain the brightness information of the merged pixel in the case of low illumination, and the brightness information has less noise.
  • a white filter region is embedded in the partial filter unit, so that the luminance information of the merged pixel is acquired under low illumination and the luminance information is less noise, and the pixel value of the synthesized image generated thereby includes
  • the color information also contains low-intensity brightness information, and the combined image brightness and sharpness are better, and the noise is less.
  • the filter unit array of the embodiment of the present invention is basically arranged according to the Bayer pattern shown in FIG. 3, and the Bayer array includes a filter structure 1317, and each filter structure 1317 includes 2*2.
  • the filter units 1311 are green, red, blue, and green filter units 1311, respectively.
  • the Bayer structure can use the traditional algorithm for Bayer structure to process image signals, so that no major adjustments in hardware structure are required.
  • the filter unit array 131 includes an R (red) filter unit 1311, a G (green) filter unit 1311, and a B (blue) filter unit 1311, wherein the G filter unit 1311 At least a white filter region 1313 is included, and the white filter region covers at least one of the N photosensitive pixels covered by the G filter unit.
  • each filter unit corresponds to one photosensitive pixel.
  • the filter unit array 131 may adopt a Bayer structure, including a filter structure 1317, and each of the filter structures 1317 includes G, R, B, and G filter units 1311, which are different from the conventional structure.
  • each of the filter units 1311 of the embodiment of the present invention corresponds to N photosensitive pixels 111.
  • each G filter unit 1311 includes a white filter region 1315 corresponding to at least one of the N photosensitive pixels, and the number of photosensitive pixels 111 covered by the white filter region 1315 is smaller than
  • the N-time G filter unit 1311 further includes a green filter region 1315, and the green filter region 1315 corresponds to other photosensitive pixels among the N photosensitive pixels.
  • each of the R filter units 1311 includes only a red filter region, and does not include a white filter region, that is, the red filter region covers the four photosensitive pixels corresponding to the R filter unit 1311.
  • each B filter unit 1311 includes only a blue filter region, and does not include a white filter region, that is, the blue filter region covers the four photosensitive pixels corresponding to the B filter unit 1311.
  • each filter unit 1311 includes 2*2 photosensitive pixels, that is, each filter unit 1311 covers 2*2 photosensitive pixels 111 to form a combined pixel.
  • the number of the photosensitive pixels 111 that can be arranged on the photosensitive pixel array 11 is to be limited, if the number of photosensitive pixels 111 included in each merged pixel is too large, the resolution of the image may be limited. For example, if the pixel value of the photosensitive pixel array 11 is 16M, the combined pixel structure of 2*2 may be distinguished. The combined image is 4M, and the combined image with a resolution of 1M can only be obtained with the 4*4 structure. Therefore, the 2*2 combined pixel structure is a better arrangement to enhance image brightness and sharpness while minimizing the resolution. At the same time, the 2*2 structure is adopted to facilitate the reading and merging of the photosensitive pixel output on the hardware.
  • the white filter region 1313 covers one photosensitive pixel covered by the G filter unit 1311.
  • the G filter unit 1311 may further include a green filter region 1315, and the green filter region 1315 covers the other. 3 photosensitive pixels; or, as shown in FIG. 4b and FIG. 5b, the white filter region 1313 covers the two photosensitive pixels covered by the G filter unit 1311, wherein the G filter unit 1311 may further include a green filter region 1315.
  • the green filter region 1315 covers the other two photosensitive pixels; or, as shown in FIG. 4c and FIG.
  • the white filter region 1313 covers the three photosensitive pixels covered by the G filter unit 1311, wherein the G filter unit 1313 can also The green filter region 1315 is included, and the green filter region 1315 covers the other one of the photosensitive pixels.
  • the white filter region 1313 covers the four photosensitive pixels covered by the G filter unit 1311.
  • the non-white filter region 1315 that is, the green filter region and the white filter region 1313 can sufficiently cover the N photosensitive pixels 111 of the merged pixels, or the white filter regions 1313 can be combined separately.
  • the N photosensitive pixels 111 of the pixel are sufficiently covered.
  • the non-white filter region 1315 that is, the red filter region, separately covers the N photosensitive pixels 111 of the merged pixels, and the non-white filter region 1315 is blue in the B filter unit 1313.
  • the filter region separately covers the N photosensitive pixels 111 of the merged pixels.
  • the image sensor further includes a control module 17 for controlling the photosensitive pixel array 11 to be progressively exposed.
  • the control module 17 is connected to the row selection logic unit 171 and the column selection logic unit 173 to control the processing of the output of the photosensitive pixel 111 row by row.
  • the method of progressive exposure and output is easier to implement on hardware.
  • the image sensor 10 includes a row selection logic unit 171 and a column selection logic unit 173.
  • the row selection logic unit 171 and the column selection logic unit 173 are respectively connected to the control module 17, and the row selection logic is selected.
  • the unit 171 and the column selection logic unit 173 are connected to the switch tube 1115 corresponding to each of the photosensitive pixels 111.
  • the control module 17 is configured to control the row selection logic unit 171 and the column selection logic unit 173 to strobe the switch of the photosensitive pixel 111 at a specific position. Tube 1115.
  • the control module 17 first acquires the outputs of the photosensitive pixels of the first row and the second row and stores them in the register 19. Subsequent circuits process the outputs of the four photosensitive pixels 111 having position coordinates of 1-1, 1-2, 2-1, 2-2 to obtain pixel values of the merged pixels. The number on the left of the position coordinates represents the line, and the number on the right represents the column.
  • the outputs of the four photosensitive pixels whose position coordinates are 1-3, 1-4, 2-3, 2-4 are processed to obtain the pixel values of the corresponding merged pixels.
  • the outputs of the photosensitive pixels of the third row, the fourth row, the fifth row, and the sixth row are processed until the outputs of all the photosensitive pixels are processed.
  • image sensor 10 further includes an array of analog to digital converters 21, each analog to digital converter 21 being coupled to a photosensitive pixel 111, and an analog to digital converter 21 for The analog signal output of the photosensitive pixel 111 is converted into a digital signal output.
  • the photosensitive pixel 111 includes a photodiode 1113.
  • Photodiode 1113 is used to convert light into electrical charge, and the resulting charge is proportional to the intensity of the light.
  • the switch tube 1115 is configured to control the turn-on and turn-off of the circuit according to the control signals of the row select logic unit 171 and the column select logic unit 173.
  • the source follower 1117 is used to turn the photodiode.
  • the charge signal generated by illumination is converted into a voltage signal.
  • An analog-to-digital converter (211) is used to convert the voltage signal into a digital signal for transmission to subsequent circuit processing.
  • This output processing method converts the output of the photosensitive pixel into a digital signal, which is processed by software in a subsequent digital circuit or in a chip. Therefore, the output information of each photosensitive pixel can be retained.
  • the imaging method of the embodiment of the present invention can retain the information of 16 M pixels (ie, the image before merging), and is processed on the basis of this. Merge images of 4M pixels or images of other resolutions. The probability of a bad point in the resulting image is low.
  • this output processing method has less noise and higher signal-to-noise ratio.
  • the image sensor 10 includes micromirror arrays 23 disposed on a filter 13, each of which corresponds to one photosensitive pixel 111.
  • each of the micromirrors 231 corresponds to one photosensitive pixel 111, including size and position.
  • each filter unit 1311 corresponds to 2*2 photosensitive pixels 111 and 2*2 micromirrors 191.
  • the image sensor according to the embodiment of the present invention embeds a white filter region in a part of the filter unit, thereby acquiring luminance information of the merged pixels under low illumination and the luminance information is less noise, and the generated synthesis is performed.
  • the pixel value of the image contains both color information and low-noise brightness information.
  • the combined image has better brightness and sharpness and less noise.
  • the imaging device 100 is also proposed in the embodiment of the present invention.
  • an image forming apparatus 100 includes an image sensor 10 of an embodiment of the present invention and an image processing module 50 connected to the image sensor 10.
  • the image processing module 50 is for reading and processing the output of the photosensitive pixel array 11 to obtain pixel values of the merged pixels to form a merged image.
  • the image sensor 10 may include a control module 17, a row selection logic unit 171, a column selection logic unit 173, an analog-to-digital converter array 21, a register 19, etc., and the output of the photosensitive pixel array 11 is converted by the analog-to-digital converter array 21 into The digital signals are stored line by line in register 19 and passed to image processing module 50 for processing until the output of all of the photosensitive pixels is processed to generate a combined image.
  • the image processing module 50 calculates the pixel values of the merged pixels from the output of the photosensitive pixels of the same merged pixel to generate a merged image.
  • the image processing module 50 is further configured to add the outputs of the N photosensitive pixels of the same merged pixel as The pixel values of the merged pixels.
  • the image processing module 50 is further configured to add, as the first pixel of the merged pixel, the output of the photosensitive pixel corresponding to the white filter region in the merged pixel.
  • the value, and the outputs of the photosensitive pixels corresponding to the non-white filter regions in the merged pixels are added as the second pixel value of the merged pixels.
  • the output of the photosensitive pixels covered by the filter of the same color is added to each filter unit to obtain a pixel value.
  • the outputs of the two photosensitive pixels covered by the white filter region are added as a merged pixel.
  • the first pixel value, the output of the two photosensitive pixels covered by the green filter area is added as the second pixel value of the merged pixel; in the B filter unit located in the lower left corner, the blue filter area is covered
  • the output of the four photosensitive pixels can be added as the pixel value of the merged pixel; in the R filter unit located in the upper right corner, the output of the four photosensitive pixels covered by the red filter region can be added as the pixel value of the merged pixel.
  • the image processing module 50 may be based on the first pixel value of the merged pixel of the G filter unit, the second pixel value of the merged pixel of the G filter unit, the pixel value of the merged pixel of the B filter unit, and the combination of the R filter unit.
  • the pixel values of the pixels generate a merged image.
  • the outputs of the plurality of photosensitive pixels are added to form a combined pixel with a higher signal-to-noise ratio. For example, assuming that the output of each photosensitive pixel is S, the noise is Ns, and the combined pixel includes N photosensitive pixels, the pixel value of the combined pixel is N*S, and the noise of the combined pixel is N is a positive integer greater than or equal to 1.
  • the output of the combined pixel is the sum of the output of each photosensitive pixel before the combination, so that the combined noise reduction signal-to-noise ratio is improved and the sharpness is improved as a whole.
  • the image sensor embeds a white filter region in the partial filter unit, thereby acquiring luminance information of the merged pixels under low illumination and the luminance information is less noise, thereby generating
  • the pixel value of the composite image contains both color information and low-noise brightness information, and the combined image has better brightness and sharpness and less noise.
  • the noise of the merged pixels is smaller than the sum of the noises of the pixels before the merge, combining the pixels by the image processor can further improve the signal-to-noise ratio, brightness and sharpness under low illumination, and further reduce the noise of the image.
  • the present invention further provides a mobile terminal to which an imaging device is applied.
  • a mobile terminal includes the imaging device of the above embodiment. Therefore, the mobile terminal has a photographing function and can generate a merged image with complete color, high signal to noise ratio, and high definition under low illumination.
  • the mobile terminal can be a mobile phone.
  • the imaging device may be a front camera of the mobile phone. Since the front camera is mostly used for self-timer, and the self-timer generally requires the definition of the image and the image resolution is not high, the mobile terminal according to the embodiment of the present invention can satisfy the requirement.
  • the mobile terminal 200 includes a central processing unit 81 and an external memory 83 connected to the imaging device 100, and the central processing unit 81 is configured to control the external memory 83 to store the merged image.
  • the external memory 83 includes an SM (Smart Media) card, a CF (Compact Flash) card, and the like.
  • the mobile terminal 200 further includes a central processing unit 81 and a display device 85 connected to the imaging device 100 for controlling the display device 85 to display the merged image.
  • the image captured by the mobile terminal 200 can be displayed on the display device for viewing by the user.
  • the display device includes an LED display or the like.
  • the mobile terminal according to the embodiment of the present invention has a photographing function and can generate a merged image with complete color, high signal to noise ratio, and high definition under low illumination.
  • the mobile terminal is a front camera of the mobile phone, the brightness and sharpness of the self-timer image under low illumination can be improved, and noise is reduced.
  • An embodiment of the present invention further provides an imaging method of an image sensor.
  • the imaging method of the image sensor of the embodiment of the present invention includes the following steps:
  • the image sensor includes a photosensitive pixel array and a filter disposed on the photosensitive pixel array, and the filter package
  • the filter unit array includes a plurality of filter units, each filter unit covers N photosensitive pixels, a part of the filter unit includes at least a white filter region, and a white filter region covers N photosensitive pixels. At least one of the photosensitive pixels, wherein the N photosensitive pixels covered by the same filter unit constitute one combined pixel, and N is a positive integer.
  • the external light is irradiated to the photosensitive portion of the photosensitive pixel through the filter to generate an electrical signal, that is, an output of the photosensitive pixel.
  • a white filter region is embedded in the partial filter unit, thereby acquiring luminance information of the merged pixels under low illumination and the luminance information is less noisy, and the pixel value of the synthesized image generated includes both color information and low.
  • the brightness information of the noise, the brightness and sharpness of the synthesized image are good, and the noise is small.
  • step S2 further includes: combining the same merged pixel The output of the corresponding N photosensitive pixels is added as the pixel value of the merged pixel.
  • step S2 further includes:
  • the output of the merged pixel is the sum of the outputs of the pixels before the merge, and the noise of the merged pixels is smaller than the sum of the noises of the pixels before the merge, so that the image generated after the merge has less noise and the signal-to-noise ratio is high.
  • step S2 calculating the pixel value of the merged pixel according to the output of the photosensitive pixel of the same merged pixel, step S2 specifically includes:
  • each of the photosensitive pixels is respectively connected to an analog-to-digital converter, and the imaging method of the embodiment of the present invention further includes:
  • the image processing module which is generally a digital signal processor (DSP)
  • DSP digital signal processor
  • the image processing module can directly process the output of the image sensor, and secondly, directly output the analog signal format of the image sensor through the circuit.
  • the image information is better preserved, for example, image sensing for 16M pixels.
  • the imaging method of the embodiment of the present invention can retain the information of 16M pixels (ie, the image before merging), and on this basis, it is processed to obtain a merged image of 4M pixels or an image of other resolutions.
  • a white filter region is embedded in a partial filter unit, thereby acquiring luminance information of the merged pixels under low illumination and the luminance information is less noise, thereby generating
  • the pixel value of the composite image contains both color information and low-noise brightness information, and the combined image brightness and sharpness are better, and the noise is less.
  • the noise of the merged pixels is smaller than the sum of the noises of the pixels before the merge, combining the pixels by the image processor can further improve the signal-to-noise ratio, brightness and sharpness under low illumination, and further reduce the noise of the image.
  • first and second are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated.
  • features defining “first” or “second” may include at least one of the features, either explicitly or implicitly.
  • the meaning of "a plurality” is at least two, such as two, three, etc., unless specifically defined otherwise.
  • the terms “installation”, “connected”, “connected”, “fixed” and the like shall be understood broadly, and may be either a fixed connection or a detachable connection, unless explicitly stated and defined otherwise. , or integrated; can be mechanical or electrical connection; can be directly connected, or indirectly connected through an intermediate medium, can be the internal communication of two elements or the interaction of two elements, unless otherwise specified Limited.
  • the specific meanings of the above terms in the present invention can be understood on a case-by-case basis.
  • the first feature "on” or “under” the second feature may be a direct contact of the first and second features, or the first and second features may be indirectly through an intermediate medium, unless otherwise explicitly stated and defined. contact.
  • the first feature "above”, “above” and “above” the second feature may be that the first feature is directly above or above the second feature, or merely that the first feature level is higher than the second feature.
  • the first feature “below”, “below” and “below” the second feature may be that the first feature is directly below or obliquely below the second feature, or merely that the first feature level is less than the second feature.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Power Engineering (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Color Television Image Signal Generators (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

本发明公开了一种图像传感器、成像装置、移动终端及成像方法,所述图像传感器包括:感光像素阵列;设置于所述感光像素阵列上的滤光片,所述滤光片包括具有多个滤光单元的滤光单元阵列,其中,每个所述滤光单元覆盖N个感光像素,部分所述滤光单元至少包括白色滤光区,所述白色滤光区覆盖所述N个感光像素中的至少一个感光像素,其中,同一所述滤光单元覆盖的所述N个感光像素构成一个合并像素,N为正整数。本发明实施例的图像传感器,通过在部分滤光单元中嵌入白色滤光区,可增大进光量,进而在低照度下得到较高的信噪比、亮度和清晰度,并生成噪点较少的图像。

Description

图像传感器、成像装置、移动终端及成像方法
相关申请的交叉引用
本申请基于申请号为201510963465.1,申请日为2015年12月18日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本发明涉及成像技术,特别涉及一种图像传感器、成像装置、移动终端及图像传感器的成像方法。
背景技术
相关成像装置的图像传感器在低照度环境下生成的图像可能存在噪点大、清晰度差的问题。
发明内容
本发明旨在至少在一定程度上解决相关技术中的技术问题之一。
为实现上述目的,本发明一方面实施例提出了一种图像传感器,包括:感光像素阵列;设置于所述感光像素阵列上的滤光片,所述滤光片包括具有多个滤光单元的滤光单元阵列,其中,每个所述滤光单元覆盖N个感光像素,部分所述滤光单元至少包括白色滤光区,所述白色滤光区覆盖所述N个感光像素中的至少一个感光像素,其中,同一所述滤光单元覆盖的所述N个感光像素构成一个合并像素,N为正整数。
根据本发明实施例提出的图像传感器,通过在部分滤光单元中嵌入白色滤光区,可增大进光量,进而在低照度下得到较高的信噪比、亮度和清晰度,并生成噪点较少的图像。
根据本发明的一些实施例,所述滤光单元阵列包括R滤光单元、G滤光单元和B滤光单元,其中,所述G滤光单元至少包括所述白色滤光区,所述白色滤光区覆盖所述G滤光单元覆盖的N个感光像素中的至少一个感光像素。
根据本发明的一些实施例,每个所述滤光单元包括2*2个所述感光像素,其中,所述白色滤光区覆盖所述G滤光单元覆盖的1个感光像素,其中,所述G滤光单元还可包括绿色滤光区,所述绿色滤光区覆盖其他3个感光像素;或者,所述白色滤光区覆盖所述G滤光单元覆盖的2个感光像素,其中,所述G滤光单元还可包括绿色滤光区,所述绿色滤光区覆盖其他2个感光像素;或者,所述白色滤光区覆盖所述G滤光单元覆盖的3个感光像素,其中,所述G滤光单元还可包括绿色滤光区,所述绿色滤光区覆盖其他1个感光像素;或 者,所述白色滤光区覆盖所述G滤光单元覆盖的4个感光像素。
根据本发明的一些实施例,所述图像传感器还包括:控制模块,所述控制模块用于控制所述感光像素阵列逐行曝光。
根据本发明的一些实施例,所述图像传感器还包括:寄存器,所述控制模块用于依次采集当前曝光完成的第k行及第k+1行的所述感光像素的输出并存入所述寄存器,其中k=2n-1,n为自然数,k+1小于等于所述感光像素阵列的总行数。
根据本发明的一些实施例,所述图像传感器还包括:具有多个模数转换器的模数转换器阵列,每个所述模数转换器与一个所述感光像素连接。
根据本发明的一些实施例,所述图像传感器还包括:微镜阵列,每个所述微镜与一个所述感光像素对应。
为实现上述目的,本发明另一方面实施例提出了一种成像装置,包括:所述的图像传感器;与所述图像传感器连接的图像处理模块,所述图像处理模块用于读取并处理所述图像传感器中所述感光像素阵列的输出以得到所述合并像素的像素值从而形成合并图像。
根据本发明实施例提出的成像装置,通过上述图像传感器,可增大进光量,进而在低照度下得到较高的信噪比、亮度和清晰度,并生成噪点较少的图像。
根据本发明的一些实施例,当滤光单元仅包括白色滤光区或非白色滤光区时,所述图像处理模块进一步用于将同一所述合并像素的对应的所述N个感光像素的输出相加作为所述合并像素的像素值。
根据本发明的一些实施例,当滤光单元包括白色滤光区和非白色滤光区时,所述图像处理模块进一步用于,将所述合并像素中所述白色滤光区对应的所述感光像素的输出相加以作为所述合并像素的第一像素值,以及将所述合并像素中所述非白色滤光区对应的所述感光像素的输出相加以作为所述合并像素的第二像素值。
由于合并像素的噪声小于合并之前各像素噪声之和,通过合并像素能进一步提高低照度下的信噪比、亮度和清晰度,进一步减少图像的噪点。
为实现上述目的,本发明又一方面实施例提出了一种移动终端,包括所述的成像装置。
根据本发明实施例提出的移动终端,通过上述成像装置,可增大进光量,进而在低照度下得到较高的信噪比、亮度和清晰度,并生成噪点较少的图像。并且,由于合并像素的噪声小于合并之前各像素噪声之和,通过合并像素能进一步提高低照度下的信噪比、亮度和清晰度,进一步减少图像的噪点。
根据本发明的一些实施例,所述移动终端可为手机。
根据本发明的一些实施例,所述成像装置可为所述手机的前置相机。
根据本发明的一些实施例,所述移动终端还包括:与所述成像装置连接的中央处理器及 外存储器,所述中央处理器用于控制所述外存储器存储所述合并图像。
根据本发明的一些实施例,所述移动终端还包括:与所述成像装置连接的中央处理器及显示装置,所述中央处理器用于控制所述显示装置显示所述合并图像。
为实现上述目的,本发明再一方面实施例提出了一种所述的图像传感器的成像方法,包括以下步骤:读取所述图像传感器中感光像素阵列的输出;根据同一所述合并像素的所述感光像素的输出计算所述合并像素的像素值以生成合并图像。
根据本发明实施例提出的成像方法,通过上述图像传感器可增大进光量,进而在低照度下得到较高的信噪比、亮度和清晰度,并生成噪点较少的图像。
根据本发明的一些实施例,每个所述滤光单元包括2*2个所述感光像素,所述根据同一所述合并像素的所述感光像素的输出计算所述合并像素的像素值具体包括:采集第k行及第k+1行的所述感光像素的输出并存入寄存器,其中k=2n-1,n为自然数,k+1小于等于所述感光像素阵列的总行数;从所述寄存器中提取所述第k行及第k+1行的所述感光像素的输出以得到所述合并像素的像素值。
根据本发明的一些实施例,当滤光单元仅包括白色滤光区或非白色滤光区时,所述根据同一所述合并像素的所述感光像素的输出计算所述合并像素的像素值以生成合并图像进一步包括:将同一所述合并像素的对应的所述N个感光像素的输出相加作为所述合并像素的像素值。
根据本发明的一些实施例,当滤光单元包括白色滤光区和非白色滤光区时,所述合并像素的像素值包括所述白色滤光区对应的第一像素值和所述非白色滤光区对应的第二像素值,所述根据同一所述合并像素的所述感光像素的输出计算所述合并像素的像素值以生成合并图像进一步包括:将所述合并像素中所述白色滤光区对应的所述感光像素的输出相加以作为所述合并像素的第一像素值,以及将所述合并像素中所述非白色滤光区对应的所述感光像素的输出相加以作为所述合并像素的第二像素值。
由于合并像素的噪声小于合并之前各像素噪声之和,通过合并像素能进一步提高低照度下的信噪比、亮度和清晰度,进一步减少图像的噪点。
根据本发明的一些实施例,每个所述感光像素分别与一个模数转换器连接,其中,所述成像方法进一步包括:将所述感光像素产生的模拟信号输出转换为数字信号输出;根据同一所述合并像素的所述感光像素的所述数字信号输出计算所述合并像素的像素值。
附图说明
图1是根据本发明实施例的图像传感器的侧视示意图;
图2a-2d是根据本发明一个实施例的图像传感器的部分滤光单元的示意图;
图3是拜耳结构的滤光单元阵列示意图;
图4a-4d是根据本发明一个实施例的图像传感器的滤光单元阵列的示意图;
图5a-5d是根据本发明一个实施例的图像传感器的立体结构示意图;
图6是根据本发明一个实施例的图像传感器的方框示意图;
图7是根据本发明一个实施例的图像传感器的感光像素的电路结构示意图;
图8是根据本发明一个实施例的图像传感器的方框示意图,其中,图像传感器包括模数转换器;
图9是根据本发明一个实施例的图像传感器的立体结构示意图,其中图像传感器包括微镜阵列;
图10是根据本发明实施例的成像装置的方框示意图;
图11是根据本发明一个实施例的移动终端的方框示意图;
图12是根据本发明另一个实施例的移动终端的方框示意图;
图13是根据本发明实施例的成像方法的流程图;
图14是根据本发明一个实施的成像方法的读取感光像素输出并生成图像的流程示意图;以及
图15是根据本发明一个实施的成像方法的处理感光像素输出并生成图像的流程示意图。
具体实施方式
下面详细描述本发明的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本发明,而不能理解为对本发明的限制。
下面参考附图来描述本发明实施例提出的图像传感器、成像装置、移动终端及成像方法。
根据图1和图2a-2d的实施例,本发明实施例的图像传感器10包括:感光像素阵列11及设置于感光像素阵列11上的滤光片13。
滤光片13包括滤光单元阵列131,其中,滤光单元阵列131具有多个滤光单元1311,每个滤光单元1311覆盖N个感光像素111,部分滤光单元1311至少包括白色滤光区1313,白色滤光区1313覆盖N个感光像素中的至少一个感光像素,其中,同一滤光单元覆盖的N个感光像素构成一个合并像素,N为正整数。外部光线通过滤光片13照射到感光像素111的感光部分1111以产生电信号,即感光像素111的输出。
需要说明的是,白色滤光区1313主要是让自然光透过,而不进行滤光。因此,白色滤光区1313可指设置有透明滤光片的区域,也可以指无滤光片的区域,即滤光片13中的“镂 空”区域。
还需说明的是,部分滤光单元1311之外的其他滤光单元1311仅包括非白色滤光区例如绿色滤光区、红色滤光区或蓝色滤光区,而不包括白色滤光区。并且,当白色滤光区1313覆盖的感光像素的数量小于N时,部分滤光单元1311也可包括非白色滤光区,换言之,部分滤光单元1311可由白色滤光区1313和非白色滤光区两部分组成,这两部分一起覆盖对应的N个感光像素。其中,非白色滤光区用于获取合并像素的色彩信息,白色滤光区用于获取整个“白光”的信息,即言,白色滤光区可以让自然光透过,由此白色滤光区有更好的透光效果使感光像素输出的亮度值更高,白色滤光区用于低照度情况下可获取合并像素的亮度信息,该亮度信息的噪点较少。
本发明实施例的图像传感器,在部分滤光单元中嵌入白色滤光区,从而在低照度下获取合并像素的亮度信息且此亮度信息噪声较少,以此生成的合成图像的像素值既包含色彩信息又包含低噪度的亮度信息,合成图像亮度及清晰度均较好,噪点少。
根据本发明的一些实施例,本发明实施例滤光单元阵列基本按照图3所示的拜耳阵列(Bayer pattern)排列,拜耳阵列中包括滤光结构1317,每个滤光结构1317包括2*2个滤光单元1311,分别是绿色、红色、蓝色、绿色滤光单元1311。
采用拜耳结构能采用传统针对拜耳结构的算法来处理图像信号,从而不需要硬件结构上做大的调整。
根据图4a-4d的实施例,滤光单元阵列131包括R(红色)滤光单元1311、G(绿色)滤光单元1311和B(蓝色)滤光单元1311,其中,G滤光单元1311至少包括白色滤光区1313,白色滤光区覆盖G滤光单元覆盖的N个感光像素中的至少一个感光像素111。
具体而言,在传统滤光单元阵列结构中,每个滤光单元对应一个感光像素。在本发明的一些实施例中,滤光单元阵列131可采用拜耳结构,包括滤光结构1317,每个滤光结构1317包括G、R、B、G滤光单元1311,而与传统结构不同的是,本发明实施例的每个滤光单元1311对应N个感光像素111。
其中,每个G滤光单元1311包括白色滤光区1315,该白色滤光区1315对应N个感光像素中的至少一个感光像素111,并且当白色滤光区1315覆盖的感光像素111的数量小于N时G滤光单元1311还包括绿色滤光区1315,绿色滤光区1315对应N个感光像素中其他感光像素。另外,每个R滤光单元1311仅包括红色滤光区,不包括白色滤光区,即言红色滤光区覆盖R滤光单元1311对应的4个感光像素。同样地,每个B滤光单元1311仅包括蓝色滤光区,不包括白色滤光区,即言蓝色滤光区覆盖B滤光单元1311对应的4个感光像素。
如图4a-4d和图5a-5d所示,每个滤光单元1311包括2*2个感光像素,即每个滤光单元1311覆盖2*2个感光像素111以形成合并像素。
除了2*2结构外,还有3*3,4*4,甚至是任意n*m等结构(n,m为自然数),可以理解,感光像素阵列11上可排列的感光像素111的数目是有限的,每个合并像素所包含的感光像素111过多的话,图像的分辨率大小会受到限制,如,若感光像素阵列11的像素值为16M,采用2*2的合并像素结构会得到分辨率为4M的合并图像,而采用4*4结构就只能得到分辨率为1M的合并图像。因此2*2的合并像素结构是一个较佳排列方式,在尽量少牺牲分辨率的前提下提升图像亮度及清晰度。同时,采用2*2结构方便硬件上实现对感光像素输出的读取及合并处理。
如图4a和图5a所示,白色滤光区1313覆盖G滤光单元1311覆盖的1个感光像素,其中,G滤光单元1311还可包括绿色滤光区1315,绿色滤光区1315覆盖其他3个感光像素;或者,如图4b和图5b所示,白色滤光区1313覆盖G滤光单元1311覆盖的2个感光像素,其中,G滤光单元1311还可包括绿色滤光区1315,绿色滤光区1315覆盖其他2个感光像素;或者,如图4c和图5c所示,白色滤光区1313覆盖G滤光单元1311覆盖的3个感光像素,其中,G滤光单元1313还可包括绿色滤光区1315,绿色滤光区1315覆盖其他1个感光像素;或者,如图4d和图5d所示,白色滤光区1313覆盖G滤光单元1311覆盖的4个感光像素。
这样,在G滤光单元1313中,非白色滤光区1315即绿色滤光区和白色滤光区1313可将合并像素的N个感光像素111充分覆盖,或者白色滤光区1313可单独将合并像素的N个感光像素111充分覆盖。而在R滤光单元1313中,非白色滤光区1315即红色滤光区单独将合并像素的N个感光像素111充分覆盖,以及在B滤光单元1313中非白色滤光区1315即蓝色滤光区单独将合并像素的N个感光像素111充分覆盖。
根据本发明的一个实施例,如图6所示,图像传感器还包括控制模块17,控制模块17用于控制感光像素阵列11逐行曝光。具体而言,控制模块17连接有行选择逻辑单元171及列选择逻辑单元173,以控制逐行对感光像素111的输出进行处理。
逐行曝光并输出的方式在硬件上更容易实现。
进一步地,如图6所示,图像传感器10还包括寄存器19,控制模块17用于依次采集当前曝光完成的第k行及第k+1行的感光像素111的输出并存入寄存器19,其中k=2n-1,n为自然数,k+1小于等于感光像素阵列11的总行数。
如此,可以充分利用寄存器来实现感光单元的输出读出、缓存及合并的过程,硬件容易实现且处理速度较快。
具体地,如图6和7所示,图像传感器10包括行选择逻辑单元171及列选择逻辑单元173。其中,行选择逻辑单元171及列选择逻辑单元173分别与控制模块17相连,行选择逻 辑单元171及列选择逻辑单元173与每一个感光像素111对应的开关管1115连接,控制模块17用于控制行选择逻辑单元171及列选择逻辑单元173以选通特定位置的感光像素111的开关管1115。
控制模块17首先采集第一行及第二行的感光像素的输出并存入寄存器19。后续电路对位置坐标为1-1、1-2、2-1、2-2的四个感光像素111的输出进行处理以得到合并像素的像素值。其中位置坐标的左边数字代表行,右边数字代表列。
再对位置坐标为1-3、1-4、2-3、2-4的四个感光像素的输出进行处理,得到相应合并像素的像素值。
以此类推,直至处理完第一行及第二行的最后一组四个感光像素。
按以上处理方式,对第三行及第四行、第五行及第六行等的感光像素的输出进行处理,直至全部感光像素的输出均处理完成。
根据本发明的一个实施例,如图7和8所示,图像传感器10还包括模数转换器21阵列,每个模数转换器21与一个感光像素111连接,模数转换器21用于将感光像素111的模拟信号输出转换为数字信号输出。
如图7的示例,感光像素111包括光电二极管1113。光电二极管1113用于将光照转化为电荷,且产生的电荷与光照强度成比例关系。开关管1115用于根据行选择逻辑单元171及列选择逻辑单元173的控制信号来控制电路的导通及断开,当电路导通时,源极跟随器1117(source follower)用于将光电二极管1113经光照产生的电荷信号转化为电压信号。模数转换器211(Analog-to-digital converter)用于将电压信号转换为数字信号,以传输至后续电路处理。
此输出处理方式使感光像素的输出转化为数字信号,在后续数字电路中或在芯片中用软件进行处理。因此每个感光像素的输出信息可以被保留,例如,对于16M像素的图像传感器来说,本发明实施方式的成像方法可以保留16M像素(即合并前图像)的信息,在此基础上经过处理得到4M像素的合并图像或其他分辨率的图像。最终生成图像出现坏点的概率较低。此外,此输出处理方式的噪声较小,信噪比较高。
根据本发明的一个实施例,如图9所示,图像传感器10包括设置在滤光片13上的微镜阵列23,每个微镜231与一个感光像素111对应。
具体地,每个微镜231与一个感光像素111对应,包括大小、位置对应。在某些实施方式中,每个滤光单元1311对应2*2个感光像素111及2*2个微镜191。随着技术发展,为了得到分辨率更高的图像,感光片上的感光像素111越来越多,排列越来越密集,单个感光像 素111也越来越小,其受光受到影响,且感光像素111的感光部分1111面积是有限的,微镜191能将光聚集到感光部分1111,从而提升感光像素111的受光强度以改善图像画质。
综上所述,根据本发明实施例的图像传感器,在部分滤光单元中嵌入白色滤光区,从而在低照度下获取合并像素的亮度信息且此亮度信息噪声较少,以此生成的合成图像的像素值既包含色彩信息又包含低噪度的亮度信息,合成图像的亮度及清晰度均较好,噪点少。
本发明实施例还提出了成像装置100。
根据图10所示,本发明实施例的成像装置100包括本发明实施方式的图像传感器10以及与图像传感器10连接的图像处理模块50。图像处理模块50用于读取并处理感光像素阵列11的输出以得到合并像素的像素值从而形成合并图像。
具体地,图像传感器10可包括控制模块17、行选择逻辑单元171、列选择逻辑单元173、模数转换器阵列21、寄存器19等,感光像素阵列11的输出经模数转换器阵列21转换为数字信号,逐行存储于寄存器19并传送至图像处理模块50处理,直至所有感光像素的输出被处理以生成合并图像。
如此,图像处理模块50根据同一合并像素的感光像素的输出计算合并像素的像素值以生成合并图像。
具体地,根据本发明的一个实施例,当滤光单元仅包括白色滤光区或非白色滤光区时,图像处理模块50进一步用于将同一合并像素的N个感光像素的输出相加作为合并像素的像素值。
并且,当滤光单元包括白色滤光区和非白色滤光区时,图像处理模块50进一步用于,将合并像素中白色滤光区对应的感光像素的输出相加以作为合并像素的第一像素值,以及将合并像素中非白色滤光区对应的感光像素的输出相加以作为合并像素的第二像素值。
也就是说,每个滤光单元中将被同一颜色的滤光片覆盖的感光像素的输出相加以获取像素值。以图4b的实施例为例,在每个滤光结构中,位于左上角和右下角的G滤光单元中,将白色滤光区覆盖的2个感光像素的输出相加即可作为合并像素的第一像素值,将绿色滤光区覆盖的2个感光像素的输出相加即可作为合并像素的第二像素值;位于左下角的B滤光单元中,将蓝色滤光区覆盖的4个感光像素的输出相加即可作为合并像素的像素值;位于右上角的R滤光单元中,将红色滤光区覆盖的4个感光像素的输出相加即可作为合并像素的像素值。
图像处理模块50即可根据G滤光单元的合并像素的第一像素值、G滤光单元的合并像素的第二像素值、B滤光单元的合并像素的像素值和R滤光单元的合并像素的像素值生成合并图像。如此,将多个感光像素的输出相加,形成的合并像素信噪比更高。例如,假定原有 每个感光像素的输出为S,噪声为Ns,合并像素包括N个感光像素,则合并像素的像素值为N*S,而合并像素的噪声为
Figure PCTCN2016099753-appb-000001
N为大于等于1的正整数。可以理解的是,在N>1的情况下,合并像素的噪声小于合并前每个感光像素输出的噪声之和,例如当N=4时,合并像素的噪声为Ns/2,Ns/2小于合并前每个感光像素输出的噪声之和4*Ns。而合并像素的输出为合并前各感光像素输出之和,因此合并后的图像整体上噪声下降信噪比提高,清晰度提升。综上所述,根据本发明实施例的成像装置,图像传感器在部分滤光单元中嵌入白色滤光区,从而在低照度下获取合并像素的亮度信息且此亮度信息噪声较少,以此生成的合成图像的像素值既包含色彩信息又包含低噪度的亮度信息,合成图像的亮度及清晰度均较好,噪点少。并且,由于合并像素的噪声小于合并之前各像素噪声之和,通过图像处理器合并像素能进一步提高低照度下的信噪比、亮度和清晰度,进一步减少图像的噪点。
本发明又提供一种应用成像装置的移动终端。
根据本发明的实施例,移动终端包括上述实施例的成像装置。因此,移动终端具有拍照功能且能在低照度下生成色彩完整,信噪比高,清晰度高的合并图像。
移动终端可以是手机。
根据本发明的一个实施例,成像装置可以是手机的前置相机。由于前置相机多用于自拍,而自拍一般要求对图像的清晰度有要求而对图像分辨率要求不高,采用本发明实施例的移动终端可满足此要求。
进一步地,根据图11的实施例,移动终端200包括与成像装置100连接的中央处理器81及外存储器83,中央处理器81用于控制外存储器83存储合并图像。
这样,生成的合并图像可以被存储,方便以后查看、使用或转移。外存储器83包括SM(Smart Media)卡及CF(Compact Flash)卡等。
进一步地,根据图12的实施例,移动终端200还包括与成像装置100连接的中央处理器81及显示装置85,中央处理器81用于控制显示装置85显示合并图像。这样,移动终端200拍摄的图像可以显示于显示装置以供用户查看。显示装置包括LED显示器等。
综上,采用本发明实施方式的移动终端,具有拍照功能且能在低照度下生成色彩完整,信噪比高,清晰度高的合并图像。特别的,当此移动终端为手机的前置相机时,能提升低照度下自拍图像的亮度及清晰度,减少噪点。
本发明实施例再提出了一种图像传感器的成像方法。
根据图13的实施例,本发明实施例的图像传感器的成像方法包括以下步骤:
S1:读取图像传感器中感光像素阵列的输出。
其中,图像传感器包括感光像素阵列及设置于感光像素阵列上的滤光片,滤光片包 括滤光单元阵列,其中,滤光单元阵列具有多个滤光单元,每个滤光单元覆盖N个感光像素,部分滤光单元至少包括白色滤光区,白色滤光区覆盖N个感光像素中的至少一个感光像素,其中,同一滤光单元覆盖的N个感光像素构成一个合并像素,N为正整数。外部光线通过滤光片照射到感光像素的感光部分以产生电信号,即感光像素的输出。
由此,在部分滤光单元中嵌入白色滤光区,从而在低照度下获取合并像素的亮度信息且此亮度信息噪声较少,以此生成的合成图像的像素值既包含色彩信息又包含低噪度的亮度信息,合成图像亮度及清晰度均较好,噪点少。
S2:根据同一合并像素的感光像素的输出计算合并像素的像素值以生成合并图像。
具体地,当滤光单元仅包括白色滤光区或非白色滤光区时,根据同一合并像素的感光像素的输出计算合并像素的像素值以生成合并图像即步骤S2进一步包括:将同一合并像素的对应的N个感光像素的输出相加作为合并像素的像素值。
并且,当滤光单元包括白色滤光区和非白色滤光区时,合并像素的像素值包括白色滤光区对应的第一像素值和非白色滤光区对应的第二像素值,根据同一合并像素的感光像素的输出计算合并像素的像素值以生成合并图像即步骤S2进一步包括:
将合并像素中白色滤光区对应的感光像素的输出相加以作为合并像素的第一像素值,以及将合并像素中非白色滤光区对应的感光像素的输出相加以作为合并像素的第二像素值。
如此,采用像素合并的方法,合并像素的输出为合并之前各像素输出的和,而合并像素的噪声小于合并之前各像素噪声的和,因此合并之后生成图像的噪点较少,信噪比较高。进一步地,根据本发明的一个实施例,如图14所示,根据同一合并像素的感光像素的输出计算合并像素的像素值即步骤S2具体包括:
S21:采集第k行及第k+1行的感光像素的输出并存入寄存器,其中k=2n-1,n为自然数,k+1小于等于感光像素阵列的总行数。
S22:从寄存器中提取第k行及第k+1行的感光像素的输出以得到合并像素的像素值。
如此,可以充分利用寄存器来实现感光单元的输出读出、缓存及合并的过程,硬件容易实现且处理速度较快。
另外,如图15所示,每个感光像素分别与一个模数转换器连接,本发明实施例成像方法进一步包括:
S31:将感光像素产生的模拟信号输出转换为数字信号输出。
S32:根据同一合并像素的感光像素的数字信号输出计算合并像素的像素值。
如此,一来,一般为数字信号处理芯片(DSP,digital signal processor)的图像处理模块可以直接处理图像传感器的输出,二来,相对于某些通过电路直接对图像传感器的模拟信号格式的输出进行处理的方案来说,较好地保留了图像的信息,例如,对于16M像素的图像传感 器来说,本发明实施方式的成像方法可以保留16M像素(即合并前图像)的信息,在此基础上经过处理得到4M像素的合并图像或其他分辨率的图像。
综上所述,根据本发明实施例提出的成像方法,在部分滤光单元中嵌入白色滤光区,从而在低照度下获取合并像素的亮度信息且此亮度信息噪声较少,以此生成的合成图像的像素值既包含色彩信息又包含低噪度的亮度信息,合成图像亮度及清晰度均较好,噪点少。并且,由于合并像素的噪声小于合并之前各像素噪声之和,通过图像处理器合并像素能进一步提高低照度下的信噪比、亮度和清晰度,进一步减少图像的噪点。
本发明实施例的成像方法及移动终端中未展开的部分,可参考以上实施例的图像传感器或成像装置的对应部分,在此不再详细展开。
在本发明的描述中,需要理解的是,术语“中心”、“纵向”、“横向”、“长度”、“宽度”、“厚度”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”“内”、“外”、“顺时针”、“逆时针”、“轴向”、“径向”、“周向”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本发明和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本发明的限制。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本发明的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。
在本发明中,除非另有明确的规定和限定,术语“安装”、“相连”、“连接”、“固定”等术语应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或成一体;可以是机械连接,也可以是电连接;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通或两个元件的相互作用关系,除非另有明确的限定。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本发明中的具体含义。
在本发明中,除非另有明确的规定和限定,第一特征在第二特征“上”或“下”可以是第一和第二特征直接接触,或第一和第二特征通过中间媒介间接接触。而且,第一特征在第二特征“之上”、“上方”和“上面”可是第一特征在第二特征正上方或斜上方,或仅仅表示第一特征水平高度高于第二特征。第一特征在第二特征“之下”、“下方”和“下面”可以是第一特征在第二特征正下方或斜下方,或仅仅表示第一特征水平高度小于第二特征。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本发明的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多 个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
尽管上面已经示出和描述了本发明的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本发明的限制,本领域的普通技术人员在本发明的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (20)

  1. 一种图像传感器,其特征在于,包括:
    感光像素阵列;及
    设置于所述感光像素阵列上的滤光片,所述滤光片包括具有多个滤光单元的滤光单元阵列,其中,每个所述滤光单元覆盖N个感光像素,部分所述滤光单元至少包括白色滤光区,所述白色滤光区覆盖所述N个感光像素中的至少一个感光像素,其中,同一所述滤光单元覆盖的所述N个感光像素构成一个合并像素,N为正整数。
  2. 如权利要求1所述的图像传感器,其特征在于,所述滤光单元阵列包括R滤光单元、G滤光单元和B滤光单元,其中,所述G滤光单元至少包括所述白色滤光区,所述白色滤光区覆盖所述G滤光单元覆盖的N个感光像素中的至少一个感光像素。
  3. 如权利要求2所述的图像传感器,其特征在于,每个所述滤光单元包括2*2个所述感光像素,其中,
    所述白色滤光区覆盖所述G滤光单元覆盖的1个感光像素,其中,所述G滤光单元还包括绿色滤光区,所述绿色滤光区覆盖其他3个感光像素;
    或者,所述白色滤光区覆盖所述G滤光单元覆盖的2个感光像素,其中,所述G滤光单元还包括绿色滤光区,所述绿色滤光区覆盖其他2个感光像素;
    或者,所述白色滤光区覆盖所述G滤光单元覆盖的3个感光像素,其中,所述G滤光单元还包括绿色滤光区,所述绿色滤光区覆盖其他1个感光像素;
    或者,所述白色滤光区覆盖所述G滤光单元覆盖的4个感光像素。
  4. 如权利要求1-3中任一项所述的图像传感器,其特征在于,还包括:
    控制模块,所述控制模块用于控制所述感光像素阵列逐行曝光。
  5. 如权利要求4所述的图像传感器,其特征在于,还包括:
    寄存器,所述控制模块用于依次采集当前曝光完成的第k行及第k+1行的所述感光像素的输出并存入所述寄存器,其中k=2n-1,n为自然数,k+1小于等于所述感光像素阵列的总行数。
  6. 如权利要求1-5中任一项所述的图像传感器,其特征在于,还包括:
    具有多个模数转换器的模数转换器阵列,每个所述模数转换器与一个所述感光像素连接。
  7. 如权利要求1-6中任一项所述的图像传感器,其特征在于,还包括:
    具有多个微镜的微镜阵列,每个所述微镜与一个所述感光像素对应。
  8. 一种成像装置,其特征在于,包括:
    如权利要求1-7任意一项所述的图像传感器;
    与所述图像传感器连接的图像处理模块,所述图像处理模块用于读取并处理所述图像传感器中所述感光像素阵列的输出以得到所述合并像素的像素值从而形成合并图像。
  9. 如权利要求8所述的成像装置,其特征在于,当滤光单元仅包括白色滤光区或非白色滤光区时,所述图像处理模块进一步用于,将同一所述合并像素的对应的所述N个感光像素的输出相加作为所述合并像素的像素值。
  10. 如权利要求8或9所述的图像传感器的成像方法,其特征在于,当滤光单元包括白色滤光区和非白色滤光区时,所述图像处理模块进一步用于,将所述合并像素中所述白色滤光区对应的所述感光像素的输出相加以作为所述合并像素的第一像素值,以及将所述合并像素中所述非白色滤光区对应的所述感光像素的输出相加以作为所述合并像素的第二像素值。
  11. 一种移动终端,其特征在于,包括如权利要求8-10中任一项所述的成像装置。
  12. 如权利要求11所述的移动终端,其特征在于,所述移动终端为手机。
  13. 如权利要求12所述的移动终端,其特征在于,所述成像装置为所述手机的前置相机。
  14. 如权利要求11-13中任一项所述的移动终端,其特征在于,还包括:
    与所述成像装置连接的中央处理器及外存储器,所述中央处理器用于控制所述外存储器存储所述合并图像。
  15. 如权利要求11-14中任一项所述的移动终端,其特征在于,还包括:
    与所述成像装置连接的中央处理器及显示装置,所述中央处理器用于控制所述显示装置 显示所述合并图像。
  16. 一种如权利要求1-7任意一项所述的图像传感器的成像方法,其特征在于,包括以下步骤:
    读取所述图像传感器中感光像素阵列的输出;
    根据同一所述合并像素的所述感光像素的输出计算所述合并像素的像素值以生成合并图像。
  17. 如权利要求16所述的图像传感器的成像方法,其特征在于,每个所述滤光单元包括2*2个所述感光像素,所述根据同一所述合并像素的所述感光像素的输出计算所述合并像素的像素值具体包括:
    采集第k行及第k+1行的所述感光像素的输出并存入寄存器,其中k=2n-1,n为自然数,k+1小于等于所述感光像素阵列的总行数;及
    从所述寄存器中提取所述第k行及第k+1行的所述感光像素的输出以得到所述合并像素的像素值。
  18. 如权利要求16或17所述的图像传感器的成像方法,其特征在于,当滤光单元仅包括白色滤光区或非白色滤光区时,所述根据同一所述合并像素的所述感光像素的输出计算所述合并像素的像素值以生成合并图像进一步包括:
    将同一所述合并像素的对应的所述N个感光像素的输出相加作为所述合并像素的像素值。
  19. 如权利要求16-18中任一项所述的图像传感器的成像方法,其特征在于,当滤光单元包括白色滤光区和非白色滤光区时,所述合并像素的像素值包括所述白色滤光区对应的第一像素值和所述非白色滤光区对应的第二像素值,所述根据同一所述合并像素的所述感光像素的输出计算所述合并像素的像素值以生成合并图像进一步包括:
    将所述合并像素中所述白色滤光区对应的所述感光像素的输出相加以作为所述合并像素的第一像素值,以及将所述合并像素中所述非白色滤光区对应的所述感光像素的输出相加以作为所述合并像素的第二像素值。
  20. 如权利要求16-19中任一项所述的图像传感器的成像方法,其特征在于,每个所述感光像素分别与一个模数转换器连接,其中,所述成像方法进一步包括:
    将所述感光像素产生的模拟信号输出转换为数字信号输出;及
    根据同一所述合并像素的所述感光像素的所述数字信号输出计算所述合并像素的像素值。
PCT/CN2016/099753 2015-12-18 2016-09-22 图像传感器、成像装置、移动终端及成像方法 WO2017101546A1 (zh)

Priority Applications (8)

Application Number Priority Date Filing Date Title
KR1020177026309A KR102083292B1 (ko) 2015-12-18 2016-09-22 이미지 센서, 이미징 장치, 이동 단말기 및 이미징 방법
MYPI2017702822A MY184809A (en) 2015-12-18 2016-09-22 Image sensor, imaging device, mobile terminal and imaging method for producing high resolution image
JP2017541006A JP6325755B2 (ja) 2015-12-18 2016-09-22 イメージセンサ、結像装置、モバイル端末及び結像方法
US15/544,537 US10594962B2 (en) 2015-12-18 2016-09-22 Image sensor, imaging device, mobile terminal and imaging method for producing high resolution image
EP16874606.3A EP3242479B1 (en) 2015-12-18 2016-09-22 Imaging device and mobile terminal
SG11201706246XA SG11201706246XA (en) 2015-12-18 2016-09-22 Image sensor, imaging device, mobile terminal and imaging method
AU2016369789A AU2016369789B2 (en) 2015-12-18 2016-09-22 Image sensor, imaging device, mobile terminal and imaging method
ZA2017/06230A ZA201706230B (en) 2015-12-18 2017-09-13 Image sensor, imaging device, mobile terminal and imaging method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510963465.1A CN105516697B (zh) 2015-12-18 2015-12-18 图像传感器、成像装置、移动终端及成像方法
CN201510963465.1 2015-12-18

Publications (1)

Publication Number Publication Date
WO2017101546A1 true WO2017101546A1 (zh) 2017-06-22

Family

ID=55724292

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/099753 WO2017101546A1 (zh) 2015-12-18 2016-09-22 图像传感器、成像装置、移动终端及成像方法

Country Status (11)

Country Link
US (1) US10594962B2 (zh)
EP (1) EP3242479B1 (zh)
JP (1) JP6325755B2 (zh)
KR (1) KR102083292B1 (zh)
CN (1) CN105516697B (zh)
AU (1) AU2016369789B2 (zh)
MY (1) MY184809A (zh)
SG (1) SG11201706246XA (zh)
TW (1) TWI617196B (zh)
WO (1) WO2017101546A1 (zh)
ZA (1) ZA201706230B (zh)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105578072A (zh) 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 成像方法、成像装置及电子装置
CN105516697B (zh) * 2015-12-18 2018-04-17 广东欧珀移动通信有限公司 图像传感器、成像装置、移动终端及成像方法
CN106504218B (zh) 2016-11-29 2019-03-12 Oppo广东移动通信有限公司 控制方法、控制装置及电子装置
CN106341670B (zh) 2016-11-29 2017-09-22 广东欧珀移动通信有限公司 控制方法、控制装置及电子装置
CN106454054B (zh) * 2016-11-29 2019-03-19 Oppo广东移动通信有限公司 控制方法、控制装置及电子装置
CN107040724B (zh) 2017-04-28 2020-05-15 Oppo广东移动通信有限公司 双核对焦图像传感器及其对焦控制方法和成像装置
CN108269243B (zh) * 2018-01-18 2021-08-31 福州鑫图光电有限公司 一种图像信噪比的增强方法及终端
CN108323208A (zh) * 2018-02-12 2018-07-24 深圳市汇顶科技股份有限公司 图像获取方法和装置
CN111835977B (zh) * 2019-04-18 2021-11-02 北京小米移动软件有限公司 图像传感器、图像生成方法及装置、电子设备、存储介质
CN111756974A (zh) * 2020-05-15 2020-10-09 深圳市汇顶科技股份有限公司 图像传感器和电子设备
KR20220072116A (ko) 2020-11-24 2022-06-02 삼성전자주식회사 이미지 센서
CN113676708B (zh) * 2021-07-01 2023-11-14 Oppo广东移动通信有限公司 图像生成方法、装置、电子设备和计算机可读存储介质

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090200451A1 (en) * 2008-02-08 2009-08-13 Micron Technology, Inc. Color pixel arrays having common color filters for multiple adjacent pixels for use in cmos imagers
CN103703413A (zh) * 2011-05-31 2014-04-02 全视技术有限公司 使用颜色相关波前编码扩展透镜系统中景深的系统和方法
CN104025577A (zh) * 2011-12-28 2014-09-03 富士胶片株式会社 图像处理装置、方法以及摄像装置
CN104184967A (zh) * 2013-05-28 2014-12-03 全视科技有限公司 用于校正图像传感器固定图案噪声的设备、系统和方法
CN104429060A (zh) * 2012-07-06 2015-03-18 富士胶片株式会社 彩色摄像元件及摄像装置
CN105516697A (zh) * 2015-12-18 2016-04-20 广东欧珀移动通信有限公司 图像传感器、成像装置、移动终端及成像方法
CN105578078A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 图像传感器、成像装置、移动终端及成像方法
CN105578071A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 图像传感器的成像方法、成像装置和电子装置
CN105578066A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 成像方法、成像装置及电子装置
CN105578006A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 成像方法、成像装置及电子装置

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1324363C (zh) * 2002-05-04 2007-07-04 三星电子株式会社 液晶显示器及其滤色片阵列板
US8139130B2 (en) * 2005-07-28 2012-03-20 Omnivision Technologies, Inc. Image sensor with improved light sensitivity
US7872681B2 (en) * 2005-10-13 2011-01-18 Rjs Technology, Inc. System and method for a high performance color filter mosaic array
KR100976284B1 (ko) * 2007-06-07 2010-08-16 가부시끼가이샤 도시바 촬상 장치
TWI413242B (zh) * 2007-08-10 2013-10-21 Hon Hai Prec Ind Co Ltd 固態圖像感測器
JP4683121B2 (ja) 2008-12-08 2011-05-11 ソニー株式会社 固体撮像装置、固体撮像装置の信号処理方法および撮像装置
US8237831B2 (en) 2009-05-28 2012-08-07 Omnivision Technologies, Inc. Four-channel color filter array interpolation
US8134115B2 (en) * 2009-06-23 2012-03-13 Nokia Corporation Color filters for sub-diffraction limit-sized light sensors
US20110013056A1 (en) * 2009-07-17 2011-01-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Color filters and demosaicing techniques for digital imaging
KR20110040402A (ko) 2009-10-14 2011-04-20 삼성전자주식회사 필터 어레이, 이를 포함하는 이미지 센서, 및 신호 보간 방법
US8345132B2 (en) 2010-07-23 2013-01-01 Omnivision Technologies, Inc. Image sensor with dual element color filter array and three channel color output
JP2013021660A (ja) * 2011-07-14 2013-01-31 Sony Corp 画像処理装置、撮像装置、および画像処理方法、並びにプログラム
US9165526B2 (en) * 2012-02-28 2015-10-20 Shenzhen Yunyinggu Technology Co., Ltd. Subpixel arrangements of displays and method for rendering the same
US9191635B2 (en) 2012-03-19 2015-11-17 Semiconductor Components Industries, Llc Imaging systems with clear filter pixels
TWI521965B (zh) * 2012-05-14 2016-02-11 Sony Corp Camera and camera methods, electronic machines and programs
JP5927068B2 (ja) * 2012-07-06 2016-05-25 富士フイルム株式会社 カラー撮像素子
US9692992B2 (en) * 2013-07-01 2017-06-27 Omnivision Technologies, Inc. Color and infrared filter array patterns to reduce color aliasing
TWI644568B (zh) * 2013-07-23 2018-12-11 新力股份有限公司 攝像元件、攝像方法及攝像程式
US10136107B2 (en) 2013-11-21 2018-11-20 Semiconductor Components Industries, Llc Imaging systems with visible light sensitive pixels and infrared light sensitive pixels
EP2887655A1 (fr) * 2013-12-20 2015-06-24 Swiss Timing Ltd. Filtre couleur adaptatif pour capteur numérique
US9888198B2 (en) * 2014-06-03 2018-02-06 Semiconductor Components Industries, Llc Imaging systems having image sensor pixel arrays with sub-pixel resolution capabilities

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090200451A1 (en) * 2008-02-08 2009-08-13 Micron Technology, Inc. Color pixel arrays having common color filters for multiple adjacent pixels for use in cmos imagers
CN103703413A (zh) * 2011-05-31 2014-04-02 全视技术有限公司 使用颜色相关波前编码扩展透镜系统中景深的系统和方法
CN104025577A (zh) * 2011-12-28 2014-09-03 富士胶片株式会社 图像处理装置、方法以及摄像装置
CN104429060A (zh) * 2012-07-06 2015-03-18 富士胶片株式会社 彩色摄像元件及摄像装置
CN104184967A (zh) * 2013-05-28 2014-12-03 全视科技有限公司 用于校正图像传感器固定图案噪声的设备、系统和方法
CN105516697A (zh) * 2015-12-18 2016-04-20 广东欧珀移动通信有限公司 图像传感器、成像装置、移动终端及成像方法
CN105578078A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 图像传感器、成像装置、移动终端及成像方法
CN105578071A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 图像传感器的成像方法、成像装置和电子装置
CN105578066A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 成像方法、成像装置及电子装置
CN105578006A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 成像方法、成像装置及电子装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3242479A4 *

Also Published As

Publication number Publication date
CN105516697A (zh) 2016-04-20
SG11201706246XA (en) 2017-08-30
KR20170122772A (ko) 2017-11-06
KR102083292B1 (ko) 2020-03-02
EP3242479A4 (en) 2018-04-25
TW201724845A (zh) 2017-07-01
JP2018509813A (ja) 2018-04-05
ZA201706230B (en) 2019-01-30
US20180007289A1 (en) 2018-01-04
AU2016369789B2 (en) 2019-06-27
US10594962B2 (en) 2020-03-17
JP6325755B2 (ja) 2018-05-16
EP3242479B1 (en) 2022-03-23
EP3242479A1 (en) 2017-11-08
TWI617196B (zh) 2018-03-01
CN105516697B (zh) 2018-04-17
MY184809A (en) 2021-04-23
AU2016369789A1 (en) 2017-08-24

Similar Documents

Publication Publication Date Title
WO2017101546A1 (zh) 图像传感器、成像装置、移动终端及成像方法
CN102892008B (zh) 双图像捕获处理
JP5026951B2 (ja) 撮像素子の駆動装置、撮像素子の駆動方法、撮像装置、及び撮像素子
WO2017101451A1 (zh) 成像方法、成像装置及电子装置
TWI511558B (zh) 具有高動態範圍攝取能力之影像感測器
CN103501416B (zh) 成像系统
JP5461568B2 (ja) カラーおよび全色性チャネルcfa画像の修正
TWI504257B (zh) 在產生數位影像中曝光像素群組
TWI615027B (zh) 高動態範圍圖像的生成方法、拍照裝置和終端裝置、成像方法
TWI613918B (zh) 圖像感測器之成像方法、成像裝置、行動終端和非易失性電腦儲存媒體
JP2006191622A (ja) Isp内蔵型イメージセンサ及びデュアルカメラシステム
TW201038061A (en) Extended depth of field for image sensor
JP2013515442A (ja) 静止画像及びプレビュー画像を用いた高ダイナミックレンジ画像の生成
TW201102968A (en) CFA image with synthetic panchromatic image
TW200903792A (en) Image sensor
JP5843027B1 (ja) 撮像装置、制御方法およびプログラム
JP5009880B2 (ja) 撮像装置及び撮像方法
JP2008193163A (ja) 固体撮像装置
JP2006157600A (ja) デジタルカメラ
JP6070301B2 (ja) 固体撮像素子及びこれを用いた撮像装置
CN105611257B (zh) 成像方法、图像传感器、成像装置及电子装置
TW201724843A (zh) 圖像感測器及終端與成像方法
TW201042355A (en) Imaging apparatus, auto-focusing method and recording medium
WO2017101864A1 (zh) 图像传感器、控制方法和电子装置
JP2007243917A (ja) 撮像装置および画像処理プログラム

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 15544537

Country of ref document: US

REEP Request for entry into the european phase

Ref document number: 2016874606

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 11201706246X

Country of ref document: SG

ENP Entry into the national phase

Ref document number: 2017541006

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16874606

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016369789

Country of ref document: AU

Date of ref document: 20160922

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20177026309

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE