US20140118572A1 - Alternative Color Image Array And Associated Methods - Google Patents
Alternative Color Image Array And Associated Methods Download PDFInfo
- Publication number
- US20140118572A1 US20140118572A1 US14/148,078 US201414148078A US2014118572A1 US 20140118572 A1 US20140118572 A1 US 20140118572A1 US 201414148078 A US201414148078 A US 201414148078A US 2014118572 A1 US2014118572 A1 US 2014118572A1
- Authority
- US
- United States
- Prior art keywords
- color
- light
- light sensitive
- green
- sensitive elements
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims description 40
- 230000003287 optical effect Effects 0.000 claims abstract description 8
- 238000004891 communication Methods 0.000 claims abstract description 5
- 238000005070 sampling Methods 0.000 claims description 97
- 229920006395 saturated elastomer Polymers 0.000 claims description 8
- 239000003086 colorant Substances 0.000 claims description 6
- 238000001914 filtration Methods 0.000 claims description 4
- 229910052876 emerald Inorganic materials 0.000 claims description 3
- 239000010976 emerald Substances 0.000 claims description 3
- 241000579895 Chlorostilbon Species 0.000 claims description 2
- 230000008569 process Effects 0.000 description 20
- 230000009467 reduction Effects 0.000 description 13
- 238000012545 processing Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000004549 pulsed laser deposition Methods 0.000 description 1
- 238000009738 saturating Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- H04N9/045—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/44—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
- H04N25/447—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by preserving the colour pattern with or without loss of information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/42—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by switching between different modes of operation using different resolutions or aspect ratios, e.g. switching between interlaced and non-interlaced mode
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2209/00—Details of colour television systems
- H04N2209/04—Picture signal generators
- H04N2209/041—Picture signal generators using solid-state devices
- H04N2209/042—Picture signal generators using solid-state devices having a single pick-up sensor
- H04N2209/045—Picture signal generators using solid-state devices having a single pick-up sensor using mosaic colour filter
Definitions
- CMOS image sensors are typically formed as an array of pixels, where each pixel includes a photodetector that transforms incident light photons into current signals.
- Each pixel may also include other known elements, such as a reset switch, a signal amplifier, and output circuits that operate to set the exposure time of the photodetector and perform a read out indicative of light photons incident thereon. Where incident light is too high for the set exposure time of the pixel, the photodetector typically saturates.
- FIG. 1 illustrates a prior art CMOS image sensor pixel array 100 .
- Pixel array 100 is configured for column parallel readout and has a plurality of columns, each having a pixel 102 for each of a plurality of rows.
- column parallel readout architecture for each row, one pixel 102 in each column is read out and processed simultaneously. That is, pixels 102 of Row 0 are read out in parallel, then pixels 102 of Row 1 are read out in parallel, then pixels 102 of Row 2 are read out in parallel, and so on, until Row M is read out.
- Pixels 102 within each column connect to a column readout line 105 , such that when a row is triggered for output, each pixel in that row outputs a signal to its associated column readout line 105 , while outputs of other pixels in the column remain inactive.
- Array 100 is shown with one sample and hold element 104 for each column read out line 105 .
- Sample and hold elements 104 and column read out lines 105 cooperate to provide a row-by-row read out of pixels 102 .
- a second stage amplifier 106 connects to each of the sample and hold elements 104 . All rows are typically output to form an image (also known as a frame).
- CMOS image sensors are often used in applications in which both very bright and very dark conditions are encountered.
- a variety of techniques have been developed to improve the response of CMOS image sensors in a variety of light conditions.
- U.S. Patent Publication No. 2004/0141075 entitled “Image Sensor Having Dual Automatic Exposure Control”, by Xiangchen Xu et al., is assigned to Omnivision Technologies, Inc. and is hereby incorporated by reference.
- Xu teaches that the gain and exposure time can be adjusted over a sequence of frames to compensate for varying light conditions. An adjustment in exposure time is determined by analyzing one frame and then used to make an adjustment for a subsequent frame.
- the dynamic range is the ratio of the largest detectable signal to the smallest, which for a CMOS image sensor is often defined by the ratio of the largest non-saturating signal to the standard deviation of the noise under dark conditions.
- a binning process is used to combine data from two or more pixels to increase a signal to noise ratio (SNR), and a high dynamic range (HDR) combination process is used to combine data from two or more pixels to increase dynamic range.
- SNR signal to noise ratio
- HDR high dynamic range
- a Bayer pattern which is one of the most commonly used patterns for down-sampling, generates zigzag edges during both the HDR combination process and the binning process.
- corrective algorithms for these zigzag edges have been developed for use with the Bayer pattern, these corrective algorithms have certain disadvantages, such as reducing sharpness and resolution of output frames and increasing cost of image sensors.
- a binning re-interpolation algorithm can partly smooth zigzag edges caused by the Bayer pattern, but with a sacrifice in sharpness and resolution of the resultant frame. Re-interpolation also becomes very expensive since more memory is necessary.
- the present disclosure presents a modified Bayer pattern as an alternative to the conventional Bayer pattern.
- the down-sampling problem of a zigzag effect resulting from binning or HDR combination of pixel values configured in a conventional Bayer pattern is solved by using a modified Bayer pattern.
- a sensor having pixels based upon the modified Bayer pattern outputs images with smooth edges without sacrificing sharpness or resolution.
- Image sensors based upon the modified Bayer pattern have less edge zigzag and have improved sharpness and resolution in generated images.
- an image sensor includes an array of light sensitive elements and a filter array including a plurality of red, green, and blue filter elements.
- Each filter element is in optical communication with a respective light sensitive element.
- Each red filter element is configured to transmit only red colored light
- each green filter element is configured to transmit only green colored light
- each blue filter element is configured to transmit only blue colored light.
- the filter array is arranged such that successive columns of the filter array have alternating first and second configurations.
- the first configuration is characterized by a repeating pattern of successive blue, green, red, and green filter elements
- the second configuration is characterized by a repeating pattern of successive green, blue, green, and red filter elements.
- a method for down-sampling an image produced by an image sensor including an array of light sensitive elements includes filtering light incident on the image sensor. The light is filtered such that successive columns of the array of light sensitive elements alternately receive light having a first pattern and a second pattern.
- the first pattern is characterized by each four successive light sensitive elements in a column respectively receiving blue, green, red, and green colored light.
- the second pattern is characterized by each four successive light sensitive elements in a column respectively receiving green, blue, green, and red colored light.
- the method further includes sampling output values of the light sensitive elements and combining output values of pairs of light sensitive elements to generate a down-sampled image.
- a method for down-sampling an image produced by an image sensor including an array of light sensitive elements includes filtering light incident on the image sensor. The light is filtered such that successive columns of the array of light sensitive elements alternately receive light having a first pattern and a second pattern.
- the first pattern is characterized by each four successive light sensitive elements in a column respectively receiving blue, green, red, and green colored light.
- the second pattern is characterized by each four successive light sensitive elements in a column respectively receiving green, blue, green, and red colored light.
- the method additionally includes sampling output values of the light sensitive elements such that light sensitive elements of successive rows alternately have long and short exposure times.
- the method further includes combining output values of pairs of light sensitive elements to generate a down-sampled image.
- an image sensor has an array of light sensitive elements and a filter array including a plurality of first, second, third, and fourth filter elements, each filter element in optical communication with a respective light sensitive element.
- Each first filter element is configured to transmit light of a first color
- each second filter element is configured to transmit light of a second color
- each third filter element is configured to transmit light of a third color
- each fourth filter element is configured to transmit light of a fourth color.
- the filter array is configured to include a repeating pattern of filter elements characterized by: at least two successive rows of alternating first and second filter elements where common columns of the at least two successive rows also include alternating first and second filter elements, and at least two additional successive rows of alternating third and fourth filter elements where common columns of the at least two additional successive rows also include alternating third and fourth filter element elements.
- a method down-samples an image produced by an image sensor including an array of light sensitive elements.
- Light incident on the image sensor is filtered such that the image sensor receives light having a repeating pattern characterized by: (a) light sensitive elements in at least two successive rows alternately receiving light having a first color and a second color, and light sensitive elements in common columns of the at least two successive rows alternately receiving light having the first color and the second color, and (b) light sensitive elements in at least two additional successive rows alternately receive light having a third and a fourth color, and light sensitive elements in common columns of the at least two additional successive rows alternately receiving light having the third color and the fourth color.
- Output values of the light sensitive elements are sampled, and output values of pairs of light sensitive elements receiving light of a common color and from successive rows of the array are combined to generate a down-sampled image.
- FIG. 1 illustrates operation of a prior art image sensor.
- FIG. 2 is a block diagram illustrating one exemplary modified Bayer pattern imaging system that supports down-sampling with both a high dynamic range combination and binning, according to an embodiment.
- FIG. 3 illustrates binning for a conventional Bayer pattern in a down-sampling mode.
- FIG. 4 illustrates binning for an exemplary modified Bayer pattern in a down-sampling mode, according to an embodiment.
- FIG. 5 illustrates prior art data selection between long exposure and short exposure for a conventional Bayer pattern in a down-sampling mode.
- FIG. 6 illustrates data selection among long exposure and short exposure for an exemplary modified Bayer pattern in a down-sampling mode, according to an embodiment.
- FIG. 7 shows a portion of a pixel array configured in a modified Bayer pattern illustrating exemplary connectivity for binning pixel data values to generate output data, according to an embodiment.
- FIG. 8 shows a portion of a pixel array configured in a modified Bayer pattern illustrating alternate exemplary connectivity for binning pixel data values to generate output data, according to an embodiment.
- FIG. 9 illustrates binning for an exemplary rotated modified Bayer pattern in a down-sampling mode, according to an embodiment.
- FIG. 10 is a block diagram illustrating one exemplary rotated modified Bayer pattern image sensor that supports down-sampling with both a high dynamic range (HDR) combination and binning, according to an embodiment.
- HDR high dynamic range
- sensor array pixel array
- image array may be used interchangeably to mean an array of photosensors.
- FIG. 2 is a block diagram illustrating one exemplary modified Bayer pattern image sensor 200 that supports down-sampling with both a high dynamic range (HDR) combination and binning. Certain components are omitted for clarity of illustration.
- Image sensor 200 includes an image array 203 that has a plurality of light sensitive elements or photo-sensitive pixels 202 arranged as a plurality of rows and a plurality of columns and a filter array including a number of filter elements. Each filter element is in optical alignment with a respective pixel and is configured to allow light of only a certain color to pass through.
- the filter array conforms to a modified Bayer pattern defining the color sensitivity of each pixel 202 . That is, color of pixel sensors within image array 203 conform to the exemplary modified Bayer pattern sensor 400 of FIG. 4 .
- Image sensor 200 may be implemented as a complementary metal-oxide-semiconductor (CMOS) image sensor where each pixel 202 includes a photodetector and associated circuitry that supports setting an exposure time and reading
- image array 203 has a column parallel readout architecture where, for each row, pixels 202 are read out simultaneously and processed in parallel.
- a readout line 205 connects, in parallel, to pixels 202 of that column and to a sample and hold (S/H) element 204 .
- Outputs of S/H elements 204 connect to a second stage amplifier 206 , which in turn connects to a processor 250 .
- Processor 250 processes signals (i.e., image sensor data) from amplifier 206 to generate an image.
- Processor 250 may be implemented as a digital signal processor having a local line memory.
- a row address decoder 208 and a column address decoder 210 operate to decode signals from a timing and control block 215 to address pixels 202 .
- Timing and control block 215 includes a first pre-charge address block 220 , a second pre-charge address block 225 , and a sampling address block 230 .
- the first pre-charge address block 220 may be set to a first pre-charge value
- the second pre-charge address block 225 may be set to a second pre-charge value.
- sampling address 230 of timing and control block 215 selects a row, and a pre-charge is applied to pixels of that row from either the first pre-charge address block or the second pre-charge address block.
- the first pre-charge address block 220 supports a full resolution mode with the same gain and exposure time setting for each row.
- the first pre-charge address block 220 also supports a down-sampling mode that reduces resolution and permits the same exposure time to be set for all the rows during binning to achieve high SNR.
- the first pre-charge address block 220 and the second pre-charge address block 225 cooperate to support a down-sampling mode that reduces resolution and permits different exposure times to be set for different rows during the HDR combination process to achieve high dynamic range.
- Additional pre-charge address blocks may be included within timing and control block 215 to provide additional pre-charge values for additional down-sampling modes.
- the resolution of an image generated by processor 250 using data from image sensor 200 depends upon how the raw pixel data generated by photo-sensitive pixel elements is sampled and processed to generate pixels for the processed image.
- the term “raw pixel data” is used to distinguish data generated by image sensor 200 from the pixel data after the raw data has been sampled and performed additional signal processing by processor 250 .
- the raw pixel data received from image sensor 200 may be down-sampled to reduce the effective vertical resolution of the processed image.
- a variety of standard resolution formats are used in the image sensing art. For example, a 1.3 megapixel super extended graphics array (SXGA) format has 1280 ⁇ 1024 pixels of resolution while a video graphics array (VGA) format has a resolution of 640 ⁇ 480 pixels.
- the vertical resolution of the raw pixel data is reduced by processor 250 to implement format conversion and simultaneously achieve a higher dynamic range.
- a down sampling mode may be selected that also provides a higher dynamic range.
- the down-sampling mode implements a 1:2 reduction in vertical resolution, and thus, since there is a simple geometric ratio of 1:2 in vertical resolution, down-sampling may combine data from two rows (e.g., Row 0 and Row 1) of pixels 202 .
- processor 250 operates to combine raw pixel data values to generate pixel values in the final image.
- values resulting from two different exposure times controlled by the pre-charge values are processed by processor 250 to effectively increase the dynamic range of array 200 , as compared to the dynamic range when full resolution is used.
- even rows e.g., Row 0, Row 2, Row 4, . . . Row M ⁇ 1
- odd rows e.g., Row 1, Row 3, Row 5, . . . Row M
- processor 250 includes a local line memory to store and synchronize the processing of lines having either the same or different row exposure times.
- the local memory may be used to store sets of long exposure rows and short exposure rows sampled at different times to permit aligning and combining rows with either the same or different exposure times.
- processor 250 reads the memory and combines the raw pixel data of pixels that are neighbors along the vertical dimension that are of a compatible type and that have the same exposure time for the binning process.
- processor 250 reads the memory and selects the raw pixel data of pixels that are neighbors along the vertical dimension that are of a compatible type and that have the different exposure times for the HDR combination process.
- the exposure time of a pixel affects its output response.
- the pixel When a pixel is operated with a long exposure time, the pixel is very sensitive to received light, but tends to saturate at a low light level.
- the pixel when the pixel is operated with a short exposure time, the pixel is less sensitive to light, and saturates at a higher light level as compared to operation with a short exposure time.
- a higher dynamic range is achieved as compared to down-sampling of rows with the same exposure time.
- any down-sampling mode with a 1:N reduction (where N is an integer value) in vertical resolution may be supported, such as 1:2, 1:3, 1:4 and so on.
- the exposure times of the rows are varied in an interleaved sequence of row exposure times that permits down-sampling to be achieved with increased dynamic range.
- the three rows that are to be combined have a sequence of a long exposure time, medium exposure time, and short exposure time.
- rows of pixels have different exposure times. By combining data from two or more pixels of rows having different exposure times, dynamic range may be increased.
- data is selected from either the long exposure pixel or the short exposure pixel.
- data from pixels of long exposure time L pixels
- S pixels data from pixels of short exposure time
- the S pixel data is normalized to match the scale of long exposure pixels. For example,
- data N is the determined normalized pixel data value
- data O is the original pixel data value
- L_exposuretime represents the long exposure time
- S_exposuretime represents the exposure time of the selected pixel data value. If the pixel data value selected has the short exposure time, data N is normalized based upon the long exposure time as shown in Equation (1). If the pixel data value selected has the long exposure time, data N is the same as data O .
- all rows of pixels are configured to have the same exposure time. Binning of two rows having the same exposure time achieves a higher signal to noise ratio (SNR) in the down-sampling.
- SNR signal to noise ratio
- Down-sampling modes that have higher dynamic range are also compatible with a variety of color filter array formats.
- a color filter array pattern is applied to an array of photosensors such that output from the photosensors creates a color image.
- the incoming light to each photosensor is filtered such that typically each photosensor in the pixel array records only one color, such as red, green or blue.
- the row exposure times used in down-sampling are selected such that pixels having compatible filter types are combined during down-sampling.
- FIG. 3 illustrates prior art binning in a down-sampling mode for a Bayer pattern sensor 300 .
- Binning results 310 after down-sampling are shown in the right portion of FIG. 3 .
- BGBG blue-green-blue-green
- GRGR green-red-green-red
- the Bayer pattern is a RGB filter pattern that is 50% green, 25% red, and 25% blue.
- a blue-green row of pixels is followed by a green-red row of pixels.
- Patterns include the CYGM filter array pattern (cyan, yellow, green, and magenta) formed of alternate rows of cyan-yellow and green-magenta, and a RGBE filter array pattern (red, green, blue, and emerald) having alternating rows of red-green and blue-emerald. Patterns may also include clear pixels, such a Red-Green-Blue-Clear (RGBC) and similarly, Red-Green-Blue-White (RGBW). As noted above, the problem with prior art sensor that utilize the Bayer pattern, is that significant zigzagging results during the binning process.
- RGBC Red-Green-Blue-Clear
- RGBW Red-Green-Blue-White
- FIG. 4 illustrates exemplary binning in a down-sampling mode for a modified Bayer pattern sensor 400 .
- Modified Bayer pattern sensor 400 may represent image array 203 of FIG. 2 ; a down sampling result 410 (the right portion of FIG. 4 ) illustrates binning results after down-sampling and has a conventional Bayer pattern as a result of binning.
- a blue-green row of pixels is followed by a green-blue row of pixels, which is followed by a red-green row of pixels, which is followed by a green-red row of pixels.
- the rows of colors repeat every four rows.
- This novel pattern is modified from the Bayer pattern sensor 300 by inserting row GBGB 420 and row RGRG 430 between row BGBG 415 and row GRGR 435 that are equivalent to respective row BGBG 315 and row GRGR 320 of the Bayer pattern sensor 300 .
- Row GBGB 420 is formed by shifting elements G and B of row BGBG 415 to one column to either the right or to the left.
- Row RGRG 430 is formed by shifting elements G and R of row GRGR 435 to one column to either the right or the left.
- the rows have four color patterns that repeat every four rows: Blue-Green-Blue-Green (BGBG) 415 , Green-Blue-Green-Blue (GBGB) 420 , Red-Green-Red-Green (RGRG) 430 , and Green-Red-Green-Red (GRGR) 435 , and so on in repeating sequence.
- BGBG Blue-Green-Blue-Green
- GBGB Green-Blue-Green-Blue
- RGRG Red-Green-Red-Green
- GRGR Green-Red-Green-Red
- Binning of BGBG row 415 and GBGB row 420 generates a single BGBG row 425 after down-sampling in which the G combines data from the two green pixels of the two rows having the same exposure times, the B combines data from the two blue pixels of the two rows having the same exposure times, and so on.
- binning of RGRG row 430 and GRGR row 435 generates a single GRGR row 440 after down-sampling.
- FIG. 5 illustrates HDR combination in a down-sampling mode for a prior art Bayer filter pattern sensor 500 .
- Down-sampling from Bayer filter pattern sensor 500 gives down-sampling results 510 (in the right portion of FIG. 5 ).
- subscripts indicate whether the pixel is configured as either long (L) or short (S) exposure time rows. Every pair of two nearest rows having the same color pattern but different exposure times have pixel data combined during down-sampling to give down-sampling results 510 .
- GRGR row 525 results from selection between pixel values of GRGR row 515 and GRGR row 520 .
- BGBG row 540 results from pixel value selection between BGBG row 530 and BGBG row 535 .
- the problem with prior art sensor that utilize the Bayer pattern is that significant zigzagging results during the HDR combination process.
- FIG. 6 illustrates HDR combination in a down-sampling mode for a modified Bayer pattern sensor 600 .
- Modified Bayer filter pattern sensor 600 may represent a portion of image array 203 of FIG. 2 ; a down-sampling result 610 (shown in the right portion of FIG. 6 ) represents the result of down-sampling from modified Bayer filter pattern sensor 600 , and has a conventional Bayer pattern as a result of HDR combination.
- color sequences are similar to modified Bayer filter pattern sensor 400 of FIG. 4 .
- Subscripts within each pixel indicate the configured exposure time as either long (L) or short (S). In the example of FIG.
- row 615 has a long exposure time
- row 620 has a short exposure time
- row 630 has a long exposure time
- row 635 has a short exposure time. This sequence then repeats. The sequence is selected to be compatible with the modified Bayer pattern, which also repeats after every four rows. Adjacent rows of corresponding colors and different exposure times have pixel data combined during down-sampling.
- a data selection of a long exposure time BGBG row 615 and a short exposure time GBGB row 620 to generate a single BGBG row 625 after down-sampling in which the G has selected data from the long and short exposure time pixel data (for the two green pixels from the rows with different exposure times), the R has selected data from long and short exposure time pixel data (for the two red pixels from the rows with different exposure times), and so on.
- one data is selected from a long exposure time RGRG row 630 and a short exposure time RGRG row 635 to generate a single GRGR row 640 after down-sampling.
- CYGM pattern could be modified to have the following color pattern that repeats every four rows: cyan, yellow, cyan, yellow (row 1); yellow, cyan, yellow, cyan (row 2), green, magenta, green, magenta (row 3); and magenta, green, magenta, green (row 4).
- the RGBE, RGBC, and RGBW patterns could also be modified in similar manners, for example.
- filter patterns are modified to repeat after more than four rows to achieve a reduction in vertical resolution larger than 1:2.
- the Bayer pattern could be modified to have the following pattern that repeats every six row to achieve a 1:3 reduction in vertical resolution: blue, green, blue, green (row 1); green, blue, green, blue (row 2); blue, green, blue, green (row 3); red, green, red green (row 4); green, red, green, red (row 5); and red, green, red, green (row 6).
- modified Bayer pattern e.g., modified Bayer pattern sensors 400 and 600
- the HDR combination process also benefits from the modified Bayer pattern, and thus generates high quality images.
- captured image quality from a sensor utilizing the modified Bayer pattern may not be as good as an image captured with a sensor configured with a conventional Bayer pattern.
- artifacts within the normal mode image captured from the sensor utilizing the modified Bayer pattern are minor compared to the zigzag problem, and these artifacts may be easily corrected by image processing algorithms.
- image sensor 200 supports a full resolution (i.e., row-by-row) readout of pixel data in which each row has the same exposure time.
- image sensor 200 has two modes of operation; (1) a normal full resolution mode with dynamic range limited by photosensors within each pixel, and (2) a down-sampling mode that has reduced vertical resolution.
- a normal full resolution mode with dynamic range limited by photosensors within each pixel In the down-sampling mode, binning achieves high SNR when HDR is not required, while HDR combination achieves a higher dynamic range when HDR is desired.
- a comparatively small amount of chip ‘real estate’ is required for the additional functionality to provide the second pre-charge address block 225 and row independent exposure times for HDR combination. Only comparatively inexpensive modifications to processor 250 are required to implement the down-sampling mode with HDR combination. In essence “spare lines” are used during down-sampling to achieve a high dynamic range sensing mode at a very low marginal cost.
- Down-sampling schemes of the prior art typically emphasize reduction of noise and gain, and the exposure time of each row remains nominally the same.
- Prior art down-sampling either discards data from a portion of the lines or averages data across multiple rows. Thus, these prior art down-sampling approaches do not increase the dynamic range of resulting image.
- HDR combination may be implemented at least in part within the analog domain, such as using sample and hold registers or may be implemented in the digital domain, such as using analog to digital converters and software.
- processor 250 represents a digital signal processor
- down-sampling such as binning and HDR combination
- processor 250 may be implemented as machine readable instructions stored in memory accessible by the processor.
- At least part of the embodiments disclosed herein may relate to a computer storage product with a computer-readable medium having computer code thereon for performing various computer-implemented operations.
- Examples of computer-readable media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs, DVDs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store and execute program code, such as application-specific integrated circuits (“ASICs”), programmable logic devices (“PLDs”) and ROM and RAM devices.
- ASICs application-specific integrated circuits
- PLDs programmable logic devices
- Examples of computer code include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter.
- machine code such as produced by a compiler
- files containing higher-level code that are executed by a computer using an interpreter.
- an embodiment of the invention may be implemented using Java, C++, or other object-oriented programming language and development tools.
- Another embodiment of the invention may be implemented in hardwired circuitry in place of, or in combination with, machine-executable software instructions.
- FIG. 7 shows a portion 700 of a pixel array in modified Bayer pattern configuration illustrating exemplary connectivity for binning pixel data values to generate output data in a conventional Bayer pattern.
- Portion 700 may represent a portion of image array 203 of FIG. 2 .
- Portion 700 includes sixteen pixels 202 divided into subgroups 710 A-D, each subgroup having four pixels.
- Portion 700 has four outputs 702 , 704 , 706 , and 708 .
- Output 702 combines data of two green pixels on a diagonal of subgroup 710 A.
- output 704 combines data of two red pixels of subgroup 710 D
- output 706 combines data of two green pixels of subgroup 710 C
- output 708 combines data of two blue pixels of subgroup 710 B.
- Outputs 708 , 702 , 706 and 704 thus generate two rows of two pixels arranged in a conventional Bayer pattern with a 2:1 reduction in vertical resolution and a 2:1 reduction in horizontal resolution.
- FIG. 8 shows a portion 800 of a pixel array in modified Bayer pattern configuration illustrating alternate exemplary connectivity for binning pixel data values to generate output data in a conventional Bayer pattern.
- Portion 800 may represent a portion of image array 203 of FIG. 2 .
- Portion 800 has twenty four pixels that are divided into six subgroups 810 A-F of four pixels each.
- Each of subgroups 810 A-C has four pixels with BG on a first row and GB on a second row next to the first row, and each of subgroups 810 D-F has four pixels with RG on a first row and GR on a second row next to the first row.
- FIG. 8 shows a portion 800 of a pixel array in modified Bayer pattern configuration illustrating alternate exemplary connectivity for binning pixel data values to generate output data in a conventional Bayer pattern.
- Portion 800 may represent a portion of image array 203 of FIG. 2 .
- Portion 800 has twenty four pixels that are divided into six subgroups 8
- Outputs 808 , 802 , 806 and 804 thus generate two rows of two pixels arranged in a conventional Bayer pattern with a 2 : 1 reduction in vertical resolution and a 3:1 reduction in horizontal resolution.
- FIG. 9 is a block diagram illustrating one exemplary rotated modified Bayer pattern image sensor 900 that supports down-sampling with both a high dynamic range (HDR) combination and binning.
- Image sensor 900 is similar to image sensor 200 of FIG. 2 , with the modified Bayer pattern of the filter array rotated by ninety degrees. Certain components are omitted for clarity of illustration.
- Image sensor 900 includes an image array 903 that has a plurality of light sensitive elements or photo-sensitive pixels 902 arranged as a plurality of rows and a plurality of columns and a filter array including a number of filter elements. Each filter element is in optical alignment with a respective pixel and is configured to allow light of only a certain color to pass through.
- Image sensor 900 may be implemented as a complementary metal-oxide-semiconductor (CMOS) image sensor where each pixel 902 includes a photodetector and associated circuitry that supports setting an exposure time and reading out pixel values.
- CMOS complementary metal-oxide-semiconductor
- image array 903 has a column parallel readout architecture where, for each row, pixels 902 are read out simultaneously and processed in parallel.
- a readout line 905 connects, in parallel, to pixels 902 of that column and to a sample and hold (S/H) element 904 .
- Outputs of S/H elements 904 connect to a second stage amplifier 906 , which in turn connects to a processor 950 .
- Processor 950 processes signals (i.e., image sensor data) from amplifier 906 to generate an image.
- Processor 950 may be implemented as a digital signal processor having a local line memory.
- Timing and control block 915 includes a first pre-charge address block 920 , a second pre-charge address block 925 , and a sampling address block 930 .
- the first pre-charge address block 920 may be set to a first pre-charge value
- the second pre-charge address block 925 may be set to a second pre-charge value.
- sampling address 930 of timing and control block 915 selects a row, and a pre-charge is applied to pixels of that row from either the first pre-charge address block or the second pre-charge address block.
- the first pre-charge address block 920 supports a full resolution mode with the same gain and exposure time setting for each row.
- the first pre-charge address block 920 also supports a down-sampling mode that reduces resolution and permits the same exposure time to be set for all the rows during binning to achieve high SNR.
- the first pre-charge address block 920 and the second pre-charge address block 925 cooperate to support a down-sampling mode that reduces resolution and permits different exposure times to be set for different rows during the HDR combination process to achieve high dynamic range.
- Additional pre-charge address blocks may be included within timing and control block 915 to provide additional pre-charge values for additional down-sampling modes.
- the resolution of an image generated by processor 950 using data from image sensor 900 depends upon how the raw pixel data generated by photo-sensitive pixel elements is sampled and processed to generate pixels for the processed image.
- the term “raw pixel data” is used to distinguish data generated by image sensor 900 from the pixel data after the raw data has been sampled and performed additional signal processing by processor 950 .
- the raw pixel data received from image sensor 900 may be down-sampled to reduce the effective vertical resolution of the processed image.
- a variety of standard resolution formats are used in the image sensing art. For example, a 1.3 megapixel super extended graphics array (SXGA) format has 1280 ⁇ 924 pixels of resolution while a video graphics array (VGA) format has a resolution of 640 ⁇ 480 pixels.
- a down-sampling mode the vertical resolution of the raw pixel data is reduced by processor 950 to implement format conversion and simultaneously achieve a higher dynamic range.
- a down sampling mode may be selected that also provides a higher dynamic range.
- the down-sampling mode implements a 1:2 reduction in vertical resolution, and thus, since there is a simple geometric ratio of 1:2 in vertical resolution, down-sampling may combine data from two rows (e.g., Row 0 and Row 1) of pixels 902 .
- processor 950 operates to combine raw pixel data values to generate pixel values in the final image.
- values resulting from two different exposure times controlled by the pre-charge values are processed by processor 950 to effectively increase the dynamic range of array 900 , as compared to the dynamic range when full resolution is used.
- even rows e.g., Row 0, Row 2, Row 4, . . . Row M ⁇ 1
- odd rows e.g., Row 1, Row 3, Row 5, . . . Row M
- processor 950 includes a local line memory to store and synchronize the processing of lines having either the same or different row exposure times.
- the local memory may be used to store sets of long exposure rows and short exposure rows sampled at different times to permit aligning and combining rows with either the same or different exposure times.
- processor 950 reads the memory and combines the raw pixel data of pixels that are neighbors along the vertical dimension that are of a compatible type and that have the same exposure time for the binning process.
- processor 950 reads the memory and selects the raw pixel data of pixels that are neighbors along the vertical dimension that are of a compatible type and that have the different exposure times for the HDR combination process.
- the exposure time of a pixel affects its output response.
- the pixel When a pixel is operated with a long exposure time, the pixel is very sensitive to received light, but tends to saturate at a low light level.
- the pixel when the pixel is operated with a short exposure time, the pixel is less sensitive to light, and saturates at a higher light level as compared to operation with a short exposure time.
- a higher dynamic range is achieved as compared to down-sampling of rows with the same exposure time.
- any down-sampling mode with a 1:N reduction (where N is an integer value) in vertical resolution may be supported, such as 1:2, 1:3, 1:4 and so on.
- the exposure times of the rows are varied in an interleaved sequence of row exposure times that permits down-sampling to be achieved with increased dynamic range.
- the three rows that are to be combined have a sequence of a long exposure time, medium exposure time, and short exposure time.
- rows of pixels have different exposure times. By combining data from two or more pixels of rows having different exposure times, dynamic range may be increased.
- data is selected from either the long exposure pixel or the short exposure pixel.
- data from pixels of long exposure time L pixels
- S pixels data from pixels of short exposure time
- the S pixel data is normalized to match the scale of long exposure pixels. For example,
- data N is the determined normalized pixel data value
- data O is the original pixel data value
- L_exposuretime represents the long exposure time
- S_exposuretime represents the exposure time of the selected pixel data value. If the pixel data value selected has the short exposure time, data N is normalized based upon the long exposure time as shown in Equation (1). If the pixel data value selected has the long exposure time, data N is the same as data°.
- all rows of pixels are configured to have the same exposure time. Binning of two rows having the same exposure time achieves a higher signal to noise ratio (SNR) in the down-sampling.
- SNR signal to noise ratio
- Down-sampling modes that have higher dynamic range are also compatible with a variety of color filter array formats.
- a color filter array pattern is applied to an array of photosensors such that output from the photosensors creates a color image.
- the incoming light to each photosensor is filtered such that typically each photosensor in the pixel array records only one color, such as red, green or blue.
- the row exposure times used in down-sampling are selected such that pixels having compatible filter types are combined during down-sampling.
- FIG. 10 illustrates exemplary binning in a down-sampling mode for a rotated modified Bayer pattern sensor 1000 .
- Rotated modified Bayer pattern sensor 1000 may represent image array 903 of FIG. 9 ;
- a down sampling result 1010 (the right portion of FIG. 10 ) illustrates binning results after down-sampling and has a conventional Bayer pattern as a result of binning.
- a blue-green-red-green row 1015 of pixels is followed by a green-blue-green-red row 1020 of pixels, which is followed by a blue-green-red-green row 1030 of pixels, which is followed by a green-blue-green-red row 1035 of pixels.
- the rows of colors repeat every two rows and the columns repeat every four columns.
- modified Bayer patterns discussed above e.g., modified CYGM, RGBE, RGBC, and RGBW
- the columns have four color patterns that repeat every four rows: Blue-Green-Blue-Green (BGBG) 1050 , Green-Blue-Green-Blue (GBGB) 1052 , Red-Green-Red-Green (RGRG) 1054 , and Green-Red-Green-Red (GRGR) 1056 , and so on in repeating sequence.
- BGBG Blue-Green-Blue-Green
- GBGB Green-Blue-Green-Blue
- RGRG Red-Green-Red-Green
- GRGR Green-Red-Green-Red
- Binning of BGBG row 1015 and GBGB row 1020 generates a single BGBG row 1025 after down-sampling in which the G combines data from the two green pixels of the two rows having the same exposure times, the B combines data from the two blue pixels of the two rows having the same exposure times, and so on.
- binning of RGRG row 1030 and GRGR row 1035 generates a single GRGR row 1040 after down-sampling. It should be noted that in this example, both horizontal and vertical down-sampling results.
- the modified Bayer filter pattern and rotated modified Bayer filter pattern may be used in, but is not limited to, high resolution sensors, and low noise and high sensitivity sensors for HD video.
- the use of higher resolution (than needed for a final image resolution) sensors e.g., image sensor 200 , FIG. 2 and image sensor 900 , FIG. 9
- the above described binning technique improves sharpness and resolution with minimized zigzag edges in the final image.
- an image sensor e.g., image sensor 200 , FIG. 2 and image sensor 900 , FIG. 9
- these sensors also benefit from use of the modified Bayer filter pattern and the rotated modified Bayer filter pattern for improved image quality.
- sensor 200 with a modified Bayer pattern sensor array (e.g., image array 203 ) and image sensor 900 with a rotated modified Bayer pattern sensor array (e.g., image array 903 , FIG. 9 ), beyond the above described improvements in image quality.
- cost may be reduced for system on a chip (SOC) image sensors.
- SOC system on a chip
- an extra memory buffer may be required for Bayer pattern output.
- DPC defect pixel correction
- AVB automatic white balance
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Power Engineering (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Computer Hardware Design (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Color Television Image Signal Generators (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
Abstract
An image sensor includes an array of light sensitive elements and a filter array. Each filter element is in optical communication with a respective light sensitive element. The image sensor receives filtered light having a repeating pattern. Light sensitive elements in at least two successive rows alternately receive light having a first color and a second color, and light sensitive elements in common columns of the successive rows alternately receive light having the first color and the second color. Light sensitive elements in at least two additional successive rows alternately receive light having a third and a fourth color, and light sensitive elements in common columns of the additional successive rows alternately receive light having the third color and the fourth color. Output values of pairs of sampled light sensitive elements receiving light of a common color and from successive rows are combined to generate a down-sampled image.
Description
- This application is a continuation of U.S. patent application Ser. No. 13/035,785, filed Feb. 25, 2011 which is a continuation-in-part of International Application No. PCT/US2010/049368 filed Sep. 17, 2010, which claims priority to U.S. Patent Application Ser. No. 61/334,886, filed May 14, 2010, each of which are incorporated herein by reference.
- CMOS image sensors are typically formed as an array of pixels, where each pixel includes a photodetector that transforms incident light photons into current signals. Each pixel may also include other known elements, such as a reset switch, a signal amplifier, and output circuits that operate to set the exposure time of the photodetector and perform a read out indicative of light photons incident thereon. Where incident light is too high for the set exposure time of the pixel, the photodetector typically saturates.
-
FIG. 1 illustrates a prior art CMOS imagesensor pixel array 100.Pixel array 100 is configured for column parallel readout and has a plurality of columns, each having apixel 102 for each of a plurality of rows. In column parallel readout architecture, for each row, onepixel 102 in each column is read out and processed simultaneously. That is,pixels 102 ofRow 0 are read out in parallel, thenpixels 102 ofRow 1 are read out in parallel, thenpixels 102 ofRow 2 are read out in parallel, and so on, until Row M is read out.Pixels 102 within each column connect to acolumn readout line 105, such that when a row is triggered for output, each pixel in that row outputs a signal to its associatedcolumn readout line 105, while outputs of other pixels in the column remain inactive.Array 100 is shown with one sample and holdelement 104 for each column read outline 105. Sample and holdelements 104 and column read outlines 105 cooperate to provide a row-by-row read out ofpixels 102. Asecond stage amplifier 106 connects to each of the sample and holdelements 104. All rows are typically output to form an image (also known as a frame). - CMOS image sensors are often used in applications in which both very bright and very dark conditions are encountered. A variety of techniques have been developed to improve the response of CMOS image sensors in a variety of light conditions. For example, U.S. Patent Publication No. 2004/0141075, entitled “Image Sensor Having Dual Automatic Exposure Control”, by Xiangchen Xu et al., is assigned to Omnivision Technologies, Inc. and is hereby incorporated by reference. Xu teaches that the gain and exposure time can be adjusted over a sequence of frames to compensate for varying light conditions. An adjustment in exposure time is determined by analyzing one frame and then used to make an adjustment for a subsequent frame. While such approach controls exposure times over a series of frames to adjust for bright and dark conditions, it does not result in an increase in the dynamic range of the image sensor for a particular frame. As is well known in the field of image sensors, the dynamic range is the ratio of the largest detectable signal to the smallest, which for a CMOS image sensor is often defined by the ratio of the largest non-saturating signal to the standard deviation of the noise under dark conditions.
- U.S. Patent Publication No. 2009/0059048, entitled “Image Sensor with High Dynamic Range in Down-Sampling Mode”, by Xiaodong Luo et al, is also assigned to Omnivision Technologies, Inc. and is hereby incorporated by reference. Luo introduces a system and method to achieve a high dynamic range in a down-sampling operation mode by varying exposure times for different pixel rows and combining rows with different exposures, thus simultaneously reducing the vertical resolution and extending the dynamic range.
- In down-sampling, a binning process is used to combine data from two or more pixels to increase a signal to noise ratio (SNR), and a high dynamic range (HDR) combination process is used to combine data from two or more pixels to increase dynamic range. In the binning process, all rows have the same exposure time, while in the HDR combination process, rows of pixels can have different exposure times.
- A Bayer pattern, which is one of the most commonly used patterns for down-sampling, generates zigzag edges during both the HDR combination process and the binning process. Although corrective algorithms for these zigzag edges have been developed for use with the Bayer pattern, these corrective algorithms have certain disadvantages, such as reducing sharpness and resolution of output frames and increasing cost of image sensors. For example, a binning re-interpolation algorithm can partly smooth zigzag edges caused by the Bayer pattern, but with a sacrifice in sharpness and resolution of the resultant frame. Re-interpolation also becomes very expensive since more memory is necessary.
- The present disclosure presents a modified Bayer pattern as an alternative to the conventional Bayer pattern. The down-sampling problem of a zigzag effect resulting from binning or HDR combination of pixel values configured in a conventional Bayer pattern is solved by using a modified Bayer pattern. A sensor having pixels based upon the modified Bayer pattern outputs images with smooth edges without sacrificing sharpness or resolution. Image sensors based upon the modified Bayer pattern have less edge zigzag and have improved sharpness and resolution in generated images.
- In an embodiment, an image sensor includes an array of light sensitive elements and a filter array including a plurality of red, green, and blue filter elements. Each filter element is in optical communication with a respective light sensitive element. Each red filter element is configured to transmit only red colored light, each green filter element is configured to transmit only green colored light, and each blue filter element is configured to transmit only blue colored light. The filter array is arranged such that successive columns of the filter array have alternating first and second configurations. The first configuration is characterized by a repeating pattern of successive blue, green, red, and green filter elements, and the second configuration is characterized by a repeating pattern of successive green, blue, green, and red filter elements.
- In an embodiment, a method for down-sampling an image produced by an image sensor including an array of light sensitive elements includes filtering light incident on the image sensor. The light is filtered such that successive columns of the array of light sensitive elements alternately receive light having a first pattern and a second pattern. The first pattern is characterized by each four successive light sensitive elements in a column respectively receiving blue, green, red, and green colored light. The second pattern is characterized by each four successive light sensitive elements in a column respectively receiving green, blue, green, and red colored light. The method further includes sampling output values of the light sensitive elements and combining output values of pairs of light sensitive elements to generate a down-sampled image.
- In an embodiment, a method for down-sampling an image produced by an image sensor including an array of light sensitive elements includes filtering light incident on the image sensor. The light is filtered such that successive columns of the array of light sensitive elements alternately receive light having a first pattern and a second pattern. The first pattern is characterized by each four successive light sensitive elements in a column respectively receiving blue, green, red, and green colored light. The second pattern is characterized by each four successive light sensitive elements in a column respectively receiving green, blue, green, and red colored light. The method additionally includes sampling output values of the light sensitive elements such that light sensitive elements of successive rows alternately have long and short exposure times. The method further includes combining output values of pairs of light sensitive elements to generate a down-sampled image.
- In an embodiment, an image sensor has an array of light sensitive elements and a filter array including a plurality of first, second, third, and fourth filter elements, each filter element in optical communication with a respective light sensitive element. Each first filter element is configured to transmit light of a first color, each second filter element is configured to transmit light of a second color, each third filter element is configured to transmit light of a third color, and each fourth filter element is configured to transmit light of a fourth color. The filter array is configured to include a repeating pattern of filter elements characterized by: at least two successive rows of alternating first and second filter elements where common columns of the at least two successive rows also include alternating first and second filter elements, and at least two additional successive rows of alternating third and fourth filter elements where common columns of the at least two additional successive rows also include alternating third and fourth filter element elements.
- In an embodiment, a method down-samples an image produced by an image sensor including an array of light sensitive elements. Light incident on the image sensor is filtered such that the image sensor receives light having a repeating pattern characterized by: (a) light sensitive elements in at least two successive rows alternately receiving light having a first color and a second color, and light sensitive elements in common columns of the at least two successive rows alternately receiving light having the first color and the second color, and (b) light sensitive elements in at least two additional successive rows alternately receive light having a third and a fourth color, and light sensitive elements in common columns of the at least two additional successive rows alternately receiving light having the third color and the fourth color. Output values of the light sensitive elements are sampled, and output values of pairs of light sensitive elements receiving light of a common color and from successive rows of the array are combined to generate a down-sampled image.
-
FIG. 1 illustrates operation of a prior art image sensor. -
FIG. 2 is a block diagram illustrating one exemplary modified Bayer pattern imaging system that supports down-sampling with both a high dynamic range combination and binning, according to an embodiment. -
FIG. 3 illustrates binning for a conventional Bayer pattern in a down-sampling mode. -
FIG. 4 illustrates binning for an exemplary modified Bayer pattern in a down-sampling mode, according to an embodiment. -
FIG. 5 illustrates prior art data selection between long exposure and short exposure for a conventional Bayer pattern in a down-sampling mode. -
FIG. 6 illustrates data selection among long exposure and short exposure for an exemplary modified Bayer pattern in a down-sampling mode, according to an embodiment. -
FIG. 7 shows a portion of a pixel array configured in a modified Bayer pattern illustrating exemplary connectivity for binning pixel data values to generate output data, according to an embodiment. -
FIG. 8 shows a portion of a pixel array configured in a modified Bayer pattern illustrating alternate exemplary connectivity for binning pixel data values to generate output data, according to an embodiment. -
FIG. 9 illustrates binning for an exemplary rotated modified Bayer pattern in a down-sampling mode, according to an embodiment. -
FIG. 10 is a block diagram illustrating one exemplary rotated modified Bayer pattern image sensor that supports down-sampling with both a high dynamic range (HDR) combination and binning, according to an embodiment. - In the following description, the terms sensor array, pixel array, and image array may be used interchangeably to mean an array of photosensors.
-
FIG. 2 is a block diagram illustrating one exemplary modified Bayerpattern image sensor 200 that supports down-sampling with both a high dynamic range (HDR) combination and binning. Certain components are omitted for clarity of illustration.Image sensor 200 includes animage array 203 that has a plurality of light sensitive elements or photo-sensitive pixels 202 arranged as a plurality of rows and a plurality of columns and a filter array including a number of filter elements. Each filter element is in optical alignment with a respective pixel and is configured to allow light of only a certain color to pass through. The filter array conforms to a modified Bayer pattern defining the color sensitivity of eachpixel 202. That is, color of pixel sensors withinimage array 203 conform to the exemplary modifiedBayer pattern sensor 400 ofFIG. 4 .Image sensor 200 may be implemented as a complementary metal-oxide-semiconductor (CMOS) image sensor where eachpixel 202 includes a photodetector and associated circuitry that supports setting an exposure time and reading out pixel values. - As shown,
image array 203 has a column parallel readout architecture where, for each row,pixels 202 are read out simultaneously and processed in parallel. For each column, areadout line 205 connects, in parallel, topixels 202 of that column and to a sample and hold (S/H)element 204. Outputs of S/H elements 204 connect to asecond stage amplifier 206, which in turn connects to aprocessor 250.Processor 250 processes signals (i.e., image sensor data) fromamplifier 206 to generate an image.Processor 250 may be implemented as a digital signal processor having a local line memory. - A
row address decoder 208 and acolumn address decoder 210 operate to decode signals from a timing and control block 215 to addresspixels 202. Timing and control block 215 includes a firstpre-charge address block 220, a secondpre-charge address block 225, and asampling address block 230. The firstpre-charge address block 220 may be set to a first pre-charge value, and the secondpre-charge address block 225 may be set to a second pre-charge value. In one example of operation,sampling address 230 of timing and control block 215 selects a row, and a pre-charge is applied to pixels of that row from either the first pre-charge address block or the second pre-charge address block. - In one embodiment, the first
pre-charge address block 220 supports a full resolution mode with the same gain and exposure time setting for each row. The firstpre-charge address block 220 also supports a down-sampling mode that reduces resolution and permits the same exposure time to be set for all the rows during binning to achieve high SNR. The firstpre-charge address block 220 and the secondpre-charge address block 225 cooperate to support a down-sampling mode that reduces resolution and permits different exposure times to be set for different rows during the HDR combination process to achieve high dynamic range. Additional pre-charge address blocks (not shown) may be included within timing and control block 215 to provide additional pre-charge values for additional down-sampling modes. - The resolution of an image generated by
processor 250 using data fromimage sensor 200 depends upon how the raw pixel data generated by photo-sensitive pixel elements is sampled and processed to generate pixels for the processed image. The term “raw pixel data” is used to distinguish data generated byimage sensor 200 from the pixel data after the raw data has been sampled and performed additional signal processing byprocessor 250. In particular, the raw pixel data received fromimage sensor 200 may be down-sampled to reduce the effective vertical resolution of the processed image. A variety of standard resolution formats are used in the image sensing art. For example, a 1.3 megapixel super extended graphics array (SXGA) format has 1280×1024 pixels of resolution while a video graphics array (VGA) format has a resolution of 640×480 pixels. - In accordance with an embodiment, in a down-sampling mode, the vertical resolution of the raw pixel data is reduced by
processor 250 to implement format conversion and simultaneously achieve a higher dynamic range. For example, when converting a 1.3 megapixel format into VGA, a down sampling mode may be selected that also provides a higher dynamic range. In this example, the down-sampling mode implements a 1:2 reduction in vertical resolution, and thus, since there is a simple geometric ratio of 1:2 in vertical resolution, down-sampling may combine data from two rows (e.g.,Row 0 and Row 1) ofpixels 202. In particular,processor 250 operates to combine raw pixel data values to generate pixel values in the final image. Where the first of the two rows being combined has a first pre-charge value (e.g., as set from the first pre-charge address block 220) and the second of the two rows has a second pre-charge value (e.g., as set from the second pre-charge address block 225), values resulting from two different exposure times controlled by the pre-charge values are processed byprocessor 250 to effectively increase the dynamic range ofarray 200, as compared to the dynamic range when full resolution is used. In one example, even rows (e.g.,Row 0,Row 2,Row 4, . . . Row M−1) have a long exposure time and odd rows (e.g.,Row 1,Row 3,Row 5, . . . Row M) have a short exposure time. - As previously described, in one embodiment,
processor 250 includes a local line memory to store and synchronize the processing of lines having either the same or different row exposure times. In particular, the local memory may be used to store sets of long exposure rows and short exposure rows sampled at different times to permit aligning and combining rows with either the same or different exposure times. In one embodiment, during down-sampling,processor 250 reads the memory and combines the raw pixel data of pixels that are neighbors along the vertical dimension that are of a compatible type and that have the same exposure time for the binning process. In another embodiment, during down-sampling,processor 250 reads the memory and selects the raw pixel data of pixels that are neighbors along the vertical dimension that are of a compatible type and that have the different exposure times for the HDR combination process. - The exposure time of a pixel affects its output response. When a pixel is operated with a long exposure time, the pixel is very sensitive to received light, but tends to saturate at a low light level. In contrast, when the pixel is operated with a short exposure time, the pixel is less sensitive to light, and saturates at a higher light level as compared to operation with a short exposure time. Thus, by using different exposure times for rows that are down-sampled, a higher dynamic range is achieved as compared to down-sampling of rows with the same exposure time.
- Various extensions and modifications of the down-sampling mode with high dynamic range are contemplated. In a first scenario, any down-sampling mode with a 1:N reduction (where N is an integer value) in vertical resolution may be supported, such as 1:2, 1:3, 1:4 and so on. In this scenario, the exposure times of the rows are varied in an interleaved sequence of row exposure times that permits down-sampling to be achieved with increased dynamic range. For example, for down-sampling with a 1:3 reduction in vertical resolution, the three rows that are to be combined have a sequence of a long exposure time, medium exposure time, and short exposure time.
- In the HDR combination process, implemented within
processor 250, rows of pixels have different exposure times. By combining data from two or more pixels of rows having different exposure times, dynamic range may be increased. There are many ways of combining the data from long exposure pixels and short exposure pixels. In one way, data is selected from either the long exposure pixel or the short exposure pixel. In particular, data from pixels of long exposure time (L pixels) are selected byprocessor 250 where the L pixels are not saturated, and data from pixels of short exposure time (S pixels) are selected byprocessor 250 where the L pixels are saturated. Where the short exposure data is selected byprocessor 250, the S pixel data is normalized to match the scale of long exposure pixels. For example, -
dataN=dataO*(L_exposuretime/S_exposuretime) (1) - Where dataN is the determined normalized pixel data value, dataO is the original pixel data value, L_exposuretime represents the long exposure time, and S_exposuretime represents the exposure time of the selected pixel data value. If the pixel data value selected has the short exposure time, dataN is normalized based upon the long exposure time as shown in Equation (1). If the pixel data value selected has the long exposure time, dataN is the same as dataO.
- In one embodiment, when HDR combination is not required, all rows of pixels are configured to have the same exposure time. Binning of two rows having the same exposure time achieves a higher signal to noise ratio (SNR) in the down-sampling.
- Down-sampling modes that have higher dynamic range are also compatible with a variety of color filter array formats. In color sensing arrays in the art, a color filter array pattern is applied to an array of photosensors such that output from the photosensors creates a color image. The incoming light to each photosensor is filtered such that typically each photosensor in the pixel array records only one color, such as red, green or blue. In one embodiment, for a particular color filter array pattern, the row exposure times used in down-sampling are selected such that pixels having compatible filter types are combined during down-sampling.
-
FIG. 3 illustrates prior art binning in a down-sampling mode for aBayer pattern sensor 300. Binning results 310 after down-sampling are shown in the right portion ofFIG. 3 . InBayer pattern sensor 300, a blue-green-blue-green (BGBG)row 315 of pixels is followed by a green-red-green-red (GRGR)row 320 of pixels. The Bayer pattern is a RGB filter pattern that is 50% green, 25% red, and 25% blue. In the Bayer pattern sensor, a blue-green row of pixels is followed by a green-red row of pixels. Other similar patterns include the CYGM filter array pattern (cyan, yellow, green, and magenta) formed of alternate rows of cyan-yellow and green-magenta, and a RGBE filter array pattern (red, green, blue, and emerald) having alternating rows of red-green and blue-emerald. Patterns may also include clear pixels, such a Red-Green-Blue-Clear (RGBC) and similarly, Red-Green-Blue-White (RGBW). As noted above, the problem with prior art sensor that utilize the Bayer pattern, is that significant zigzagging results during the binning process. -
FIG. 4 illustrates exemplary binning in a down-sampling mode for a modifiedBayer pattern sensor 400. ModifiedBayer pattern sensor 400 may representimage array 203 ofFIG. 2 ; a down sampling result 410 (the right portion ofFIG. 4 ) illustrates binning results after down-sampling and has a conventional Bayer pattern as a result of binning. In the modifiedBayer pattern sensor 400, a blue-green row of pixels is followed by a green-blue row of pixels, which is followed by a red-green row of pixels, which is followed by a green-red row of pixels. Within the modifiedBayer pattern sensor 400, the rows of colors repeat every four rows. This novel pattern is modified from theBayer pattern sensor 300 by inserting row GBGB 420 and rowRGRG 430 betweenrow BGBG 415 androw GRGR 435 that are equivalent torespective row BGBG 315 androw GRGR 320 of theBayer pattern sensor 300. Row GBGB 420 is formed by shifting elements G and B ofrow BGBG 415 to one column to either the right or to the left. RowRGRG 430 is formed by shifting elements G and R ofrow GRGR 435 to one column to either the right or the left. - With the modified
Bayer pattern sensor 400, the rows have four color patterns that repeat every four rows: Blue-Green-Blue-Green (BGBG) 415, Green-Blue-Green-Blue (GBGB) 420, Red-Green-Red-Green (RGRG) 430, and Green-Red-Green-Red (GRGR) 435, and so on in repeating sequence. In this example all rows have the same exposure time. The repeating sequence is selected to be compatible with the modified Bayer pattern, which also repeats after every four rows. Binning ofBGBG row 415 and GBGB row 420 generates asingle BGBG row 425 after down-sampling in which the G combines data from the two green pixels of the two rows having the same exposure times, the B combines data from the two blue pixels of the two rows having the same exposure times, and so on. Similarly binning ofRGRG row 430 andGRGR row 435 generates asingle GRGR row 440 after down-sampling. -
FIG. 5 illustrates HDR combination in a down-sampling mode for a prior art Bayerfilter pattern sensor 500. Down-sampling from Bayerfilter pattern sensor 500 gives down-sampling results 510 (in the right portion ofFIG. 5 ). Within Bayerfilter pattern sensor 500, subscripts indicate whether the pixel is configured as either long (L) or short (S) exposure time rows. Every pair of two nearest rows having the same color pattern but different exposure times have pixel data combined during down-sampling to give down-sampling results 510.GRGR row 525 results from selection between pixel values ofGRGR row 515 andGRGR row 520. SimilarlyBGBG row 540 results from pixel value selection betweenBGBG row 530 andBGBG row 535. As noted above, the problem with prior art sensor that utilize the Bayer pattern, is that significant zigzagging results during the HDR combination process. -
FIG. 6 illustrates HDR combination in a down-sampling mode for a modifiedBayer pattern sensor 600. Modified Bayerfilter pattern sensor 600 may represent a portion ofimage array 203 ofFIG. 2 ; a down-sampling result 610 (shown in the right portion ofFIG. 6 ) represents the result of down-sampling from modified Bayerfilter pattern sensor 600, and has a conventional Bayer pattern as a result of HDR combination. Within modified Bayerfilter pattern sensor 600, color sequences are similar to modified Bayerfilter pattern sensor 400 ofFIG. 4 . Subscripts within each pixel indicate the configured exposure time as either long (L) or short (S). In the example ofFIG. 6 ,row 615 has a long exposure time,row 620 has a short exposure time,row 630 has a long exposure time, androw 635 has a short exposure time. This sequence then repeats. The sequence is selected to be compatible with the modified Bayer pattern, which also repeats after every four rows. Adjacent rows of corresponding colors and different exposure times have pixel data combined during down-sampling. For example, a data selection of a long exposuretime BGBG row 615 and a short exposuretime GBGB row 620 to generate asingle BGBG row 625 after down-sampling in which the G has selected data from the long and short exposure time pixel data (for the two green pixels from the rows with different exposure times), the R has selected data from long and short exposure time pixel data (for the two red pixels from the rows with different exposure times), and so on. Similarly one data is selected from a long exposuretime RGRG row 630 and a short exposuretime RGRG row 635 to generate asingle GRGR row 640 after down-sampling. - Other common filter patterns may also repeat after every four rows, such that the principles illustrated in
FIGS. 4 and 6 may be applied. For example, the CYGM pattern could be modified to have the following color pattern that repeats every four rows: cyan, yellow, cyan, yellow (row 1); yellow, cyan, yellow, cyan (row 2), green, magenta, green, magenta (row 3); and magenta, green, magenta, green (row 4). The RGBE, RGBC, and RGBW patterns could also be modified in similar manners, for example. Furthermore, it is anticipated that in alternate embodiments filter patterns are modified to repeat after more than four rows to achieve a reduction in vertical resolution larger than 1:2. For example, the Bayer pattern could be modified to have the following pattern that repeats every six row to achieve a 1:3 reduction in vertical resolution: blue, green, blue, green (row 1); green, blue, green, blue (row 2); blue, green, blue, green (row 3); red, green, red green (row 4); green, red, green, red (row 5); and red, green, red, green (row 6). - Binning for the modified Bayer pattern (e.g., modified
Bayer pattern sensors 400 and 600) results in a uniform sampling and thereby minimizes zigzag edges, as compared to binning for conventional Bayer patterns. The HDR combination process also benefits from the modified Bayer pattern, and thus generates high quality images. - In a normal mode (i.e., when not down-sampling), captured image quality from a sensor utilizing the modified Bayer pattern (e.g.,
sensor 200 configured with modified Bayer pattern sensor 400) may not be as good as an image captured with a sensor configured with a conventional Bayer pattern. However, artifacts within the normal mode image captured from the sensor utilizing the modified Bayer pattern are minor compared to the zigzag problem, and these artifacts may be easily corrected by image processing algorithms. - As previously discussed,
image sensor 200 supports a full resolution (i.e., row-by-row) readout of pixel data in which each row has the same exposure time. In a preferred embodiment,image sensor 200 has two modes of operation; (1) a normal full resolution mode with dynamic range limited by photosensors within each pixel, and (2) a down-sampling mode that has reduced vertical resolution. In the down-sampling mode, binning achieves high SNR when HDR is not required, while HDR combination achieves a higher dynamic range when HDR is desired. A comparatively small amount of chip ‘real estate’ is required for the additional functionality to provide the secondpre-charge address block 225 and row independent exposure times for HDR combination. Only comparatively inexpensive modifications toprocessor 250 are required to implement the down-sampling mode with HDR combination. In essence “spare lines” are used during down-sampling to achieve a high dynamic range sensing mode at a very low marginal cost. - Down-sampling schemes of the prior art typically emphasize reduction of noise and gain, and the exposure time of each row remains nominally the same. Prior art down-sampling either discards data from a portion of the lines or averages data across multiple rows. Thus, these prior art down-sampling approaches do not increase the dynamic range of resulting image.
- HDR combination may be implemented at least in part within the analog domain, such as using sample and hold registers or may be implemented in the digital domain, such as using analog to digital converters and software.
- Where
processor 250 represents a digital signal processor, down-sampling, such as binning and HDR combination, may be implemented as machine readable instructions stored in memory accessible by the processor. At least part of the embodiments disclosed herein may relate to a computer storage product with a computer-readable medium having computer code thereon for performing various computer-implemented operations. Examples of computer-readable media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs, DVDs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store and execute program code, such as application-specific integrated circuits (“ASICs”), programmable logic devices (“PLDs”) and ROM and RAM devices. Examples of computer code include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter. For example, an embodiment of the invention may be implemented using Java, C++, or other object-oriented programming language and development tools. Another embodiment of the invention may be implemented in hardwired circuitry in place of, or in combination with, machine-executable software instructions. -
FIG. 7 shows aportion 700 of a pixel array in modified Bayer pattern configuration illustrating exemplary connectivity for binning pixel data values to generate output data in a conventional Bayer pattern.Portion 700 may represent a portion ofimage array 203 ofFIG. 2 .Portion 700 includes sixteenpixels 202 divided intosubgroups 710A-D, each subgroup having four pixels.Portion 700 has fouroutputs Output 702 combines data of two green pixels on a diagonal ofsubgroup 710A. Similarly,output 704 combines data of two red pixels ofsubgroup 710D,output 706 combines data of two green pixels ofsubgroup 710C, andoutput 708 combines data of two blue pixels ofsubgroup 710B.Outputs -
FIG. 8 shows aportion 800 of a pixel array in modified Bayer pattern configuration illustrating alternate exemplary connectivity for binning pixel data values to generate output data in a conventional Bayer pattern.Portion 800 may represent a portion ofimage array 203 ofFIG. 2 .Portion 800 has twenty four pixels that are divided into sixsubgroups 810A-F of four pixels each. Each ofsubgroups 810A-C has four pixels with BG on a first row and GB on a second row next to the first row, and each ofsubgroups 810D-F has four pixels with RG on a first row and GR on a second row next to the first row. As illustrated inFIG. 8 , outputs from each pair of pixels on a diagonal are combined, and then three pairs of the same color output are combined to generategreen output 802,red output 804,green output 806, andblue output 808.Outputs -
FIG. 9 is a block diagram illustrating one exemplary rotated modified Bayerpattern image sensor 900 that supports down-sampling with both a high dynamic range (HDR) combination and binning.Image sensor 900 is similar toimage sensor 200 ofFIG. 2 , with the modified Bayer pattern of the filter array rotated by ninety degrees. Certain components are omitted for clarity of illustration.Image sensor 900 includes animage array 903 that has a plurality of light sensitive elements or photo-sensitive pixels 902 arranged as a plurality of rows and a plurality of columns and a filter array including a number of filter elements. Each filter element is in optical alignment with a respective pixel and is configured to allow light of only a certain color to pass through. The filter array conforms to a rotated modified Bayer pattern defining the color sensitivity of eachpixel 902. That is, color of pixel sensors withinimage array 903 conform to the exemplary rotated modifiedBayer pattern sensor 1000 ofFIG. 10 .Image sensor 900 may be implemented as a complementary metal-oxide-semiconductor (CMOS) image sensor where eachpixel 902 includes a photodetector and associated circuitry that supports setting an exposure time and reading out pixel values. - As shown,
image array 903 has a column parallel readout architecture where, for each row,pixels 902 are read out simultaneously and processed in parallel. For each column, areadout line 905 connects, in parallel, topixels 902 of that column and to a sample and hold (S/H)element 904. Outputs of S/H elements 904 connect to asecond stage amplifier 906, which in turn connects to aprocessor 950.Processor 950 processes signals (i.e., image sensor data) fromamplifier 906 to generate an image.Processor 950 may be implemented as a digital signal processor having a local line memory. - A
row address decoder 908 and acolumn address decoder 910 operate to decode signals from a timing and control block 915 to addresspixels 902. Timing and control block 915 includes a firstpre-charge address block 920, a secondpre-charge address block 925, and asampling address block 930. The firstpre-charge address block 920 may be set to a first pre-charge value, and the secondpre-charge address block 925 may be set to a second pre-charge value. In one example of operation,sampling address 930 of timing and control block 915 selects a row, and a pre-charge is applied to pixels of that row from either the first pre-charge address block or the second pre-charge address block. - In one embodiment, the first
pre-charge address block 920 supports a full resolution mode with the same gain and exposure time setting for each row. The firstpre-charge address block 920 also supports a down-sampling mode that reduces resolution and permits the same exposure time to be set for all the rows during binning to achieve high SNR. The firstpre-charge address block 920 and the secondpre-charge address block 925 cooperate to support a down-sampling mode that reduces resolution and permits different exposure times to be set for different rows during the HDR combination process to achieve high dynamic range. Additional pre-charge address blocks (not shown) may be included within timing and control block 915 to provide additional pre-charge values for additional down-sampling modes. - The resolution of an image generated by
processor 950 using data fromimage sensor 900 depends upon how the raw pixel data generated by photo-sensitive pixel elements is sampled and processed to generate pixels for the processed image. The term “raw pixel data” is used to distinguish data generated byimage sensor 900 from the pixel data after the raw data has been sampled and performed additional signal processing byprocessor 950. In particular, the raw pixel data received fromimage sensor 900 may be down-sampled to reduce the effective vertical resolution of the processed image. A variety of standard resolution formats are used in the image sensing art. For example, a 1.3 megapixel super extended graphics array (SXGA) format has 1280×924 pixels of resolution while a video graphics array (VGA) format has a resolution of 640×480 pixels. - In accordance with an embodiment, in a down-sampling mode, the vertical resolution of the raw pixel data is reduced by
processor 950 to implement format conversion and simultaneously achieve a higher dynamic range. For example, when converting a 1.3 megapixel format into VGA, a down sampling mode may be selected that also provides a higher dynamic range. In this example, the down-sampling mode implements a 1:2 reduction in vertical resolution, and thus, since there is a simple geometric ratio of 1:2 in vertical resolution, down-sampling may combine data from two rows (e.g.,Row 0 and Row 1) ofpixels 902. In particular,processor 950 operates to combine raw pixel data values to generate pixel values in the final image. Where the first of the two rows being combined has a first pre-charge value (e.g., as set from the first pre-charge address block 920) and the second of the two rows has a second pre-charge value (e.g., as set from the second pre-charge address block 925), values resulting from two different exposure times controlled by the pre-charge values are processed byprocessor 950 to effectively increase the dynamic range ofarray 900, as compared to the dynamic range when full resolution is used. In one example, even rows (e.g.,Row 0,Row 2,Row 4, . . . Row M−1) have a long exposure time and odd rows (e.g.,Row 1,Row 3,Row 5, . . . Row M) have a short exposure time. - As previously described, in one embodiment,
processor 950 includes a local line memory to store and synchronize the processing of lines having either the same or different row exposure times. In particular, the local memory may be used to store sets of long exposure rows and short exposure rows sampled at different times to permit aligning and combining rows with either the same or different exposure times. In one embodiment, during down-sampling,processor 950 reads the memory and combines the raw pixel data of pixels that are neighbors along the vertical dimension that are of a compatible type and that have the same exposure time for the binning process. In another embodiment, during down-sampling,processor 950 reads the memory and selects the raw pixel data of pixels that are neighbors along the vertical dimension that are of a compatible type and that have the different exposure times for the HDR combination process. - The exposure time of a pixel affects its output response. When a pixel is operated with a long exposure time, the pixel is very sensitive to received light, but tends to saturate at a low light level. In contrast, when the pixel is operated with a short exposure time, the pixel is less sensitive to light, and saturates at a higher light level as compared to operation with a short exposure time. Thus, by using different exposure times for rows that are down-sampled, a higher dynamic range is achieved as compared to down-sampling of rows with the same exposure time.
- Various extensions and modifications of the down-sampling mode with high dynamic range are contemplated. In a first scenario, any down-sampling mode with a 1:N reduction (where N is an integer value) in vertical resolution may be supported, such as 1:2, 1:3, 1:4 and so on. In this scenario, the exposure times of the rows are varied in an interleaved sequence of row exposure times that permits down-sampling to be achieved with increased dynamic range. For example, for down-sampling with a 1:3 reduction in vertical resolution, the three rows that are to be combined have a sequence of a long exposure time, medium exposure time, and short exposure time.
- In the HDR combination process, implemented within
processor 950, rows of pixels have different exposure times. By combining data from two or more pixels of rows having different exposure times, dynamic range may be increased. There are many ways of combining the data from long exposure pixels and short exposure pixels. In one way, data is selected from either the long exposure pixel or the short exposure pixel. In particular, data from pixels of long exposure time (L pixels) are selected byprocessor 950 where the L pixels are not saturated, and data from pixels of short exposure time (S pixels) are selected byprocessor 950 where the L pixels are saturated. Where the short exposure data is selected byprocessor 950, the S pixel data is normalized to match the scale of long exposure pixels. For example, -
dataN=dataO*(L_exposuretime/S_exposuretime) (2) - Where dataN is the determined normalized pixel data value, dataO is the original pixel data value, L_exposuretime represents the long exposure time, and S_exposuretime represents the exposure time of the selected pixel data value. If the pixel data value selected has the short exposure time, dataN is normalized based upon the long exposure time as shown in Equation (1). If the pixel data value selected has the long exposure time, dataN is the same as data°.
- In one embodiment, when HDR combination is not required, all rows of pixels are configured to have the same exposure time. Binning of two rows having the same exposure time achieves a higher signal to noise ratio (SNR) in the down-sampling.
- Down-sampling modes that have higher dynamic range are also compatible with a variety of color filter array formats. In color sensing arrays in the art, a color filter array pattern is applied to an array of photosensors such that output from the photosensors creates a color image. The incoming light to each photosensor is filtered such that typically each photosensor in the pixel array records only one color, such as red, green or blue. In one embodiment, for a particular color filter array pattern, the row exposure times used in down-sampling are selected such that pixels having compatible filter types are combined during down-sampling.
-
FIG. 10 illustrates exemplary binning in a down-sampling mode for a rotated modifiedBayer pattern sensor 1000. Rotated modifiedBayer pattern sensor 1000 may representimage array 903 ofFIG. 9 ; a down sampling result 1010 (the right portion ofFIG. 10 ) illustrates binning results after down-sampling and has a conventional Bayer pattern as a result of binning. In the rotated modifiedBayer pattern sensor 1000, a blue-green-red-green row 1015 of pixels is followed by a green-blue-green-red row 1020 of pixels, which is followed by a blue-green-red-green row 1030 of pixels, which is followed by a green-blue-green-red row 1035 of pixels. Within the rotated modifiedBayer pattern sensor 1000, the rows of colors repeat every two rows and the columns repeat every four columns. The other modified Bayer patterns discussed above (e.g., modified CYGM, RGBE, RGBC, and RGBW) could also be rotated in a similar manner. - With the rotated modified
Bayer pattern sensor 1000, the columns have four color patterns that repeat every four rows: Blue-Green-Blue-Green (BGBG) 1050, Green-Blue-Green-Blue (GBGB) 1052, Red-Green-Red-Green (RGRG) 1054, and Green-Red-Green-Red (GRGR) 1056, and so on in repeating sequence. In this example all rows have the same exposure time. The repeating sequence is selected to be compatible with the modified Bayer pattern, which also repeats after every four rows. Binning ofBGBG row 1015 andGBGB row 1020 generates asingle BGBG row 1025 after down-sampling in which the G combines data from the two green pixels of the two rows having the same exposure times, the B combines data from the two blue pixels of the two rows having the same exposure times, and so on. Similarly binning ofRGRG row 1030 andGRGR row 1035 generates asingle GRGR row 1040 after down-sampling. It should be noted that in this example, both horizontal and vertical down-sampling results. - According to embodiments of the present invention, the modified Bayer filter pattern and rotated modified Bayer filter pattern may be used in, but is not limited to, high resolution sensors, and low noise and high sensitivity sensors for HD video. The use of higher resolution (than needed for a final image resolution) sensors (e.g.,
image sensor 200,FIG. 2 andimage sensor 900,FIG. 9 ) and the above described binning technique improves sharpness and resolution with minimized zigzag edges in the final image. Where an image sensor (e.g.,image sensor 200,FIG. 2 andimage sensor 900,FIG. 9 ) also optionally implements down-sampling, such as binning and HDR combination to generate output, these sensors also benefit from use of the modified Bayer filter pattern and the rotated modified Bayer filter pattern for improved image quality. - Other improvements may be realized through use of
sensor 200 with a modified Bayer pattern sensor array (e.g., image array 203) andimage sensor 900 with a rotated modified Bayer pattern sensor array (e.g.,image array 903,FIG. 9 ), beyond the above described improvements in image quality. For example, cost may be reduced for system on a chip (SOC) image sensors. For sensors that output raw image data, an extra memory buffer may be required for Bayer pattern output. However, such raw image sensors may also share memory with other processing modules, such as defect pixel correction (DPC) and automatic white balance (AWB) processors. - Having described several embodiments, it will be recognized by those skilled in the art that various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the invention, for example, variations in sequence of steps and configuration and number of pixels, etc. Additionally, a number of well known processes and elements have not been described in order to avoid unnecessarily obscuring the present invention. Accordingly, the above description should not be taken as limiting the scope of the invention.
- It should thus be noted that the matter contained in the above description or shown in the accompanying drawings should be interpreted as illustrative and not in a limiting sense. The following claims are intended to cover generic and specific features described herein, as well as all statements of the scope of the present method and system, which, as a matter of language, might be the to fall there between.
Claims (16)
1. An image sensor, comprising:
an array of light sensitive elements; and
a filter array including a plurality of first, second, third, and fourth filter elements, each filter element in optical communication with a respective light sensitive element,
each first filter element configured to transmit light of a first color,
each second filter element configured to transmit light of a second color,
each third filter element configured to transmit light of a third color,
each fourth filter element configured to transmit light of a fourth color, and
the filter array configured to include a repeating pattern of filter elements characterized by:
at least two successive rows of alternating first and second filter elements where common columns of the at least two successive rows also include alternating first and second filter elements, and
at least two additional successive rows of alternating third and fourth filter elements where common columns of the at least two additional successive rows also include alternating third and fourth filter element elements.
2. The image sensor of claim 1 , the first and fourth colors being the color green, the second color being the color blue, and the third color being the color red.
3. The image sensor of claim 1 , the first color being the color cyan, the second color being the color yellow, the third color being the color green, and the fourth color being the color magenta.
4. The image sensor of claim 1 , the first color being the color red, the second color being the color green, the third color being the color blue, and the fourth color being the color emerald.
5. The image sensor of claim 1 , the first color being the color red, the second color being the color green, the third color being the color blue, and the fourth color being the color white.
6. The image sensor of claim 1 , the first color being the color red, the second color being the color green, the third color being the color blue, and the fourth color being at least all colors of visible light.
7. A method for down-sampling an image produced by an image sensor including an array of light sensitive elements, comprising:
filtering light incident on the image sensor such that the image sensor receives light having a repeating pattern characterized by:
light sensitive elements in at least two successive rows alternately receiving light having a first color and a second color, and light sensitive elements in common columns of the at least two successive rows alternately receiving light having the first color and the second color, and
light sensitive elements in at least two additional successive rows alternately receive light having a third and a fourth color, and light sensitive elements in common columns of the at least two additional successive rows alternately receiving light having the third color and the fourth color;
sampling output values of the light sensitive elements; and
combining output values of pairs of light sensitive elements receiving light of a common color and from successive rows of the array to generate a down-sampled image.
8. The method of claim 7 , the first and fourth colors being the color green, the second color being the color blue, and the third color being the color red.
9. The method of claim 7 , further comprising configuring the light sensitive elements to have a common exposure time, wherein the down-sampled image has an increased signal-to-noise ratio.
10. The method of claim 7 , further comprising configuring the light sensitive elements such that light sensitive elements of successive rows alternately have long and short exposure times, wherein the down-sampled image has increased dynamic range.
11. The method of claim 10 , the step of combining comprising, for each pair of light sensitive elements:
selecting a value from a light sensitive element of the pair having a long exposure time when the light sensitive element having a long exposure time has not saturated; and
selecting a value from a light sensitive element of the pair having a short exposure time when the light sensitive element having a long exposure time has saturated, the value being normalized to the value of the light sensitive element of the pair having the long exposure time.
12. An image sensor, comprising:
an array of light sensitive elements; and
a filter array including a plurality of red, green, and blue filter elements, each filter element in optical communication with a respective light sensitive element,
each red filter element configured to transmit only red colored light,
each green filter element configured to transmit only green colored light,
each blue filter element configured to transmit only blue colored light, and
the filter array arranged such that successive columns of the filter array have alternating first and second configurations, the first configuration characterized by a repeating pattern of successive blue, green, red, and green filter elements, the second configuration characterized by a repeating pattern of successive green, blue, green, and red filter elements.
13. A method for down-sampling an image produced by an image sensor including an array of light sensitive elements, comprising:
filtering light incident on the image sensor such that successive columns of the array of light sensitive elements alternately receive light having a first pattern and a second pattern, the first pattern characterized by each four successive light sensitive elements in a column respectively receiving blue, green, red, and green colored light, and the second pattern characterized by each four successive light sensitive elements in a column respectively receiving green, blue, green, and red colored light;
sampling output values of the light sensitive elements; and
combining output values of pairs of light sensitive elements receiving light of a common color and from successive rows of the array to generate a down-sampled image.
14. The method of claim 13 , further comprising configuring the light sensitive elements to have a common exposure time, wherein the down-sampled image has an increased signal-to-noise ratio.
15. The method of claim 13 , further comprising configuring the light sensitive elements such that light sensitive elements of successive rows alternately have long and short exposure times, wherein the down-samples image has an increased dynamic range.
16. The method of claim 15 , the step of combining comprising, for each pair of light sensitive elements:
selecting a value from a light sensitive element of the pair having a long exposure time when the light sensitive element having a long exposure time has not saturated; and
selecting a value from a light sensitive element of the pair having a short exposure time when the light sensitive element having a long exposure time has saturated, the value being normalized to the value of the light sensitive element of the pair having the long exposure time.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/148,078 US20140118572A1 (en) | 2010-05-14 | 2014-01-06 | Alternative Color Image Array And Associated Methods |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US33488610P | 2010-05-14 | 2010-05-14 | |
US13/035,785 US8624997B2 (en) | 2010-05-14 | 2011-02-25 | Alternative color image array and associated methods |
US14/148,078 US20140118572A1 (en) | 2010-05-14 | 2014-01-06 | Alternative Color Image Array And Associated Methods |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/035,785 Continuation US8624997B2 (en) | 2010-05-14 | 2011-02-25 | Alternative color image array and associated methods |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140118572A1 true US20140118572A1 (en) | 2014-05-01 |
Family
ID=44914622
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/035,785 Active 2031-08-07 US8624997B2 (en) | 2010-05-14 | 2011-02-25 | Alternative color image array and associated methods |
US14/148,078 Abandoned US20140118572A1 (en) | 2010-05-14 | 2014-01-06 | Alternative Color Image Array And Associated Methods |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/035,785 Active 2031-08-07 US8624997B2 (en) | 2010-05-14 | 2011-02-25 | Alternative color image array and associated methods |
Country Status (3)
Country | Link |
---|---|
US (2) | US8624997B2 (en) |
TW (1) | TW201141209A (en) |
WO (1) | WO2011142774A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130057736A1 (en) * | 2011-03-30 | 2013-03-07 | Fujifilm Corporation | Driving method of solid-state imaging device, solid-state imaging device, and imaging apparatus |
US20150195463A1 (en) * | 2014-01-08 | 2015-07-09 | Kyoung-Min Koh | Image sensor |
US20160358958A1 (en) * | 2015-06-08 | 2016-12-08 | Ricoh Company, Ltd. | Solid-state imaging device |
US11665439B2 (en) | 2020-06-01 | 2023-05-30 | Samsung Electronics Co., Ltd. | Image sensor, a mobile device including the same and a method of controlling sensing sensitivity of an image sensor |
Families Citing this family (70)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11792538B2 (en) | 2008-05-20 | 2023-10-17 | Adeia Imaging Llc | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US8902321B2 (en) | 2008-05-20 | 2014-12-02 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US8866920B2 (en) | 2008-05-20 | 2014-10-21 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US8514491B2 (en) | 2009-11-20 | 2013-08-20 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
SG185500A1 (en) | 2010-05-12 | 2012-12-28 | Pelican Imaging Corp | Architectures for imager arrays and array cameras |
JP5664141B2 (en) * | 2010-11-08 | 2015-02-04 | ソニー株式会社 | Solid-state imaging device and camera system |
US8878950B2 (en) | 2010-12-14 | 2014-11-04 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using super-resolution processes |
EP2680591B1 (en) * | 2011-02-21 | 2015-11-18 | FUJIFILM Corporation | Color imaging device |
TWI491252B (en) * | 2011-02-25 | 2015-07-01 | Omnivision Tech Inc | Image sensor and method for down-sampling an image produced by image sensor |
WO2012155119A1 (en) | 2011-05-11 | 2012-11-15 | Pelican Imaging Corporation | Systems and methods for transmitting and receiving array camera image data |
US20130070060A1 (en) | 2011-09-19 | 2013-03-21 | Pelican Imaging Corporation | Systems and methods for determining depth from multiple views of a scene that include aliasing using hypothesized fusion |
EP2761534B1 (en) | 2011-09-28 | 2020-11-18 | FotoNation Limited | Systems for encoding light field image files |
CN104025574B (en) * | 2011-10-31 | 2015-10-14 | 富士胶片株式会社 | Camera head and image processing method |
US9412206B2 (en) | 2012-02-21 | 2016-08-09 | Pelican Imaging Corporation | Systems and methods for the manipulation of captured light field image data |
US20130300902A1 (en) * | 2012-03-29 | 2013-11-14 | Hiok Nam Tay | Color image sensor pixel array |
US9210392B2 (en) | 2012-05-01 | 2015-12-08 | Pelican Imaging Coporation | Camera modules patterned with pi filter groups |
US9077943B2 (en) * | 2012-05-31 | 2015-07-07 | Apple Inc. | Local image statistics collection |
KR20150023907A (en) | 2012-06-28 | 2015-03-05 | 펠리칸 이매징 코포레이션 | Systems and methods for detecting defective camera arrays, optic arrays, and sensors |
US20140002674A1 (en) | 2012-06-30 | 2014-01-02 | Pelican Imaging Corporation | Systems and Methods for Manufacturing Camera Modules Using Active Alignment of Lens Stack Arrays and Sensors |
EP4296963A3 (en) | 2012-08-21 | 2024-03-27 | Adeia Imaging LLC | Method for depth detection in images captured using array cameras |
EP2888698A4 (en) | 2012-08-23 | 2016-06-29 | Pelican Imaging Corp | Feature based high resolution motion estimation from low resolution images captured using an array source |
WO2014052974A2 (en) | 2012-09-28 | 2014-04-03 | Pelican Imaging Corporation | Generating images from light fields utilizing virtual viewpoints |
WO2014078443A1 (en) | 2012-11-13 | 2014-05-22 | Pelican Imaging Corporation | Systems and methods for array camera focal plane control |
US9462164B2 (en) | 2013-02-21 | 2016-10-04 | Pelican Imaging Corporation | Systems and methods for generating compressed light field representation data using captured light fields, array geometry, and parallax information |
US9253380B2 (en) | 2013-02-24 | 2016-02-02 | Pelican Imaging Corporation | Thin form factor computational array cameras and modular array cameras |
WO2014138695A1 (en) | 2013-03-08 | 2014-09-12 | Pelican Imaging Corporation | Systems and methods for measuring scene information while capturing images using array cameras |
US8866912B2 (en) | 2013-03-10 | 2014-10-21 | Pelican Imaging Corporation | System and methods for calibration of an array camera using a single captured image |
WO2014164550A2 (en) | 2013-03-13 | 2014-10-09 | Pelican Imaging Corporation | System and methods for calibration of an array camera |
US9888194B2 (en) | 2013-03-13 | 2018-02-06 | Fotonation Cayman Limited | Array camera architecture implementing quantum film image sensors |
US9519972B2 (en) | 2013-03-13 | 2016-12-13 | Kip Peli P1 Lp | Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies |
US9106784B2 (en) | 2013-03-13 | 2015-08-11 | Pelican Imaging Corporation | Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing |
US9578259B2 (en) | 2013-03-14 | 2017-02-21 | Fotonation Cayman Limited | Systems and methods for reducing motion blur in images or video in ultra low light with array cameras |
US9100586B2 (en) | 2013-03-14 | 2015-08-04 | Pelican Imaging Corporation | Systems and methods for photometric normalization in array cameras |
US10122993B2 (en) | 2013-03-15 | 2018-11-06 | Fotonation Limited | Autofocus system for a conventional camera that uses depth information from an array camera |
US9497429B2 (en) | 2013-03-15 | 2016-11-15 | Pelican Imaging Corporation | Extended color processing on pelican array cameras |
EP2973476A4 (en) | 2013-03-15 | 2017-01-18 | Pelican Imaging Corporation | Systems and methods for stereo imaging with camera arrays |
US9445003B1 (en) | 2013-03-15 | 2016-09-13 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information |
US9438827B2 (en) | 2013-08-27 | 2016-09-06 | Semiconductor Components Industries, Llc | Imaging systems and methods for generating binned high-dynamic-range images |
WO2015048694A2 (en) | 2013-09-27 | 2015-04-02 | Pelican Imaging Corporation | Systems and methods for depth-assisted perspective distortion correction |
US9264592B2 (en) | 2013-11-07 | 2016-02-16 | Pelican Imaging Corporation | Array camera modules incorporating independently aligned lens stacks |
US10119808B2 (en) * | 2013-11-18 | 2018-11-06 | Fotonation Limited | Systems and methods for estimating depth from projected texture using camera arrays |
WO2015081279A1 (en) | 2013-11-26 | 2015-06-04 | Pelican Imaging Corporation | Array camera configurations incorporating multiple constituent array cameras |
US10089740B2 (en) | 2014-03-07 | 2018-10-02 | Fotonation Limited | System and methods for depth regularization and semiautomatic interactive matting using RGB-D images |
US9711553B2 (en) * | 2014-04-28 | 2017-07-18 | Samsung Electronics Co., Ltd. | Image sensor including a pixel having photoelectric conversion elements and image processing device having the image sensor |
KR102189675B1 (en) | 2014-04-30 | 2020-12-11 | 삼성전자주식회사 | Image sensor having improved light utilization efficiency |
CN107077743B (en) | 2014-09-29 | 2021-03-23 | 快图有限公司 | System and method for dynamic calibration of an array camera |
US9942474B2 (en) | 2015-04-17 | 2018-04-10 | Fotonation Cayman Limited | Systems and methods for performing high speed video capture and depth estimation using array cameras |
CN105304678B (en) * | 2015-09-25 | 2018-11-16 | 上海和辉光电有限公司 | The pixel arrangement structure and mask plate of organic LED display panel |
US11244478B2 (en) * | 2016-03-03 | 2022-02-08 | Sony Corporation | Medical image processing device, system, method, and program |
EP3565259A1 (en) * | 2016-12-28 | 2019-11-06 | Panasonic Intellectual Property Corporation of America | Three-dimensional model distribution method, three-dimensional model receiving method, three-dimensional model distribution device, and three-dimensional model receiving device |
US10482618B2 (en) | 2017-08-21 | 2019-11-19 | Fotonation Limited | Systems and methods for hybrid depth regularization |
JP6938352B2 (en) * | 2017-12-08 | 2021-09-22 | キヤノン株式会社 | Imaging device and imaging system |
US10431616B2 (en) | 2017-12-20 | 2019-10-01 | Google Llc | Color filter arrays for image sensors |
TWI735904B (en) * | 2018-07-10 | 2021-08-11 | 大陸商廣州印芯半導體技術有限公司 | Image sensor and image sensing system with visible light communication function |
CN110225261B (en) * | 2019-06-21 | 2021-03-26 | 上海集成电路研发中心有限公司 | ZigZagHDR exposure control system and exposure control method thereof |
WO2021055585A1 (en) | 2019-09-17 | 2021-03-25 | Boston Polarimetrics, Inc. | Systems and methods for surface modeling using polarization cues |
US11525906B2 (en) | 2019-10-07 | 2022-12-13 | Intrinsic Innovation Llc | Systems and methods for augmentation of sensor systems and imaging systems with polarization |
JP7329143B2 (en) | 2019-11-30 | 2023-08-17 | ボストン ポーラリメトリックス,インコーポレイティド | Systems and methods for segmentation of transparent objects using polarization cues |
EP4081933A4 (en) | 2020-01-29 | 2024-03-20 | Intrinsic Innovation LLC | Systems and methods for characterizing object pose detection and measurement systems |
WO2021154459A1 (en) | 2020-01-30 | 2021-08-05 | Boston Polarimetrics, Inc. | Systems and methods for synthesizing data for training statistical models on different imaging modalities including polarized images |
US11953700B2 (en) | 2020-05-27 | 2024-04-09 | Intrinsic Innovation Llc | Multi-aperture polarization optical systems using beam splitters |
CN111798497A (en) * | 2020-06-30 | 2020-10-20 | 深圳市慧鲤科技有限公司 | Image processing method and device, electronic device and storage medium |
US12069227B2 (en) | 2021-03-10 | 2024-08-20 | Intrinsic Innovation Llc | Multi-modal and multi-spectral stereo camera arrays |
US12020455B2 (en) | 2021-03-10 | 2024-06-25 | Intrinsic Innovation Llc | Systems and methods for high dynamic range image reconstruction |
US11290658B1 (en) | 2021-04-15 | 2022-03-29 | Boston Polarimetrics, Inc. | Systems and methods for camera exposure control |
US11954886B2 (en) | 2021-04-15 | 2024-04-09 | Intrinsic Innovation Llc | Systems and methods for six-degree of freedom pose estimation of deformable objects |
US12067746B2 (en) | 2021-05-07 | 2024-08-20 | Intrinsic Innovation Llc | Systems and methods for using computer vision to pick up small objects |
US11689813B2 (en) | 2021-07-01 | 2023-06-27 | Intrinsic Innovation Llc | Systems and methods for high dynamic range imaging using crossed polarizers |
KR20230008496A (en) * | 2021-07-07 | 2023-01-16 | 삼성전자주식회사 | Image sensor and operating method thereof |
US11350048B1 (en) * | 2021-07-25 | 2022-05-31 | Shenzhen GOODIX Technology Co., Ltd. | Luminance-adaptive processing of hexa-deca RGBW color filter arrays in CMOS image sensors |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040080652A1 (en) * | 2002-07-25 | 2004-04-29 | Shinichi Nonaka | Electric camera |
US20070273785A1 (en) * | 2004-11-01 | 2007-11-29 | Masahiro Ogawa | Image Sensor |
US20070285526A1 (en) * | 2006-05-31 | 2007-12-13 | Ess Technology, Inc. | CMOS imager system with interleaved readout for providing an image with increased dynamic range |
US7479998B2 (en) * | 2001-01-09 | 2009-01-20 | Sony Corporation | Image pickup and conversion apparatus |
US20090059048A1 (en) * | 2007-08-31 | 2009-03-05 | Omnivision Technologies, Inc. | Image sensor with high dynamic range in down-sampling mode |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7072508B2 (en) * | 2001-01-10 | 2006-07-04 | Xerox Corporation | Document optimized reconstruction of color filter array images |
JP3902525B2 (en) * | 2002-09-05 | 2007-04-11 | 三洋電機株式会社 | Image signal processing device |
US7430011B2 (en) | 2003-01-22 | 2008-09-30 | Omnivision Technologies, Inc. | Image sensor having dual automatic exposure control |
US7554588B2 (en) | 2005-02-01 | 2009-06-30 | TransChip Israel, Ltd. | Dual exposure for image sensor |
US7829832B2 (en) | 2005-08-30 | 2010-11-09 | Aptina Imaging Corporation | Method for operating a pixel cell using multiple pulses to a transistor transfer gate |
JP4852321B2 (en) | 2006-02-28 | 2012-01-11 | パナソニック株式会社 | Imaging device |
KR100932217B1 (en) * | 2007-08-09 | 2009-12-16 | 연세대학교 산학협력단 | Color interpolation method and device |
KR101473720B1 (en) * | 2008-06-26 | 2014-12-18 | 삼성전자주식회사 | Color filter array and method of fabricating the same, and image pick-up device of the same |
-
2010
- 2010-09-17 WO PCT/US2010/049368 patent/WO2011142774A1/en active Application Filing
- 2010-10-01 TW TW099133622A patent/TW201141209A/en unknown
-
2011
- 2011-02-25 US US13/035,785 patent/US8624997B2/en active Active
-
2014
- 2014-01-06 US US14/148,078 patent/US20140118572A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7479998B2 (en) * | 2001-01-09 | 2009-01-20 | Sony Corporation | Image pickup and conversion apparatus |
US20040080652A1 (en) * | 2002-07-25 | 2004-04-29 | Shinichi Nonaka | Electric camera |
US20070273785A1 (en) * | 2004-11-01 | 2007-11-29 | Masahiro Ogawa | Image Sensor |
US20070285526A1 (en) * | 2006-05-31 | 2007-12-13 | Ess Technology, Inc. | CMOS imager system with interleaved readout for providing an image with increased dynamic range |
US20090059048A1 (en) * | 2007-08-31 | 2009-03-05 | Omnivision Technologies, Inc. | Image sensor with high dynamic range in down-sampling mode |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130057736A1 (en) * | 2011-03-30 | 2013-03-07 | Fujifilm Corporation | Driving method of solid-state imaging device, solid-state imaging device, and imaging apparatus |
US8830364B2 (en) * | 2011-03-30 | 2014-09-09 | Fujifilm Corporation | Driving method of solid-state imaging device, solid-state imaging device, and imaging apparatus |
US20150195463A1 (en) * | 2014-01-08 | 2015-07-09 | Kyoung-Min Koh | Image sensor |
US9584741B2 (en) * | 2014-01-08 | 2017-02-28 | Samsung Electronics Co., Ltd. | Image sensor having multiple operation modes |
US10205905B2 (en) | 2014-01-08 | 2019-02-12 | Samsung Electronics Co., Ltd. | Image sensor having multiple operation modes |
US20160358958A1 (en) * | 2015-06-08 | 2016-12-08 | Ricoh Company, Ltd. | Solid-state imaging device |
US10446595B2 (en) * | 2015-06-08 | 2019-10-15 | Ricoh Company, Ltd. | Solid-state imaging device |
US20190393252A1 (en) * | 2015-06-08 | 2019-12-26 | Ricoh Company, Ltd. | Solid-state imaging device |
US10868057B2 (en) * | 2015-06-08 | 2020-12-15 | Ricoh Company, Ltd. | Solid-state imaging device |
US11665439B2 (en) | 2020-06-01 | 2023-05-30 | Samsung Electronics Co., Ltd. | Image sensor, a mobile device including the same and a method of controlling sensing sensitivity of an image sensor |
Also Published As
Publication number | Publication date |
---|---|
TW201141209A (en) | 2011-11-16 |
US20110279705A1 (en) | 2011-11-17 |
US8624997B2 (en) | 2014-01-07 |
WO2011142774A1 (en) | 2011-11-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8624997B2 (en) | Alternative color image array and associated methods | |
US8022994B2 (en) | Image sensor with high dynamic range in down-sampling mode | |
US8125543B2 (en) | Solid-state imaging device and imaging apparatus with color correction based on light sensitivity detection | |
US10021358B2 (en) | Imaging apparatus, imaging system, and signal processing method | |
US7745779B2 (en) | Color pixel arrays having common color filters for multiple adjacent pixels for use in CMOS imagers | |
JP4846608B2 (en) | Solid-state imaging device | |
US7777804B2 (en) | High dynamic range sensor with reduced line memory for color interpolation | |
US7593047B2 (en) | CMOS image sensor for suppressing degradation of spatial resolution and generating compressed image signals | |
US8013914B2 (en) | Imaging apparatus including noise suppression circuit | |
WO2013145487A1 (en) | Image processing device, image-capturing element, image processing method, and program | |
US20080180556A1 (en) | Solid-state image pickup device | |
US10477165B2 (en) | Solid-state imaging apparatus, driving method therefor, and imaging system | |
US8497925B2 (en) | Solid-state imaging device, color filter arrangement method therefor and image recording apparatus | |
US8830364B2 (en) | Driving method of solid-state imaging device, solid-state imaging device, and imaging apparatus | |
TW201351986A (en) | Imaging device and imaging method, electronic apparatus, as well as program | |
JP4600315B2 (en) | Camera device control method and camera device using the same | |
US8471936B2 (en) | Imaging device and signal processing method | |
TWI491252B (en) | Image sensor and method for down-sampling an image produced by image sensor | |
EP2680590B1 (en) | Color image pick-up element | |
WO2006064587A1 (en) | Multi-segment read ccd correcting apparatus and method | |
WO2006061923A1 (en) | Correction approximating straight line group information generating method of multi-divided reading ccd, and correction processing device manufacturing method of multi-divided reading ccd | |
JP2009055151A (en) | Color solid-state imaging apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: OMNIVISION TECHNOLOGIES, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUANG, JIANGTAO;WU, DONGHUI;SHAN, JIZHANG;REEL/FRAME:032620/0917 Effective date: 20110221 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |