WO2012039180A1 - 撮像デバイス及び撮像装置 - Google Patents
撮像デバイス及び撮像装置 Download PDFInfo
- Publication number
- WO2012039180A1 WO2012039180A1 PCT/JP2011/065314 JP2011065314W WO2012039180A1 WO 2012039180 A1 WO2012039180 A1 WO 2012039180A1 JP 2011065314 W JP2011065314 W JP 2011065314W WO 2012039180 A1 WO2012039180 A1 WO 2012039180A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- microlens
- image
- imaging device
- photoelectric conversion
- shooting mode
- Prior art date
Links
- 238000006243 chemical reaction Methods 0.000 claims abstract description 66
- 239000004065 semiconductor Substances 0.000 claims abstract description 4
- 239000000758 substrate Substances 0.000 claims abstract description 4
- 238000003384 imaging method Methods 0.000 claims description 149
- 230000003287 optical effect Effects 0.000 claims description 26
- 210000001747 pupil Anatomy 0.000 claims description 3
- 238000012545 processing Methods 0.000 description 77
- 238000000034 method Methods 0.000 description 14
- 238000012937 correction Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 11
- 230000035945 sensitivity Effects 0.000 description 11
- 239000004973 liquid crystal related substance Substances 0.000 description 9
- 238000007906 compression Methods 0.000 description 6
- 238000001514 detection method Methods 0.000 description 6
- 230000009471 action Effects 0.000 description 5
- 230000006835 compression Effects 0.000 description 5
- 230000006837 decompression Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 239000011521 glass Substances 0.000 description 3
- FYZYXYLPBWLLGI-AUOPOVQUSA-N Genipin 1-beta-gentiobioside Chemical compound C([C@H]1O[C@H]([C@@H]([C@@H](O)[C@@H]1O)O)O[C@@H]1OC=C([C@@H]2[C@H]1C(=CC2)CO)C(=O)OC)O[C@@H]1O[C@H](CO)[C@@H](O)[C@H](O)[C@H]1O FYZYXYLPBWLLGI-AUOPOVQUSA-N 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000004907 flux Effects 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/286—Image signal generators having separate monoscopic and stereoscopic modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/218—Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/257—Colour aspects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/80—Constructional details of image sensors
- H10F39/806—Optical elements or arrangements associated with the image sensors
- H10F39/8063—Microlenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/296—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/10—Integrated devices
- H10F39/12—Image sensors
- H10F39/18—Complementary metal-oxide-semiconductor [CMOS] image sensors; Photodiode array image sensors
- H10F39/182—Colour image sensors
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10F—INORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
- H10F39/00—Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
- H10F39/80—Constructional details of image sensors
- H10F39/802—Geometry or disposition of elements in pixels, e.g. address-lines or gate electrodes
- H10F39/8023—Disposition of the elements in pixels, e.g. smaller elements in the centre of the imager compared to larger elements at the periphery
Definitions
- the present invention relates to an imaging device and an imaging apparatus, and more particularly to an imaging device and an imaging apparatus capable of capturing a two-dimensional image (2D image) and a three-dimensional image (3D image).
- Patent Document 1 describes that a plurality of parallax images having different parallaxes are generated from a plurality of pixels to which one microlens is assigned.
- a lens array camera in which a plurality of lenses are arranged in an array and a normal camera are arranged in a horizontal direction, and a plurality of parallax images are captured with a low-resolution image using one lens array camera.
- a stereoscopic video imaging apparatus has been proposed in which a high-resolution video is captured using the other camera, and the parallax vectors between the cameras and the lens array camera are matched (Patent Document 2).
- the video shot by this stereoscopic video imaging device includes a plurality of videos with fine parallax intervals and one video with a large parallax interval having the same vector as this video, and the resolution is a video with fine resolution, Video with a rough resolution. Then, by interpolating the parallax and the resolution, a high-resolution multi-parallax image can be taken.
- a plurality of two-dimensionally arranged microlenses (microlens array) described in Patent Document 1 are arranged on an imaging surface of a photographing lens, and an imaging element is arranged at an imaging position of the microlens array.
- a light beam is incident on each pixel of the image sensor through a microlens array.
- Patent Document 1 can acquire a plurality of parallax images having different parallaxes from a plurality of pixels to which one microlens is assigned, but cannot obtain a high-resolution 2D image.
- cited document 1 has a description that color filters may be two-dimensionally arranged in units of image pickup pixels (paragraph [0022] of patent document 1), a plurality of one microlens assigned. There is no description of arranging color filters of the same color in units of pixels.
- the stereoscopic video photographing apparatus described in Patent Document 2 requires two cameras, a lens array camera and a normal camera, and there is a problem that the apparatus becomes large and costs increase.
- the present invention has been made in view of such circumstances, and an imaging device and an imaging apparatus that can capture a high-resolution 2D image and can capture a 3D image at low cost and can be downsized.
- the purpose is to provide.
- an imaging device includes a plurality of photoelectric conversion elements arranged in a row direction and a column direction on a semiconductor substrate, and one photoelectric conversion element disposed above one photoelectric conversion element.
- An imaging device includes a 1-pixel 1-microlens portion in which one microlens is disposed for one photoelectric conversion element (1 pixel), and n ⁇ n photoelectric conversion elements adjacent vertically and horizontally.
- An n ⁇ n pixel 1 microlens unit in which one microlens is arranged for (n ⁇ n pixels) is mixed and output from a 1 pixel 1 microlens unit with a small pixel pitch.
- a high-resolution two-dimensional image can be generated from the first output signal that is generated, and on the other hand, a three-dimensional image can be generated from the second output signal output from the n ⁇ n pixel 1 microlens unit from which an n ⁇ n viewpoint parallax image is obtained.
- An image can be generated.
- a color filter of any one of a plurality of color filters is disposed above the plurality of photoelectric conversion elements, and n ⁇ n photoelectric elements corresponding to the second microlens.
- the conversion element is provided with a color filter of the same color. That is, by making the color of the color filter the same in units of n ⁇ n pixels and 1 microlens, pixels can be added as necessary.
- the number of photoelectric conversion elements provided with the first microlens is the same as the number of photoelectric conversion elements provided with the second microlens.
- a 4 ⁇ 4 photoelectric conversion element is used as one block, a first region in which 16 first microlenses are arranged in one block, and four second microlenses in one block. Are arranged in a checkered pattern.
- the color filter array can be a Bayer array.
- a 2 ⁇ 2 photoelectric conversion element is used as one block, a first region in which four first microlenses are arranged in one block, and one second microlens in one block. Are arranged in a checkered pattern.
- An imaging apparatus captures a single imaging optical system, an imaging device on which a subject image is formed via the imaging optical system, a 2D imaging mode for imaging a 2D image, and a 3D image.
- a 2D shooting mode is selected by the shooting mode selection unit that switches between 3D shooting modes and the shooting mode selection unit, a first output that is output from a photoelectric conversion element corresponding to a first microlens of the imaging device
- a 3D imaging mode is selected by a first image generation unit that generates a two-dimensional image based on a signal and the imaging mode selection unit, an output is output from a photoelectric conversion element corresponding to a second microlens of the imaging device.
- a second image generation unit that generates a three-dimensional image based on the second output signal, a two-dimensional image generated by the first image generation unit or the second image generation unit, or
- a recording unit for recording dimension image on a recording medium comprising a.
- the first output signal output from the 1-pixel 1-microlens unit and the second output signal output from the 4-pixel- 1 microlens unit according to the 2D shooting mode or the 3D shooting mode. Is switched and a 2D shooting mode is selected, a high-resolution two-dimensional image is generated based on the first output signal, and when a 3D shooting mode is selected, based on the second output signal A three-dimensional image (a plurality of parallax images) can be generated.
- An imaging apparatus captures a single imaging optical system, an imaging device on which a subject image is formed via the imaging optical system, a 2D imaging mode for imaging a 2D image, and a 3D image.
- a shooting mode selection unit that switches between 3D shooting modes, a determination unit that determines whether an image shot through the shooting optical system and the imaging device includes a lot of high-frequency components, and 2D by the shooting mode selection unit When the photographing mode is selected and the image is determined to be an image containing a lot of high-frequency components by the determination unit, 2 is determined based on the first output signal output from the photoelectric conversion element corresponding to the first microlens of the imaging device.
- a three-dimensional image is generated, and output from the photoelectric conversion element corresponding to the second microlens of the imaging device when the determination unit determines that the image does not contain a lot of high-frequency components
- the second microlens of the imaging device corresponds to the second microlens
- a second image generation unit that generates a three-dimensional image based on a second output signal output from the photoelectric conversion element, and a two-dimensional image generated by the first image generation unit or the second image generation unit Or a recording unit that records a three-dimensional image on a recording medium.
- the high-resolution 2 is selected based on the first output signal.
- a two-dimensional image is generated and high-resolution imaging is not required (in the case of an image that does not include many high-frequency components)
- a two-dimensional image is generated based on the second output signal. Note that when a two-dimensional image is generated based on the second output signal, four pixels corresponding to one microlens are added to form one pixel.
- the imaging apparatus further includes a brightness detection unit that detects the brightness of the subject, and the first image generation unit selects a 2D shooting mode by the shooting mode selection unit and increases a high-frequency component by the determination unit.
- a two-dimensional image is generated based on the image, and when the image is determined to be an image that does not contain a lot of high-frequency components by the determining unit, or when the brightness of the detected subject is equal to or less than a predetermined threshold, A two-dimensional image is generated based on the second output signal output from the photoelectric conversion element corresponding to the microlens.
- the present invention particularly when the 2D shooting mode is selected and high-resolution shooting is required (in the case of an image containing a lot of high-frequency components), and the brightness of the subject exceeds a predetermined brightness
- a high-resolution two-dimensional image is generated based on the first output signal, and a two-dimensional image is generated based on the second output signal under other imaging conditions.
- the brightness of a subject has a predetermined brightness.
- a two-dimensional image is generated based on the second output signal even if the image contains a lot of high-frequency components.
- the present invention provides a single photographing optical system, an imaging device on which a subject image is formed via the photographing optical system, a 2D photographing mode for photographing a two-dimensional image, and a 3D photographing mode for photographing a three-dimensional image.
- a brightness detection unit that detects the brightness of the subject, and the shooting mode selection unit, and the detected brightness of the subject exceeds a predetermined threshold
- a first image generation unit that generates a two-dimensional image based on a second output signal output from a photoelectric conversion element corresponding to a second microlens of the imaging device;
- a second image that generates a three-dimensional image based on a second output signal output from a photoelectric conversion element corresponding to a second microlens of the imaging device when the 3D imaging mode is selected by the mode selection unit;
- a generating unit; and a recording unit that records the two-dimensional image or the three-dimensional image generated by the first image generating unit or the second image generating
- a high-resolution two-dimensional image is generated based on the first output signal
- a two-dimensional image is generated based on the second output signal.
- n ⁇ n pixel addition is performed, so that a desired output signal can be obtained even when the subject is dark.
- An imaging apparatus captures a single imaging optical system, an imaging device on which a subject image is formed via the imaging optical system, a 2D imaging mode for imaging a 2D image, and a 3D image.
- a shooting mode selection unit that switches between 3D shooting modes, and a determination unit that determines whether an image shot through the shooting optical system and the imaging device includes a lot of high-frequency components, and displays one screen as N ⁇
- a determination unit that determines whether or not each of the divided areas divided into M includes a large amount of high-frequency components; a 2D shooting mode is selected by the shooting mode selection unit; Then, for the divided area, the first output signal output from the photoelectric conversion element corresponding to the first microlens of the imaging device is acquired, and the divided area does not contain much high-frequency components.
- the second output signal output from the photoelectric conversion element corresponding to the second microlens of the imaging device is acquired for the divided area, and the acquired first output signal and
- a 3D shooting mode is selected by a first image generation unit that generates a two-dimensional image based on a second output signal and the shooting mode selection unit, a photoelectric corresponding to a second microlens of the imaging device.
- a second image generation unit that generates a three-dimensional image based on a second output signal output from the conversion element, and the two-dimensional image generated by the first image generation unit or the second image generation unit, or
- a recording unit that records a three-dimensional image on a recording medium.
- one screen is divided from each divided area depending on whether or not it is a divided area containing many high-frequency components determined for each divided area divided into N ⁇ M.
- An appropriate output signal is selectively acquired from the first output signal and the second output signal.
- the second image generation unit is configured to display parallax images of four viewpoints, top, bottom, left, and right, based on a second output signal output from a photoelectric conversion element corresponding to the second microlens of the imaging device.
- a parallax image of two viewpoints, that is, top and bottom or left and right is generated.
- the present invention it is possible to shoot a high-resolution 2D image and a 3D image with a novel imaging device in which a 1-pixel 1-microlens portion and a 4-pixel-1 microlens portion are mixed, and to reduce the cost of the apparatus And miniaturization can be achieved.
- FIG. 1 is a main part plan view showing a first embodiment of an imaging device according to the present invention;
- the figure which shows the 1 pixel 1 micro lens part in an imaging device The figure which shows the 4 pixel 1 micro lens part in an imaging device
- the principal part top view which shows 2nd Embodiment of the imaging device which concerns on this invention 1 is a block diagram showing an embodiment of an imaging apparatus according to the present invention.
- the figure which shows the pixel of 4 pixels 1 micro lens part The figure explaining the addition method of the pixel of 4 pixel 1 micro lens part
- the block diagram which shows the internal structure of the digital signal processing part of the imaging device which concerns on this invention
- the flowchart which shows the effect
- action of the imaging device of the 2nd Embodiment of this invention The flowchart which shows the effect
- FIG. 1 is a main part plan view showing a first embodiment of an imaging device according to the present invention.
- the imaging device 1 is a CCD or CMOS color image sensor, and is mainly composed of a plurality of photoelectric conversion elements (photodiodes) PD (FIG. 2A) arranged in a row direction and a column direction on a semiconductor substrate. 2B), two types of large and small microlenses L1 and L2, and color filters of a plurality of colors (three primary colors) of red (R), green (G), and blue (B).
- PD photoelectric conversion elements
- One small microlens L1 is disposed for one photodiode PD, and one large microlens L2 is disposed for four photodiodes PD in the vertical and horizontal directions.
- a portion where one microlens L1 is disposed for one photodiode PD is referred to as one pixel one microlens portion 1A, and one portion for four photodiodes PD (four pixels).
- the portion where the two microlenses L2 are disposed is referred to as a 4-pixel 1-microlens portion 1B.
- the imaging device 1 is provided with a mixture of a 1-pixel 1-microlens portion 1A and a 4-pixel 1-microlens portion 1B.
- a color filter of any one of R, G, and B is disposed in the 1-pixel 1-microlens portion 1A.
- a color filter of any one color of R, G, and B is disposed in the 4-pixel 1-microlens portion 1B. That is, color filters of the same color are disposed on the four photodiodes PD of the four-pixel / one-microlens portion 1B.
- color filters are arranged in the order of RGRG in the 1 pixel 1 microlens portion 1A on the odd lines l1, l3, l5, l7..., And even lines l2, l4, l6, l8.
- color filters are arranged in the order of GBGB.
- color filters are arranged in the order of RGRG in the 4-pixel 1 microlens portion 1B on the lines l1, l2, l5, l6... 4 pixels 1 on the lines l3, l4, l7, l8.
- color filters are arranged in the order of GBGB.
- the imaging device 1 includes 4 ⁇ 4 pixels as one block, a first region in which 16 1-pixel 1-microlens portions 1A are disposed in one block, and 4 4-pixels in one block.
- the second region in which the 1 microlens portion 1B is disposed is arranged in a checkered pattern, and the R, G, and B color filter arrays of the 1 pixel 1 microlens portion 1A and the 4 pixel 1 microlens portion 1B are arranged.
- the imaging device 1 includes 4 ⁇ 4 pixels as one block, a first region in which 16 1-pixel 1-microlens portions 1A are disposed in one block, and 4 4-pixels in one block.
- the second region in which the 1 microlens portion 1B is disposed is arranged in a checkered pattern, and the R, G, and B color filter arrays of the 1 pixel 1 microlens portion 1A and the 4 pixel 1 microlens portion 1B are arranged.
- the microlens L1 of the 1-pixel 1-microlens portion 1A condenses the light flux on the light receiving surface of one photodiode PD.
- the microlens L2 of the 4-pixel 1-microlens portion 1B collects the light flux on the light receiving surfaces of four photodiodes PD (only two are shown in FIG. 2B). The limited light (light divided into pupils) is incident on each of the four photodiodes PD.
- a high-resolution 2D image can be generated based on the output signal output from the 1-pixel 1-microlens unit 1A, and based on the output signal output from the 4-pixel 1-microlens unit 1B.
- 3D images can be generated. A 2D image and 3D image generation method will be described later.
- FIG. 3 is a plan view of an essential part showing a second embodiment of the imaging device according to the present invention.
- This imaging device 1 'differs from the imaging device 1 shown in FIG. 1 only in the arrangement of the 1 pixel 1 microlens portion 1A and the 4 pixels 1 microlens portion 1B.
- the 4-pixel 1-microlens portion 1B of the imaging device 1 ' is arranged in a checkered pattern, and the 1-pixel 1-microlens portion 1A is arranged therebetween.
- the color filters of the 1-pixel 1-microlens portion 1A are in a Bayer array, and the color filters of the 4-pixel 1-microlens portion 1B are alternately arranged with G lines and RB lines.
- the arrangement of the 1 pixel 1 microlens portion 1A and the 4 pixel 1 microlens portion 1B is not limited to the embodiment shown in FIGS. 1 and 3, and may be alternately arranged in a stripe shape, for example.
- the number of photodiodes PD in the 1-pixel 1-microlens portion 1A and the 4-pixel 1-microlens portion 1B is the same. Need only be able to acquire high-resolution 2D images and 3D images.
- the color filters are not limited to R, G, and B color filters, and may be color filters such as yellow (Y), magenta (M), and cyan (C).
- FIG. 4 is a block diagram showing an embodiment of the imaging apparatus 10 according to the present invention.
- the image pickup apparatus 10 is provided with the image pickup device 1 shown in FIG. 1 and can take 2D images and 3D images.
- the operation of the entire apparatus is centrally controlled by a central processing unit (CPU) 40. .
- CPU central processing unit
- the imaging device 10 is provided with operation units 38 such as a shutter button, a mode dial, a playback button, a MENU / OK key, a cross key, and a BACK key.
- operation units 38 such as a shutter button, a mode dial, a playback button, a MENU / OK key, a cross key, and a BACK key.
- a signal from the operation unit 38 is input to the CPU 40, and the CPU 40 controls each circuit of the imaging device 10 based on the input signal.
- the shutter button is an operation button for inputting an instruction to start shooting, and is configured by a two-stroke switch having an S1 switch that is turned on when half-pressed and an S2 switch that is turned on when fully pressed.
- the mode dial is a selection unit that selects a 2D shooting mode, a 3D shooting mode, an auto shooting mode, a manual shooting mode, a scene position such as a person, a landscape, a night view, a macro mode, a moving image mode, and a parallax priority shooting mode according to the present invention. is there.
- the playback button is a button for switching to a playback mode in which a still image or a moving image of a plurality of parallax images (3D images) and planar images (2D images) that have been photographed and recorded are displayed on the liquid crystal monitor 30.
- the MENU / OK key is an operation key having both a function as a menu button for instructing to display a menu on the screen of the liquid crystal monitor 30 and a function as an OK button for instructing confirmation and execution of the selection contents. It is.
- the cross key is an operation unit for inputting instructions in four directions, up, down, left, and right, and functions as a button (cursor moving operation means) for selecting an item from the menu screen or instructing selection of various setting items from each menu. To do.
- the up / down key of the cross key functions as a zoom switch for shooting or a playback zoom switch in playback mode
- the left / right key functions as a frame advance (forward / reverse feed) button in playback mode.
- the BACK key is used to delete a desired object such as a selection item, cancel an instruction content, or return to the previous operation state.
- the image light indicating the subject is imaged on the light receiving surface of the imaging device 1 through the single photographing optical system (zoom lens) 12 and the diaphragm 14.
- the photographing optical system 12 is driven by a lens driving unit 36 controlled by the CPU 40, and performs focus control, zoom control, and the like.
- the diaphragm 14 is composed of, for example, five diaphragm blades, and is driven by a diaphragm drive unit 34 controlled by the CPU 40.
- the diaphragm value is controlled in six steps from 1 to an aperture value from a diaphragm value F1.4 to F11.
- the CPU 40 controls the aperture 14 via the aperture drive unit 34, and controls charge accumulation time (shutter speed) in the imaging device 1 and image signal readout from the imaging device 1 via the device control unit 32. Etc.
- the signal charge accumulated in the imaging device 1 is read out as a voltage signal corresponding to the signal charge based on a readout signal applied from the device control unit 32.
- the voltage signal read from the imaging device 1 is applied to the analog signal processing unit 18, where the R, G, B signals for each pixel are sampled and held, and the gain designated by the CPU 40 (corresponding to ISO sensitivity). And then added to the A / D converter 20.
- the A / D converter 20 converts R, G, and B signals that are sequentially input into digital R, G, and B signals and outputs them to the image input controller 22.
- the digital signal processing unit 24 performs gain control processing including gamma correction processing, gamma correction processing, synchronization processing, YC processing for digital image signals input via the image input controller 22, including offset processing, white balance correction, and sensitivity correction. Then, predetermined signal processing such as sharpness correction is performed.
- reference numeral 46 denotes a camera control program, defect information of the imaging device 1, various parameters and tables used for image processing, and an aperture priority program diagram, shutter speed priority program diagram, or subject brightness.
- a program diagram ordinary program diagram
- EEPROM EEPROM in which a program diagram for parallax priority is stored.
- the parallax image of the four viewpoints obtained from the output signal of the four-pixel 1-microlens unit 1B of the imaging device 1 is not limited to the parallax priority program diagram, and the parallax changes depending on the size of the aperture opening. You may make it control so that it may not become smaller than a fixed aperture opening at the time of 3D imaging
- the digital signal processing unit 24 performs image processing according to the shooting mode according to the 2D shooting mode or the 3D shooting mode, and also performs image processing according to the subject and shooting conditions in the 2D shooting mode. The details of the image processing in the digital signal processing unit 24 will be described later.
- the 2D image data processed by the digital signal processing unit 24 is output to the VRAM 50, whereas when the 3D shooting mode is selected, the digital signal processing unit 24 The processed 3D image data is output to the VRAM 50.
- the VRAM 50 includes an A area and a B area each storing image data representing an image for one frame. In the VRAM 50, image data representing an image for one frame is rewritten alternately in the A area and the B area. Of the A area and B area of the VRAM 50, the written image data is read from an area other than the area where the image data is rewritten.
- the image data read from the VRAM 50 is encoded by the video encoder 28 and is output to a stereoscopic display liquid crystal monitor 30 provided on the back of the camera, whereby a 2D / 3D subject image (live view image) is liquid crystal. It is displayed on the display screen of the monitor 30.
- the liquid crystal monitor 30 is a stereoscopic display unit that can display stereoscopic images (left viewpoint image and right viewpoint image) as directional images having predetermined directivities by a parallax barrier, but is not limited thereto, and is not limited thereto.
- the left viewpoint image and the right viewpoint image may be viewed separately by using a lens, or by wearing dedicated glasses such as polarized glasses or liquid crystal shutter glasses.
- the imaging device 1 starts the AF operation and the AE operation, and the focus in the photographing optical system 12 is set via the lens driving unit 36. Control is performed so that the lens comes to the in-focus position.
- the image data output from the A / D converter 20 when the shutter button is half-pressed is taken into the AE detection unit 44.
- the AE detection unit 44 integrates the G signals of the entire screen or integrates the G signals that are weighted differently in the central portion and the peripheral portion of the screen, and outputs the integrated value to the CPU 40.
- the CPU 40 calculates the brightness of the subject (shooting EV value) from the integrated value input from the AE detection unit 44, and sets the aperture value of the diaphragm 14 and the electronic shutter (shutter speed) of the imaging device 1 based on the shooting EV value. It is determined according to a predetermined program diagram.
- shooting (exposure) conditions that are a combination of aperture value and shutter speed, or a combination of these and shooting sensitivity (ISO sensitivity) are designed according to the brightness of the subject. Therefore, by shooting under the shooting conditions determined according to the program diagram, an image with appropriate brightness can be taken regardless of the brightness of the subject.
- the CPU 40 controls the aperture 14 via the aperture drive unit 34 based on the aperture value determined according to the program diagram, and charges in the imaging device 1 via the device control unit 32 based on the determined shutter speed. Control the accumulation time.
- the AF processing unit 42 is a part that performs contrast AF processing or phase AF processing.
- contrast AF processing for example, a high frequency component of image data in a predetermined focus area is extracted from image data corresponding to one pixel / one microlens unit 1A, and the high frequency component is integrated.
- An AF evaluation value indicating the in-focus state is calculated.
- AF control is performed by controlling the focus lens in the photographic optical system 12 so that the AF evaluation value is maximized.
- phase difference AF process a phase difference of image data in a predetermined focus area among a plurality of parallax image data corresponding to the 4-pixel 1-microlens unit 1B is detected, and this phase difference is indicated.
- a defocus amount is obtained based on the information.
- AF control is performed by controlling the focus lens in the photographing optical system 12 so that the defocus amount becomes zero.
- the image data output from the A / D converter 20 in response to the press is stored in the memory from the image input controller 22. (SDRAM) Input to 48 and temporarily stored.
- the image data temporarily stored in the memory 48 is appropriately read out by the digital signal processing unit 24.
- the image data corresponding to the pixel position of the 4-pixel 1-microlens portion 1B is insufficient. Insufficient image data is generated by linearly interpolating image data corresponding to one pixel / one microlens portion 1A. After that, all the image data of the image data corresponding to 1 pixel 1 microlens unit 1A and the image data generated by the interpolation are subjected to a synchronization process (by interpolating the spatial shift of the color signal associated with the primary color filter array). Predetermined signal processing including color signal conversion processing) and YC processing (image data luminance data and color difference data generation processing) is performed. The YC processed image data (YC data) is stored in the memory 48 again.
- each pixel of the 4-pixel 1-microlens portion 1B is A, B, C, and D
- image data for four sheets for each of A, B, C, and D is generated.
- the image data of A and C are added to generate a left eye display image (left parallax image), and the image data of B and D are added.
- a right-eye display image (right parallax image) is generated.
- the symbols L and R attached to the four pixels of the four-pixel / one-microlens portion 1B are pixels for left-eye display when the imaging device 10 is held horizontally.
- And right-eye display pixels are pixels for left-eye display when the imaging device 10 is held horizontally.
- the imaging device 10 when shooting with the imaging device 10 held vertically, the A and B image data are added to generate a left eye display image (left parallax image), and the C and D image data are added.
- the right eye display image (right parallax image) is generated.
- the imaging device 10 is provided with a sensor that detects the posture (vertical and horizontal) of the imaging device 10, and selectively adds the above pixels based on the posture of the imaging device 10 during 3D shooting. Further, as described later, a 2D image can be generated by adding image data of A, B, C, and D.
- One piece of YC data generated in the 2D shooting mode as described above and stored in the memory 48 is output to the compression / decompression processing unit 26, where a predetermined compression process such as JPEG (joint photographic experts group) is performed. Is recorded on the memory card 54 via the media controller 52. Further, the YC data for two sheets (for the left and right viewpoints) generated in the 3D shooting mode and stored in the memory 48 are respectively output to the compression / decompression processing unit 26, where a predetermined value such as JPEG (joint photographic experts group) is used. And a multi-picture file (MP file: a file in which a plurality of images are connected) is generated, and the MP file is recorded on the memory card 54 via the media controller 52.
- MP file a file in which a plurality of images are connected
- FIG. 5B In the 3D shooting mode, two left and right parallax images are generated as shown in FIG. 5B.
- the present invention is not limited to this.
- images may be added to output a parallax image.
- FIG. 6 is a block diagram showing an internal configuration of the digital signal processing unit 24.
- the digital signal processing unit 24 includes an input / output processing circuit 241, an image determination unit 242, an image processing unit 243, and a control unit 244.
- the input / output processing circuit 241 once inputs / outputs image data stored in the memory 48 via the image input controller 22.
- the image determination unit 242 mixes image data acquired via the input / output processing circuit 241 (image data corresponding to the 1 pixel 1 microlens unit 1A and image data corresponding to the 4 pixels 1 microlens unit 1B).
- image data corresponding to the 1-pixel 1-microlens portion 1A or the 4-pixel 1-microlens portion 1B is used.
- the image processing unit 243 performs post-processing for generating image data for recording from the image data acquired according to the determination result of the image determination unit 242.
- the control unit 244 is a part that performs overall control of the input / output processing circuit 241, the image determination unit 242, and the image processing unit 243.
- FIG. 7 is a flowchart showing the operation of the imaging apparatus 10 according to the first embodiment of the present invention.
- the photographer operates the mode dial of the operation unit 38 to select the 2D shooting mode or the 3D shooting mode, and then determines the composition while viewing the live view image (through image) output to the liquid crystal monitor 30. Then, imaging is performed by half-pressing and fully-pressing the shutter button (step S10).
- the CPU 40 determines whether the 2D shooting mode is selected by the mode dial or the 3D shooting mode is selected (step S12). If the 2D shooting mode is selected, the process proceeds to step S14. If the 3D shooting mode is selected, the process proceeds to step S18.
- step S14 the image determination unit 242 illustrated in FIG. 6 acquires the image data corresponding to the 1-pixel 1-microlens unit 1A and the image corresponding to the 4-pixel 1-microlens unit 1B, which are acquired via the input / output processing circuit 241. It is determined that the image data corresponding to the 1-pixel 1-microlens unit 1A is used from the image data mixed with the data, and the image processing unit 243 selects the image data corresponding to the 1-pixel 1-microlens unit 1A. Output to.
- the image processing unit 243 generates image data corresponding to the pixel position of the 4-pixel 1-microlens unit 1B by linear interpolation to generate image data corresponding to the 1-pixel 1-microlens unit 1A.
- Image data is generated, and predetermined signal processing such as white balance correction, gamma correction, synchronization processing, and YC processing is performed.
- the image data (YC data) YC processed by the image processing unit 243 is stored in the memory 48 via the input / output processing circuit 241, and then compressed by the compression / decompression processing unit 26. And recorded as a 2D image on the memory card 54 (step S16).
- the image determination unit 242 illustrated in FIG. 6 includes image data corresponding to the 1-pixel 1-microlens unit 1A acquired via the input / output processing circuit 241; It is determined that the image data corresponding to the 4-pixel 1-microlens portion 1B is to be used from the image data in which the image data corresponding to the 4-pixel-1 microlens portion 1B is mixed, and corresponds to the 4-pixel 1-microlens portion 1B.
- the image data to be selected is selected and output to the image processing unit 243.
- the image processing unit 243 generates image data corresponding to the pixel position of the 1-pixel 1-microlens unit 1A and image data corresponding to the 4-pixel 1-microlens unit 1B by linear interpolation.
- Image data for four viewpoints (for four images) is generated, and two images are added according to the orientation of the imaging device 10 at the time of shooting, whereby a left-eye display image (left parallax image) and right An eye display image (right parallax image) is generated.
- predetermined signal processing such as white balance correction, gamma correction, synchronization processing, and YC processing is performed on these left and right viewpoint images.
- the image data (YC data) YC processed by the image processing unit 243 is stored in the memory 48 via the input / output processing circuit 241, and then compressed by the compression / decompression processing unit 26. And recorded as a 3D image on the memory card 54 (step S20).
- FIG. 8 is a flowchart showing the operation of the imaging apparatus 10 according to the second embodiment of the present invention.
- the second embodiment shown in FIG. 8 is different from the first embodiment in that steps S30, S32, S34, and S36 surrounded by an alternate long and short dash line are added.
- step S30 the representative spatial frequency of the image photographed in step S10 is calculated.
- an image obtained from the 1-pixel 1-microlens unit 1A is converted into a spatial frequency region, and the spatial frequency (hereinafter, “average spatial frequency” of the entire screen in the converted spatial frequency region (hereinafter, “ (Referred to as “representative spatial frequency”) (first representative spatial frequency) and a representative spatial frequency (second representative spatial frequency) of an image obtained from the 4-pixel 1-microlens portion 1B.
- a G pixel signal close to a luminance signal can be used as a pixel used for calculation of the representative spatial frequency.
- step S32 it is determined whether or not the first representative spatial frequency exceeds a predetermined threshold.
- a difference between the first representative spatial frequency and the second representative spatial frequency is calculated, and the difference is a predetermined value (for example, a value for determining whether there is an obvious difference between the representative spatial frequencies). ) Or not.
- the determination as to whether the first representative spatial frequency exceeds a predetermined threshold is not limited to the above example, and is compared with a preset threshold (for example, the maximum value that the second representative spatial frequency can take). May be performed.
- step S14 If it is determined that the first representative spatial frequency exceeds the predetermined threshold, the process proceeds to step S14. If it is determined that the first representative spatial frequency is equal to or lower than the predetermined threshold, the process proceeds to step S34. . That is, when the first representative spatial frequency exceeds a predetermined threshold value, the subject image contains a lot of high-frequency components and is preferably recorded as a high-resolution 2D image. On the other hand, when the first representative spatial frequency is equal to or lower than the predetermined threshold value, the subject image has few high frequency components, and therefore the process proceeds to step S34 in order to prioritize sensitivity over resolution.
- step S34 the image determination unit 242 (FIG. 6) acquires the image data corresponding to the 1-pixel 1-microlens unit 1A and the image data corresponding to the 4-pixel 1-microlens unit 1B acquired via the input / output processing circuit 241.
- the image data corresponding to the 4-pixel 1-microlens unit 1B is determined to be used from among the image data in which is mixed, and the image data corresponding to the 4-pixel 1-microlens unit 1B is selected and sent to the image processing unit 243.
- Output In the 2D shooting mode, for the image signal (analog signal) output from the 4-pixel 1-microlens portion 1B, the analog gain is lowered (sensitivity is lowered) in consideration of the pixel addition of 4 pixels. .
- the image processing unit 243 generates a 2D image from image data corresponding to the 4-pixel 1-microlens unit 1B. That is, four image data for each four-pixel one-microlens portion 1B are added to generate image data for one pixel from the four image data, and one-pixel one-microlens is obtained by linearly interpolating the generated image data. Image data at the pixel position of the unit 1A is generated. After that, white balance correction, gamma correction, synchronization processing, YC processing based on the image data generated from the image data corresponding to the 4-pixel 1-microlens unit 1B and all the image data generated by the interpolation. The predetermined signal processing such as is performed.
- the image data YC processed by the image processing unit 243 (YC data) is stored in the memory 48 via the input / output processing circuit 241, and after being compressed by the compression / decompression processing unit 26, the media controller 52 is And recorded as a 2D image on the memory card 54 (step S36).
- FIG. 9 is a flowchart showing the operation of the imaging apparatus 10 according to the third embodiment of the present invention.
- the third embodiment shown in FIG. 9 is different from the first embodiment in that steps S40, S42, S34, and S36 surrounded by an alternate long and short dash line are added.
- step S40 the average luminance at the time of imaging in step S10 is acquired.
- the brightness of the subject (shooting EV value) measured by the AE detection unit 44 (FIG. 4) can be used.
- the threshold value is, for example, a value when the average luminance (shooting EV value) is low and the shooting sensitivity needs to be increased.
- step S14 When the average luminance exceeds a predetermined threshold (when it is not necessary to increase the shooting sensitivity), the process proceeds to step S14, and when the average luminance is equal to or lower than the predetermined threshold (when the shooting sensitivity needs to be increased). Makes a transition to step S34.
- step S34 and S36 image data corresponding to the 4-pixel 1-microlens unit 1B is selected as in the second embodiment shown in FIG. 8, and a 2D image is generated and recorded based on the selected image data.
- the analog gain is low for the image signal (analog signal) output from the 4-pixel 1-microlens unit 1B in consideration of the added amount of 4 pixels (sensitivity is reduced). Therefore, a 2D image with less noise than the image signal obtained from the 1-pixel 1-microlens portion 1A can be obtained.
- FIG. 10 is a flowchart showing the operation of the imaging apparatus 10 according to the fourth embodiment of the present invention.
- the fourth embodiment shown in FIG. 10 is different in that steps S30, S32, S34, S36, S40, and S42 surrounded by a one-dot chain line are added. To do.
- step S32 only when it is determined in step S32 that the first representative spatial frequency exceeds the predetermined threshold value and in step S42, it is determined that the average luminance exceeds the predetermined threshold value, one pixel per microlens.
- a 2D image is generated and recorded based on the image data output from the unit 1A. In other cases, a 2D image is generated and recorded based on the image data output from the 4-pixel 1 microlens unit 1B. Yes.
- the fourth embodiment when the average luminance is equal to or lower than the predetermined threshold, even if the first representative spatial frequency exceeds the predetermined threshold (an image containing a lot of high frequency components), A two-dimensional image is generated based on the second output signal output from the 4-pixel 1-microlens unit 1B.
- FIG. 11 is a flowchart showing the operation of the imaging apparatus 10 according to the fifth embodiment of the present invention.
- the fifth embodiment shown in FIG. 11 differs from the first embodiment in that steps S50 to S64 surrounded by a one-dot chain line are added.
- one captured image is divided into N ⁇ M, and a representative spatial frequency is calculated for each divided area divided into N ⁇ M.
- the size of the divided area is preferably as small as possible within the range in which the representative spatial frequency can be calculated. Then, for each divided area, it is determined whether the image data of the 1-pixel 1-microlens unit 1A or the 4-pixel 1-microlens unit 1B is selected.
- Step S50 is a pre-determination unit that repeats while changing the variable X between step S64, with the initial value of the variable X being 1, the closing price being N, and the increment being 1, and step S52 being the variable Y
- step S52 being the variable Y
- This is a pre-determination unit that sets the initial value to 1, the final value to M, and the increment to 1, and repeats the process while changing the variable Y with respect to step S62.
- step S54 the representative spatial frequency of the divided area ZONE (X, Y) of the photographed image is calculated.
- step S56 it is determined whether or not the calculated representative spatial frequency of the divided area ZONE (X, Y) exceeds a threshold value. This determination is performed in the same manner as in the second embodiment (step S32 in FIG. 8).
- the image data of the 1-pixel 1-microlens portion 1A in the divided area ZONE (X, Y) is selected.
- the image of the 4-pixel 1 microlens portion 1B in the divided area ZONE (X, Y) is temporarily stored (step S58). Select data and save it temporarily.
- the image data of 1 pixel 1 microlens unit 1A or the image of 4 pixels 1 microlens unit 1B is obtained for all N ⁇ M divided areas ZONE (X, Y). Data is selected.
- step S16 ′ a 2D image is generated based on image data for one screen in which the image data of the 1-pixel 1-microlens unit 1A and the image data of the 4-pixel 1-microlens unit 1B selected as described above are mixed. Generate.
- the 2D image of the divided area generated based on the image data of the 4-pixel 1 microlens unit 1B and the 2D image of the divided area generated based on the image data of the 1-pixel 1 microlens unit 1A are pixels. Since the numbers are different, one pixel of the 2D image of the divided area generated based on the four-pixel one-microlens portion 1B is made four pixels by interpolation or the like, and the number of pixels in each divided area is made uniform. That is, step S16 ′ is different from step S16 in FIG. 7 of the first embodiment in that processing for aligning the number of pixels in each divided area is added, but the other processing is the same as step S16. 2D image is generated and saved.
- the fifth embodiment it is possible to generate a 2D image using optimal image data in accordance with a subject to be photographed (whether the subject includes a high frequency component) in one image.
- the method for selecting whether to use the image data of the 1 pixel 1 microlens unit 1A or the image data of the 4 pixel 1 microlens unit 1B in the 2D shooting mode is not limited to this embodiment.
- the image size to be recorded is set to 1 ⁇ 4 or less of the maximum image size
- the image data of the 4-pixel 1-microlens portion 1B may be used.
- the image data of the 1 pixel 1 micro lens unit 1A or the image data of the 4 pixel 1 micro lens unit 1B is used depending on whether the representative spatial frequency of the image exceeds the threshold value.
- the present invention is not limited to this.
- a high-frequency component included in an image is extracted by a high-pass filter, and one micro lens per pixel is based on the magnitude of the extracted high-frequency component. It is possible to select whether to use the image data of the unit 1A or the image data of the 4-pixel 1 microlens unit 1B.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Studio Devices (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Solid State Image Pick-Up Elements (AREA)
Abstract
Description
図1は本発明に係る撮像デバイスの第1の実施の形態を示す要部平面図である。
図4は本発明に係る撮像装置10の実施の形態を示すブロック図である。
図7は本発明の第1の実施の形態の撮像装置10の作用を示すフローチャートである。
図8は本発明の第2の実施の形態の撮像装置10の作用を示すフローチャートである。
図9は本発明の第3の実施の形態の撮像装置10の作用を示すフローチャートである。
図10は本発明の第4の実施の形態の撮像装置10の作用を示すフローチャートである。
図11は本発明の第5の実施の形態の撮像装置10の作用を示すフローチャートである。
2D撮影モード時に1画素1マイクロレンズ部1Aの画像データを使用するか、又は4画素1マイクロレンズ部1Bの画像データを使用するかの選択方法は、この実施の形態に限定されず、例えば、記録する画像サイズが最大画像サイズの4分の1以下に設定されている場合には、4画素1マイクロレンズ部1Bの画像データを使用するようにしてもよい。
Claims (11)
- 半導体基板上の行方向及び列方向に配列された複数の光電変換素子と、
1つの光電変換素子の上方に配設される1つのマイクロレンズであって、該マイクロレンズに入射する光を前記1つの光電変換素子の受光面に導く第1のマイクロレンズと、
上下左右に隣接するn×n(n:2以上の整数)個の光電変換素子の上方に配設される1つのマイクロレンズであって、該マイクロレンズに入射する光を瞳分割してそれぞれ前記n×n個の光電変換素子の受光面に導く第2のマイクロレンズと、を有し、
少なくとも前記第1のマイクロレンズに対応する光電変換素子の第1の出力信号及び前記第2のマイクロレンズに対応する光電変換素子の第2の出力信号に基づいてそれぞれ2次元画像及び3次元画像が生成可能なように前記第1のマイクロレンズと第2のマイクロレンズとが混在して配設されている撮像デバイス。 - 前記複数の光電変換素子の上方には複数色のカラーフィルタのうちのいずれかの色のカラーフィルタが配設され、
前記第2のマイクロレンズに対応するn×n個の光電変換素子には、同色のカラーフィルタが配設されている請求項1に記載の撮像デバイス。 - 前記第1のマイクロレンズが配設される光電変換素子の数と、第2のマイクロレンズが配設される光電変換素子の数とは同数である請求項1又は2に記載の撮像デバイス。
- 4×4の光電変換素子を1ブロックとし、1ブロックに16個の第1のマイクロレンズが配設される第1の領域と、1ブロックに4個の第2のマイクロレンズが配設される第2の領域とが市松状に配置されている請求項1から3のいずれかに記載の撮像デバイス。
- 2×2の光電変換素子を1ブロックとし、1ブロックに4個の第1のマイクロレンズが配設される第1の領域と、1ブロックに1個の第2のマイクロレンズが配設される第2の領域とが市松状に配置されている請求項1から3のいずれかに記載の撮像デバイス。
- 単一の撮影光学系と、
前記撮影光学系を介して被写体像が結像される請求項1から5のいずれかに記載の撮像デバイスと、
2次元画像を撮影する2D撮影モードと3次元画像を撮影する3D撮影モードとを切り替える撮影モード選択部と、
前記撮影モード選択部により2D撮影モードが選択されると、前記撮像デバイスの第1のマイクロレンズに対応する光電変換素子から出力される第1の出力信号に基づいて2次元画像を生成する第1の画像生成部と、
前記撮影モード選択部により3D撮影モードが選択されると、前記撮像デバイスの第2のマイクロレンズに対応する光電変換素子から出力される第2の出力信号に基づいて3次元画像を生成する第2の画像生成部と、
前記第1の画像生成部又は第2の画像生成部により生成された2次元画像又は3次元画像を記録媒体に記録する記録部と、
を備えた撮像装置。 - 単一の撮影光学系と、
前記撮影光学系を介して被写体像が結像される請求項1から5のいずれかに記載の撮像デバイスと、
2次元画像を撮影する2D撮影モードと3次元画像を撮影する3D撮影モードとを切り替える撮影モード選択部と、
前記撮影光学系及び撮像デバイスを介して撮影される画像が高周波成分を多く含むものか否かを判別する判別部と、
前記撮影モード選択部により2D撮影モードが選択され、前記判別部により高周波成分を多く含む画像と判別されると、前記撮像デバイスの第1のマイクロレンズに対応する光電変換素子から出力される第1の出力信号に基づいて2次元画像を生成し、前記判別部により高周波成分を多く含まない画像と判別されると、前記撮像デバイスの第2のマイクロレンズに対応する光電変換素子から出力される第2の出力信号に基づいて2次元画像を生成する第1の画像生成部と、
前記撮影モード選択部により3D撮影モードが選択されると、前記撮像デバイスの第2のマイクロレンズに対応する光電変換素子から出力される第2の出力信号に基づいて3次元画像を生成する第2の画像生成部と、
前記第1の画像生成部又は第2の画像生成部により生成された2次元画像又は3次元画像を記録媒体に記録する記録部と、
を備えた撮像装置。 - 被写体の明るさを検出する明るさ検出部を更に備え、
前記第1の画像生成部は、前記撮影モード選択部により2D撮影モードが選択され、前記判別部により高周波成分を多く含む画像と判別され、かつ前記検出された被写体の明るさが所定の閾値を越える場合には、前記撮像デバイスの第1のマイクロレンズに対応する光電変換素子から出力される第1の出力信号に基づいて2次元画像を生成し、前記判別部により高周波成分を多く含まない画像と判別され、又は前記検出された被写体の明るさが所定の閾値以下の場合には、前記撮像デバイスの第2のマイクロレンズに対応する光電変換素子から出力される第2の出力信号に基づいて2次元画像を生成する請求項7に記載の撮像装置。 - 単一の撮影光学系と、
前記撮影光学系を介して被写体像が結像される請求項1から5のいずれかに記載の撮像デバイスと、
2次元画像を撮影する2D撮影モードと3次元画像を撮影する3D撮影モードとを切り替える撮影モード選択部と、
被写体の明るさを検出する明るさ検出部と、
前記撮影モード選択部により2D撮影モードが選択され、前記検出された被写体の明るさが所定の閾値を越える場合には、前記撮像デバイスの第1のマイクロレンズに対応する光電変換素子から出力される第1の出力信号に基づいて2次元画像を生成し、前記検出された被写体の明るさが所定の閾値以下の場合には、前記撮像デバイスの第2のマイクロレンズに対応する光電変換素子から出力される第2の出力信号に基づいて2次元画像を生成する第1の画像生成部と、
前記撮影モード選択部により3D撮影モードが選択されると、前記撮像デバイスの第2のマイクロレンズに対応する光電変換素子から出力される第2の出力信号に基づいて3次元画像を生成する第2の画像生成部と、
前記第1の画像生成部又は第2の画像生成部により生成された2次元画像又は3次元画像を記録媒体に記録する記録部と、
を備えた撮像装置。 - 単一の撮影光学系と、
前記撮影光学系を介して被写体像が結像される請求項1から5のいずれかに記載の撮像デバイスと、
2次元画像を撮影する2D撮影モードと3次元画像を撮影する3D撮影モードとを切り替える撮影モード選択部と、
前記撮影光学系及び撮像デバイスを介して撮影される画像が高周波成分を多く含むか否かを判別する判別部であって、1画面をN×M分割された分割エリア毎に高周波成分を多く含むか否かを判別する判別部と、
前記撮影モード選択部により2D撮影モードが選択され、前記判別部により高周波成分を多く含む分割エリアと判別されると、該分割エリアについては前記撮像デバイスの第1のマイクロレンズに対応する光電変換素子から出力される第1の出力信号を取得し、
高周波成分を多く含まない分割エリアと判別されると、該分割エリアについては前記撮像デバイスの第2のマイクロレンズに対応する光電変換素子から出力される第2の出力信号を取得し、これらの取得した第1の出力信号及び第2の出力信号に基づいて2次元画像を生成する第1の画像生成部と、
前記撮影モード選択部により3D撮影モードが選択されると、前記撮像デバイスの第2のマイクロレンズに対応する光電変換素子から出力される第2の出力信号に基づいて3次元画像を生成する第2の画像生成部と、
前記第1の画像生成部又は第2の画像生成部により生成された2次元画像又は3次元画像を記録媒体に記録する記録部と、
を備えた撮像装置。 - 前記第2の画像生成部は、前記撮像デバイスの第2のマイクロレンズに対応する光電変換素子から出力される第2の出力信号に基づいて上下左右の4視点の視差画像、若しくは上下又は左右の2視点の視差画像を生成する請求項6から10のいずれかに記載の撮像装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012534953A JPWO2012039180A1 (ja) | 2010-09-24 | 2011-07-05 | 撮像デバイス及び撮像装置 |
CN201180045871XA CN103155542A (zh) | 2010-09-24 | 2011-07-05 | 图像拾取装置和图像拾取设备 |
US13/846,550 US20130222553A1 (en) | 2010-09-24 | 2013-03-18 | Image pickup device and image pickup apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-214103 | 2010-09-24 | ||
JP2010214103 | 2010-09-24 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/846,550 Continuation US20130222553A1 (en) | 2010-09-24 | 2013-03-18 | Image pickup device and image pickup apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012039180A1 true WO2012039180A1 (ja) | 2012-03-29 |
Family
ID=45873676
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/065314 WO2012039180A1 (ja) | 2010-09-24 | 2011-07-05 | 撮像デバイス及び撮像装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20130222553A1 (ja) |
JP (1) | JPWO2012039180A1 (ja) |
CN (1) | CN103155542A (ja) |
WO (1) | WO2012039180A1 (ja) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013094121A1 (ja) * | 2011-12-21 | 2013-06-27 | シャープ株式会社 | 撮像装置および電子情報機器 |
WO2013147199A1 (ja) * | 2012-03-30 | 2013-10-03 | 株式会社ニコン | 撮像素子、撮影方法、および撮像装置 |
JP2013219736A (ja) * | 2012-03-16 | 2013-10-24 | Nikon Corp | 画像処理装置、撮像装置および画像処理プログラム |
WO2013161313A1 (ja) * | 2012-04-25 | 2013-10-31 | 株式会社ニコン | 画像処理装置、撮像装置および画像処理プログラム |
JP2013229765A (ja) * | 2012-04-25 | 2013-11-07 | Nikon Corp | 画像処理装置、撮像装置および画像処理プログラム |
JP2013229764A (ja) * | 2012-04-25 | 2013-11-07 | Nikon Corp | 画像処理装置、撮像装置および画像処理プログラム |
JP2013229762A (ja) * | 2012-04-25 | 2013-11-07 | Nikon Corp | 画像処理装置、撮像装置および画像処理プログラム |
JP2013229766A (ja) * | 2012-04-25 | 2013-11-07 | Nikon Corp | 画像処理装置、撮像装置および画像処理プログラム |
JP2014086984A (ja) * | 2012-10-26 | 2014-05-12 | Nikon Corp | 画像処理装置、撮影装置およびプログラム |
JP2015144416A (ja) * | 2013-12-25 | 2015-08-06 | キヤノン株式会社 | 撮像装置及び撮像装置の制御方法 |
JP2015207000A (ja) * | 2014-04-22 | 2015-11-19 | オプティツ インコーポレイテッド | カラーフィルタ及びフォトダイオードのパターニング構成 |
JP2016219982A (ja) * | 2015-05-19 | 2016-12-22 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理方法および画像処理プログラム |
JPWO2014112002A1 (ja) * | 2013-01-15 | 2017-01-19 | オリンパス株式会社 | 撮像素子、及び撮像装置 |
JP2017085484A (ja) * | 2015-10-30 | 2017-05-18 | 日本放送協会 | 撮像素子、合焦位置検出器及び撮像装置 |
CN108337419A (zh) * | 2012-07-12 | 2018-07-27 | 株式会社尼康 | 图像处理装置 |
CN115250320A (zh) * | 2021-04-28 | 2022-10-28 | 北京小米移动软件有限公司 | 图像获取方法及装置、电子设备、存储介质 |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8742309B2 (en) | 2011-01-28 | 2014-06-03 | Aptina Imaging Corporation | Imagers with depth sensing capabilities |
EP2645722A4 (en) * | 2011-03-11 | 2014-05-14 | Fujifilm Corp | IMAGING APPARATUS AND METHOD FOR CONTROLLING THEIR OPERATION |
KR101772458B1 (ko) * | 2011-06-28 | 2017-08-30 | 엘지전자 주식회사 | 영상표시 장치 및 그 제어방법 |
US10015471B2 (en) * | 2011-08-12 | 2018-07-03 | Semiconductor Components Industries, Llc | Asymmetric angular response pixels for single sensor stereo |
US9554115B2 (en) * | 2012-02-27 | 2017-01-24 | Semiconductor Components Industries, Llc | Imaging pixels with depth sensing capabilities |
KR20140038692A (ko) * | 2012-09-21 | 2014-03-31 | 포항공과대학교 산학협력단 | 색변환 엘리먼트 및 그 제조방법 |
KR102224489B1 (ko) * | 2013-11-12 | 2021-03-08 | 엘지전자 주식회사 | 디지털 디바이스 및 그의 3차원 영상 처리 방법 |
EP2871843B1 (en) * | 2013-11-12 | 2019-05-29 | LG Electronics Inc. -1- | Digital device and method for processing three dimensional image thereof |
JP6408372B2 (ja) * | 2014-03-31 | 2018-10-17 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置及びその駆動制御方法、並びに、電子機器 |
CN105812644A (zh) * | 2014-12-30 | 2016-07-27 | 联想(北京)有限公司 | 图像处理方法、成像装置和电子设备 |
JP6618271B2 (ja) | 2015-05-01 | 2019-12-11 | キヤノン株式会社 | 画像処理装置およびその制御方法、撮像装置 |
US10044959B2 (en) * | 2015-09-24 | 2018-08-07 | Qualcomm Incorporated | Mask-less phase detection autofocus |
US10469730B2 (en) * | 2016-05-18 | 2019-11-05 | Canon Kabushiki Kaisha | Imaging device and control method for simultaneously outputting an image pickup signal and a parallax image signal |
CN107135340A (zh) * | 2017-04-28 | 2017-09-05 | 广东欧珀移动通信有限公司 | 图像传感器、对焦控制方法、成像装置和移动终端 |
JP2019012968A (ja) * | 2017-06-30 | 2019-01-24 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、及び電子機器 |
JP7079080B2 (ja) * | 2017-11-06 | 2022-06-01 | キヤノン株式会社 | 画像処理装置および画像処理方法 |
KR102632474B1 (ko) * | 2019-02-11 | 2024-02-01 | 삼성전자주식회사 | 이미지 센서의 픽셀 어레이 및 이를 포함하는 이미지 센서 |
CN112188181B (zh) * | 2019-07-02 | 2023-07-04 | 中强光电股份有限公司 | 图像显示设备、立体图像处理电路及其同步信号校正方法 |
CN114647092A (zh) * | 2020-12-18 | 2022-06-21 | 深圳光峰科技股份有限公司 | 一种立体显示装置与立体投影显示系统 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003523646A (ja) * | 1999-02-25 | 2003-08-05 | ヴィジョンセンス リミテッド | 光学装置 |
JP2009165115A (ja) * | 2007-12-12 | 2009-07-23 | Sony Corp | 撮像装置 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3513371B2 (ja) * | 1996-10-18 | 2004-03-31 | キヤノン株式会社 | マトリクス基板と液晶装置とこれらを用いた表示装置 |
US6219113B1 (en) * | 1996-12-17 | 2001-04-17 | Matsushita Electric Industrial Co., Ltd. | Method and apparatus for driving an active matrix display panel |
JP3199312B2 (ja) * | 1997-11-06 | 2001-08-20 | キヤノン株式会社 | 液晶表示装置 |
US6078371A (en) * | 1998-10-05 | 2000-06-20 | Canon Kabushiki Kaisha | Liquid crystal device and liquid crystal display apparatus |
TW591295B (en) * | 1999-04-13 | 2004-06-11 | Canon Kk | Liquid crystal device and liquid crystal display apparatus |
JP2005109968A (ja) * | 2003-09-30 | 2005-04-21 | Matsushita Electric Ind Co Ltd | カラー固体撮像装置 |
JP4284282B2 (ja) * | 2004-07-21 | 2009-06-24 | 富士フイルム株式会社 | 撮像装置及び固体撮像素子 |
US7355154B2 (en) * | 2005-04-11 | 2008-04-08 | Canon Kabushiki Kaisha | Image sensing apparatus with movable light flux splitter and control method thereof |
JP4497022B2 (ja) * | 2005-04-26 | 2010-07-07 | ソニー株式会社 | 固体撮像装置、固体撮像装置の駆動方法および撮像装置 |
JP5106870B2 (ja) * | 2006-06-14 | 2012-12-26 | 株式会社東芝 | 固体撮像素子 |
US8169518B2 (en) * | 2007-08-14 | 2012-05-01 | Fujifilm Corporation | Image pickup apparatus and signal processing method |
JP5814626B2 (ja) * | 2011-05-27 | 2015-11-17 | キヤノン株式会社 | 光電変換装置及び光電変換装置の製造方法 |
JP5621056B2 (ja) * | 2011-12-27 | 2014-11-05 | 富士フイルム株式会社 | カラー撮像素子 |
US9420208B2 (en) * | 2012-07-13 | 2016-08-16 | Canon Kabushiki Kaisha | Driving method for image pickup apparatus and driving method for image pickup system |
-
2011
- 2011-07-05 CN CN201180045871XA patent/CN103155542A/zh active Pending
- 2011-07-05 WO PCT/JP2011/065314 patent/WO2012039180A1/ja active Application Filing
- 2011-07-05 JP JP2012534953A patent/JPWO2012039180A1/ja not_active Withdrawn
-
2013
- 2013-03-18 US US13/846,550 patent/US20130222553A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003523646A (ja) * | 1999-02-25 | 2003-08-05 | ヴィジョンセンス リミテッド | 光学装置 |
JP2009165115A (ja) * | 2007-12-12 | 2009-07-23 | Sony Corp | 撮像装置 |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013094121A1 (ja) * | 2011-12-21 | 2013-06-27 | シャープ株式会社 | 撮像装置および電子情報機器 |
JP2013219736A (ja) * | 2012-03-16 | 2013-10-24 | Nikon Corp | 画像処理装置、撮像装置および画像処理プログラム |
WO2013147199A1 (ja) * | 2012-03-30 | 2013-10-03 | 株式会社ニコン | 撮像素子、撮影方法、および撮像装置 |
JPWO2013147199A1 (ja) * | 2012-03-30 | 2015-12-14 | 株式会社ニコン | 撮像素子、撮影方法、および撮像装置 |
US10341620B2 (en) | 2012-03-30 | 2019-07-02 | Nikon Corporation | Image sensor and image-capturing device |
US10560669B2 (en) | 2012-03-30 | 2020-02-11 | Nikon Corporation | Image sensor and image-capturing device |
JP2013229762A (ja) * | 2012-04-25 | 2013-11-07 | Nikon Corp | 画像処理装置、撮像装置および画像処理プログラム |
JP2013229766A (ja) * | 2012-04-25 | 2013-11-07 | Nikon Corp | 画像処理装置、撮像装置および画像処理プログラム |
CN104412586A (zh) * | 2012-04-25 | 2015-03-11 | 株式会社尼康 | 图像处理装置、摄像装置及图像处理程序 |
JP2013229764A (ja) * | 2012-04-25 | 2013-11-07 | Nikon Corp | 画像処理装置、撮像装置および画像処理プログラム |
JP2013229765A (ja) * | 2012-04-25 | 2013-11-07 | Nikon Corp | 画像処理装置、撮像装置および画像処理プログラム |
WO2013161313A1 (ja) * | 2012-04-25 | 2013-10-31 | 株式会社ニコン | 画像処理装置、撮像装置および画像処理プログラム |
CN104412586B (zh) * | 2012-04-25 | 2016-12-28 | 株式会社尼康 | 图像处理装置、摄像装置 |
CN108337419A (zh) * | 2012-07-12 | 2018-07-27 | 株式会社尼康 | 图像处理装置 |
JP2014086984A (ja) * | 2012-10-26 | 2014-05-12 | Nikon Corp | 画像処理装置、撮影装置およびプログラム |
JPWO2014112002A1 (ja) * | 2013-01-15 | 2017-01-19 | オリンパス株式会社 | 撮像素子、及び撮像装置 |
JP2015144416A (ja) * | 2013-12-25 | 2015-08-06 | キヤノン株式会社 | 撮像装置及び撮像装置の制御方法 |
KR101770079B1 (ko) * | 2014-04-22 | 2017-08-21 | 옵티즈 인코포레이티드 | 컬러 필터 및 포토다이오드의 패터닝 구성 |
US9985063B2 (en) | 2014-04-22 | 2018-05-29 | Optiz, Inc. | Imaging device with photo detectors and color filters arranged by color transmission characteristics and absorption coefficients |
JP2015207000A (ja) * | 2014-04-22 | 2015-11-19 | オプティツ インコーポレイテッド | カラーフィルタ及びフォトダイオードのパターニング構成 |
JP2016219982A (ja) * | 2015-05-19 | 2016-12-22 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理方法および画像処理プログラム |
JP2017085484A (ja) * | 2015-10-30 | 2017-05-18 | 日本放送協会 | 撮像素子、合焦位置検出器及び撮像装置 |
CN115250320A (zh) * | 2021-04-28 | 2022-10-28 | 北京小米移动软件有限公司 | 图像获取方法及装置、电子设备、存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN103155542A (zh) | 2013-06-12 |
JPWO2012039180A1 (ja) | 2014-02-03 |
US20130222553A1 (en) | 2013-08-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012039180A1 (ja) | 撮像デバイス及び撮像装置 | |
JP5640143B2 (ja) | 撮像装置及び撮像方法 | |
JP5621056B2 (ja) | カラー撮像素子 | |
JP5192096B2 (ja) | 立体撮像装置 | |
JP5628913B2 (ja) | 撮像装置および撮像方法 | |
US8593509B2 (en) | Three-dimensional imaging device and viewpoint image restoration method | |
US8786676B2 (en) | Imaging device for generating stereoscopic image | |
JP5722975B2 (ja) | 撮像装置、撮像装置用シェーディング補正方法及び撮像装置用プログラム | |
JP5421829B2 (ja) | 撮像装置 | |
JP5368350B2 (ja) | 立体撮像装置 | |
JP5469258B2 (ja) | 撮像装置および撮像方法 | |
US20110234767A1 (en) | Stereoscopic imaging apparatus | |
WO2013069445A1 (ja) | 立体撮像装置及び画像処理方法 | |
WO2012169301A1 (ja) | 立体動画像及び平面動画像を撮像する撮像素子及びこの撮像素子を搭載する撮像装置 | |
JP5457240B2 (ja) | 立体撮像装置 | |
JP2010204385A (ja) | 立体撮像装置および立体撮像方法 | |
JP2012124650A (ja) | 撮像装置および撮像方法 | |
JP5649837B2 (ja) | 立体撮像装置 | |
JP2014063190A (ja) | 撮像装置 | |
JP2011077680A (ja) | 立体撮影装置および撮影制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180045871.X Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11826626 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2012534953 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11826626 Country of ref document: EP Kind code of ref document: A1 |