US20240155261A1 - Imaging device and electronic apparatus - Google Patents
Imaging device and electronic apparatus Download PDFInfo
- Publication number
- US20240155261A1 US20240155261A1 US17/773,172 US202017773172A US2024155261A1 US 20240155261 A1 US20240155261 A1 US 20240155261A1 US 202017773172 A US202017773172 A US 202017773172A US 2024155261 A1 US2024155261 A1 US 2024155261A1
- Authority
- US
- United States
- Prior art keywords
- special
- filter
- pixel
- color filter
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 182
- 239000011159 matrix material Substances 0.000 claims abstract description 11
- 230000003287 optical effect Effects 0.000 claims description 17
- 239000003086 colorant Substances 0.000 claims description 3
- 239000010410 layer Substances 0.000 description 71
- 238000012545 processing Methods 0.000 description 70
- 238000006243 chemical reaction Methods 0.000 description 54
- 238000010586 diagram Methods 0.000 description 54
- 230000000875 corresponding effect Effects 0.000 description 35
- 238000012986 modification Methods 0.000 description 34
- 230000004048 modification Effects 0.000 description 34
- 238000004891 communication Methods 0.000 description 29
- 238000005516 engineering process Methods 0.000 description 29
- 238000001514 detection method Methods 0.000 description 26
- 210000003128 head Anatomy 0.000 description 26
- 230000006870 function Effects 0.000 description 23
- 230000007423 decrease Effects 0.000 description 18
- 239000002775 capsule Substances 0.000 description 16
- 238000001727 in vivo Methods 0.000 description 16
- 230000035945 sensitivity Effects 0.000 description 15
- 230000005540 biological transmission Effects 0.000 description 14
- 238000012546 transfer Methods 0.000 description 13
- 238000002674 endoscopic surgery Methods 0.000 description 12
- 230000003321 amplification Effects 0.000 description 11
- 238000003199 nucleic acid amplification method Methods 0.000 description 11
- 230000007246 mechanism Effects 0.000 description 8
- 238000000034 method Methods 0.000 description 8
- 238000001356 surgical procedure Methods 0.000 description 8
- 230000015572 biosynthetic process Effects 0.000 description 7
- 230000000694 effects Effects 0.000 description 7
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 6
- 230000001276 controlling effect Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 230000005284 excitation Effects 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 208000005646 Pneumoperitoneum Diseases 0.000 description 3
- 238000009792 diffusion process Methods 0.000 description 3
- 238000010336 energy treatment Methods 0.000 description 3
- 230000002349 favourable effect Effects 0.000 description 3
- 238000007667 floating Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000006978 adaptation Effects 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 2
- 239000003153 chemical reaction reagent Substances 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- MOFVSTNWEDAEEK-UHFFFAOYSA-M indocyanine green Chemical compound [Na+].[O-]S(=O)(=O)CCCCN1C2=CC=C3C=CC=CC3=C2C(C)(C)C1=CC=CC=CC=CC1=[N+](CCCCS([O-])(=O)=O)C2=CC=C(C=CC=C3)C3=C2C1(C)C MOFVSTNWEDAEEK-UHFFFAOYSA-M 0.000 description 2
- 229960004657 indocyanine green Drugs 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 210000000056 organ Anatomy 0.000 description 2
- 238000007639 printing Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000002583 angiography Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000003796 beauty Effects 0.000 description 1
- 230000000740 bleeding effect Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000005669 field effect Effects 0.000 description 1
- 230000004313 glare Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 210000000936 intestine Anatomy 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000003595 mist Substances 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000002572 peristaltic effect Effects 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 230000008929 regeneration Effects 0.000 description 1
- 238000011069 regeneration method Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 210000004761 scalp Anatomy 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 210000002784 stomach Anatomy 0.000 description 1
- 239000002344 surface layer Substances 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/131—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/702—SSIS architectures characterised by non-identical, non-equidistant or non-planar pixel layout
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/20—Filters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/12—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/704—Pixels specially adapted for focusing, e.g. phase difference pixel sets
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/706—Pixels for exposure or ambient light measuring
Definitions
- the present disclosure relates to an imaging device and an electronic apparatus.
- CMOS complementary metal oxide semiconductor
- a special pixel such as a pixel for receiving infrared light or a pixel for detecting an image plane phase difference may be arranged by replacing a normal pixel.
- CMOS complementary metal oxide semiconductor
- an imaging device in which pixels for detecting an image plane phase difference are arranged at predetermined intervals on a horizontal line of normal pixels arranged on an array (see, for example, Patent Literature 1).
- Patent Literature 1 JP 2008-312073 A
- a filter for example, in the case of a pixel for receiving infrared light, an infrared light filter
- the special pixel replaces a normal pixel
- a filter for example, in the case of a pixel for receiving infrared light, an infrared light filter
- the color filter cannot be formed in a desired shape due to the influence of such a region. If the color filter cannot be formed in a desired shape, the light receiving sensitivity of the normal pixel fluctuates, and the accuracy of the imaging device may decrease.
- the present disclosure provides an imaging device and an electronic apparatus capable of suppressing a decrease in accuracy.
- an imaging device includes the plurality of normal pixels arranged in a matrix, the special pixel arranged by replacing a part of the normal pixels, the color filter corresponding to the normal pixels and arranged according to a predetermined rule, the special filter arranged corresponding to the special pixel, and the special pixel color filter arranged to surround at least a part of the periphery of the special filter.
- FIG. 1 is a block diagram illustrating a configuration of an example of an electronic apparatus that may be applied to an embodiment.
- FIG. 2 is a block diagram illustrating a schematic configuration example of an imaging device that may be applied to an embodiment of the present disclosure.
- FIG. 3 is a diagram illustrating a part of a circuit configuration of a pixel array unit that may be applied to the embodiment.
- FIG. 4 is a diagram for explaining an arrangement example of pixels.
- FIG. 5 is a timing chart schematically illustrating reading of a pixel signal by the imaging device.
- FIG. 6 is a diagram illustrating a configuration example of color filters.
- FIG. 7 is a diagram for explaining an example of a method of forming color filters.
- FIG. 8 is a diagram for explaining an example of a method of forming color filters.
- FIG. 9 is a diagram for explaining a color filter layer of the imaging device according to the embodiment.
- FIG. 10 is a diagram for explaining the color filter layer of the imaging device according to the embodiment.
- FIG. 11 is a diagram for explaining another configuration example of a color filter layer of the imaging device according to the embodiment.
- FIG. 12 is a diagram for explaining another configuration example of the color filter layer of the imaging device according to the embodiment.
- FIG. 13 is a diagram for explaining another configuration example of a color filter layer of the imaging device according to the embodiment.
- FIG. 14 is a diagram illustrating a configuration example of a color filter layer according to an existing technology.
- FIG. 15 is a schematic view illustrating a cross section of the color filter layer taken along line A-A′ in FIG. 14 .
- FIG. 16 is a diagram illustrating a configuration example of the color filter layer according to the embodiment.
- FIG. 17 is a schematic view illustrating a cross section of the color filter layer taken along line B-B′ in FIG. 16 .
- FIG. 18 is a diagram illustrating a configuration example of a color filter layer according to a first modification of the embodiment.
- FIG. 19 is a diagram illustrating a configuration example of a color filter layer according to a second modification of the embodiment.
- FIG. 20 is a diagram illustrating a configuration example of a color filter layer according to a third modification of the embodiment.
- FIG. 21 is a diagram illustrating another configuration example of the color filter layer according to the third modification of the embodiment.
- FIG. 22 is a diagram illustrating a configuration example of a color filter layer according to a fourth modification of the embodiment.
- FIG. 23 is a diagram illustrating another configuration example of the color filter layer according to the fourth modification of the embodiment.
- FIG. 24 is a diagram illustrating a configuration example of the color filter layer according to the fourth modification of the embodiment.
- FIG. 25 is a diagram illustrating examples of using the imaging device according to the embodiment and the modifications.
- FIG. 26 is a block diagram illustrating an example of a schematic configuration of a patient in-vivo information acquisition system using a capsule endoscope to which the technology according to the present disclosure may be applied.
- FIG. 27 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure may be applied.
- FIG. 28 is a block diagram illustrating an example of functional configurations of a camera head and a CCU.
- FIG. 29 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a moving body control system to which the technology according to the present disclosure may be applied.
- FIG. 30 is a diagram illustrating an example of an installation position of an imaging unit.
- FIG. 1 is a block diagram illustrating a configuration of an example of an electronic apparatus that may be applied to an embodiment.
- an electronic apparatus 1 D includes an optical system 2 D, a control unit 3 D, an imaging device 1 , an image processing unit 5 D, a memory 6 D, a storage unit 7 D, a display unit 8 D, an interface (I/F) unit 9 D, and an input device 10 D.
- a digital still camera, a digital video camera, a mobile phone with an imaging function, a smartphone, or the like can be applied.
- a monitoring camera, an in-vehicle camera, a medical camera, or the like can also be applied.
- the imaging device 1 includes, for example, a plurality of photoelectric conversion elements arranged in a matrix array.
- the photoelectric conversion elements convert received light into electric charge by photoelectric conversion.
- the imaging device 1 includes a drive circuit that drives the plurality of photoelectric conversion elements, and a signal processing circuit that reads electric charge from each of the plurality of photoelectric conversion elements and generates image data based on the read electric charge.
- the optical system 2 D includes a main lens including one or a combination of a plurality of lenses and a mechanism for driving the main lens, and forms an image of image light (incident light) from a subject on a light receiving surface of the imaging device 1 via the main lens.
- the optical system 2 D includes an autofocus mechanism that adjusts the focus in accordance with a control signal and a zoom mechanism that changes the zoom ratio in accordance with a control signal.
- the optical system 2 D may be detachable and may be replaced with another optical system 2 D.
- the image processing unit 5 D executes predetermined image processing on the image data output from the imaging device 1 .
- the image processing unit 5 D is connected to the memory 6 D such as a frame memory, and writes the image data output from the imaging device 1 in the memory 6 D.
- the image processing unit 5 D executes predetermined image processing on the image data written in the memory 6 D, and writes the image data subjected to the image processing again in the memory 6 D.
- the storage unit 7 D is, for example, a non-volatile memory such as a flash memory or a hard disk drive, and stores the image data output from the image processing unit 5 D in a non-volatile manner.
- the display unit 8 D includes, for example, a display device such as a liquid crystal display (LCD) and a drive circuit that drives the display device, and can display an image based on the image data output from the image processing unit 5 D.
- the I/F unit 9 D is an interface for transmitting the image data output from the image processing unit 5 D to the outside.
- a universal serial bus (USB) can be applied as the I/F unit 9 D.
- USB universal serial bus
- the present invention is not limited to this, and the I/F unit 9 D may be an interface connectable to a network by wired communication or wireless communication.
- the input device 10 D includes an operator for receiving a user input. If the electronic apparatus 1 D is, for example, a digital still camera, a digital video camera, a mobile phone or a smartphone with an imaging function, the input device 10 D can include a shutter button for giving instructions on imaging by the imaging device 1 or an operator for realizing the function of the shutter button.
- the control unit 3 D includes, for example, a processor such as a central processing unit (CPU), a read only memory (ROM), and a random access memory (RAM), and controls the entire operation of the electronic apparatus 1 D using the RAM as a work memory according to the program stored in the ROM in advance.
- the control unit 3 D can control the operation of the electronic apparatus 1 D according to the user input received by the input device 10 D.
- the control unit 3 D can control the autofocus mechanism in the optical system 2 D based on the image processing result of the image processing unit 5 D.
- FIG. 2 is a block diagram illustrating a schematic configuration example of an imaging device that may be applied to an embodiment of the present disclosure.
- the imaging device 1 includes a pixel array unit 11 , a vertical scanning unit 12 , an A/D conversion unit 13 , a reference signal generation unit 14 , a horizontal scanning unit 15 , a pixel signal line 16 , a vertical signal line 17 , an output unit 18 , and a control unit 19 .
- the pixel array unit 11 includes a plurality of pixels arranged in a two-dimensional matrix in a horizontal direction (row direction) and a vertical direction (column direction). Each pixel includes a photoelectric conversion unit that performs photoelectric conversion on received light.
- the photoelectric conversion unit includes a photodiode or the like.
- the pixel signal line 16 is connected for each row and the vertical signal line 17 is connected for each column.
- An end of the pixel signal line 16 that is not connected to the pixel array unit 11 is connected to the vertical scanning unit 12 .
- the pixel signal line 16 transmits a control signal such as a drive pulse when a pixel signal is read from a pixel from the vertical scanning unit 12 to the pixel array unit 11 .
- An end of the vertical signal line 17 that is not connected to the pixel array unit 11 is connected to the analog to digital (A/D) conversion unit 13 .
- the vertical signal line 17 transmits the pixel signal read from the pixel to the A/D conversion unit 13
- the vertical scanning unit 12 supplies various signals including a drive pulse to the pixel signal line 16 corresponding to the selected pixel row of the pixel array unit 11 , outputting the pixel signal and the like to the vertical signal line 17 .
- the vertical scanning unit 12 includes, for example, a shift register, an address decoder, and the like.
- the A/D conversion unit 13 includes a column A/D conversion unit 131 provided for each vertical signal line 17 and a signal processing unit 132 .
- the column A/D conversion unit 131 executes a counting processing for a correlated double sampling (CDS) processing for performing noise reduction on the pixel signal output from the pixel via the vertical signal line 17 .
- the column A/D conversion unit 131 includes a comparator 131 a and a counter unit 131 b.
- the comparator 131 a compares the pixel signal input from the pixel via the vertical signal line 17 with a ramp signal RAMP supplied from the reference signal generation unit 14 in a preset phase (P-phase) period, and outputs the comparison result to the counter unit 131 b .
- the P-phase period is a period in which the reset level of the pixel signal is detected in the CDS processing.
- the ramp signal RAMP is, for example, a signal in which the level (voltage value) decreases at a constant slope, or a sawtooth wave signal in which the level decreases stepwise.
- the comparator 131 a When the level of the ramp signal RAMP is higher than the level of the pixel signal, the comparator 131 a outputs a high difference signal to the counter unit 131 b .
- the comparator 131 a When the level of the ramp signal RAMP becomes equal to or lower than the level of the pixel signal, the comparator 131 a inverts the output and outputs a low difference signal to the counter unit 131 b . Note that the level of the ramp signal RAMP is reset to a predetermined value after the output of the comparator 131 a is inverted.
- the counter unit 131 b down-counts the time from the start of the voltage drop of the ramp signal RAMP to the level equal to or lower than that of the pixel signal, and outputs the counting result to the signal processing unit 132 .
- the counter unit 131 b up-counts the time from the start of the voltage drop of the ramp signal RAMP to the level equal to or lower than that of the pixel signal, and outputs the counting result to the signal processing unit 132 .
- the D-phase period is a detection period in which the signal level of the pixel signal is detected in the CDS processing.
- the signal processing unit 132 performs CDS processing and A/D conversion processing based on the counting result of the P-phase period and the counting result of the D-phase period input from the counter unit 131 b to generate digital image data, and outputs the digital image data to the output unit 18 .
- the reference signal generation unit 14 generates the ramp signal RAMP based on a control signal input from the control unit 19 , and outputs the generated ramp signal RAMP to the comparator 131 a of the A/D conversion unit 13 .
- the reference signal generation unit 14 includes, for example, a D/A conversion circuit or the like.
- the horizontal scanning unit 15 performs selective scanning for selecting each column A/D conversion unit 131 in a predetermined order, sequentially outputting the counting results temporarily held by each column A/D conversion unit 131 to the signal processing unit 132 .
- the horizontal scanning unit 15 includes, for example, a shift register, an address decoder, and the like.
- the output unit 18 performs predetermined signal processing on the image data input from the signal processing unit 132 and outputs the image data to the outside of the imaging device 1 .
- the control unit 19 performs drive control of the vertical scanning unit 12 , the A/D conversion unit 13 , the reference signal generation unit 14 , the horizontal scanning unit 15 , and the like.
- the control unit 19 includes, for example, a timing generator or the like.
- the control unit 19 generates various drive signals serving as references for the operations of the vertical scanning unit 12 , the A/D conversion unit 13 , the reference signal generation unit 14 , and the horizontal scanning unit 15 .
- the imaging device 1 configured as described above is a column AD type complementary metal oxide semiconductor (CMOS) image sensor in which the column A/D conversion unit 131 is arranged for each column.
- CMOS complementary metal oxide semiconductor
- FIG. 3 is a diagram illustrating a part of a circuit configuration of the pixel array unit 11 that may be applied to the embodiment.
- the pixel array unit 11 includes a constant current source 2 , a pixel 3 (hereinafter, referred to as “normal pixel 3 ”), and a pixel 4 (hereinafter, referred to as “special pixel 4 ”).
- a plurality of normal pixels 3 and a plurality of special pixels 4 are arranged in a two-dimensional matrix in a predetermined arrangement pattern, and the special pixels 4 are arranged in a predetermined pixel row at predetermined intervals.
- a first transmission signal line 161 , a reset signal line 162 , and a row selection signal line 163 are connected to each normal pixel 3 as the pixel signal line 16 .
- the reset signal line 162 , the row selection signal line 163 , and a second transmission signal line 164 are connected to each special pixel 4 as the pixel signal line 16 .
- the constant current source 2 is provided in each vertical signal line 17 .
- the constant current source 2 includes an N-channel MOS (metal-oxide-semiconductor field-effect) transistor (hereinafter, abbreviated as “NMOS”) or the like.
- NMOS metal-oxide-semiconductor field-effect transistor
- One end side of the constant current source 2 is grounded, and the other end side is connected to the vertical signal line 17 .
- the normal pixels 3 are arranged in a two-dimensional matrix on the pixel array unit 11 .
- the normal pixel 3 includes a photoelectric conversion unit 31 , a transfer switch 32 , a floating diffusion 33 (hereinafter, abbreviated as “FD 33 ”), a reset switch 34 , an amplification transistor 35 , and a row selection switch 36 .
- the photoelectric conversion unit 31 performs photoelectric conversion on the received light to generate signal electric charge for images.
- the photoelectric conversion unit 31 includes a PN junction photodiode or the like.
- the photoelectric conversion unit 31 has an anode terminal grounded and a cathode terminal connected to the FD 33 via the transfer switch 32 .
- the photoelectric conversion unit 31 functions as a first photoelectric conversion unit.
- the transfer switch 32 has one end connected to the photoelectric conversion unit 31 and the other end connected to the FD 33 . Further, the transfer switch 32 is connected to the first transmission signal line 161 . When the transfer pulse TR is supplied via the first transmission signal line 161 , the transfer switch 32 is turned on (closed state), and transfers the signal electric charge photoelectrically converted by the photoelectric conversion unit 31 to the FD 33 .
- the FD 33 temporarily holds the signal electric charge transferred from the photoelectric conversion unit 31 and converts the signal electric charge into voltage corresponding to the electric charge amount.
- the reset switch 34 has one end connected to the FD 33 and the other end connected to the power source voltage. Further, the reset switch 34 is connected to the reset signal line 162 . In a case where the reset pulse RST is supplied via the reset signal line 162 , the reset switch 34 is turned on, and discharges the electric charge of the FD 33 to the power source voltage to reset the potential of the FD 33 to predetermined potential.
- One end of the amplification transistor 35 is connected to a power source voltage, and the other end is connected to the row selection switch 36 . Further, the FD 33 is connected to the gate end of the amplification transistor 35 .
- the amplification transistor 35 functions as a source follower together with the constant current source 2 connected via the vertical signal line 17 .
- the amplification transistor 35 outputs a reset signal (reset level) indicating a level corresponding to the potential of the FD 33 reset by the reset switch 34 to the vertical signal line 17 .
- the amplification transistor 35 outputs, to the vertical signal line 17 , an image pixel signal indicating a level corresponding to the electric charge amount of the signal electric charge held in the FD 33 after the signal electric charge is transferred from the photoelectric conversion unit 31 by the transfer switch 32 .
- the row selection switch 36 has one end connected to the amplification transistor 35 and the other end connected to the vertical signal line 17 . Further, the row selection switch 36 is connected to the row selection signal line 163 . When the row selection signal SEL is supplied from the row selection signal line 163 , the row selection switch 36 is turned on, and outputs a reset signal or a pixel signal (first signal) output from the amplification transistor 35 to the vertical signal line 17 .
- One end of the vertical signal line 17 is connected to the comparator 131 a or 131 a _S of the A/D conversion unit 13 .
- the comparator 131 a connected to the vertical signal line 17 to which the special pixel 4 is connected is illustrated as a comparator 131 a _S.
- the transfer switch 32 , the reset switch 34 , the amplification transistor 35 , and the row selection switch 36 of the normal pixel 3 configured as described above include, for example, an NMOS or P-channel MOS transistor (abbreviated as PMOS).
- the normal pixel 3 includes any one color filter of a red (R) filter, a green (G) filter, and a blue (B) filter stacked on the light receiving surface of the photoelectric conversion unit 31 .
- the normal pixels 3 form, for example, a Bayer array on the pixel array unit 11 .
- the normal pixels 3 are not limited to the Bayer array, and may be arranged according to a predetermined rule.
- various color filter arrays as a base such as an X-Trans (registered trademark) type color filter array having a unit pattern of 3 ⁇ 3 pixels, a quad-Bayer array having 4 ⁇ 4 pixels, and a white RGB type color filter array having 4 ⁇ 4 pixels including a color filter (hereinafter, also referred to as clear or white) having a broad light transmission characteristic with respect to a visible light region in addition to the color filters of the three primary colors of RGB.
- the photoelectric conversion unit 31 in which a green (G) filter is stacked on the light receiving surface will be referred to as a pixel G
- the photoelectric conversion unit 31 in which a red (R) filter is stacked on the light receiving surface will be referred to as a pixel R
- the photoelectric conversion unit 31 in which a blue (B) filter is stacked on the light receiving surface will be referred to as a pixel B.
- FIG. 4 is a diagram for explaining an arrangement example of pixels.
- the normal pixels 3 in a unit pixel of 2 ⁇ 2 pixels, two pixels at diagonal positions are the pixels G, and the remaining pixels are the pixels R and the pixels B.
- a part of the normal pixels 3 arranged according to the Bayer array is further replaced with special pixels 4 (pixels S). More specifically, in the normal pixels 3 , the horizontal line in which the pixels B and pixels G are arranged is replaced with the special pixels 4 .
- the special pixel 4 has a configuration similar to that of the normal pixel 3 , and includes a photoelectric conversion unit 41 , a transfer switch 42 , a floating diffusion 43 (hereinafter, simply referred to as “FD 43 ”), a reset switch 44 , an amplification transistor 45 , and a row selection switch 46 .
- the special pixel 4 includes a special filter stacked on the light receiving surface of the photoelectric conversion unit 41 .
- the transfer switch 42 is connected to the second transmission signal line 164 , and the transfer pulse TR_S is supplied from the second transmission signal line 164 .
- Other configurations of the special pixel 4 are similar to those of the normal pixel 3 .
- the special pixel 4 is a pixel other than a pixel (normal pixel, for example, the pixel R, the pixel G, and the pixel B) for acquiring color information and luminance information in the visible light region in order to form a full-color image.
- Examples of the special pixel 4 include an infrared light pixel, a white pixel, a monochrome pixel, a black pixel, a polarization pixel, and an image plane phase difference pixel.
- an infrared filter capable of receiving infrared light is stacked on the light receiving surface of the photoelectric conversion unit 41 .
- a white filter capable of receiving all visible light of red, green, and blue is stacked on the light receiving surface of the photoelectric conversion unit 41 .
- a transparent filter is stacked on the light receiving surface of the photoelectric conversion unit 41 .
- a light shielding filter is stacked on the light receiving surface of the photoelectric conversion unit 41 .
- the polarizing pixel is a pixel using a polarizing element for receiving polarized light.
- an opening filter opened only in a predetermined region is stacked on the light receiving surface of the photoelectric conversion unit 41 .
- the image plane phase difference pixel two pixels of a pixel in which an opening filter having an opening in a region of, for example, 1 ⁇ 2 on the left side of the light receiving surface of the photoelectric conversion unit 41 is stacked and a pixel in which an opening filter having an opening in a region of 1 ⁇ 2 on the right side of the light receiving surface of another photoelectric conversion unit 41 is stacked are set as one set, and distance measurement is performed based on a phase difference of light received by these two pixels.
- the pixel signal having light received by the special pixel 4 photoelectrically converted can realize a function different from that of the pixel signal having light received by the normal pixel 3 photoelectrically converted.
- the special pixel 4 or the photoelectric conversion unit 41 of the special pixel 4 is represented as “S”.
- FIG. 5 is a timing chart schematically illustrating reading of a pixel signal by the imaging device 1 .
- the horizontal axis represents time.
- the output timing of the vertical synchronization pulse is illustrated in the upper part, and the output timing of the horizontal synchronization pulse in the vertical scanning unit 12 is illustrated in the middle part.
- FIG. 5 illustrates a case where the imaging device 1 reads pixel signals of one frame.
- the control unit 19 first sequentially reads pixel signals from the special pixels 4 of the pixel array unit 11 according to, for example, a vertical synchronization pulse and a horizontal synchronization pulse input from the outside of the imaging device 1 . After reading the pixel signals from the special pixels 4 of all the special pixel rows, the control unit 19 sequentially reads the pixel signals from each normal pixel 3 for each row of the pixel array unit 11 .
- the imaging device 1 first reads the pixel signals from all the special pixels 4 , and then performs a reading method of sequentially reading the pixel signals from each normal pixel 3 for each row of the pixel array unit 11 .
- FIG. 6 is a diagram illustrating a configuration example of a color filter layer.
- FIGS. 7 and 8 are diagrams for explaining an example of a method of forming color filters.
- the green color filter formed corresponding to the pixel G is referred to as a green filter GF.
- the red color filter formed corresponding to the pixel R is referred to as a red filter RF.
- the blue color filter formed corresponding to the pixel B is referred to as a blue filter BF.
- the green filter GF, the red filter RF, and the blue filter BF are not distinguished, they are also simply referred to as a color filter CF.
- the special filter formed corresponding to the pixel S (special pixel 4 ) is referred to as a special filter SF.
- the color filter CF and the special filter SF are provided for each of the normal pixel 3 and the special pixel 4 . That is, although one color filter CF or special filter SF is provided in each pixel, for example, two or more adjacent special filters SF may be integrally formed, and the same type of adjacent filters may be integrally formed.
- the color filters CF formed in a color filter layer CL 0 are also arranged according to the Bayer array similarly to the normal pixels 3 .
- the special filters SF are arranged by replacing a part (for example, a pixel row) of the color filters CF arranged according to the Bayer array.
- the special filter SF is a filter formed according to the function of the special pixel 4 , and examples include the above-described white filter, a transparent filter, and an opening filter opened only in a predetermined region.
- the color filter layer in which the color filter CF and the special filter SF are arranged is formed in the order of the green filter GF, the red filter RF, and the blue filter BF, for example.
- the green filter GF is formed on the pixel array unit 11 .
- the color filter CF has a problem that it is difficult to form a corner portion that is not in contact with another color filter CF at a right angle. Therefore, as illustrated in FIG. 7 , since the corner portion of the green filter GF adjacent to the region where the special filter SF is formed is an end that is not in contact with another green filter GF, the corner portion does not form a right angle and is rounded. Note that FIG. 7 is an enlarged view of a part of the color filter layer CL 0 illustrated in FIG. 6 , and a boundary line of pixels is indicated by a dotted line.
- FIG. 8 is an enlarged view of a part of the color filter layer CL 0 illustrated in FIG. 6 , and a boundary line of pixels is indicated by a dotted line.
- the red filter RF is formed in a region having four sides surrounded by the green filter GF on all sides.
- the red filter RF is formed in a region surrounded by the green filter GF on three sides. Therefore, as illustrated in FIG. 8 , the red filter RF is formed to protrude to the formation region of the special filter SF, and the sensitivity of the pixel R may fluctuate. Since the red filter RF protrudes to the formation region of the special filter SF in this manner, there is a possibility that the film thickness of the red filter RF becomes uneven.
- the film thickness of the red filter RF becomes uneven, the light receiving sensitivity of the pixel R varies, and the imaging accuracy of the imaging device 1 may decrease.
- the color filter CF is formed in the region where the special filter SF is formed to surround at least a part of the periphery of the special filter SF, suppressing a decrease in the imaging accuracy of the imaging device 1 .
- FIGS. 9 and 10 are diagrams for explaining the color filter layer of the imaging device 1 according to the embodiment.
- the color filter layer CL 1 illustrated in FIG. 9 has the same configuration as the color filter layer CL 0 illustrated in FIG. 6 except that the periphery of some special filters SF is surrounded by a color filter (hereinafter, also referred to as a special pixel color filter SCF).
- a color filter hereinafter, also referred to as a special pixel color filter SCF.
- FIG. 9 illustrates a case where the special filters SF are arranged in one pixel row since a part of the color filter layer CL 1 is illustrated, the special filters SF may be arranged in a plurality of pixel rows at predetermined intervals.
- the color filter layer CL 1 includes a special pixel color filter SGF surrounding the periphery of the special filter SF in plan view.
- FIG. 9 illustrates a case where the special pixel color filter SCF is a green filter (hereinafter, referred to as a special pixel green filter SGF) that passes through green, the special pixel color filter SCF may be a red filter or a blue filter.
- a special filter (hereinafter, also referred to as a first special filter SF 1 ) surrounded by the special pixel green filter SGF and a special filter (hereinafter, also referred to as a second special filter SF 2 ) not surrounded by the special pixel green filter SGF are alternately arranged for each pixel to form one pixel row.
- FIG. 10 is a diagram illustrating the green filter GF and the special pixel green filter SGF among the filters formed in the color filter layer CL 1 illustrated in FIG. 9 .
- the special pixel green filter SGF is arranged to surround the periphery of the special filter SF adjacent to the color filter CF (the red filter RF in FIG. 9 ) other than the green filter GF.
- FIGS. 9 and 10 an example in which the special pixels 4 and the special filters SF are arranged in a predetermined pixel row has been described, but the present invention is not limited to this.
- the special pixels 4 and the special filters SF may be arranged in a predetermined pixel column.
- FIGS. 11 and 12 are diagrams for explaining another configuration example of a color filter layer of the imaging device 1 according to the embodiment.
- the special filters SF of a color filter layer CL 2 are arranged in one pixel row.
- the first special filter SF 1 adjacent to the red filter RF is surrounded by the special pixel green filter SGF.
- the second special filter SF 2 adjacent to the green filter GF is not surrounded by the special pixel green filter SGF.
- FIG. 11 illustrates a case where the special filters SF are arranged in one pixel row since a part of the color filter layer CL 1 is illustrated, the special filters SF may be arranged in a plurality of pixel rows at predetermined intervals.
- FIG. 12 is a diagram illustrating the green filter GF and the special pixel green filter SGF among the filters formed in the color filter layer CL 2 illustrated in FIG. 11 .
- the special pixel green filter SGF is arranged to surround the periphery of the special filter SF adjacent to the color filter CF (the red filter RF in FIG. 11 ) other than the green filter GF.
- FIG. 13 is a diagram for explaining another configuration example of a color filter layer of the imaging device 1 according to the embodiment.
- FIG. 13 illustrates the green filter GF and the special pixel green filter SGF among the filters formed in the color filter layer CL 2 .
- FIG. 13 illustrates a case where the special filters SF are arranged in one pixel row and one pixel column since a part of the color filter layer is illustrated, the special filters SF may be arranged in a plurality of pixel rows or a plurality of pixel columns at predetermined intervals.
- the special pixel green filters SGF are arranged in a region where the pixels G are arranged in the Bayer array before replacement with the special pixels 4 .
- the first special filter SF 1 is formed inside while leaving the periphery of the green filter GF. Therefore, the corner portion of the green filter GF is arranged in contact with the corner portion of another green filter GF (including the special pixel green filter SGF).
- the green filter GF has a corner portion in contact with another green filter even in the region where the special filter SF is formed, and an end not in contact with another green filter is not formed. Therefore, the corner portion of the green filter GF adjacent to the region where the special filter SF is formed can be formed in a desired shape, and the adjacent red filter RF can be formed without protruding to the region where the green filter GF is formed. Therefore, the color filter CF can be formed in a desired shape, a decrease and variation in the light receiving sensitivity of each pixel can be suppressed, and a decrease in the imaging accuracy of the imaging device 1 can be suppressed.
- the red filter RF adjacent to the formation region of the special filter SF can also be surrounded by the green filter GF including the special pixel green filter SGF, similarly to the other red filters RF.
- the red filter RF adjacent to the formation region of the special filter SF can also be made less likely to be formed with an uneven film thickness. Such points will be described with reference to FIGS. 14 to 17 .
- FIG. 14 is a diagram illustrating a configuration example of a color filter layer according to an existing technology.
- FIG. 15 is a schematic view illustrating a cross section of the color filter layer taken along line A-A′ in FIG. 14 .
- FIG. 16 is a diagram illustrating a configuration example of the color filter layer according to the embodiment.
- FIG. 17 is a schematic view illustrating a cross section of the color filter layer taken along line B-B′ in FIG. 16 .
- the color filter layer CL 0 illustrated in FIGS. 14 and 16 does not include the special pixel color filter SGF, and the red filters RF and the blue filters BF are arranged in contact with the special filter SF. As illustrated in FIGS. 14 and 16 , the special filters SF may be arranged in both the pixel row and the pixel column at predetermined intervals.
- FIGS. 14 and 16 are diagrams illustrating a part of the color filter layer CL 0 .
- the red filter RF and the blue filter BF adjacent to the special filter SF are surrounded by the green filter GF on three sides, and the other side is opened without being surrounded by the green filter GF.
- the color filters CF are formed in the order of green, red, and blue.
- the green filters GF and the special pixel green filters SGF are formed, and then, the red filters RF and the blue filters BF are formed.
- the green filter GF becomes a wall on three sides surrounded by the green filter GF, but since there is no green filter GF as a wall on one side opened, the blue filter BF easily flows to the one side opened.
- the film thickness on the special filter SF side becomes thinner than the film thickness on the green filter GF side, and the film thickness becomes uneven.
- the red filter RF and the blue filter BF adjacent to the special filter SF are surrounded by the green filter GF on three sides, and the other side is opened without being surrounded by the green filter GF.
- the color filters CF are formed in the order of green, red, and blue.
- the green filters GF and the special pixel green filters SGF are formed, and then, the red filters RF and the blue filters BF are formed.
- the green filter GF becomes a wall on three sides surrounded by the green filter GF, but since there is no green filter GF as a wall on one side opened, the red filter RF easily flows to the one side opened.
- the film thickness on the special filter SF side becomes thinner than the film thickness on the green filter GF side, and the film thickness becomes uneven.
- the special pixel green filter SGF is formed to surround the periphery of the special pixel region. Therefore, the red filter RF and the blue filter BF are formed in a region having four sides surrounded by the green filter GF including the special pixel green filter SGF.
- the blue filter BF is formed by using the four sides green filters GF as walls. Therefore, in the blue filter BF, the difference between the film thickness on the special filter SF side and the film thickness on the green filter GF side becomes small, and the film can be evenly formed as compared with FIG. 15 .
- the unevenness of the film thickness of the special filter SF can also be improved.
- the special filters SF are arranged corresponding to a pixel row (also referred to as a special pixel row) and a pixel column (also referred to as a special pixel column).
- a pixel row also referred to as a special pixel row
- a pixel column also referred to as a special pixel column
- the sensitivity of the special pixel 4 fluctuates, and the function corresponding to the special pixel 4 decreases.
- the special pixel 4 is an image plane phase difference pixel, and the imaging device 1 realizes an autofocus function using the special pixel 4 .
- the imaging device 1 cannot appropriately adjust the focus, and the autofocus function may decrease.
- the special pixel color filter SGF when the special pixel color filter SGF is formed in the special pixel row (or the special pixel column), for example, the special filter SF is divided for each pixel by the special pixel color filter SGF. Therefore, the special pixel color filter SGF serves as a wall, and the special filter SF hardly flows to both end sides of the special pixel row (or the special pixel column), in a manner that the unevenness of the film thickness of the special filter SF can be improved. As a result, it is possible to suppress functional degradation of the imaging device 1 .
- the special pixel green filter SGF is formed to surround the periphery of the special filter SF, but the present invention is not limited to this.
- the special pixel green filter SGF may be formed to surround at least a part of a special pixel region SR including at least one of the special pixels 4 . Such points will be described with reference to FIG. 18 .
- FIG. 18 is a diagram illustrating a configuration example of a color filter layer according to a first modification of the embodiment. Note that FIG. 18 illustrates a part of the color filter layer of the imaging device 1 .
- the special pixel green filter SGF is formed on at least a part (a side in contact with the color filter CF) of the special filter SF.
- the corner portion of the green filter GF comes into contact with another green filter GF or the special pixel green filter SGF, and the color filter CF can be formed in a desired shape.
- the unevenness of the film thickness of the color filter CF can be improved.
- the special filter SF is not divided by the special pixel green filter SGF. Therefore, for example, when evenness of the film thickness of the special filter SF is required, it is desirable to form the special pixel green filter SGF to surround the periphery of the special filter SF as illustrated in the color filter layers CL 1 to CL 3 in the embodiment. As a result, the special filter SF can be divided by the special pixel green filter SGF, and the unevenness of the film thickness of the special filter SF can be improved.
- FIG. 19 is a diagram illustrating a configuration example of a color filter layer CL 5 according to the second modification of the embodiment. Note that FIG. 19 illustrates a part of the color filter layer CL 5 of the imaging device 1 .
- the special filter SF corresponding to the special pixel 4 is an opening filter opened only in a predetermined region, but all the special filters SF are not necessarily opening filters.
- a part of the special filter SF may be an opening filter
- another special filter SF may be a color filter that passes through a predetermined color.
- the color filter formed in the opening of the opening filter may be a color filter that passes through the same color as another special filter SF (hereinafter, also referred to as a special color filter SCF).
- an opening filter OF and the special color filter SCF are alternately arranged.
- the special color filter SCF is, for example, a color filter (hereinafter, also referred to as a cyan filter) that passes through cyan. Since cyan has higher sensitivity in the photoelectric conversion unit 41 than RGB, the sensitivity of the special pixel 4 can be improved by using the cyan filter as the special filter SF.
- the special color filter SCF may be a white filter that passes through white instead of a cyan filter. Since the white filter has higher sensitivity in the photoelectric conversion unit 41 than the cyan filter, the sensitivity of the special pixel 4 can be further improved.
- the special pixel green filter SGF is arranged to surround the periphery of the special color filter SCF. As a result, effects similar to those of the embodiment can be obtained. In addition, in the special pixel region where the special pixel green filter SGF is formed, the formation region of the special filter SF becomes small. Therefore, when the opening filter OF is formed in the special pixel region where the special pixel green filter SGF is formed, the sensitivity of the image plane phase difference pixel may decrease. Therefore, by forming the opening filter OF in the special pixel region where the special pixel green filter SGF is not formed, it is possible to suppress a decrease in the sensitivity of the image plane phase difference pixel.
- the pixel signal obtained by photoelectrically converting the light received by the special pixel 4 corresponding to the special color filter SCF may be used, for example, for correction of a captured image.
- the number of pixels of the normal pixels 3 decreases, and the image quality of the captured image deteriorates.
- FIG. 19 illustrates the opening filter OF that shields light on the left side of the special pixel 4
- the opening filter OF that shields light on the right side is also formed in the color filter layer CL 5 although not illustrated.
- the arrangement of the opening filters OF illustrated in FIG. 19 is an example, and the present invention is not limited to this.
- the opening filter OF that shields light on the left side of the special pixel 4 and the opening filter OF that shields light on the right side may be formed in the same pixel row.
- FIG. 20 is a diagram illustrating a configuration example of a color filter layer CL 6 according to a third modification of the embodiment. Note that FIG. 20 illustrates a part of the color filter layer CL 6 of the imaging device 1 .
- FIG. 20 illustrates an example in which the special filters SF are arranged in the pixel row of the color filter layer CL 6
- the special filters SF may also be arranged in the pixel column in addition to the pixel row of the color filter layer CL 6 .
- the special filters SF may be arranged in the pixel column of the color filter layer CL 6 .
- FIG. 21 is a diagram illustrating another configuration example of the color filter layer CL 6 according to the third modification of the embodiment, and illustrates a part of the color filter layer CL 6 of the imaging device 1 .
- the special pixel green filter SGF is arranged to surround the periphery of the special filter SF corresponding to one special pixel 4 , but the present invention is not limited to this.
- the special pixel green filter SGF may be arranged to surround the periphery of the special filter SF corresponding to two or more special pixels 4 .
- the periphery of the special filter region including two or more special filters SF in plan view may be surrounded by the special pixel green filter SGF.
- FIG. 22 is a diagram illustrating a configuration example of the color filter layer CL 7 according to a fourth modification of the embodiment.
- FIG. 22 illustrates a part of the color filter layer CL 7 of the imaging device 1 .
- the periphery of the special filter region including two special filters SF in plan view in the special pixel row is surrounded by the special pixel green filter SGF.
- FIG. 22 illustrates an example in which the special filters SF are arranged in the pixel row of the color filter layer CL 7
- the special filters SF may also be arranged in the pixel column in addition to the pixel row of the color filter layer CL 7 .
- the special filters SF may be arranged in the pixel column of the color filter layer CL 7 .
- FIG. 23 is a diagram illustrating another configuration example of the color filter layer CL 7 according to the fourth modification of the embodiment, and illustrates a part of the color filter layer CL 7 of the imaging device 1 .
- FIG. 24 is a diagram illustrating a configuration example of a color filter layer CL 8 according to the fourth modification of the embodiment.
- FIG. 24 illustrates a part of the color filter layer CL 8 of the imaging device 1 .
- the green filter GF is replaced with the special filter SF at predetermined intervals, and the special pixel green filter SGF is arranged to surround the periphery of the special filter SF.
- the normal pixels 3 may be replaced with the special pixels 4 at predetermined intervals.
- FIG. 25 is a diagram illustrating examples of using the imaging device 1 according to the embodiment and the modifications described above.
- Each imaging device 1 described above can be used, for example, in various cases of sensing light such as visible light, infrared light, ultraviolet light, and X-rays as described below.
- the technology according to the present disclosure (the present technology) can be applied to various products.
- the technology according to the present disclosure may be applied to an endoscopic surgery system.
- FIG. 26 is a block diagram illustrating an example of a schematic configuration of a patient in-vivo information acquisition system using a capsule endoscope to which the technology according to the present disclosure (present technology) may be applied.
- An in-vivo information acquisition system 10001 includes a capsule endoscope 10100 and an external control device 10200 .
- the capsule endoscope 10100 is swallowed by the patient at the time of examination.
- the capsule endoscope 10100 has an imaging function and a wireless communication function, sequentially captures images of the inside of organs (hereinafter, it is also referred to as in-vivo images) at predetermined intervals while moving inside the organs such as the stomach and the intestines by peristaltic movement or the like until being naturally discharged from the patient, and sequentially wirelessly transmits information regarding the in-vivo images to the external control device 10200 outside the body.
- the external control device 10200 integrally controls the operation of the in-vivo information acquisition system 10001 .
- the external control device 10200 receives the information regarding the in-vivo image transmitted from the capsule endoscope 10100 , and generates image data for displaying the in-vivo image on a display device (not illustrated) based on the received information regarding the in-vivo image.
- the in-vivo information acquisition system 10001 can obtain an in-vivo image obtained by imaging the state of the inside of the patient's body at any time from when the capsule endoscope 10100 is swallowed until the capsule endoscope is discharged.
- the capsule endoscope 10100 includes a casing 10101 of capsule type, and a light source unit 10111 , an imaging unit 10112 , an image processing unit 10113 , a wireless communication unit 10114 , a power feeding unit 10115 , a power source unit 10116 , and a control unit 10117 are housed in the casing 10101 .
- the light source unit 10111 includes a light source such as a light emitting diode (LED), for example, and irradiates an imaging field of view of the imaging unit 10112 with light.
- a light source such as a light emitting diode (LED), for example, and irradiates an imaging field of view of the imaging unit 10112 with light.
- LED light emitting diode
- the imaging unit 10112 includes an imaging device and an optical system including a plurality of lenses provided in front of the imaging device. Reflected light (hereinafter, referred to as observation light) of light with which a body tissue to be observed is irradiated is condensed by the optical system and enters the imaging device. In the imaging unit 10112 , the observation light that entered the imaging device is photoelectrically converted, and an image signal corresponding to the observation light is generated in the imaging device. The image signal generated by the imaging unit 10112 is provided to the image processing unit 10113 .
- the image processing unit 10113 includes a processor such as a CPU or a graphics processing unit (GPU), and performs various types of signal processing on the image signal generated by the imaging unit 10112 .
- the image processing unit 10113 provides the image signal subjected to the signal processing to the wireless communication unit 10114 as RAW data.
- the wireless communication unit 10114 performs predetermined processing such as modulation processing on the image signal subjected to the signal processing by the image processing unit 10113 , and transmits the image signal to the external control device 10200 via an antenna 10114 A.
- the wireless communication unit 10114 receives a control signal related to drive control of the capsule endoscope 10100 from the external control device 10200 via the antenna 10114 A.
- the wireless communication unit 10114 provides the control unit 10117 with a control signal received from the external control device 10200 .
- the power feeding unit 10115 includes a power receiving antenna coil, a power regeneration circuit that regenerates power from current generated in the antenna coil, a booster circuit, and the like. In the power feeding unit 10115 , power is generated using a so-called non-contact charging principle.
- the power source unit 10116 includes a secondary battery, and stores the electric power generated by the power feeding unit 10115 .
- FIG. 26 in order to avoid complication of the drawing, illustration of an arrow or the like indicating the destination of power feeding from the power source unit 10116 is omitted, but the power stored in the power source unit 10116 is supplied to the light source unit 10111 , the imaging unit 10112 , the image processing unit 10113 , the wireless communication unit 10114 , and the control unit 10117 , and may be used for driving these units.
- the control unit 10117 includes a processor such as a CPU, and appropriately controls driving of the light source unit 10111 , the imaging unit 10112 , the image processing unit 10113 , the wireless communication unit 10114 , and the power feeding unit 10115 according to a control signal transmitted from the external control device 10200 .
- a processor such as a CPU
- the external control device 10200 includes a processor such as a CPU or a GPU, or a microcomputer, a control board, or the like on which a processor and a storage element such as a memory are mixedly mounted.
- the external control device 10200 controls the operation of the capsule endoscope 10100 by transmitting a control signal to the control unit 10117 of the capsule endoscope 10100 via an antenna 10200 A.
- the light irradiation condition for the observation target in the light source unit 10111 may be changed by the control signal from the external control device 10200 .
- imaging conditions for example, a frame rate, an exposure value, and the like in the imaging unit 10112 ) may be changed by the control signal from the external control device 10200 .
- the contents of the processing in the image processing unit 10113 and the conditions (for example, a transmission interval, the number of transmitted images, and the like) under which the wireless communication unit 10114 transmits the image signal may be changed by the control signal from the external control device 10200 .
- the external control device 10200 performs various types of image processing on the image signal transmitted from the capsule endoscope 10100 , and generates image data for displaying the imaged in-vivo image on the display device.
- image processing for example, various types of signal processing such as development processing (demosaic processing), high image quality processing (band emphasis processing, super-resolution processing, noise reduction processing, camera shake correction processing, and the like), and enlargement processing (electronic zoom processing) can be performed alone or in combination.
- the external control device 10200 controls driving of the display device to display the imaged in-vivo image based on the generated image data.
- the external control device 10200 may cause a recording device (not illustrated) to record the generated image data or cause a printing device (not illustrated) to print out the generated image data.
- the technology according to the present disclosure may be applied to, for example, the imaging unit 10112 among the above-described configurations.
- the imaging device 1 according to the present disclosure By applying the imaging device 1 according to the present disclosure to the imaging unit 10112 , favorable autofocus can be performed even in a case where zooming or the like is performed, and a higher-quality in-vivo image or the like can be acquired.
- FIG. 27 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure (present technology) may be applied.
- FIG. 27 illustrates a state in which an operator (doctor) 11131 is performing surgery on a patient 11132 on a patient bed 11133 using an endoscopic surgery system 11000 .
- the endoscopic surgery system 11000 includes an endoscope 11100 , other surgical tools 11110 such as a pneumoperitoneum tube 11111 and an energy treatment tool 11112 , a support arm device 11120 that supports the endoscope 11100 , and a cart 11200 on which various devices for endoscopic surgery are mounted.
- the endoscope 11100 includes a lens barrel 11101 having region of a predetermined length from the distal end is inserted into the body cavity of the patient 11132 , and a camera head 11102 connected to the proximal end of the lens barrel 11101 .
- the endoscope 11100 configured as a so-called rigid scope having the rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible scope having a flexible lens barrel.
- An opening portion into which an objective lens is fitted is provided at the distal end of the lens barrel 11101 .
- a light source device 11203 is connected to the endoscope 11100 , and light generated by the light source device 11203 is guided to the distal end of the lens barrel by a light guide extending inside the lens barrel 11101 , and is emitted toward an observation target in the body cavity of the patient 11132 via the objective lens.
- the endoscope 11100 may be a forward-viewing endoscope, an oblique-viewing endoscope, or a side-viewing endoscope.
- An optical system and an imaging device are provided inside the camera head 11102 , and reflected light (observation light) from the observation target is condensed on the imaging device by the optical system.
- the observation light is photoelectrically converted by the imaging device, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
- the image signal is transmitted to a camera control unit (CCU) 11201 as RAW data.
- CCU camera control unit
- the CCU 11201 includes a CPU, a GPU, and the like, and integrally controls operations of the endoscope 11100 and a display device 11202 . Further, the CCU 11201 receives an image signal from the camera head 11102 , and performs various types of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), on the image signal.
- image processing for displaying an image based on the image signal, such as development processing (demosaic processing), on the image signal.
- the display device 11202 displays an image based on the image signal subjected to the image processing by the CCU 11201 under the control of the CCU 11201 .
- the light source device 11203 includes a light source such as a light emitting diode (LED), for example, and supplies irradiation light for capturing an image of a surgical site or the like to the endoscope 11100 .
- a light source such as a light emitting diode (LED), for example, and supplies irradiation light for capturing an image of a surgical site or the like to the endoscope 11100 .
- LED light emitting diode
- An input device 11204 is an input interface for the endoscopic surgery system 11000 .
- the user can input various types of information and instructions to the endoscopic surgery system 11000 via the input device 11204 .
- the user inputs an instruction or the like to change imaging conditions (type of irradiation light, magnification, focal length, and the like) by the endoscope 11100 .
- a treatment tool control device 11205 controls driving of the energy treatment tool 11112 for cauterization and incision of tissue, sealing of a blood vessel, or the like.
- a pneumoperitoneum device 11206 feeds gas into the body cavity of the patient 11132 via the pneumoperitoneum tube 11111 in order to inflate the body cavity for the purpose of securing a visual field by the endoscope 11100 and securing a working space of the operator.
- a recorder 11207 is a device capable of recording various types of information regarding surgery.
- a printer 11208 is a device capable of printing various types of information regarding surgery in various formats such as text, image, or graph.
- the light source device 11203 that supplies the endoscope 11100 with the irradiation light at the time of capturing a picture the surgical site can include, for example, an LED, a laser light source, or a white light source including a combination of them.
- the white light source includes a combination of RGB laser light sources, since the output intensity and the output timing of each color (each wavelength) can be controlled with high accuracy, adjustment of the white balance of the captured image can be performed in the light source device 11203 .
- the driving of the light source device 11203 may be controlled to change the intensity of light to be output every predetermined time.
- the driving of the imaging device of the camera head 11102 in synchronization with the timing of the change of the intensity of the light to acquire images in a time division manner and synthesizing the images, it is possible to generate an image of a high dynamic range without so-called blocked up shadows and blown out highlights.
- the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
- special light observation for example, so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel in a mucosal surface layer is imaged with high contrast by emitting light in a narrower band than irradiation light (that is, white light) at the time of normal observation using wavelength dependency of light absorption in body tissue.
- irradiation light that is, white light
- fluorescence observation for obtaining an image by fluorescence generated by irradiation with excitation light may be performed.
- the light source device 11203 may be configured to be able to supply narrowband light and/or excitation light corresponding to such special light observation.
- FIG. 28 is a block diagram illustrating an example of functional configurations of the camera head 11102 and the CCU 11201 illustrated in FIG. 27 .
- the camera head 11102 includes a lens unit 11401 , an imaging unit 11402 , a drive unit 11403 , a communication unit 11404 , and a camera head control unit 11405 .
- the CCU 11201 includes a communication unit 11411 , an image processing unit 11412 , and a control unit 11413 .
- the camera head 11102 and the CCU 11201 are communicably connected to each other by a transmission cable 11400 .
- the lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101 . Observation light taken in from the distal end of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401 .
- the lens unit 11401 includes a combination of a plurality of lenses including a zoom lens and a focus lens.
- the imaging unit 11402 includes an imaging device.
- the number of imaging devices forming the imaging unit 11402 may be one (so-called single-plate type) or a plurality of (so-called multi-plate type) imaging devices.
- image signals corresponding to RGB may be generated by the imaging devices, and a color image may be obtained by combining the image signals.
- the imaging unit 11402 may include a pair of imaging devices for acquiring right-eye and left-eye image signals corresponding to three-dimensional (3D) display. By performing 3D display, the operator 11131 can more accurately grasp the depth of the living tissue in the surgical site.
- 3D three-dimensional
- the imaging unit 11402 is not necessarily provided in the camera head 11102 .
- the imaging unit 11402 may be provided immediately after the objective lens inside the lens barrel 11101 .
- the drive unit 11403 includes an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405 . As a result, the magnification and focus of the captured image by the imaging unit 11402 may be appropriately adjusted.
- the communication unit 11404 includes a communication device for transmitting and receiving various types of information to and from the CCU 11201 .
- the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400 .
- the communication unit 11404 receives a control signal for controlling driving of the camera head 11102 from the CCU 11201 , and supplies the control signal to the camera head control unit 11405 .
- the control signal includes, for example, information regarding imaging conditions such as information for specifying a frame rate of a captured image, information for specifying an exposure value at the time of imaging, and/or information for specifying a magnification and a focus of a captured image.
- the imaging conditions such as the frame rate, the exposure value, the magnification, and the focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal.
- a so-called auto exposure (AE) function, an auto focus (AF) function, and an auto white balance (AWB) function are installed in the endoscope 11100 .
- the camera head control unit 11405 controls driving of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404 .
- the communication unit 11411 includes a communication device for transmitting and receiving various types of information to and from the camera head 11102 .
- the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400 .
- the communication unit 11411 transmits a control signal for controlling driving of the camera head 11102 to the camera head 11102 .
- the image signal and the control signal can be transmitted by electric communication, optical communication, or the like.
- the image processing unit 11412 performs various types of image processing on the image signal that is RAW data transmitted from the camera head 11102 .
- the control unit 11413 performs various types of control related to imaging of a surgical site or the like by the endoscope 11100 and display of a captured image obtained by imaging of the surgical site or the like. For example, the control unit 11413 generates a control signal for controlling driving of the camera head 11102 .
- control unit 11413 causes the display device 11202 to display a captured image of a surgical site or the like based on the image signal subjected to the image processing by the image processing unit 11412 .
- the control unit 11413 may recognize various objects in the captured image using various image recognition technologies.
- the control unit 11413 can recognize a surgical tool such as forceps, a specific body part, bleeding, mist at the time of using the energy treatment tool 11112 , and the like by detecting the shape, color, and the like of the edge of the object included in the captured image.
- the control unit 11413 may superimpose and display various types of surgery support information on the image of the surgical site by using the recognition result. Since the surgery support information is superimposed and displayed and presented to the operator 11131 , the burden on the operator 11131 can be reduced and the operator 11131 can reliably proceed with the surgery.
- the transmission cable 11400 connecting the camera head 11102 and the CCU 11201 is an electric signal cable compatible with electric signal communication, an optical fiber compatible with optical communication, or a composite cable of them.
- wired communication is performed using the transmission cable 11400 , but communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
- the technology according to the present disclosure may be applied to, for example, the endoscope 11100 and the imaging unit 11402 of the camera head 11102 among the above-described configurations.
- the imaging device 1 according to the present disclosure to the imaging unit 11402 , favorable autofocus can be performed even in a case where zooming or the like is performed, and a higher-quality captured image or the like can be acquired.
- the burden on the operator 11131 can be reduced, and the operator 11131 can reliably proceed with the surgery.
- the endoscopic surgery system has been described as an example, but the technology according to the present disclosure may be applied to, for example, a microscopic surgery system or the like.
- the technology according to the present disclosure may be further applied to devices mounted on various moving bodies such as an m-automobile, an electric car, a hybrid electric car, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot.
- various moving bodies such as an m-automobile, an electric car, a hybrid electric car, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot.
- FIG. 29 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a moving body control system to which the technology according to the present disclosure may be applied.
- a vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001 .
- the vehicle control system 12000 includes a drive system control unit 12010 , a body system control unit 12020 , a vehicle exterior information detection unit 12030 , a vehicle interior information detection unit 12040 , and an integrated control unit 12050 .
- a microcomputer 12051 As a functional configuration of the integrated control unit 12050 , a microcomputer 12051 , an audio image output unit 12052 , and an in-vehicle network interface (I/F) 12053 are illustrated.
- I/F in-vehicle network interface
- the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
- the drive system control unit 12010 functions as a control device of a driving force generation device for generating a driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting a steering angle of the vehicle, a braking device for generating a braking force of the vehicle, and the like.
- the body system control unit 12020 controls operations of various devices mounted on the vehicle body according to various programs.
- the body system control unit 12020 functions as a control device of a keyless entry system, a smart key system, a power window device, or various lamps such as a head lamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
- radio waves transmitted from a portable device that substitutes for a key or signals of various switches may be input to the body system control unit 12020 .
- the body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
- the vehicle exterior information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted.
- an imaging unit 12031 is connected to the vehicle exterior information detection unit 12030 .
- the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the outside of the vehicle, and receives the captured image.
- the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing of a person, a vehicle, an obstacle, a sign, a character on a road surface, or the like based on the received image.
- the vehicle exterior information detection unit 12030 performs image processing on the received image, and performs object detection processing and distance detection processing based on a result of the image processing.
- the imaging unit 12031 is an optical sensor that receives light and outputs an electric signal corresponding to the amount of received light.
- the imaging unit 12031 can output the electric signal as an image or can output the electric signal as distance measurement information.
- the light received by the imaging unit 12031 may be visible light or invisible light such as infrared rays.
- the vehicle interior information detection unit 12040 detects information inside the vehicle.
- a driver state detection unit 12041 that detects a state of a driver is connected to the vehicle interior information detection unit 12040 .
- the driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 may calculate the degree of fatigue or the degree of concentration of the driver or may determine whether or not the driver is dozing off based on the detection information input from the driver state detection unit 12041 .
- the microcomputer 12051 can calculate a control target value of the driving force generation device, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040 , and output a control command to the drive system control unit 12010 .
- the microcomputer 12051 can perform cooperative control for the purpose of realizing functions of an advanced driver assistance system (ADAS) including collision avoidance or impact mitigation of the vehicle, follow-up traveling based on the distance between vehicles, vehicle speed maintenance traveling, vehicle collision warning, vehicle lane departure warning, or the like.
- ADAS advanced driver assistance system
- the microcomputer 12051 controls the driving force generation device, the steering mechanism, the braking device, or the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040 , performing cooperative control for the purpose of automatic driving or the like in which the vehicle autonomously travels without depending on the operation of the driver.
- the microcomputer 12051 can output a control command to the body system control unit 12020 based on the vehicle exterior information acquired by the vehicle exterior information detection unit 12030 .
- the microcomputer 12051 can perform cooperative control for the purpose of preventing the glare, such as switching from a high beam to a low beam, by controlling the headlamp according to the position of a preceding vehicle or an oncoming vehicle detected by the vehicle exterior information detection unit 12030 .
- the audio image output unit 12052 transmits an output signal of at least one of audio and an image to an output device capable of visually or audibly notifying an occupant of the vehicle or the outside of the vehicle of the information.
- an audio speaker 12061 a display unit 12062 , and an instrument panel 12063 are illustrated as the output device.
- the display unit 12062 may include, for example, at least one of an on-board display and a head-up display.
- FIG. 30 is a diagram illustrating an example of an installation position of the imaging unit 12031 .
- a vehicle 12100 includes imaging units 12101 , 12102 , 12103 , 12104 , and 12105 as the imaging unit 12031 .
- the imaging units 12101 , 12102 , 12103 , 12104 , and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper portion of a windshield in a vehicle interior of the vehicle 12100 .
- the imaging unit 12101 provided at the front nose and the imaging unit 12105 provided at the upper portion of a windshield in a vehicle interior mainly acquire images in front of the vehicle 12100 .
- the imaging units 12102 and 12103 provided at the side mirrors mainly acquire images of the sides of the vehicle 12100 .
- the imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image behind the vehicle 12100 .
- the front images acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
- FIG. 30 illustrates an example of an imaging range of the imaging units 12101 to 12104 .
- An imaging range 12111 indicates an imaging range of the imaging unit 12101 provided at the front nose
- imaging ranges 12112 and 12113 indicate imaging ranges of the imaging units 12102 and 12103 provided at the side mirrors, respectively
- an imaging range 12114 indicates an imaging range of the imaging unit 12104 provided at the rear bumper or the back door.
- an overhead image of the vehicle 12100 viewed from above can be obtained by superimposing image data captured by the imaging units 12101 to 12104 .
- At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
- at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging devices, or may be an imaging device having pixels for phase difference detection.
- the microcomputer 12051 obtains a distance to each three-dimensional object in the imaging ranges 12111 to 12114 and a temporal change of the distance (relative speed with respect to the vehicle 12100 ) based on the distance information obtained from the imaging units 12101 to 12104 , extracting, as a preceding vehicle, a three-dimensional object traveling at a predetermined speed (for example, 0 km/h or more) in substantially the same direction as the vehicle 12100 , in particular, the closest three-dimensional object on a traveling path of the vehicle 12100 .
- a predetermined speed for example, 0 km/h or more
- the microcomputer 12051 can set the distance between vehicles to be secured in advance in front of the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. As described above, it is possible to perform cooperative control for the purpose of automatic driving or the like in which the vehicle autonomously travels without depending on the operation of the driver.
- the microcomputer 12051 can classify three-dimensional object data related to a three-dimensional object into a two-wheeled vehicle, an ordinary vehicle, a large vehicle, a pedestrian, and other three-dimensional objects such as a utility pole, extract the three-dimensional object data, and use the three-dimensional object data for automatic avoidance of an obstacle. For example, the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that can be visually recognized by the driver of the vehicle 12100 and obstacles that are difficult to visually recognize.
- the microcomputer 12051 determines a collision risk indicating a risk of collision with each obstacle, and when the collision risk is a set value or more and there is a possibility of collision, the microcomputer 12051 can perform driving assistance for collision avoidance by outputting an alarm to the driver via the audio speaker 12061 or the display unit 12062 or performing forced deceleration or avoidance steering via the drive system control unit 12010 .
- At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
- the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in images captured by the imaging units 12101 to 12104 .
- recognition of a pedestrian is performed by, for example, a procedure of extracting feature points in captured images by the imaging units 12101 to 12104 as an infrared camera and a procedure of performing pattern matching processing on a series of feature points indicating an outline of an object to determine whether or not the object is a pedestrian.
- the audio image output unit 12052 controls the display unit 12062 to superimpose and display a square contour line for emphasis on the recognized pedestrian.
- the audio image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
- the technology according to the present disclosure may be applied to, for example, the imaging unit 12031 among the above-described configurations.
- the imaging device 1 according to the present disclosure By applying the imaging device 1 according to the present disclosure to the imaging unit 12031 , favorable autofocus can be performed even in a case where zooming or the like is performed, and a higher-quality captured image is acquired.
- the imaging device 1 includes the plurality of normal pixels 3 arranged in a matrix, the special pixel 4 arranged by replacing a part of the normal pixels 3 , the color filter CF corresponding to the normal pixels 3 and arranged according to a predetermined rule, the special filter SF arranged corresponding to the special pixel 4 , and the special pixel color filter (corresponding to the special pixel green filter SGF) arranged to surround at least a part of the periphery of the special filter SF.
- the special pixel 4 arranged by replacing a part of the normal pixels 3
- the color filter CF corresponding to the normal pixels 3 and arranged according to a predetermined rule
- the special filter SF arranged corresponding to the special pixel 4
- the special pixel color filter corresponding to the special pixel green filter SGF
- the color filter CF can be arranged in a manner that the end of the color filter CF is in contact with the special pixel color filter, and the color filter CF can be formed in a desired shape. Therefore, the sensitivity variation of the normal pixels 4 can be suppressed, and the decrease in the imaging accuracy of the imaging device can be suppressed.
- An imaging device comprising:
- An electronic apparatus comprising:
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Solid State Image Pick-Up Elements (AREA)
Abstract
An imaging device according to the present disclosure includes the plurality of normal pixels arranged in a matrix, the special pixel arranged by replacing a part of the normal pixels, the color filter corresponding to the normal pixels and arranged according to a predetermined rule, the special filter arranged corresponding to the special pixel, and the special pixel color filter arranged to surround at least a part of the periphery of the special filter.
Description
- The present disclosure relates to an imaging device and an electronic apparatus.
- In an imaging device using a complementary metal oxide semiconductor (CMOS) or the like, a special pixel such as a pixel for receiving infrared light or a pixel for detecting an image plane phase difference may be arranged by replacing a normal pixel. For example, there is known an imaging device in which pixels for detecting an image plane phase difference are arranged at predetermined intervals on a horizontal line of normal pixels arranged on an array (see, for example, Patent Literature 1).
- Patent Literature 1: JP 2008-312073 A
- As in the above technology, when the special pixel replaces a normal pixel, for example, a filter (for example, in the case of a pixel for receiving infrared light, an infrared light filter) corresponding to the function of the special pixel is formed in the special pixel instead of a color filter. As described above, when there is a region where no color filter is formed in the pixel array unit, there is a possibility that the color filter cannot be formed in a desired shape due to the influence of such a region. If the color filter cannot be formed in a desired shape, the light receiving sensitivity of the normal pixel fluctuates, and the accuracy of the imaging device may decrease.
- Therefore, the present disclosure provides an imaging device and an electronic apparatus capable of suppressing a decrease in accuracy.
- According to the present disclosure, an imaging device is provided. The imaging device includes the plurality of normal pixels arranged in a matrix, the special pixel arranged by replacing a part of the normal pixels, the color filter corresponding to the normal pixels and arranged according to a predetermined rule, the special filter arranged corresponding to the special pixel, and the special pixel color filter arranged to surround at least a part of the periphery of the special filter.
-
FIG. 1 is a block diagram illustrating a configuration of an example of an electronic apparatus that may be applied to an embodiment. -
FIG. 2 is a block diagram illustrating a schematic configuration example of an imaging device that may be applied to an embodiment of the present disclosure. -
FIG. 3 is a diagram illustrating a part of a circuit configuration of a pixel array unit that may be applied to the embodiment. -
FIG. 4 is a diagram for explaining an arrangement example of pixels. -
FIG. 5 is a timing chart schematically illustrating reading of a pixel signal by the imaging device. -
FIG. 6 is a diagram illustrating a configuration example of color filters. -
FIG. 7 is a diagram for explaining an example of a method of forming color filters. -
FIG. 8 is a diagram for explaining an example of a method of forming color filters. -
FIG. 9 is a diagram for explaining a color filter layer of the imaging device according to the embodiment. -
FIG. 10 is a diagram for explaining the color filter layer of the imaging device according to the embodiment. -
FIG. 11 is a diagram for explaining another configuration example of a color filter layer of the imaging device according to the embodiment. -
FIG. 12 is a diagram for explaining another configuration example of the color filter layer of the imaging device according to the embodiment. -
FIG. 13 is a diagram for explaining another configuration example of a color filter layer of the imaging device according to the embodiment. -
FIG. 14 is a diagram illustrating a configuration example of a color filter layer according to an existing technology. -
FIG. 15 is a schematic view illustrating a cross section of the color filter layer taken along line A-A′ inFIG. 14 . -
FIG. 16 is a diagram illustrating a configuration example of the color filter layer according to the embodiment. -
FIG. 17 is a schematic view illustrating a cross section of the color filter layer taken along line B-B′ inFIG. 16 . -
FIG. 18 is a diagram illustrating a configuration example of a color filter layer according to a first modification of the embodiment. -
FIG. 19 is a diagram illustrating a configuration example of a color filter layer according to a second modification of the embodiment. -
FIG. 20 is a diagram illustrating a configuration example of a color filter layer according to a third modification of the embodiment. -
FIG. 21 is a diagram illustrating another configuration example of the color filter layer according to the third modification of the embodiment. -
FIG. 22 is a diagram illustrating a configuration example of a color filter layer according to a fourth modification of the embodiment. -
FIG. 23 is a diagram illustrating another configuration example of the color filter layer according to the fourth modification of the embodiment. -
FIG. 24 is a diagram illustrating a configuration example of the color filter layer according to the fourth modification of the embodiment. -
FIG. 25 is a diagram illustrating examples of using the imaging device according to the embodiment and the modifications. -
FIG. 26 is a block diagram illustrating an example of a schematic configuration of a patient in-vivo information acquisition system using a capsule endoscope to which the technology according to the present disclosure may be applied. -
FIG. 27 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure may be applied. -
FIG. 28 is a block diagram illustrating an example of functional configurations of a camera head and a CCU. -
FIG. 29 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a moving body control system to which the technology according to the present disclosure may be applied. -
FIG. 30 is a diagram illustrating an example of an installation position of an imaging unit. - Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. Note that, in the present specification and the drawings, components having substantially the same functional configuration are denoted by the same reference numerals, and redundant description is omitted. In addition, the size of each member in the drawings is appropriately emphasized for ease of description, and does not indicate actual dimensions and ratios between members.
- Note that the description will be given in the following order.
-
- 1. Introduction
- 1.1. Configuration examples applicable to embodiment
- 1.2. Accuracy decrease in existing technology
- 2. Embodiment
- 2.1. Configuration examples of color filter layer
- 2.2. Effects of embodiment
- 3. Modifications
- 3.1. First modification
- 3.2. Second modification
- 3.3. Third modification
- 3.4. Fourth modification
- 3.5. Fifth modification
- 4. Adaptation example
- 1. Introduction
-
FIG. 1 is a block diagram illustrating a configuration of an example of an electronic apparatus that may be applied to an embodiment. InFIG. 1 , anelectronic apparatus 1D includes anoptical system 2D, acontrol unit 3D, animaging device 1, animage processing unit 5D, amemory 6D, astorage unit 7D, adisplay unit 8D, an interface (I/F)unit 9D, and aninput device 10D. - Here, as the
electronic apparatus 1D, a digital still camera, a digital video camera, a mobile phone with an imaging function, a smartphone, or the like can be applied. In addition, as theelectronic apparatus 1D, a monitoring camera, an in-vehicle camera, a medical camera, or the like can also be applied. - The
imaging device 1 includes, for example, a plurality of photoelectric conversion elements arranged in a matrix array. The photoelectric conversion elements convert received light into electric charge by photoelectric conversion. Theimaging device 1 includes a drive circuit that drives the plurality of photoelectric conversion elements, and a signal processing circuit that reads electric charge from each of the plurality of photoelectric conversion elements and generates image data based on the read electric charge. - The
optical system 2D includes a main lens including one or a combination of a plurality of lenses and a mechanism for driving the main lens, and forms an image of image light (incident light) from a subject on a light receiving surface of theimaging device 1 via the main lens. In addition, theoptical system 2D includes an autofocus mechanism that adjusts the focus in accordance with a control signal and a zoom mechanism that changes the zoom ratio in accordance with a control signal. In addition, in theelectronic apparatus 1D, theoptical system 2D may be detachable and may be replaced with anotheroptical system 2D. - The
image processing unit 5D executes predetermined image processing on the image data output from theimaging device 1. For example, theimage processing unit 5D is connected to thememory 6D such as a frame memory, and writes the image data output from theimaging device 1 in thememory 6D. Theimage processing unit 5D executes predetermined image processing on the image data written in thememory 6D, and writes the image data subjected to the image processing again in thememory 6D. - The
storage unit 7D is, for example, a non-volatile memory such as a flash memory or a hard disk drive, and stores the image data output from theimage processing unit 5D in a non-volatile manner. Thedisplay unit 8D includes, for example, a display device such as a liquid crystal display (LCD) and a drive circuit that drives the display device, and can display an image based on the image data output from theimage processing unit 5D. The I/F unit 9D is an interface for transmitting the image data output from theimage processing unit 5D to the outside. For example, a universal serial bus (USB) can be applied as the I/F unit 9D. The present invention is not limited to this, and the I/F unit 9D may be an interface connectable to a network by wired communication or wireless communication. - The
input device 10D includes an operator for receiving a user input. If theelectronic apparatus 1D is, for example, a digital still camera, a digital video camera, a mobile phone or a smartphone with an imaging function, theinput device 10D can include a shutter button for giving instructions on imaging by theimaging device 1 or an operator for realizing the function of the shutter button. - The
control unit 3D includes, for example, a processor such as a central processing unit (CPU), a read only memory (ROM), and a random access memory (RAM), and controls the entire operation of theelectronic apparatus 1D using the RAM as a work memory according to the program stored in the ROM in advance. For example, thecontrol unit 3D can control the operation of theelectronic apparatus 1D according to the user input received by theinput device 10D. In addition, thecontrol unit 3D can control the autofocus mechanism in theoptical system 2D based on the image processing result of theimage processing unit 5D. -
FIG. 2 is a block diagram illustrating a schematic configuration example of an imaging device that may be applied to an embodiment of the present disclosure. InFIG. 2 , theimaging device 1 includes apixel array unit 11, avertical scanning unit 12, an A/D conversion unit 13, a referencesignal generation unit 14, ahorizontal scanning unit 15, apixel signal line 16, avertical signal line 17, anoutput unit 18, and acontrol unit 19. - The
pixel array unit 11 includes a plurality of pixels arranged in a two-dimensional matrix in a horizontal direction (row direction) and a vertical direction (column direction). Each pixel includes a photoelectric conversion unit that performs photoelectric conversion on received light. The photoelectric conversion unit includes a photodiode or the like. - In addition, to the
pixel array unit 11, thepixel signal line 16 is connected for each row and thevertical signal line 17 is connected for each column. An end of thepixel signal line 16 that is not connected to thepixel array unit 11 is connected to thevertical scanning unit 12. Thepixel signal line 16 transmits a control signal such as a drive pulse when a pixel signal is read from a pixel from thevertical scanning unit 12 to thepixel array unit 11. An end of thevertical signal line 17 that is not connected to thepixel array unit 11 is connected to the analog to digital (A/D)conversion unit 13. Thevertical signal line 17 transmits the pixel signal read from the pixel to the A/D conversion unit 13 - Under the control of the
control unit 19, thevertical scanning unit 12 supplies various signals including a drive pulse to thepixel signal line 16 corresponding to the selected pixel row of thepixel array unit 11, outputting the pixel signal and the like to thevertical signal line 17. Thevertical scanning unit 12 includes, for example, a shift register, an address decoder, and the like. - The A/
D conversion unit 13 includes a column A/D conversion unit 131 provided for eachvertical signal line 17 and asignal processing unit 132. - The column A/
D conversion unit 131 executes a counting processing for a correlated double sampling (CDS) processing for performing noise reduction on the pixel signal output from the pixel via thevertical signal line 17. The column A/D conversion unit 131 includes acomparator 131 a and acounter unit 131 b. - The
comparator 131 a compares the pixel signal input from the pixel via thevertical signal line 17 with a ramp signal RAMP supplied from the referencesignal generation unit 14 in a preset phase (P-phase) period, and outputs the comparison result to thecounter unit 131 b. Here, the P-phase period is a period in which the reset level of the pixel signal is detected in the CDS processing. In addition, the ramp signal RAMP is, for example, a signal in which the level (voltage value) decreases at a constant slope, or a sawtooth wave signal in which the level decreases stepwise. When the level of the ramp signal RAMP is higher than the level of the pixel signal, thecomparator 131 a outputs a high difference signal to thecounter unit 131 b. When the level of the ramp signal RAMP becomes equal to or lower than the level of the pixel signal, thecomparator 131 a inverts the output and outputs a low difference signal to thecounter unit 131 b. Note that the level of the ramp signal RAMP is reset to a predetermined value after the output of thecomparator 131 a is inverted. - In the P-phase period, according to the difference signal input from the
comparator 131 a, thecounter unit 131 b down-counts the time from the start of the voltage drop of the ramp signal RAMP to the level equal to or lower than that of the pixel signal, and outputs the counting result to thesignal processing unit 132. In addition, in the data phase (D-phase) period, according to the difference signal input from thecomparator 131 a, thecounter unit 131 b up-counts the time from the start of the voltage drop of the ramp signal RAMP to the level equal to or lower than that of the pixel signal, and outputs the counting result to thesignal processing unit 132. Here, the D-phase period is a detection period in which the signal level of the pixel signal is detected in the CDS processing. - The
signal processing unit 132 performs CDS processing and A/D conversion processing based on the counting result of the P-phase period and the counting result of the D-phase period input from thecounter unit 131 b to generate digital image data, and outputs the digital image data to theoutput unit 18. - The reference
signal generation unit 14 generates the ramp signal RAMP based on a control signal input from thecontrol unit 19, and outputs the generated ramp signal RAMP to thecomparator 131 a of the A/D conversion unit 13. The referencesignal generation unit 14 includes, for example, a D/A conversion circuit or the like. - Under the control of the
control unit 19, thehorizontal scanning unit 15 performs selective scanning for selecting each column A/D conversion unit 131 in a predetermined order, sequentially outputting the counting results temporarily held by each column A/D conversion unit 131 to thesignal processing unit 132. Thehorizontal scanning unit 15 includes, for example, a shift register, an address decoder, and the like. - The
output unit 18 performs predetermined signal processing on the image data input from thesignal processing unit 132 and outputs the image data to the outside of theimaging device 1. - The
control unit 19 performs drive control of thevertical scanning unit 12, the A/D conversion unit 13, the referencesignal generation unit 14, thehorizontal scanning unit 15, and the like. Thecontrol unit 19 includes, for example, a timing generator or the like. Thecontrol unit 19 generates various drive signals serving as references for the operations of thevertical scanning unit 12, the A/D conversion unit 13, the referencesignal generation unit 14, and thehorizontal scanning unit 15. - The
imaging device 1 configured as described above is a column AD type complementary metal oxide semiconductor (CMOS) image sensor in which the column A/D conversion unit 131 is arranged for each column. Note that, although there is one A/D conversion unit 13 inFIG. 2 , for example, two A/D conversion units 13 may be provided in the vertical direction of thepixel array unit 11, and the odd-numbered column and the even-numbered column of thepixel array unit 11 may be divided in the vertical direction to output pixel signals. -
FIG. 3 is a diagram illustrating a part of a circuit configuration of thepixel array unit 11 that may be applied to the embodiment. - As illustrated in
FIG. 3 , thepixel array unit 11 includes a constant current source 2, a pixel 3 (hereinafter, referred to as “normal pixel 3”), and a pixel 4 (hereinafter, referred to as “special pixel 4”). In thepixel array unit 11, a plurality of normal pixels 3 and a plurality of special pixels 4 are arranged in a two-dimensional matrix in a predetermined arrangement pattern, and the special pixels 4 are arranged in a predetermined pixel row at predetermined intervals. A firsttransmission signal line 161, areset signal line 162, and a rowselection signal line 163 are connected to each normal pixel 3 as thepixel signal line 16. In addition, thereset signal line 162, the rowselection signal line 163, and a secondtransmission signal line 164 are connected to each special pixel 4 as thepixel signal line 16. - The constant current source 2 is provided in each
vertical signal line 17. The constant current source 2 includes an N-channel MOS (metal-oxide-semiconductor field-effect) transistor (hereinafter, abbreviated as “NMOS”) or the like. One end side of the constant current source 2 is grounded, and the other end side is connected to thevertical signal line 17. - The normal pixels 3 are arranged in a two-dimensional matrix on the
pixel array unit 11. The normal pixel 3 includes aphotoelectric conversion unit 31, atransfer switch 32, a floating diffusion 33 (hereinafter, abbreviated as “FD33”), areset switch 34, anamplification transistor 35, and arow selection switch 36. - The
photoelectric conversion unit 31 performs photoelectric conversion on the received light to generate signal electric charge for images. Thephotoelectric conversion unit 31 includes a PN junction photodiode or the like. Thephotoelectric conversion unit 31 has an anode terminal grounded and a cathode terminal connected to the FD33 via thetransfer switch 32. In the embodiment, thephotoelectric conversion unit 31 functions as a first photoelectric conversion unit. - The
transfer switch 32 has one end connected to thephotoelectric conversion unit 31 and the other end connected to the FD33. Further, thetransfer switch 32 is connected to the firsttransmission signal line 161. When the transfer pulse TR is supplied via the firsttransmission signal line 161, thetransfer switch 32 is turned on (closed state), and transfers the signal electric charge photoelectrically converted by thephotoelectric conversion unit 31 to the FD33. - The FD33 temporarily holds the signal electric charge transferred from the
photoelectric conversion unit 31 and converts the signal electric charge into voltage corresponding to the electric charge amount. - The
reset switch 34 has one end connected to the FD33 and the other end connected to the power source voltage. Further, thereset switch 34 is connected to thereset signal line 162. In a case where the reset pulse RST is supplied via thereset signal line 162, thereset switch 34 is turned on, and discharges the electric charge of the FD33 to the power source voltage to reset the potential of the FD33 to predetermined potential. - One end of the
amplification transistor 35 is connected to a power source voltage, and the other end is connected to therow selection switch 36. Further, the FD33 is connected to the gate end of theamplification transistor 35. Theamplification transistor 35 functions as a source follower together with the constant current source 2 connected via thevertical signal line 17. Theamplification transistor 35 outputs a reset signal (reset level) indicating a level corresponding to the potential of the FD33 reset by thereset switch 34 to thevertical signal line 17. In addition, theamplification transistor 35 outputs, to thevertical signal line 17, an image pixel signal indicating a level corresponding to the electric charge amount of the signal electric charge held in the FD33 after the signal electric charge is transferred from thephotoelectric conversion unit 31 by thetransfer switch 32. - The
row selection switch 36 has one end connected to theamplification transistor 35 and the other end connected to thevertical signal line 17. Further, therow selection switch 36 is connected to the rowselection signal line 163. When the row selection signal SEL is supplied from the rowselection signal line 163, therow selection switch 36 is turned on, and outputs a reset signal or a pixel signal (first signal) output from theamplification transistor 35 to thevertical signal line 17. - One end of the
vertical signal line 17 is connected to thecomparator D conversion unit 13. In the example ofFIG. 3 , thecomparator 131 a connected to thevertical signal line 17 to which the special pixel 4 is connected is illustrated as acomparator 131 a_S. - The
transfer switch 32, thereset switch 34, theamplification transistor 35, and therow selection switch 36 of the normal pixel 3 configured as described above include, for example, an NMOS or P-channel MOS transistor (abbreviated as PMOS). In addition, the normal pixel 3 includes any one color filter of a red (R) filter, a green (G) filter, and a blue (B) filter stacked on the light receiving surface of thephotoelectric conversion unit 31. The normal pixels 3 form, for example, a Bayer array on thepixel array unit 11. - Note that the normal pixels 3 are not limited to the Bayer array, and may be arranged according to a predetermined rule. For example, it is possible to use various color filter arrays as a base such as an X-Trans (registered trademark) type color filter array having a unit pattern of 3×3 pixels, a quad-Bayer array having 4×4 pixels, and a white RGB type color filter array having 4×4 pixels including a color filter (hereinafter, also referred to as clear or white) having a broad light transmission characteristic with respect to a visible light region in addition to the color filters of the three primary colors of RGB.
- In the following description, the
photoelectric conversion unit 31 in which a green (G) filter is stacked on the light receiving surface will be referred to as a pixel G, thephotoelectric conversion unit 31 in which a red (R) filter is stacked on the light receiving surface will be referred to as a pixel R, and thephotoelectric conversion unit 31 in which a blue (B) filter is stacked on the light receiving surface will be referred to as a pixel B. -
FIG. 4 is a diagram for explaining an arrangement example of pixels. As illustrated inFIG. 4 , regarding the normal pixels 3, in a unit pixel of 2×2 pixels, two pixels at diagonal positions are the pixels G, and the remaining pixels are the pixels R and the pixels B. InFIG. 4 , a part of the normal pixels 3 arranged according to the Bayer array is further replaced with special pixels 4 (pixels S). More specifically, in the normal pixels 3, the horizontal line in which the pixels B and pixels G are arranged is replaced with the special pixels 4. - The special pixel 4 has a configuration similar to that of the normal pixel 3, and includes a
photoelectric conversion unit 41, atransfer switch 42, a floating diffusion 43 (hereinafter, simply referred to as “FD43”), areset switch 44, anamplification transistor 45, and arow selection switch 46. The special pixel 4 includes a special filter stacked on the light receiving surface of thephotoelectric conversion unit 41. In addition, in the special pixel 4, thetransfer switch 42 is connected to the secondtransmission signal line 164, and the transfer pulse TR_S is supplied from the secondtransmission signal line 164. Other configurations of the special pixel 4 are similar to those of the normal pixel 3. - The special pixel 4 is a pixel other than a pixel (normal pixel, for example, the pixel R, the pixel G, and the pixel B) for acquiring color information and luminance information in the visible light region in order to form a full-color image. Examples of the special pixel 4 include an infrared light pixel, a white pixel, a monochrome pixel, a black pixel, a polarization pixel, and an image plane phase difference pixel. In the infrared light pixel, an infrared filter capable of receiving infrared light is stacked on the light receiving surface of the
photoelectric conversion unit 41. In the white pixel, a white filter capable of receiving all visible light of red, green, and blue is stacked on the light receiving surface of thephotoelectric conversion unit 41. In the monochrome pixel, a transparent filter is stacked on the light receiving surface of thephotoelectric conversion unit 41. In the black pixel, a light shielding filter is stacked on the light receiving surface of thephotoelectric conversion unit 41. The polarizing pixel is a pixel using a polarizing element for receiving polarized light. - In the image plane phase difference pixel, an opening filter opened only in a predetermined region is stacked on the light receiving surface of the
photoelectric conversion unit 41. More specifically, as the image plane phase difference pixel, two pixels of a pixel in which an opening filter having an opening in a region of, for example, ½ on the left side of the light receiving surface of thephotoelectric conversion unit 41 is stacked and a pixel in which an opening filter having an opening in a region of ½ on the right side of the light receiving surface of anotherphotoelectric conversion unit 41 is stacked are set as one set, and distance measurement is performed based on a phase difference of light received by these two pixels. - As described above, the pixel signal having light received by the special pixel 4 photoelectrically converted can realize a function different from that of the pixel signal having light received by the normal pixel 3 photoelectrically converted. Note that, in the drawings, the special pixel 4 or the
photoelectric conversion unit 41 of the special pixel 4 is represented as “S”. - Next, a method of reading the pixel signal in the above-described
imaging device 1 will be described.FIG. 5 is a timing chart schematically illustrating reading of a pixel signal by theimaging device 1. InFIG. 5 , the horizontal axis represents time. In addition, inFIG. 5 , the output timing of the vertical synchronization pulse is illustrated in the upper part, and the output timing of the horizontal synchronization pulse in thevertical scanning unit 12 is illustrated in the middle part.FIG. 5 illustrates a case where theimaging device 1 reads pixel signals of one frame. - As illustrated in
FIG. 5 , thecontrol unit 19 first sequentially reads pixel signals from the special pixels 4 of thepixel array unit 11 according to, for example, a vertical synchronization pulse and a horizontal synchronization pulse input from the outside of theimaging device 1. After reading the pixel signals from the special pixels 4 of all the special pixel rows, thecontrol unit 19 sequentially reads the pixel signals from each normal pixel 3 for each row of thepixel array unit 11. - As described above, for example, the
imaging device 1 first reads the pixel signals from all the special pixels 4, and then performs a reading method of sequentially reading the pixel signals from each normal pixel 3 for each row of thepixel array unit 11. - Next, a decrease in imaging accuracy of the imaging device according to the existing technology will be described with reference to
FIGS. 6 to 8 .FIG. 6 is a diagram illustrating a configuration example of a color filter layer.FIGS. 7 and 8 are diagrams for explaining an example of a method of forming color filters. - Note that the green color filter formed corresponding to the pixel G is referred to as a green filter GF. The red color filter formed corresponding to the pixel R is referred to as a red filter RF. The blue color filter formed corresponding to the pixel B is referred to as a blue filter BF. When the green filter GF, the red filter RF, and the blue filter BF are not distinguished, they are also simply referred to as a color filter CF. In addition, the special filter formed corresponding to the pixel S (special pixel 4) is referred to as a special filter SF.
- In the following description, the color filter CF and the special filter SF are provided for each of the normal pixel 3 and the special pixel 4. That is, although one color filter CF or special filter SF is provided in each pixel, for example, two or more adjacent special filters SF may be integrally formed, and the same type of adjacent filters may be integrally formed.
- As described above, since the normal pixels 3 are arranged according to the Bayer array, as illustrated in
FIG. 6 , the color filters CF formed in a color filter layer CL0 are also arranged according to the Bayer array similarly to the normal pixels 3. In addition, the special filters SF are arranged by replacing a part (for example, a pixel row) of the color filters CF arranged according to the Bayer array. - Note that the special filter SF is a filter formed according to the function of the special pixel 4, and examples include the above-described white filter, a transparent filter, and an opening filter opened only in a predetermined region.
- In this manner, it is assumed that the color filter layer in which the color filter CF and the special filter SF are arranged is formed in the order of the green filter GF, the red filter RF, and the blue filter BF, for example. In this case, as illustrated in
FIG. 7 , first, the green filter GF is formed on thepixel array unit 11. - Here, the color filter CF has a problem that it is difficult to form a corner portion that is not in contact with another color filter CF at a right angle. Therefore, as illustrated in
FIG. 7 , since the corner portion of the green filter GF adjacent to the region where the special filter SF is formed is an end that is not in contact with another green filter GF, the corner portion does not form a right angle and is rounded. Note thatFIG. 7 is an enlarged view of a part of the color filter layer CL0 illustrated inFIG. 6 , and a boundary line of pixels is indicated by a dotted line. - Next, when the red filter RF is formed after the green filter GF is formed, as illustrated in
FIG. 8 , a part of the red filter RF adjacent to the region where the special filter SF is formed protrudes to a region where the green filter GF is originally to be formed. Note thatFIG. 8 is an enlarged view of a part of the color filter layer CL0 illustrated inFIG. 6 , and a boundary line of pixels is indicated by a dotted line. - As described above, since a part of the red filter RF is formed in the region where the green filter GF is to be formed, there is a possibility that the light receiving sensitivity of the pixel G fluctuates and the imaging accuracy of the
imaging device 1 decreases. - In addition, in the red filter formation region where the special filter SF is not formed, the red filter RF is formed in a region having four sides surrounded by the green filter GF on all sides. On the other hand, in the red filter formation region adjacent to the special filter SF, the red filter RF is formed in a region surrounded by the green filter GF on three sides. Therefore, as illustrated in
FIG. 8 , the red filter RF is formed to protrude to the formation region of the special filter SF, and the sensitivity of the pixel R may fluctuate. Since the red filter RF protrudes to the formation region of the special filter SF in this manner, there is a possibility that the film thickness of the red filter RF becomes uneven. - As described above, since the film thickness of the red filter RF becomes uneven, the light receiving sensitivity of the pixel R varies, and the imaging accuracy of the
imaging device 1 may decrease. - Therefore, in the
imaging device 1 according to the embodiment, the color filter CF is formed in the region where the special filter SF is formed to surround at least a part of the periphery of the special filter SF, suppressing a decrease in the imaging accuracy of theimaging device 1. Such points will be described below. -
FIGS. 9 and 10 are diagrams for explaining the color filter layer of theimaging device 1 according to the embodiment. The color filter layer CL1 illustrated inFIG. 9 has the same configuration as the color filter layer CL0 illustrated inFIG. 6 except that the periphery of some special filters SF is surrounded by a color filter (hereinafter, also referred to as a special pixel color filter SCF). Note that, althoughFIG. 9 illustrates a case where the special filters SF are arranged in one pixel row since a part of the color filter layer CL1 is illustrated, the special filters SF may be arranged in a plurality of pixel rows at predetermined intervals. - The color filter layer CL1 includes a special pixel color filter SGF surrounding the periphery of the special filter SF in plan view. Although
FIG. 9 illustrates a case where the special pixel color filter SCF is a green filter (hereinafter, referred to as a special pixel green filter SGF) that passes through green, the special pixel color filter SCF may be a red filter or a blue filter. - As illustrated in
FIG. 9 , a special filter (hereinafter, also referred to as a first special filter SF1) surrounded by the special pixel green filter SGF and a special filter (hereinafter, also referred to as a second special filter SF2) not surrounded by the special pixel green filter SGF are alternately arranged for each pixel to form one pixel row. -
FIG. 10 is a diagram illustrating the green filter GF and the special pixel green filter SGF among the filters formed in the color filter layer CL1 illustrated inFIG. 9 . As illustrated inFIG. 10 , the special pixel green filter SGF is arranged to surround the periphery of the special filter SF adjacent to the color filter CF (the red filter RF inFIG. 9 ) other than the green filter GF. - Here, in
FIGS. 9 and 10 , an example in which the special pixels 4 and the special filters SF are arranged in a predetermined pixel row has been described, but the present invention is not limited to this. For example, as illustrated inFIGS. 11 and 12 , the special pixels 4 and the special filters SF may be arranged in a predetermined pixel column.FIGS. 11 and 12 are diagrams for explaining another configuration example of a color filter layer of theimaging device 1 according to the embodiment. - As illustrated in
FIG. 11 , the special filters SF of a color filter layer CL2 are arranged in one pixel row. In addition, the first special filter SF1 adjacent to the red filter RF is surrounded by the special pixel green filter SGF. On the other hand, the second special filter SF2 adjacent to the green filter GF is not surrounded by the special pixel green filter SGF. Note that, althoughFIG. 11 illustrates a case where the special filters SF are arranged in one pixel row since a part of the color filter layer CL1 is illustrated, the special filters SF may be arranged in a plurality of pixel rows at predetermined intervals. -
FIG. 12 is a diagram illustrating the green filter GF and the special pixel green filter SGF among the filters formed in the color filter layer CL2 illustrated inFIG. 11 . As illustrated inFIG. 12 , the special pixel green filter SGF is arranged to surround the periphery of the special filter SF adjacent to the color filter CF (the red filter RF inFIG. 11 ) other than the green filter GF. - Alternatively, as illustrated in
FIG. 13 , the special pixels 4 and the special filters SF may be arranged in both the pixel row and the pixel column.FIG. 13 is a diagram for explaining another configuration example of a color filter layer of theimaging device 1 according to the embodiment.FIG. 13 illustrates the green filter GF and the special pixel green filter SGF among the filters formed in the color filter layer CL2. Note that, althoughFIG. 13 illustrates a case where the special filters SF are arranged in one pixel row and one pixel column since a part of the color filter layer is illustrated, the special filters SF may be arranged in a plurality of pixel rows or a plurality of pixel columns at predetermined intervals. - As illustrated in
FIGS. 9 to 13 , the special pixel green filters SGF are arranged in a region where the pixels G are arranged in the Bayer array before replacement with the special pixels 4. In other words, the first special filter SF1 is formed inside while leaving the periphery of the green filter GF. Therefore, the corner portion of the green filter GF is arranged in contact with the corner portion of another green filter GF (including the special pixel green filter SGF). - As a result, the green filter GF has a corner portion in contact with another green filter even in the region where the special filter SF is formed, and an end not in contact with another green filter is not formed. Therefore, the corner portion of the green filter GF adjacent to the region where the special filter SF is formed can be formed in a desired shape, and the adjacent red filter RF can be formed without protruding to the region where the green filter GF is formed. Therefore, the color filter CF can be formed in a desired shape, a decrease and variation in the light receiving sensitivity of each pixel can be suppressed, and a decrease in the imaging accuracy of the
imaging device 1 can be suppressed. - In addition, by forming the special pixel green filter SGF, four sides of the red filter RF adjacent to the formation region of the special filter SF can also be surrounded by the green filter GF including the special pixel green filter SGF, similarly to the other red filters RF. As a result, similarly to the other red filters RF, the red filter RF adjacent to the formation region of the special filter SF can also be made less likely to be formed with an uneven film thickness. Such points will be described with reference to
FIGS. 14 to 17 . -
FIG. 14 is a diagram illustrating a configuration example of a color filter layer according to an existing technology.FIG. 15 is a schematic view illustrating a cross section of the color filter layer taken along line A-A′ inFIG. 14 .FIG. 16 is a diagram illustrating a configuration example of the color filter layer according to the embodiment.FIG. 17 is a schematic view illustrating a cross section of the color filter layer taken along line B-B′ inFIG. 16 . - The color filter layer CL0 illustrated in
FIGS. 14 and 16 does not include the special pixel color filter SGF, and the red filters RF and the blue filters BF are arranged in contact with the special filter SF. As illustrated inFIGS. 14 and 16 , the special filters SF may be arranged in both the pixel row and the pixel column at predetermined intervals.FIGS. 14 and 16 are diagrams illustrating a part of the color filter layer CL0. - As illustrated in
FIG. 14 , the red filter RF and the blue filter BF adjacent to the special filter SF are surrounded by the green filter GF on three sides, and the other side is opened without being surrounded by the green filter GF. - As described above, for example, it is assumed that the color filters CF are formed in the order of green, red, and blue. In this case, for example, as illustrated in
FIGS. 10 and 12 , first, the green filters GF and the special pixel green filters SGF are formed, and then, the red filters RF and the blue filters BF are formed. - For example, in the case of forming the blue filter BF, the green filter GF becomes a wall on three sides surrounded by the green filter GF, but since there is no green filter GF as a wall on one side opened, the blue filter BF easily flows to the one side opened.
- Therefore, as illustrated in
FIG. 15 , in the blue filter BF, the film thickness on the special filter SF side becomes thinner than the film thickness on the green filter GF side, and the film thickness becomes uneven. - Similarly, in the color filter layer CL0 according to the embodiment illustrated in
FIG. 16 , the red filter RF and the blue filter BF adjacent to the special filter SF are surrounded by the green filter GF on three sides, and the other side is opened without being surrounded by the green filter GF. - As described above, for example, it is assumed that the color filters CF are formed in the order of green, red, and blue. In this case, for example, as illustrated in
FIGS. 10 and 12 , first, the green filters GF and the special pixel green filters SGF are formed, and then, the red filters RF and the blue filters BF are formed. - For example, in the case of forming the red filter RF, the green filter GF becomes a wall on three sides surrounded by the green filter GF, but since there is no green filter GF as a wall on one side opened, the red filter RF easily flows to the one side opened.
- Therefore, as illustrated in
FIG. 17 , in the red filter RF, the film thickness on the special filter SF side becomes thinner than the film thickness on the green filter GF side, and the film thickness becomes uneven. - On the other hand, in the color filter layer according to the present embodiment, the special pixel green filter SGF is formed to surround the periphery of the special pixel region. Therefore, the red filter RF and the blue filter BF are formed in a region having four sides surrounded by the green filter GF including the special pixel green filter SGF.
- For example, in the case of forming the blue filter BF, the blue filter BF is formed by using the four sides green filters GF as walls. Therefore, in the blue filter BF, the difference between the film thickness on the special filter SF side and the film thickness on the green filter GF side becomes small, and the film can be evenly formed as compared with
FIG. 15 . - Note that, here, the description has been made on the point that the unevenness of the film thickness can be suppressed using the blue filter BF as an example, but the same effect can be obtained also for the red filter RF. In addition, the unevenness of the film thickness of the special filter SF can also be improved.
- For example, in the examples of
FIGS. 14 and 16 , the special filters SF are arranged corresponding to a pixel row (also referred to as a special pixel row) and a pixel column (also referred to as a special pixel column). Here, if the special pixel color filter SGF is not formed, the film thickness of the special filter SF at the central portion of the special pixel row becomes thinner than the film thickness on both end sides. This is because the special filter SF flows to both end sides of the special pixel row (or the special pixel column) by the centrifugal force of the coater that forms the special filter SF. - As described above, since the film thickness of the special filter SF becomes uneven, the sensitivity of the special pixel 4 fluctuates, and the function corresponding to the special pixel 4 decreases. For example, it is assumed that the special pixel 4 is an image plane phase difference pixel, and the
imaging device 1 realizes an autofocus function using the special pixel 4. In this case, when the sensitivity of the special pixel 4 fluctuates, theimaging device 1 cannot appropriately adjust the focus, and the autofocus function may decrease. - On the other hand, as illustrated in
FIGS. 9 to 13 , when the special pixel color filter SGF is formed in the special pixel row (or the special pixel column), for example, the special filter SF is divided for each pixel by the special pixel color filter SGF. Therefore, the special pixel color filter SGF serves as a wall, and the special filter SF hardly flows to both end sides of the special pixel row (or the special pixel column), in a manner that the unevenness of the film thickness of the special filter SF can be improved. As a result, it is possible to suppress functional degradation of theimaging device 1. - In the above-described embodiment, the case where the special pixel green filter SGF is formed to surround the periphery of the special filter SF has been described, but the present invention is not limited to this. The special pixel green filter SGF may be formed to surround at least a part of a special pixel region SR including at least one of the special pixels 4. Such points will be described with reference to
FIG. 18 . -
FIG. 18 is a diagram illustrating a configuration example of a color filter layer according to a first modification of the embodiment. Note thatFIG. 18 illustrates a part of the color filter layer of theimaging device 1. - In a color filter layer CL4 illustrated in
FIG. 18 , the special pixel green filter SGF is formed on at least a part (a side in contact with the color filter CF) of the special filter SF. - As a result, the corner portion of the green filter GF comes into contact with another green filter GF or the special pixel green filter SGF, and the color filter CF can be formed in a desired shape. In addition, the unevenness of the film thickness of the color filter CF can be improved.
- Note that, in the example illustrated in
FIG. 18 , the special filter SF is not divided by the special pixel green filter SGF. Therefore, for example, when evenness of the film thickness of the special filter SF is required, it is desirable to form the special pixel green filter SGF to surround the periphery of the special filter SF as illustrated in the color filter layers CL1 to CL3 in the embodiment. As a result, the special filter SF can be divided by the special pixel green filter SGF, and the unevenness of the film thickness of the special filter SF can be improved. - Next, a second modification of the embodiment will be described with reference to
FIG. 19 .FIG. 19 is a diagram illustrating a configuration example of a color filter layer CL5 according to the second modification of the embodiment. Note thatFIG. 19 illustrates a part of the color filter layer CL5 of theimaging device 1. - As described above, in a case where the special pixel 4 is an image plane phase difference pixel, the special filter SF corresponding to the special pixel 4 is an opening filter opened only in a predetermined region, but all the special filters SF are not necessarily opening filters. For example, as illustrated in
FIG. 19 , a part of the special filter SF may be an opening filter, and another special filter SF may be a color filter that passes through a predetermined color. In this case, the color filter formed in the opening of the opening filter may be a color filter that passes through the same color as another special filter SF (hereinafter, also referred to as a special color filter SCF). - In the example of
FIG. 19 , an opening filter OF and the special color filter SCF are alternately arranged. The special color filter SCF is, for example, a color filter (hereinafter, also referred to as a cyan filter) that passes through cyan. Since cyan has higher sensitivity in thephotoelectric conversion unit 41 than RGB, the sensitivity of the special pixel 4 can be improved by using the cyan filter as the special filter SF. Note that the special color filter SCF may be a white filter that passes through white instead of a cyan filter. Since the white filter has higher sensitivity in thephotoelectric conversion unit 41 than the cyan filter, the sensitivity of the special pixel 4 can be further improved. - The special pixel green filter SGF is arranged to surround the periphery of the special color filter SCF. As a result, effects similar to those of the embodiment can be obtained. In addition, in the special pixel region where the special pixel green filter SGF is formed, the formation region of the special filter SF becomes small. Therefore, when the opening filter OF is formed in the special pixel region where the special pixel green filter SGF is formed, the sensitivity of the image plane phase difference pixel may decrease. Therefore, by forming the opening filter OF in the special pixel region where the special pixel green filter SGF is not formed, it is possible to suppress a decrease in the sensitivity of the image plane phase difference pixel.
- Note that the pixel signal obtained by photoelectrically converting the light received by the special pixel 4 corresponding to the special color filter SCF may be used, for example, for correction of a captured image. When a part of the normal pixels 3 is replaced with the special pixels 4, the number of pixels of the normal pixels 3 decreases, and the image quality of the captured image deteriorates. By correcting the captured image using the special pixels 4 corresponding to the special color filter SCF, it is possible to suppress deterioration in image quality of the captured image while realizing functions such as autofocus.
- Note that although
FIG. 19 illustrates the opening filter OF that shields light on the left side of the special pixel 4, the opening filter OF that shields light on the right side is also formed in the color filter layer CL5 although not illustrated. In addition, the arrangement of the opening filters OF illustrated inFIG. 19 is an example, and the present invention is not limited to this. For example, the opening filter OF that shields light on the left side of the special pixel 4 and the opening filter OF that shields light on the right side may be formed in the same pixel row. - In the above-described embodiment, the first special filter SF1 surrounded by the special pixel green filter SGF and the second special filter SF2 not surrounded by the special pixel green filter SGF are alternately arranged, but the present invention is not limited to this. For example, as illustrated in
FIG. 20 , all the special filters SF may be surrounded by the special pixel green filter SGF. Note thatFIG. 20 is a diagram illustrating a configuration example of a color filter layer CL6 according to a third modification of the embodiment. Note thatFIG. 20 illustrates a part of the color filter layer CL6 of theimaging device 1. - As illustrated in
FIG. 20 , even if all the special filters SF are surrounded by the special pixel green filter SGF, the same effects as those of the embodiment can be obtained. Note that although the area of the special filter SF is reduced, but all the special filters SF have substantially the same area. Therefore, by surrounding all the special filters SF with the special pixel green filter SGF, the sensitivity of the special pixels 4 can be made uniform. - Note that, although
FIG. 20 illustrates an example in which the special filters SF are arranged in the pixel row of the color filter layer CL6, as illustrated inFIG. 21 , the special filters SF may also be arranged in the pixel column in addition to the pixel row of the color filter layer CL6. Alternatively, the special filters SF may be arranged in the pixel column of the color filter layer CL6. Note thatFIG. 21 is a diagram illustrating another configuration example of the color filter layer CL6 according to the third modification of the embodiment, and illustrates a part of the color filter layer CL6 of theimaging device 1. - In the above-described embodiment, the special pixel green filter SGF is arranged to surround the periphery of the special filter SF corresponding to one special pixel 4, but the present invention is not limited to this. For example, the special pixel green filter SGF may be arranged to surround the periphery of the special filter SF corresponding to two or more special pixels 4. In other words, in a color filter layer CL7 according to the third modification, the periphery of the special filter region including two or more special filters SF in plan view may be surrounded by the special pixel green filter SGF.
-
FIG. 22 is a diagram illustrating a configuration example of the color filter layer CL7 according to a fourth modification of the embodiment.FIG. 22 illustrates a part of the color filter layer CL7 of theimaging device 1. - As illustrated in
FIG. 22 , in the color filter layer CL7, the periphery of the special filter region including two special filters SF in plan view in the special pixel row is surrounded by the special pixel green filter SGF. - Note that, although
FIG. 22 illustrates an example in which the special filters SF are arranged in the pixel row of the color filter layer CL7, as illustrated inFIG. 23 , the special filters SF may also be arranged in the pixel column in addition to the pixel row of the color filter layer CL7. Alternatively, the special filters SF may be arranged in the pixel column of the color filter layer CL7. Note thatFIG. 23 is a diagram illustrating another configuration example of the color filter layer CL7 according to the fourth modification of the embodiment, and illustrates a part of the color filter layer CL7 of theimaging device 1. - In the above-described embodiment, the case where the special filters SF are arranged in the pixel row or the pixel column of the color filter layer has been described, but the present invention is not limited to this. For example, as illustrated in
FIG. 24 , a part of the color filter CF may be replaced with the special filter SF.FIG. 24 is a diagram illustrating a configuration example of a color filter layer CL8 according to the fourth modification of the embodiment.FIG. 24 illustrates a part of the color filter layer CL8 of theimaging device 1. - In the color filter layer CL8 illustrated in
FIG. 24 , the green filter GF is replaced with the special filter SF at predetermined intervals, and the special pixel green filter SGF is arranged to surround the periphery of the special filter SF. In this manner, the normal pixels 3 may be replaced with the special pixels 4 at predetermined intervals. - Next, application examples of the
imaging device 1 according to the embodiment and the modifications will be described in the present disclosure.FIG. 25 is a diagram illustrating examples of using theimaging device 1 according to the embodiment and the modifications described above. - Each
imaging device 1 described above can be used, for example, in various cases of sensing light such as visible light, infrared light, ultraviolet light, and X-rays as described below. -
- A device that captures images used for viewing, such as a digital camera or a portable device with a camera function.
- A device used for traffic, such as an in-vehicle sensor that captures images of the front, rear, periphery, inside, and the like of an automobile for safe driving such as automatic stop, recognition of a driver's condition, and the like, a monitoring camera that monitors traveling vehicles and roads, and a distance measuring sensor that measures a distance between vehicles and the like.
- A device used for home appliances such as a TV, a refrigerator, and an air conditioner in order to capture an image of a gesture of a user and operate the device according to the gesture.
- A device used for medical care or health care, such as an endoscope or a device that performs angiography by receiving infrared light.
- A device used for security, such as a monitoring camera for crime prevention or a camera for person authentication.
- A device used for beauty care, such as a skin measuring instrument for capturing images of skin or a microscope for capturing images of a scalp.
- An apparatus used for sports, such as an action camera or a wearable camera for sports or the like.
- A device used for agriculture, such as a camera for monitoring conditions of fields and crops.
- The technology according to the present disclosure (the present technology) can be applied to various products. For example, the technology according to the present disclosure may be applied to an endoscopic surgery system.
-
FIG. 26 is a block diagram illustrating an example of a schematic configuration of a patient in-vivo information acquisition system using a capsule endoscope to which the technology according to the present disclosure (present technology) may be applied. - An in-vivo
information acquisition system 10001 includes acapsule endoscope 10100 and anexternal control device 10200. - The
capsule endoscope 10100 is swallowed by the patient at the time of examination. Thecapsule endoscope 10100 has an imaging function and a wireless communication function, sequentially captures images of the inside of organs (hereinafter, it is also referred to as in-vivo images) at predetermined intervals while moving inside the organs such as the stomach and the intestines by peristaltic movement or the like until being naturally discharged from the patient, and sequentially wirelessly transmits information regarding the in-vivo images to theexternal control device 10200 outside the body. - The
external control device 10200 integrally controls the operation of the in-vivoinformation acquisition system 10001. In addition, theexternal control device 10200 receives the information regarding the in-vivo image transmitted from thecapsule endoscope 10100, and generates image data for displaying the in-vivo image on a display device (not illustrated) based on the received information regarding the in-vivo image. - In this manner, the in-vivo
information acquisition system 10001 can obtain an in-vivo image obtained by imaging the state of the inside of the patient's body at any time from when thecapsule endoscope 10100 is swallowed until the capsule endoscope is discharged. - Configurations and functions of the
capsule endoscope 10100 and theexternal control device 10200 will be described in more detail. - The
capsule endoscope 10100 includes acasing 10101 of capsule type, and alight source unit 10111, animaging unit 10112, animage processing unit 10113, awireless communication unit 10114, apower feeding unit 10115, apower source unit 10116, and acontrol unit 10117 are housed in thecasing 10101. - The
light source unit 10111 includes a light source such as a light emitting diode (LED), for example, and irradiates an imaging field of view of theimaging unit 10112 with light. - The
imaging unit 10112 includes an imaging device and an optical system including a plurality of lenses provided in front of the imaging device. Reflected light (hereinafter, referred to as observation light) of light with which a body tissue to be observed is irradiated is condensed by the optical system and enters the imaging device. In theimaging unit 10112, the observation light that entered the imaging device is photoelectrically converted, and an image signal corresponding to the observation light is generated in the imaging device. The image signal generated by theimaging unit 10112 is provided to theimage processing unit 10113. - The
image processing unit 10113 includes a processor such as a CPU or a graphics processing unit (GPU), and performs various types of signal processing on the image signal generated by theimaging unit 10112. Theimage processing unit 10113 provides the image signal subjected to the signal processing to thewireless communication unit 10114 as RAW data. - The
wireless communication unit 10114 performs predetermined processing such as modulation processing on the image signal subjected to the signal processing by theimage processing unit 10113, and transmits the image signal to theexternal control device 10200 via anantenna 10114A. In addition, thewireless communication unit 10114 receives a control signal related to drive control of thecapsule endoscope 10100 from theexternal control device 10200 via theantenna 10114A. Thewireless communication unit 10114 provides thecontrol unit 10117 with a control signal received from theexternal control device 10200. - The
power feeding unit 10115 includes a power receiving antenna coil, a power regeneration circuit that regenerates power from current generated in the antenna coil, a booster circuit, and the like. In thepower feeding unit 10115, power is generated using a so-called non-contact charging principle. - The
power source unit 10116 includes a secondary battery, and stores the electric power generated by thepower feeding unit 10115. InFIG. 26 , in order to avoid complication of the drawing, illustration of an arrow or the like indicating the destination of power feeding from thepower source unit 10116 is omitted, but the power stored in thepower source unit 10116 is supplied to thelight source unit 10111, theimaging unit 10112, theimage processing unit 10113, thewireless communication unit 10114, and thecontrol unit 10117, and may be used for driving these units. - The
control unit 10117 includes a processor such as a CPU, and appropriately controls driving of thelight source unit 10111, theimaging unit 10112, theimage processing unit 10113, thewireless communication unit 10114, and thepower feeding unit 10115 according to a control signal transmitted from theexternal control device 10200. - The
external control device 10200 includes a processor such as a CPU or a GPU, or a microcomputer, a control board, or the like on which a processor and a storage element such as a memory are mixedly mounted. Theexternal control device 10200 controls the operation of thecapsule endoscope 10100 by transmitting a control signal to thecontrol unit 10117 of thecapsule endoscope 10100 via anantenna 10200A. In thecapsule endoscope 10100, for example, the light irradiation condition for the observation target in thelight source unit 10111 may be changed by the control signal from theexternal control device 10200. In addition, imaging conditions (for example, a frame rate, an exposure value, and the like in the imaging unit 10112) may be changed by the control signal from theexternal control device 10200. In addition, the contents of the processing in theimage processing unit 10113 and the conditions (for example, a transmission interval, the number of transmitted images, and the like) under which thewireless communication unit 10114 transmits the image signal may be changed by the control signal from theexternal control device 10200. - In addition, the
external control device 10200 performs various types of image processing on the image signal transmitted from thecapsule endoscope 10100, and generates image data for displaying the imaged in-vivo image on the display device. As the image processing, for example, various types of signal processing such as development processing (demosaic processing), high image quality processing (band emphasis processing, super-resolution processing, noise reduction processing, camera shake correction processing, and the like), and enlargement processing (electronic zoom processing) can be performed alone or in combination. Theexternal control device 10200 controls driving of the display device to display the imaged in-vivo image based on the generated image data. Alternatively, theexternal control device 10200 may cause a recording device (not illustrated) to record the generated image data or cause a printing device (not illustrated) to print out the generated image data. - An example of the in-vivo information acquisition system to which the technology according to the present disclosure may be applied has been described above. The technology according to the present disclosure may be applied to, for example, the
imaging unit 10112 among the above-described configurations. By applying theimaging device 1 according to the present disclosure to theimaging unit 10112, favorable autofocus can be performed even in a case where zooming or the like is performed, and a higher-quality in-vivo image or the like can be acquired. - The technology according to the present disclosure may be further applied to an endoscopic surgery system.
FIG. 27 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure (present technology) may be applied. -
FIG. 27 illustrates a state in which an operator (doctor) 11131 is performing surgery on apatient 11132 on apatient bed 11133 using anendoscopic surgery system 11000. As illustrated, theendoscopic surgery system 11000 includes anendoscope 11100, othersurgical tools 11110 such as apneumoperitoneum tube 11111 and anenergy treatment tool 11112, asupport arm device 11120 that supports theendoscope 11100, and a cart 11200 on which various devices for endoscopic surgery are mounted. - The
endoscope 11100 includes alens barrel 11101 having region of a predetermined length from the distal end is inserted into the body cavity of thepatient 11132, and acamera head 11102 connected to the proximal end of thelens barrel 11101. In the illustrated example, theendoscope 11100 configured as a so-called rigid scope having therigid lens barrel 11101 is illustrated, but theendoscope 11100 may be configured as a so-called flexible scope having a flexible lens barrel. - An opening portion into which an objective lens is fitted is provided at the distal end of the
lens barrel 11101. Alight source device 11203 is connected to theendoscope 11100, and light generated by thelight source device 11203 is guided to the distal end of the lens barrel by a light guide extending inside thelens barrel 11101, and is emitted toward an observation target in the body cavity of thepatient 11132 via the objective lens. Note that theendoscope 11100 may be a forward-viewing endoscope, an oblique-viewing endoscope, or a side-viewing endoscope. - An optical system and an imaging device are provided inside the
camera head 11102, and reflected light (observation light) from the observation target is condensed on the imaging device by the optical system. The observation light is photoelectrically converted by the imaging device, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated. The image signal is transmitted to a camera control unit (CCU) 11201 as RAW data. - The
CCU 11201 includes a CPU, a GPU, and the like, and integrally controls operations of theendoscope 11100 and adisplay device 11202. Further, theCCU 11201 receives an image signal from thecamera head 11102, and performs various types of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), on the image signal. - The
display device 11202 displays an image based on the image signal subjected to the image processing by theCCU 11201 under the control of theCCU 11201. - The
light source device 11203 includes a light source such as a light emitting diode (LED), for example, and supplies irradiation light for capturing an image of a surgical site or the like to theendoscope 11100. - An
input device 11204 is an input interface for theendoscopic surgery system 11000. The user can input various types of information and instructions to theendoscopic surgery system 11000 via theinput device 11204. For example, the user inputs an instruction or the like to change imaging conditions (type of irradiation light, magnification, focal length, and the like) by theendoscope 11100. - A treatment
tool control device 11205 controls driving of theenergy treatment tool 11112 for cauterization and incision of tissue, sealing of a blood vessel, or the like. Apneumoperitoneum device 11206 feeds gas into the body cavity of thepatient 11132 via thepneumoperitoneum tube 11111 in order to inflate the body cavity for the purpose of securing a visual field by theendoscope 11100 and securing a working space of the operator. Arecorder 11207 is a device capable of recording various types of information regarding surgery. Aprinter 11208 is a device capable of printing various types of information regarding surgery in various formats such as text, image, or graph. - Note that the
light source device 11203 that supplies theendoscope 11100 with the irradiation light at the time of capturing a picture the surgical site can include, for example, an LED, a laser light source, or a white light source including a combination of them. In a case where the white light source includes a combination of RGB laser light sources, since the output intensity and the output timing of each color (each wavelength) can be controlled with high accuracy, adjustment of the white balance of the captured image can be performed in thelight source device 11203. - In addition, the driving of the
light source device 11203 may be controlled to change the intensity of light to be output every predetermined time. By controlling the driving of the imaging device of thecamera head 11102 in synchronization with the timing of the change of the intensity of the light to acquire images in a time division manner and synthesizing the images, it is possible to generate an image of a high dynamic range without so-called blocked up shadows and blown out highlights. - In addition, the
light source device 11203 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation. In the special light observation, for example, so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel in a mucosal surface layer is imaged with high contrast by emitting light in a narrower band than irradiation light (that is, white light) at the time of normal observation using wavelength dependency of light absorption in body tissue. Alternatively, in the special light observation, fluorescence observation for obtaining an image by fluorescence generated by irradiation with excitation light may be performed. In the fluorescence observation, it is possible to irradiate body tissue with excitation light and observe fluorescence from the body tissue (autofluorescence observation), or to locally inject a reagent such as indocyanine green (ICG) into body tissue and irradiate the body tissue with excitation light corresponding to a fluorescence wavelength of the reagent to obtain a fluorescent image, for example. Thelight source device 11203 may be configured to be able to supply narrowband light and/or excitation light corresponding to such special light observation. -
FIG. 28 is a block diagram illustrating an example of functional configurations of thecamera head 11102 and theCCU 11201 illustrated inFIG. 27 . - The
camera head 11102 includes alens unit 11401, animaging unit 11402, adrive unit 11403, acommunication unit 11404, and a camerahead control unit 11405. TheCCU 11201 includes acommunication unit 11411, animage processing unit 11412, and acontrol unit 11413. Thecamera head 11102 and theCCU 11201 are communicably connected to each other by atransmission cable 11400. - The
lens unit 11401 is an optical system provided at a connection portion with thelens barrel 11101. Observation light taken in from the distal end of thelens barrel 11101 is guided to thecamera head 11102 and enters thelens unit 11401. Thelens unit 11401 includes a combination of a plurality of lenses including a zoom lens and a focus lens. - The
imaging unit 11402 includes an imaging device. The number of imaging devices forming theimaging unit 11402 may be one (so-called single-plate type) or a plurality of (so-called multi-plate type) imaging devices. In a case where theimaging unit 11402 is a multi-plate type, for example, image signals corresponding to RGB may be generated by the imaging devices, and a color image may be obtained by combining the image signals. Alternatively, theimaging unit 11402 may include a pair of imaging devices for acquiring right-eye and left-eye image signals corresponding to three-dimensional (3D) display. By performing 3D display, theoperator 11131 can more accurately grasp the depth of the living tissue in the surgical site. Note that, in a case where theimaging unit 11402 is a multi-plate type, a plurality of systems oflens units 11401 may be provided corresponding to imaging devices. - In addition, the
imaging unit 11402 is not necessarily provided in thecamera head 11102. For example, theimaging unit 11402 may be provided immediately after the objective lens inside thelens barrel 11101. - The
drive unit 11403 includes an actuator, and moves the zoom lens and the focus lens of thelens unit 11401 by a predetermined distance along the optical axis under the control of the camerahead control unit 11405. As a result, the magnification and focus of the captured image by theimaging unit 11402 may be appropriately adjusted. - The
communication unit 11404 includes a communication device for transmitting and receiving various types of information to and from theCCU 11201. Thecommunication unit 11404 transmits the image signal obtained from theimaging unit 11402 as RAW data to theCCU 11201 via thetransmission cable 11400. - In addition, the
communication unit 11404 receives a control signal for controlling driving of thecamera head 11102 from theCCU 11201, and supplies the control signal to the camerahead control unit 11405. The control signal includes, for example, information regarding imaging conditions such as information for specifying a frame rate of a captured image, information for specifying an exposure value at the time of imaging, and/or information for specifying a magnification and a focus of a captured image. - Note that the imaging conditions such as the frame rate, the exposure value, the magnification, and the focus may be appropriately specified by the user, or may be automatically set by the
control unit 11413 of theCCU 11201 based on the acquired image signal. In the latter case, a so-called auto exposure (AE) function, an auto focus (AF) function, and an auto white balance (AWB) function are installed in theendoscope 11100. - The camera
head control unit 11405 controls driving of thecamera head 11102 based on the control signal from theCCU 11201 received via thecommunication unit 11404. - The
communication unit 11411 includes a communication device for transmitting and receiving various types of information to and from thecamera head 11102. Thecommunication unit 11411 receives an image signal transmitted from thecamera head 11102 via thetransmission cable 11400. - In addition, the
communication unit 11411 transmits a control signal for controlling driving of thecamera head 11102 to thecamera head 11102. The image signal and the control signal can be transmitted by electric communication, optical communication, or the like. - The
image processing unit 11412 performs various types of image processing on the image signal that is RAW data transmitted from thecamera head 11102. - The
control unit 11413 performs various types of control related to imaging of a surgical site or the like by theendoscope 11100 and display of a captured image obtained by imaging of the surgical site or the like. For example, thecontrol unit 11413 generates a control signal for controlling driving of thecamera head 11102. - In addition, the
control unit 11413 causes thedisplay device 11202 to display a captured image of a surgical site or the like based on the image signal subjected to the image processing by theimage processing unit 11412. At this time, thecontrol unit 11413 may recognize various objects in the captured image using various image recognition technologies. For example, thecontrol unit 11413 can recognize a surgical tool such as forceps, a specific body part, bleeding, mist at the time of using theenergy treatment tool 11112, and the like by detecting the shape, color, and the like of the edge of the object included in the captured image. When displaying the captured image on thedisplay device 11202, thecontrol unit 11413 may superimpose and display various types of surgery support information on the image of the surgical site by using the recognition result. Since the surgery support information is superimposed and displayed and presented to theoperator 11131, the burden on theoperator 11131 can be reduced and theoperator 11131 can reliably proceed with the surgery. - The
transmission cable 11400 connecting thecamera head 11102 and theCCU 11201 is an electric signal cable compatible with electric signal communication, an optical fiber compatible with optical communication, or a composite cable of them. - Here, in the example of
FIG. 28 , wired communication is performed using thetransmission cable 11400, but communication between thecamera head 11102 and theCCU 11201 may be performed wirelessly. - An example of the endoscopic surgery system to which the technology according to the present disclosure may be applied has been described above. The technology according to the present disclosure may be applied to, for example, the
endoscope 11100 and theimaging unit 11402 of thecamera head 11102 among the above-described configurations. By applying theimaging device 1 according to the present disclosure to theimaging unit 11402, favorable autofocus can be performed even in a case where zooming or the like is performed, and a higher-quality captured image or the like can be acquired. As a result, the burden on theoperator 11131 can be reduced, and theoperator 11131 can reliably proceed with the surgery. - Note that, here, the endoscopic surgery system has been described as an example, but the technology according to the present disclosure may be applied to, for example, a microscopic surgery system or the like.
- The technology according to the present disclosure may be further applied to devices mounted on various moving bodies such as an m-automobile, an electric car, a hybrid electric car, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot.
-
FIG. 29 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a moving body control system to which the technology according to the present disclosure may be applied. - A
vehicle control system 12000 includes a plurality of electronic control units connected via acommunication network 12001. In the example illustrated inFIG. 29 , thevehicle control system 12000 includes a drivesystem control unit 12010, a bodysystem control unit 12020, a vehicle exteriorinformation detection unit 12030, a vehicle interiorinformation detection unit 12040, and anintegrated control unit 12050. In addition, as a functional configuration of theintegrated control unit 12050, amicrocomputer 12051, an audioimage output unit 12052, and an in-vehicle network interface (I/F) 12053 are illustrated. - The drive
system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs. For example, the drivesystem control unit 12010 functions as a control device of a driving force generation device for generating a driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting a steering angle of the vehicle, a braking device for generating a braking force of the vehicle, and the like. - The body
system control unit 12020 controls operations of various devices mounted on the vehicle body according to various programs. For example, the bodysystem control unit 12020 functions as a control device of a keyless entry system, a smart key system, a power window device, or various lamps such as a head lamp, a back lamp, a brake lamp, a blinker, or a fog lamp. In this case, radio waves transmitted from a portable device that substitutes for a key or signals of various switches may be input to the bodysystem control unit 12020. The bodysystem control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle. - The vehicle exterior
information detection unit 12030 detects information outside the vehicle on which thevehicle control system 12000 is mounted. For example, animaging unit 12031 is connected to the vehicle exteriorinformation detection unit 12030. The vehicle exteriorinformation detection unit 12030 causes theimaging unit 12031 to capture an image of the outside of the vehicle, and receives the captured image. The vehicle exteriorinformation detection unit 12030 may perform object detection processing or distance detection processing of a person, a vehicle, an obstacle, a sign, a character on a road surface, or the like based on the received image. For example, the vehicle exteriorinformation detection unit 12030 performs image processing on the received image, and performs object detection processing and distance detection processing based on a result of the image processing. - The
imaging unit 12031 is an optical sensor that receives light and outputs an electric signal corresponding to the amount of received light. Theimaging unit 12031 can output the electric signal as an image or can output the electric signal as distance measurement information. In addition, the light received by theimaging unit 12031 may be visible light or invisible light such as infrared rays. - The vehicle interior
information detection unit 12040 detects information inside the vehicle. For example, a driverstate detection unit 12041 that detects a state of a driver is connected to the vehicle interiorinformation detection unit 12040. The driverstate detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interiorinformation detection unit 12040 may calculate the degree of fatigue or the degree of concentration of the driver or may determine whether or not the driver is dozing off based on the detection information input from the driverstate detection unit 12041. - The
microcomputer 12051 can calculate a control target value of the driving force generation device, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exteriorinformation detection unit 12030 or the vehicle interiorinformation detection unit 12040, and output a control command to the drivesystem control unit 12010. For example, themicrocomputer 12051 can perform cooperative control for the purpose of realizing functions of an advanced driver assistance system (ADAS) including collision avoidance or impact mitigation of the vehicle, follow-up traveling based on the distance between vehicles, vehicle speed maintenance traveling, vehicle collision warning, vehicle lane departure warning, or the like. - In addition, the
microcomputer 12051 controls the driving force generation device, the steering mechanism, the braking device, or the like based on the information around the vehicle acquired by the vehicle exteriorinformation detection unit 12030 or the vehicle interiorinformation detection unit 12040, performing cooperative control for the purpose of automatic driving or the like in which the vehicle autonomously travels without depending on the operation of the driver. - In addition, the
microcomputer 12051 can output a control command to the bodysystem control unit 12020 based on the vehicle exterior information acquired by the vehicle exteriorinformation detection unit 12030. For example, themicrocomputer 12051 can perform cooperative control for the purpose of preventing the glare, such as switching from a high beam to a low beam, by controlling the headlamp according to the position of a preceding vehicle or an oncoming vehicle detected by the vehicle exteriorinformation detection unit 12030. - The audio
image output unit 12052 transmits an output signal of at least one of audio and an image to an output device capable of visually or audibly notifying an occupant of the vehicle or the outside of the vehicle of the information. In the example ofFIG. 29 , anaudio speaker 12061, adisplay unit 12062, and aninstrument panel 12063 are illustrated as the output device. Thedisplay unit 12062 may include, for example, at least one of an on-board display and a head-up display. -
FIG. 30 is a diagram illustrating an example of an installation position of theimaging unit 12031. InFIG. 30 , avehicle 12100 includesimaging units imaging unit 12031. - The
imaging units vehicle 12100. Theimaging unit 12101 provided at the front nose and theimaging unit 12105 provided at the upper portion of a windshield in a vehicle interior mainly acquire images in front of thevehicle 12100. Theimaging units vehicle 12100. Theimaging unit 12104 provided on the rear bumper or the back door mainly acquires an image behind thevehicle 12100. The front images acquired by theimaging units - Note that
FIG. 30 illustrates an example of an imaging range of theimaging units 12101 to 12104. Animaging range 12111 indicates an imaging range of theimaging unit 12101 provided at the front nose, imaging ranges 12112 and 12113 indicate imaging ranges of theimaging units imaging range 12114 indicates an imaging range of theimaging unit 12104 provided at the rear bumper or the back door. For example, an overhead image of thevehicle 12100 viewed from above can be obtained by superimposing image data captured by theimaging units 12101 to 12104. - At least one of the
imaging units 12101 to 12104 may have a function of acquiring distance information. For example, at least one of theimaging units 12101 to 12104 may be a stereo camera including a plurality of imaging devices, or may be an imaging device having pixels for phase difference detection. - For example, the
microcomputer 12051 obtains a distance to each three-dimensional object in the imaging ranges 12111 to 12114 and a temporal change of the distance (relative speed with respect to the vehicle 12100) based on the distance information obtained from theimaging units 12101 to 12104, extracting, as a preceding vehicle, a three-dimensional object traveling at a predetermined speed (for example, 0 km/h or more) in substantially the same direction as thevehicle 12100, in particular, the closest three-dimensional object on a traveling path of thevehicle 12100. Further, themicrocomputer 12051 can set the distance between vehicles to be secured in advance in front of the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. As described above, it is possible to perform cooperative control for the purpose of automatic driving or the like in which the vehicle autonomously travels without depending on the operation of the driver. - For example, based on the distance information obtained from the
imaging units 12101 to 12104, themicrocomputer 12051 can classify three-dimensional object data related to a three-dimensional object into a two-wheeled vehicle, an ordinary vehicle, a large vehicle, a pedestrian, and other three-dimensional objects such as a utility pole, extract the three-dimensional object data, and use the three-dimensional object data for automatic avoidance of an obstacle. For example, themicrocomputer 12051 identifies obstacles around thevehicle 12100 as obstacles that can be visually recognized by the driver of thevehicle 12100 and obstacles that are difficult to visually recognize. Then, themicrocomputer 12051 determines a collision risk indicating a risk of collision with each obstacle, and when the collision risk is a set value or more and there is a possibility of collision, themicrocomputer 12051 can perform driving assistance for collision avoidance by outputting an alarm to the driver via theaudio speaker 12061 or thedisplay unit 12062 or performing forced deceleration or avoidance steering via the drivesystem control unit 12010. - At least one of the
imaging units 12101 to 12104 may be an infrared camera that detects infrared rays. For example, themicrocomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in images captured by theimaging units 12101 to 12104. Such recognition of a pedestrian is performed by, for example, a procedure of extracting feature points in captured images by theimaging units 12101 to 12104 as an infrared camera and a procedure of performing pattern matching processing on a series of feature points indicating an outline of an object to determine whether or not the object is a pedestrian. When themicrocomputer 12051 determines that a pedestrian is present in the images captured by theimaging units 12101 to 12104 and recognizes the pedestrian, the audioimage output unit 12052 controls thedisplay unit 12062 to superimpose and display a square contour line for emphasis on the recognized pedestrian. In addition, the audioimage output unit 12052 may control thedisplay unit 12062 to display an icon or the like indicating a pedestrian at a desired position. - An example of the vehicle control system to which the technology according to the present disclosure may be applied has been described above. The technology according to the present disclosure may be applied to, for example, the
imaging unit 12031 among the above-described configurations. By applying theimaging device 1 according to the present disclosure to theimaging unit 12031, favorable autofocus can be performed even in a case where zooming or the like is performed, and a higher-quality captured image is acquired. - The embodiment and the modifications of the present disclosure have been described in detail above with reference to
FIGS. 1 to 30 . As described above, theimaging device 1 according to the embodiment and the modifications includes the plurality of normal pixels 3 arranged in a matrix, the special pixel 4 arranged by replacing a part of the normal pixels 3, the color filter CF corresponding to the normal pixels 3 and arranged according to a predetermined rule, the special filter SF arranged corresponding to the special pixel 4, and the special pixel color filter (corresponding to the special pixel green filter SGF) arranged to surround at least a part of the periphery of the special filter SF. - As a result, the color filter CF can be arranged in a manner that the end of the color filter CF is in contact with the special pixel color filter, and the color filter CF can be formed in a desired shape. Therefore, the sensitivity variation of the normal pixels 4 can be suppressed, and the decrease in the imaging accuracy of the imaging device can be suppressed.
- Although the preferred embodiments of the present disclosure have been described in detail with reference to the accompanying drawings, the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can conceive various changes or modifications within the scope of the technical idea described in the claims, and it is naturally understood that these also belong to the technical scope of the present disclosure.
- In addition, the effects described in the present specification are merely illustrative or exemplary, and are not restrictive. That is, the technology according to the present disclosure can exhibit other effects obvious to those skilled in the art from the description of the present specification together with or instead of the above effects.
- Note that the following configurations also belong to the technical scope of the present disclosure.
- (1)
- An imaging device comprising:
-
- a plurality of normal pixels arranged in a matrix;
- a special pixel arranged by replacing a part of the normal pixels;
- a color filter corresponding to the normal pixels and arranged according to a predetermined rule;
- a special filter arranged corresponding to the special pixel; and a special pixel color filter arranged to surround at least a part of a periphery of the special filter.
(2)
- The imaging device according to (1), wherein
-
- the color filter includes a plurality of filters that passes through different colors, and
- the special pixel color filter is a color filter that passes through a same color as any one of the plurality of filters.
(3)
- The imaging device according to (1) or (2), wherein
-
- the special pixel color filter is arranged to be in contact with a corner portion of the color filter that passes through the same color.
(4)
- the special pixel color filter is arranged to be in contact with a corner portion of the color filter that passes through the same color.
- The imaging device according to any one of (1) to (3), wherein
-
- the special pixel color filter is arranged to surround a periphery of the one special filter.
(5)
- the special pixel color filter is arranged to surround a periphery of the one special filter.
- The imaging device according to (4), wherein
-
- the special pixel color filter is arranged to surround the special filter adjacent to the color filter that passes through a color different from a color passed through by the special pixel color filter.
(6)
- the special pixel color filter is arranged to surround the special filter adjacent to the color filter that passes through a color different from a color passed through by the special pixel color filter.
- The imaging device according to any one of (1) to (3), wherein
-
- the special pixel color filter is arranged to surround the peripheries of the two or more adjacent special filters.
(7)
- the special pixel color filter is arranged to surround the peripheries of the two or more adjacent special filters.
- The imaging device according to any one of (1) to (6), wherein
-
- the special filter is any one of an infrared light filter, an image plane phase difference filter, a white filter, a monochrome filter, and a black filter.
(8)
- the special filter is any one of an infrared light filter, an image plane phase difference filter, a white filter, a monochrome filter, and a black filter.
- An electronic apparatus comprising:
-
- an imaging device;
- an optical system that forms an image of incident light on a light receiving surface of the imaging device; and
- a processor that controls the imaging device, wherein
- the imaging device includes:
- a plurality of normal pixels arranged in a matrix;
- a special pixel arranged by replacing a part of the normal pixels;
- a color filter corresponding to the normal pixels and arranged according to a predetermined rule;
- a special filter arranged corresponding to the special pixel; and
- a special pixel color filter arranged to surround at least a part of a periphery of the special filter.
-
-
- 1 IMAGING DEVICE
- 3 NORMAL PIXEL
- 4 SPECIAL PIXEL
- 31, 41 PHOTOELECTRIC CONVERSION UNIT
- 32, 42 TRANSFER SWITCH
- 33, 43 FLOATING DIFFUSION
- 35, 45 AMPLIFICATION TRANSISTOR
- 36, 46 ROW SELECTION SWITCH
- CF COLOR FILTER
- SF SPECIAL FILTER
- SGF SPECIAL PIXEL GREEN FILTER
- RF RED FILTER
- GF GREEN FILTER
- BF BLUE FILTER
Claims (8)
1. An imaging device comprising:
a plurality of normal pixels arranged in a matrix;
a special pixel arranged by replacing a part of the normal pixels;
a color filter corresponding to the normal pixels and arranged according to a predetermined rule;
a special filter arranged corresponding to the special pixel; and
a special pixel color filter arranged to surround at least a part of a periphery of the special filter.
2. The imaging device according to claim 1 , wherein
the color filter includes a plurality of filters that passes through different colors, and
the special pixel color filter is a color filter that passes through a same color as any one of the plurality of filters.
3. The imaging device according to claim 2 , wherein
the special pixel color filter is arranged to be in contact with a corner portion of the color filter that passes through the same color.
4. The imaging device according to claim 3 , wherein
the special pixel color filter is arranged to surround a periphery of the one special filter.
5. The imaging device according to claim 4 , wherein
the special pixel color filter is arranged to surround the special filter adjacent to the color filter that passes through a color different from a color passed through by the special pixel color filter.
6. The imaging device according to claim 3 , wherein
the special pixel color filter is arranged to surround the peripheries of the two or more adjacent special filters.
7. The imaging device according to claim 3 , wherein
the special filter is any one of an infrared light filter, an image plane phase difference filter, a white filter, a monochrome filter, and a black filter.
8. An electronic apparatus comprising:
an imaging device;
an optical system that forms an image of incident light on a light receiving surface of the imaging device; and
a processor that controls the imaging device, wherein
the imaging device includes:
a plurality of normal pixels arranged in a matrix;
a special pixel arranged by replacing a part of the normal pixels;
a color filter corresponding to the normal pixels and arranged according to a predetermined rule;
a special filter arranged corresponding to the special pixel; and
a special pixel color filter arranged to surround at least a part of a periphery of the special filter.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-202415 | 2019-11-07 | ||
JP2019202415 | 2019-11-07 | ||
PCT/JP2020/040148 WO2021090727A1 (en) | 2019-11-07 | 2020-10-26 | Imaging device and electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20240155261A1 true US20240155261A1 (en) | 2024-05-09 |
Family
ID=75848370
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/773,172 Pending US20240155261A1 (en) | 2019-11-07 | 2020-10-26 | Imaging device and electronic apparatus |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240155261A1 (en) |
JP (1) | JPWO2021090727A1 (en) |
CN (1) | CN114556906A (en) |
TW (1) | TW202138848A (en) |
WO (1) | WO2021090727A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050270594A1 (en) * | 2004-05-24 | 2005-12-08 | Matsushita Electric Industrial Co., Ltd. | Solid-state imaging device, method for manufacturing the same and camera |
US20140346629A1 (en) * | 2012-02-13 | 2014-11-27 | Fujifilm Corporation | Imaging element |
US20190019820A1 (en) * | 2016-01-29 | 2019-01-17 | Sony Corporation | Solid-state imaging device and electronic apparatus |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5483917B2 (en) * | 2009-03-31 | 2014-05-07 | ローム株式会社 | Endoscope |
JP2016009813A (en) * | 2014-06-26 | 2016-01-18 | ソニー株式会社 | Solid-state image pickup device, electronic apparatus and method of manufacturing solid-state image pickup device |
US10462431B2 (en) * | 2015-04-10 | 2019-10-29 | Visera Technologies Company Limited | Image sensors |
-
2020
- 2020-09-29 TW TW109133773A patent/TW202138848A/en unknown
- 2020-10-26 US US17/773,172 patent/US20240155261A1/en active Pending
- 2020-10-26 WO PCT/JP2020/040148 patent/WO2021090727A1/en active Application Filing
- 2020-10-26 CN CN202080066134.7A patent/CN114556906A/en active Pending
- 2020-10-26 JP JP2021554900A patent/JPWO2021090727A1/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050270594A1 (en) * | 2004-05-24 | 2005-12-08 | Matsushita Electric Industrial Co., Ltd. | Solid-state imaging device, method for manufacturing the same and camera |
US20140346629A1 (en) * | 2012-02-13 | 2014-11-27 | Fujifilm Corporation | Imaging element |
US20190019820A1 (en) * | 2016-01-29 | 2019-01-17 | Sony Corporation | Solid-state imaging device and electronic apparatus |
US10872919B2 (en) * | 2016-01-29 | 2020-12-22 | Sony Corporation | Solid-state imaging device and electronic apparatus |
US11322534B2 (en) * | 2016-01-29 | 2022-05-03 | Sony Corporation | Solid-state imaging device and electronic apparatus |
Also Published As
Publication number | Publication date |
---|---|
TW202138848A (en) | 2021-10-16 |
JPWO2021090727A1 (en) | 2021-05-14 |
CN114556906A (en) | 2022-05-27 |
WO2021090727A1 (en) | 2021-05-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110050459B (en) | Solid-state imaging element and electronic device | |
US10841520B2 (en) | Solid-state imaging device and electronic device | |
US11930287B2 (en) | Imaging element and electronic device | |
US11943549B2 (en) | Imaging apparatus and electronic equipment | |
US20240064438A1 (en) | Imaging device | |
US11218658B2 (en) | Solid-state imaging device, method of controlling the solid-state imaging device, and electronic apparatus to generate high dynamic range image | |
US20220272292A1 (en) | Solid-state imaging device, method for driving the same, and electronic device | |
JP2018113637A (en) | Solid state imaging device and electronic apparatus | |
US10893224B2 (en) | Imaging element and electronic device | |
US20240155261A1 (en) | Imaging device and electronic apparatus | |
WO2021100446A1 (en) | Solid-state imaging device and electronic apparatus | |
JP2020123795A (en) | Solid-state imaging device and electronic device | |
TWI853915B (en) | Photographic components and electronic equipment | |
WO2024062813A1 (en) | Imaging device and electronic equipment | |
JP2019022020A (en) | Solid state imaging device, driving method of solid state imaging device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY SEMICONDUCTOR SOLUTIONS CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ISERI, YUJI;KIRA, HODAKA;HAGIHARA, DAISUKE;AND OTHERS;SIGNING DATES FROM 20220524 TO 20220615;REEL/FRAME:060248/0654 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |