WO2019176302A1 - Imaging element and method for manufacturing imaging element - Google Patents
Imaging element and method for manufacturing imaging element Download PDFInfo
- Publication number
- WO2019176302A1 WO2019176302A1 PCT/JP2019/002025 JP2019002025W WO2019176302A1 WO 2019176302 A1 WO2019176302 A1 WO 2019176302A1 JP 2019002025 W JP2019002025 W JP 2019002025W WO 2019176302 A1 WO2019176302 A1 WO 2019176302A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- base layer
- lens
- pixel
- image
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 118
- 238000004519 manufacturing process Methods 0.000 title claims abstract description 34
- 238000000034 method Methods 0.000 title description 59
- 238000005530 etching Methods 0.000 claims description 36
- 238000006243 chemical reaction Methods 0.000 claims description 27
- 230000002093 peripheral effect Effects 0.000 claims 1
- 238000005516 engineering process Methods 0.000 description 53
- 239000004065 semiconductor Substances 0.000 description 47
- 238000012545 processing Methods 0.000 description 31
- 239000000758 substrate Substances 0.000 description 26
- 230000008569 process Effects 0.000 description 25
- 238000001514 detection method Methods 0.000 description 23
- 238000001312 dry etching Methods 0.000 description 19
- 238000004891 communication Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 18
- 238000002674 endoscopic surgery Methods 0.000 description 12
- 239000012212 insulator Substances 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 239000000463 material Substances 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 8
- 210000001519 tissue Anatomy 0.000 description 8
- 239000000203 mixture Substances 0.000 description 7
- 229910004298 SiO 2 Inorganic materials 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 6
- 230000015572 biosynthetic process Effects 0.000 description 6
- 238000012986 modification Methods 0.000 description 6
- 230000004048 modification Effects 0.000 description 6
- 238000001356 surgical procedure Methods 0.000 description 6
- 238000012546 transfer Methods 0.000 description 6
- 229910052581 Si3N4 Inorganic materials 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 230000005284 excitation Effects 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 238000002156 mixing Methods 0.000 description 4
- HQVNEWCFYHHQES-UHFFFAOYSA-N silicon nitride Chemical compound N12[Si]34N5[Si]62N3[Si]51N64 HQVNEWCFYHHQES-UHFFFAOYSA-N 0.000 description 4
- 238000005229 chemical vapour deposition Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000010336 energy treatment Methods 0.000 description 3
- 230000001678 irradiating effect Effects 0.000 description 3
- 208000005646 Pneumoperitoneum Diseases 0.000 description 2
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 2
- 229910052782 aluminium Inorganic materials 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 2
- 239000003153 chemical reaction reagent Substances 0.000 description 2
- 239000010949 copper Substances 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- MOFVSTNWEDAEEK-UHFFFAOYSA-M indocyanine green Chemical compound [Na+].[O-]S(=O)(=O)CCCCN1C2=CC=C3C=CC=CC3=C2C(C)(C)C1=CC=CC=CC=CC1=[N+](CCCCS([O-])(=O)=O)C2=CC=C(C=CC=C3)C3=C2C1(C)C MOFVSTNWEDAEEK-UHFFFAOYSA-M 0.000 description 2
- 229960004657 indocyanine green Drugs 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 229910052814 silicon oxide Inorganic materials 0.000 description 2
- 229910052721 tungsten Inorganic materials 0.000 description 2
- 238000001039 wet etching Methods 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 238000002679 ablation Methods 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 230000000740 bleeding effect Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000009833 condensation Methods 0.000 description 1
- 230000005494 condensation Effects 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000002073 fluorescence micrograph Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000012535 impurity Substances 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 239000003595 mist Substances 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 210000004400 mucous membrane Anatomy 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- WFKWXMTUELFFGS-UHFFFAOYSA-N tungsten Chemical compound [W] WFKWXMTUELFFGS-UHFFFAOYSA-N 0.000 description 1
- 239000010937 tungsten Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
-
- H01L27/146—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
Definitions
- the present technology relates to an image sensor and a method for manufacturing the image sensor.
- the present invention relates to an image pickup device in which a lens that collects incident light is arranged, and a method for manufacturing the image pickup device.
- an image pickup device that picks up an image of an object has an on-chip lens that collects incident light from the object, a color filter that transmits light of a predetermined wavelength among the collected incident light, and incident light that has passed through the color filter.
- a pixel having a photoelectric conversion unit that converts the signal into an electric signal; These pixels are arranged in a two-dimensional grid to form an image sensor.
- a light shield is disposed around each pixel. This light blocking body blocks light incident obliquely from adjacent pixels.
- the color filter described above transmits light having a different wavelength for each pixel. For example, color filters corresponding to red light, green light, and blue light are arranged in each pixel.
- an image sensor including a pixel in which a second lens that collects incident light collected by an on-chip lens and transmitted through a color filter is disposed is used. ing.
- This second lens is called an in-layer lens and is configured in a convex shape.
- a method of forming this intra-layer lens for example, a method of forming by self-alignment using the above-described light shielding body as a mask has been proposed (for example, see Patent Document 1).
- a light shielding body is formed on the base layer, and the concave layer is formed by selectively etching the base layer using the formed light shielding body as a mask.
- an insulator is deposited in the recess by a high-density plasma film forming method.
- the deposited insulating film has a thickness in the vicinity of the light shielding body, and has a downwardly convex shape in which the film thickness decreases as the distance from the light shielding body increases.
- the above-mentioned conventional technology has a problem that the light collection rate is low. This is because the lens has a hemispherical shape configured to be convex downward, and there is only one curved surface that refracts incident light. By forming convex portions on both surfaces of the lens, the light collection rate can be improved. However, a lens having convex portions on both sides has a problem that it is difficult to form compared with a lens having convex portions on one side.
- the present technology has been made to solve the above-described problems.
- the first aspect of the present technology is a pixel that generates an image signal corresponding to incident light, and the pixel that is disposed in the pixel and transmits the incident light. And a convex layer formed adjacent to the concave portion of the base layer for each pixel and formed based on the concave portion of the base layer to collect the irradiated light.
- An imaging device comprising: a lens that is incident on the pixel through the base layer. As a result, the convex portion of the lens is formed based on the concave portion of the base layer. It is assumed that the convex portion of the lens is created based on self-alignment with the concave portion of the underlayer.
- the lens includes a light transmissive member disposed adjacent to the base layer and having a second recess formed on the surface based on the recess of the base layer, and the second recess.
- the light transmitting member and the resist may be etched at different etching rates to form the convex portion based on the concave portion of the base layer.
- the underlayer may include the concave portion formed by isotropic etching. This brings about the effect that the recess of the underlayer is formed by isotropic etching.
- a light shielding wall disposed around the pixel may be further provided, and the base layer may include the concave portion based on a shape of a region surrounded by the light shielding wall.
- a color filter that transmits light having a predetermined wavelength among the collected incident light may be further provided.
- the lens is arranged between the subject and the color filter.
- a color filter that transmits light having a predetermined wavelength among the incident light may be further provided, and the lens may collect the light transmitted through the color filter.
- the color filter is arranged between the subject and the lens.
- the second aspect of the present technology is configured to transmit the incident light to a pixel that generates an image signal corresponding to incident light and to form a base layer recess forming step including a recess, and to irradiate the irradiated light.
- the convex portion of the lens is formed based on the concave portion of the base layer. It is assumed that the convex portion of the lens is created based on self-alignment with the concave portion of the underlayer.
- FIG. 1 is a diagram illustrating a configuration example of an image sensor according to an embodiment of the present technology.
- the image pickup device 1 of FIG. 1 includes a pixel array unit 10, a vertical drive unit 20, a column signal processing unit 30, and a control unit 40.
- the pixel array unit 10 is configured by arranging the pixels 100 in a two-dimensional grid.
- the pixel 100 generates an image signal corresponding to the irradiated light.
- the pixel 100 includes a photoelectric conversion unit that generates charges according to the irradiated light.
- the pixel 100 further includes a pixel circuit. This pixel circuit generates an image signal based on the charges generated by the photoelectric conversion unit. The generation of the image signal is controlled by a control signal generated by the vertical drive unit 20 described later.
- signal lines 11 and 12 are arranged in an XY matrix.
- the signal line 11 is a signal line that transmits a control signal of the pixel circuit in the pixel 100, and is arranged for each row of the pixel array unit 10, and is wired in common to the pixels 100 arranged in each row.
- the signal line 12 is a signal line that transmits an image signal generated by the pixel circuit of the pixel 100, and is arranged for each column of the pixel array unit 10, and is wired in common to the pixels 100 arranged in each column.
- the vertical drive unit 20 generates a control signal for the pixel circuit of the pixel 100.
- the vertical drive unit 20 transmits the generated control signal to the pixel 100 via the signal line 11 in FIG.
- the column signal processing unit 30 processes the image signal generated by the pixel 100.
- the column signal processing unit 30 processes the image signal transmitted from the pixel 100 via the signal line 12 shown in FIG.
- the processing in the column signal processing unit 30 corresponds to, for example, analog-digital conversion that converts an analog image signal generated in the pixel 100 into a digital image signal.
- the image signal processed by the column signal processing unit 30 is output as an image signal of the image sensor 1.
- the control unit 40 controls the entire image sensor 1.
- the control unit 40 controls the image sensor 1 by generating and outputting a control signal for controlling the vertical driving unit 20 and the column signal processing unit 30.
- the control signal generated by the control unit 40 is transmitted to the vertical drive unit 20 and the column signal processing unit 30 through signal lines 41 and 42, respectively.
- FIG. 2 is a diagram illustrating a configuration example of a pixel according to the embodiment of the present technology.
- FIG. 2 is a circuit diagram illustrating a configuration example of the pixel 100.
- a pixel 100 in the figure includes a photoelectric conversion unit 101, a charge holding unit 102, and MOS transistors 103 to 106.
- the anode of the photoelectric conversion unit 101 is grounded, and the cathode is connected to the source of the MOS transistor 103.
- the drain of the MOS transistor 103 is connected to the source of the MOS transistor 104, the gate of the MOS transistor 105, and one end of the charge holding unit 102. The other end of the charge holding unit 102 is grounded.
- the drains of MOS transistors 104 and 105 are commonly connected to power supply line Vdd, and the source of MOS transistor 105 is connected to the drain of MOS transistor 106.
- the source of the MOS transistor 106 is connected to the signal line 12.
- MOS transistors 103, 104 and 106 have their gates connected to transfer signal line TR, reset signal line RST and selection signal line SEL, respectively. Note that the transfer signal line TR, the reset signal line RST, and the selection signal line SEL constitute a signal line 11.
- the photoelectric conversion unit 101 generates a charge corresponding to the irradiated light as described above.
- a photodiode can be used for the photoelectric conversion unit 101.
- the charge holding unit 102 and the MOS transistors 103 to 106 constitute a pixel circuit.
- the MOS transistor 103 is a transistor that transfers the charge generated by the photoelectric conversion of the photoelectric conversion unit 101 to the charge holding unit 102.
- the charge transfer in the MOS transistor 103 is controlled by a signal transmitted through the transfer signal line TR.
- the charge holding unit 102 is a capacitor that holds the charge transferred by the MOS transistor 103.
- the MOS transistor 105 is a transistor that generates a signal based on the charge held in the charge holding unit 102.
- the MOS transistor 106 is a transistor that outputs a signal generated by the MOS transistor 105 to the signal line 12 as an image signal.
- the MOS transistor 106 is controlled by a signal transmitted through the selection signal line SEL.
- the MOS transistor 104 is a transistor that resets the charge holding unit 102 by discharging the charge held in the charge holding unit 102 to the power supply line Vdd.
- the reset by the MOS transistor 104 is controlled by a signal transmitted through the reset signal line RST, and is executed before the charge transfer by the MOS transistor 103.
- the photoelectric conversion unit 101 can also be reset by conducting the MOS transistor 103 at the time of resetting.
- the pixel circuit converts the charge generated by the photoelectric conversion unit 101 into an image signal.
- FIG. 3 is a cross-sectional view illustrating a configuration example of a pixel according to the first embodiment of the present technology.
- FIG. 2 is a schematic cross-sectional view illustrating a configuration example of the pixel 100 arranged in the pixel array unit 10.
- the pixel 100 includes an on-chip lens 151, a base layer 144, a light shielding wall 143, a color filter 142, an insulating film 141, a semiconductor substrate 111, a wiring region including the wiring layer 122 and the insulating layer 121, and a support substrate. 131.
- the semiconductor substrate 111 is a semiconductor substrate on which the photoelectric conversion portion of the pixel 100 described in FIG. 1 and the semiconductor portion of the pixel circuit are formed.
- semiconductor portions of the vertical driving unit 20, the column signal processing unit 30, and the control unit 40 are further formed on the semiconductor substrate 111.
- a p-type well region is formed in the semiconductor substrate 111, and the photoelectric conversion portion of the pixel 100 and the like are formed in this well region.
- the semiconductor substrate 111 in the figure is configured as a p-type well region.
- the photoelectric conversion unit 101 in the pixel circuit is shown in FIG.
- a photodiode is constituted by a pn junction formed at the interface between the n-type semiconductor region 112 and the p-type well region, and photoelectric conversion is performed.
- a p-type semiconductor region 113 is disposed adjacent to the n-type semiconductor region 112.
- the p-type semiconductor region 113 is a region configured with a relatively high impurity concentration, and is a region for pinning an interface state on the surface of the semiconductor substrate 111. Thereby, the influence of the interface state on the surface of the semiconductor substrate 111 can be reduced.
- the p-type semiconductor region for pinning can also be disposed on the other surface of the semiconductor substrate 111.
- the wiring layer 122 is a wiring that transmits an image signal generated in the pixel 100 and a control signal for controlling the pixel circuit.
- the wiring layer 122 can be made of a metal such as copper (Cu).
- the insulating layer 121 insulates the wiring layer 122.
- the insulating layer 121 can be made of an oxide such as silicon oxide (SiO 2 ), for example.
- the image pickup device 1 including the pixel 100 in FIG. 1 is a back-illuminated image pickup device in which a wiring region is formed on a surface (back surface) different from the surface on which light is incident on the semiconductor substrate 111.
- the support substrate 131 is a substrate that supports the image sensor 1 and is a substrate that is used for improving the strength of the image sensor 1 in the manufacturing process.
- the on-chip lens 151 is a lens that collects light from the subject.
- the on-chip lens 151 in the figure has an elliptical cross section. That is, it is configured in a shape having convex portions on the top and bottom. Thereby, the condensing rate of incident light can be improved.
- the on-chip lens 151 can be composed of a light transmissive member, for example, silicon nitride (SiN).
- the underlayer 144 is a film that is disposed below the on-chip lens 151 and serves as a base when the on-chip lens 151 is formed.
- the underlayer 144 can be made of a light transmissive member having a lower refractive index than that of the on-chip lens 151, for example, silicon oxide (SiO 2 ).
- the resist is disposed in the second recess and the light transmitting member and the resist are etched.
- dry etching can be used.
- the light transmissive member and the resist are etched at different etching rates. Specifically, the etching is performed under the condition that the light transmissive member is etched faster than the resist. Then, the etching amount of the light transmissive member decreases as the resist thickness increases, and a convex portion is formed on the light transmissive member. That is, the on-chip lens 151 having a convex portion based on the concave portion of the base layer 144 can be formed.
- the on-chip lens 151 is formed in a self-aligned manner with the concave portion of the base layer 144. Details of the manufacturing method of the on-chip lens 151 will be described later.
- the on-chip lens 151 is an example of a lens described in the claims.
- the color filter 142 is an optical filter that transmits light having a predetermined wavelength out of the light collected by the on-chip lens 151.
- a color filter 142 that transmits visible light or infrared light of red light, green light, and blue light can be used.
- the light shielding wall 143 is a wall-shaped film that shields light incident obliquely from the adjacent pixels 100.
- the light shielding wall 143 can prevent light that has passed through the color filter 142 of the adjacent pixel 100 from entering, and can prevent color mixing.
- the light shielding wall 143 can be made of, for example, aluminum (Al) or tungsten (W).
- the insulating film 141 is a film that insulates the semiconductor substrate 111. This insulating film 141 can be made of, for example, SiO 2 .
- FIG. 4 is a diagram illustrating light collection by the on-chip lens according to the first embodiment of the present technology.
- FIG. 3A is a simplified diagram of the pixel 100 described in FIG. 3, and illustrates how incident light 301 enters the n-type semiconductor region 112 of the pixel 100.
- the on-chip lens 151 is comprised by the elliptical cross section. Incident light 301 of the pixel 100 is refracted by the two convex portions of the on-chip lens 151. For this reason, even if the incident angle of the incident light of the pixel 100 is relatively large, the n-type semiconductor region 112 can be irradiated. Therefore, the present invention can be applied to imaging using a photographic lens having a large image height.
- B in the figure represents an example in which a hemispherical on-chip lens 311 is used as a comparative example.
- the incident light 301 having the same incident angle as that of a in the figure cannot be condensed on the n-type semiconductor region 112, and the condensing rate becomes low.
- c in the figure is a view showing an example of the on-chip lens 312 in which the convex portion on the upper surface side is formed by a method different from the on-chip lens 151.
- the dry etching method is a method of forming a film of a light transmissive member as a material of an on-chip lens, placing a resist having a convex shape thereon, and forming a resist and a light transmissive material. This is a method of transferring the convex portions of the resist onto the surface of the light-transmitting member by dry etching the member film.
- a condensing rate can be improved by using the on-chip lens 151 of an elliptical cross section. For this reason, generation
- the focal length can be shortened, the imaging device 1 can be thinned.
- a light transmissive member 405 serving as a material for the on-chip lens 151 is formed on the surface of the base layer 144.
- a recess 406 is formed by transferring the shape of the recess 404 of the base layer 144.
- the recess 406 corresponds to the second recess described in FIG.
- the light transmissive member 405 for example, SiN can be used.
- CVD can be used to form the light transmissive member 405 (d in FIG. 6).
- the said process is an example of the lens arrangement
- the resist 407 and the light transmissive member 405 are etched.
- This etching can be performed by anisotropic dry etching.
- anisotropic dry etching By using NF 3 and SF 6 to which CH 2 F 2 or the like is added as an etching gas, anisotropic dry etching can be performed.
- the etching is performed under the condition that the etching rate of the light transmitting member 405 is higher than the etching rate of the resist 407. Etching is performed until the resist 407 disappears.
- the convex part of the shape in which the downward convex part of the resist 407 is inverted and transferred is formed on the surface of the light transmitting member 405, and the on-chip lens 151 having an elliptical cross section is formed (in FIG. 6). f). Since the on-chip lens 151 is formed by self-alignment based on the position of the concave portion 404 of the base layer 144, the on-chip lens 151 can be formed without causing problems such as misalignment during manufacturing.
- the etching method of the resist 407 and the light transmissive member 405 is not limited to this example. For example, wet etching can be performed.
- the said process is an example of the lens convex part formation process as described in a claim.
- the on-chip lens 151 including the convex portions based on the shape of the concave portion 404 of the base layer 144 on both surfaces (the front surface and the back surface) is formed. Since the on-chip lens 151 is formed by self-alignment based on the position of the concave portion 404 of the base layer 144, the manufacturing process of the on-chip lens 151 can be simplified.
- FIG. 7 is a cross-sectional view illustrating a configuration example of a pixel according to the second embodiment of the present technology.
- the pixel 100 in the figure is different from the pixel 100 described in FIG. 3 in that it further includes an underlayer 146, an intralayer lens 152, and a planarizing film 147.
- the pixel 100 in the figure includes a light shielding wall 145 instead of the light shielding wall 143.
- the in-layer lens 152 is a lens disposed in the inner layer of the pixel 100 and is a lens disposed between the color filter 142 and the semiconductor substrate 111.
- This in-layer lens 152 can be made of SiN, for example. Since the pixel 100 in the figure includes two lenses, an on-chip lens 151 and an in-layer lens 152, the light condensing rate can be higher than that of the pixel 100 described in FIG. That is, even the incident light having an incident angle larger than the incident light 301 of a in FIG. 4 can be applied to the n-type semiconductor region 112.
- the light shielding wall 145 is disposed so as to surround a region where the color filter 142 and the in-layer lens 152 are disposed.
- the light shielding wall 145 can be made of Al or W like the light shielding wall 143.
- the inner lens 152 has a back surface formed adjacent to the concave portion formed in the base layer 146 and a convex portion formed on the surface based on the shape of the concave portion of the base layer 146.
- This in-layer lens 152 can also be formed by the same method as the on-chip lens 151. First, a light transmissive member is disposed adjacent to the base layer 146. At this time, a second concave portion to which the shape of the concave portion of the base layer 146 is transferred is formed on the surface of the light transmissive member arranged.
- the in-layer lens 152 can be formed by disposing a resist in the second recess and etching the light transmissive member and the resist at different etching rates.
- the in-layer lens 152 is an example of a lens described in the claims.
- the light shielding wall 145 is formed on the surface of the semiconductor substrate 111 on which the wiring region and the insulating film 141 are formed. This can be performed, for example, by forming a film such as W as a material of the light shielding wall 145 and etching and removing W in a region other than the boundary of the pixel 100 (a in FIG. 8).
- an insulator film 411 serving as a material for the base layer 146 is formed (b in FIG. 8).
- anisotropic dry etching is performed to etch the insulator film 411.
- the insulating film 411 adjacent to the wall surface of the light shielding wall 145 remains without being etched.
- the said process is an example of the base layer recessed part formation process as described in a claim.
- a light transmissive member 414 is formed on the surface of the base layer 146.
- a concave portion 415 to which the shape of the concave portion 413 of the base layer 146 is transferred is formed on the surface of the formed light transmitting member 414.
- the recess 415 corresponds to a second recess (d in FIG. 8).
- the said process is an example of the lens arrangement
- a resist 416 having a flat surface shape is disposed on the surface of the light transmitting member 414 (e in FIG. 9).
- the resist 416 and the light transmissive member 414 are etched by anisotropic dry etching.
- anisotropic dry etching the etching is performed under the condition that the etching rate of the light transmitting member 414 is higher than the etching rate of the resist 416.
- a convex part is formed on the surface of the light transmissive member 414, and the convex part of the in-layer lens 152 is formed by self-alignment based on the position of the concave part 413 of the base layer 146 (f in FIG. 9).
- the said process is an example of the lens convex part formation process as described in a claim.
- a planarizing film 147 is disposed (g in FIG. 9). Thereafter, the color filter 142 and the on-chip lens 151 are formed.
- the configuration of the image sensor 1 is the same as the configuration of the image sensor 1 described in the first embodiment of the present technology, and thus the description thereof is omitted.
- the imaging element 1 according to the second embodiment of the present technology can further improve the light collection rate by including the in-layer lens 152.
- the recess 404 of the base layer 144 is formed by isotropic dry etching.
- the imaging device 1 according to the third embodiment of the present technology is different from the above-described first embodiment in that a concave portion based on the shape of the region surrounded by the light shielding wall is formed in the base layer. Different.
- FIG. 10 is a cross-sectional view illustrating a configuration example of a pixel according to the third embodiment of the present technology.
- the pixel 100 in the figure is different from the pixel 100 described in FIG. 3 in that an on-chip lens 153, a base layer 148, and a light shielding wall 149 are provided instead of the on-chip lens 151, the ground layer 144 and the light shielding wall 143.
- the light shielding wall 149 is disposed in a region adjacent to the on-chip lens 153 from the surface of the insulating film 141.
- the underlayer 148 and the on-chip lens 153 are formed by the same manufacturing method as the pixel 100 described in FIG. That is, except for the point formed adjacent to the surface of the color filter 142, the underlayer 148 and the on-chip lens 153 can be formed by the manufacturing process described in FIGS. Since the concave portion of the base layer 148 can be formed using the light shielding wall 149, the manufacturing method of the on-chip lens 153 can be simplified.
- the on-chip lens 153 is an example of a lens described in the claims.
- the configuration of the image sensor 1 is the same as the configuration of the image sensor 1 described in the first embodiment of the present technology, and thus the description thereof is omitted.
- the imaging device 1 according to the third embodiment of the present technology further simplifies the manufacturing process of the imaging device 1 by forming the concave portion of the base layer 148 using the light shielding wall 149. Can do.
- the pixel 100 having the same configuration is arranged in the image sensor 1 of the first embodiment described above.
- the imaging device 1 according to the fourth embodiment of the present technology is different from the above-described first embodiment in that a phase difference pixel for autofocus is further arranged.
- FIG. 11 is a diagram illustrating a configuration example of the pixel array unit according to the fourth embodiment of the present technology.
- FIG. 3 is a top view showing the arrangement of the pixels 100 in the pixel array unit 10.
- a pixel 100a and a pixel 100b correspond to phase difference pixels.
- the phase difference pixel is a pixel for detecting, as a phase difference, an image shift caused by light that has passed through different areas of the photographing lens that collects light from the subject on the pixel array unit 10 of the image sensor 1. This is a pixel used for autofocus.
- a solid circle represents the on-chip lens 151, and a dotted rectangle represents an effective area of the n-type semiconductor region 112 formed on the semiconductor substrate 111.
- a plurality of such pixels 100 a and pixels 100 b are arranged in a specific row of the pixel array unit 10.
- the entire region where the n-type semiconductor region 112 is formed is an effective region.
- approximately half of the region where the n-type semiconductor region 112 is formed is an effective region.
- the right half of the figure is an effective area
- the left half of the figure is an effective area.
- Light passing through the left and right sides of the photographic lens is incident on these pixels 100a and 100b, respectively.
- FIG. 12 is a cross-sectional view illustrating a configuration example of a pixel according to the fourth embodiment of the present technology.
- FIG. 2 is a diagram illustrating a configuration example of the pixel 100 and the pixel 100a. 3 are different from the pixel 100 described in FIG. 3 in that a base layer 161 and a planarizing film 162 are provided between the color filter 142 and the insulating film 141.
- the light shielding wall 143 is disposed below the base layer 161.
- the light shielding wall 143 in the pixel 100 a is disposed at a position covering the left half of the n-type semiconductor region 112, and shields the half of the n-type semiconductor region 112. Thereby, an effective region in the right half of the n-type semiconductor region 112 described in FIG. 11 can be set.
- the pixel 100a further includes an in-layer lens 154.
- incident light is condensed on the n-type semiconductor region 112 by the on-chip lens 151.
- the in-layer lens 154 is disposed in the pixel 100a, and the condensing position is adjusted to be the end 321.
- the in-layer lens 154 is an example of a lens described in the claims.
- FIG. 13 and 14 are diagrams illustrating an example of a method of manufacturing an image sensor according to the fourth embodiment of the present technology.
- the insulating film 141 is formed on the surface of the semiconductor substrate 111, and the light shielding wall 143 is formed.
- the shape of the light shielding wall 143 is changed in accordance with the region that shields the n-type semiconductor region 112 (a in FIG. 13).
- an insulator film 421 serving as a material for the base layer 161 is formed (b in FIG. 13).
- a resist 422 is formed on the surface of the insulating film 421.
- An opening 423 is formed in the resist 422 in a region where the inner lens 154 is to be formed (c in FIG.
- a light transmissive member 426 is formed on the surface of the base layer 161.
- a concave portion 427 to which the shape of the concave portion 425 of the base layer 161 is transferred is formed on the surface of the formed light transmitting member 426 (e in FIG. 14).
- the said process is an example of the lens arrangement
- a resist 428 having a flat surface is formed (f in FIG. 14), and anisotropic dry etching is performed.
- This anisotropic dry etching can be performed in the same manner as the dry etching described in FIG.
- the inner lens 154 can be formed selectively with respect to the pixel 100a (g in FIG. 14).
- the said process is an example of the lens convex part formation process as described in a claim.
- the imaging element 1 can be manufactured by forming the planarization film 162 and the like.
- the in-layer lens 154 can be arranged on the pixels 100a and 100b which are phase difference pixels.
- the manufacturing process of other regions such as the pixel 100a, for example, the manufacturing process of the on-chip lens 151 can be made common to the pixel 100, and the influence of variations in the manufacturing process can be reduced.
- the configuration of the image sensor 1 in the fourth embodiment of the present technology is not limited to this example.
- a color filter 142 that transmits red light or infrared light may be disposed, and the intralayer lens 154 may be disposed in the pixel 100 that detects red light and infrared light. This makes it possible to change the focal position of the pixel 100 that detects red light or infrared light.
- the configuration of the image sensor 1 is the same as the configuration of the image sensor 1 described in the first embodiment of the present technology, and thus the description thereof is omitted.
- the in-layer lens 154 is selectively disposed on some pixels 100 of the pixel array unit 10. Thereby, the manufacturing process of the image pick-up element 1 provided with a phase difference pixel can be simplified.
- FIG. 15 is a cross-sectional view illustrating a configuration example of a pixel according to a modification of the embodiment of the present technology.
- the pixel 100 in the figure is that the on-chip lens 151, the base layer 144, the light shielding wall 143, the color filter 142, and the insulating film 141 are disposed adjacent to the wiring region composed of the wiring layer 122 and the insulating layer 121. 3 is different from the pixel 100 described in FIG.
- incident light is irradiated onto the n-type semiconductor region 112 from the surface, which is the surface on which the wiring region of the semiconductor substrate 111 is formed.
- the on-chip lens 151 shown in the figure can be formed by the same method as the on-chip lens 151 described in FIG. Even in such a surface irradiation type imaging device 1, the on-chip lens 151 having an elliptical cross section can be easily formed.
- the configuration of the image sensor 1 is the same as the configuration of the image sensor 1 described in the first embodiment of the present technology, and thus the description thereof is omitted.
- the imaging device 1 can simplify the method for manufacturing an on-chip lens in an imaging device configured as a surface irradiation type.
- the present technology can be applied to various products.
- the present technology may be realized as an imaging element mounted on an imaging device such as a camera.
- FIG. 16 is a block diagram illustrating a schematic configuration example of a camera that is an example of an imaging apparatus to which the present technology can be applied.
- the camera 1000 shown in FIG. 1 includes a lens 1001, an image sensor 1002, an imaging control unit 1003, a lens driving unit 1004, an image processing unit 1005, an operation input unit 1006, a frame memory 1007, a display unit 1008, And a recording unit 1009.
- the lens 1001 is a photographing lens of the camera 1000.
- the lens 1001 collects light from the subject and makes it incident on an image sensor 1002 described later to form an image of the subject.
- the imaging element 1002 is a semiconductor element that images light from the subject condensed by the lens 1001.
- the image sensor 1002 generates an analog image signal corresponding to the irradiated light, converts it into a digital image signal, and outputs it.
- the imaging control unit 1003 controls imaging in the imaging element 1002.
- the imaging control unit 1003 controls the imaging element 1002 by generating a control signal and outputting the control signal to the imaging element 1002.
- the imaging control unit 1003 can perform autofocus in the camera 1000 based on the image signal output from the imaging element 1002.
- the autofocus is a system that detects the focal position of the lens 1001 and automatically adjusts it.
- a method image plane phase difference autofocus
- an image plane phase difference is detected by a phase difference pixel arranged in the image sensor 1002 to detect a focal position
- a method (contrast autofocus) in which a position where the contrast of an image is the highest is detected as a focal position can be applied.
- the imaging control unit 1003 adjusts the position of the lens 1001 via the lens driving unit 1004 based on the detected focal position, and performs autofocus.
- the imaging control unit 1003 can be configured by, for example, a DSP (Digital Signal Processor) equipped with firmware.
- DSP Digital Signal Processor
- the lens driving unit 1004 drives the lens 1001 based on the control of the imaging control unit 1003.
- the lens driving unit 1004 can drive the lens 1001 by changing the position of the lens 1001 using a built-in motor.
- the image processing unit 1005 processes the image signal generated by the image sensor 1002. This processing includes, for example, demosaic that generates an image signal of insufficient color among image signals corresponding to red, green, and blue for each pixel, noise reduction that removes noise of the image signal, and encoding of the image signal. Applicable.
- the image processing unit 1005 can be configured by, for example, a microcomputer equipped with firmware.
- the operation input unit 1006 receives an operation input from the user of the camera 1000.
- the operation input unit 1006 for example, a push button or a touch panel can be used.
- the operation input received by the operation input unit 1006 is transmitted to the imaging control unit 1003 and the image processing unit 1005. Thereafter, processing according to the operation input, for example, processing such as imaging of a subject is started.
- the frame memory 1007 is a memory for storing frames that are image signals for one screen.
- the frame memory 1007 is controlled by the image processing unit 1005 and holds a frame in the course of image processing.
- the display unit 1008 displays the image processed by the image processing unit 1005.
- a liquid crystal panel can be used for the display unit 1008.
- the recording unit 1009 records the image processed by the image processing unit 1005.
- a memory card or a hard disk can be used.
- the camera to which the present invention can be applied has been described above.
- the present technology can be applied to the image sensor 1002 among the configurations described above.
- the image sensor 1 described in FIG. 1 can be applied to the image sensor 1002.
- a manufacturing method of the image sensor 1002 in which pixels including an on-chip lens having an elliptical cross section are arranged can be simplified.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure may be applied to an endoscopic surgery system.
- FIG. 17 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology (present technology) according to the present disclosure can be applied.
- FIG. 17 shows a state in which an operator (doctor) 11131 is performing an operation on a patient 11132 on a patient bed 11133 using an endoscopic operation system 11000.
- an endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as an insufflation tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 that supports the endoscope 11100. And a cart 11200 on which various devices for endoscopic surgery are mounted.
- the endoscope 11100 includes a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
- a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
- an endoscope 11100 configured as a so-called rigid mirror having a rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible lens barrel. Good.
- An opening into which the objective lens is fitted is provided at the tip of the lens barrel 11101.
- a light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101. Irradiation is performed toward the observation target in the body cavity of the patient 11132 through the lens.
- the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
- An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from the observation target is condensed on the image sensor by the optical system. Observation light is photoelectrically converted by the imaging element, and an electrical signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
- the image signal is transmitted to a camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
- CCU Camera Control Unit
- the CCU 11201 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs various kinds of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), for example.
- a CPU Central Processing Unit
- GPU Graphics Processing Unit
- the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201.
- the light source device 11203 is composed of a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
- a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
- the input device 11204 is an input interface for the endoscopic surgery system 11000.
- a user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204.
- the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
- the treatment instrument control device 11205 controls the drive of the energy treatment instrument 11112 for tissue ablation, incision, blood vessel sealing, or the like.
- the pneumoperitoneum device 11206 passes gas into the body cavity via the pneumoperitoneum tube 11111.
- the recorder 11207 is an apparatus capable of recording various types of information related to surgery.
- the printer 11208 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
- the light source device 11203 that supplies the irradiation light when imaging the surgical site to the endoscope 11100 can be configured from a white light source configured by, for example, an LED, a laser light source, or a combination thereof.
- a white light source configured by a combination of RGB laser light sources
- the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
- laser light from each of the RGB laser light sources is irradiated on the observation target in a time-sharing manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing, thereby corresponding to each RGB. It is also possible to take the images that have been taken in time division. According to this method, a color image can be obtained without providing a color filter in the image sensor.
- the driving of the light source device 11203 may be controlled so as to change the intensity of the output light every predetermined time. Synchronously with the timing of changing the intensity of the light, the drive of the image sensor of the camera head 11102 is controlled to acquire an image in a time-sharing manner, and the image is synthesized, so that high dynamic without so-called blackout and overexposure A range image can be generated.
- the light source device 11203 may be configured to be able to supply light of a predetermined wavelength band corresponding to special light observation.
- special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue, the surface of the mucous membrane is irradiated by irradiating light in a narrow band compared to irradiation light (ie, white light) during normal observation.
- a so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel is imaged with high contrast.
- fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light.
- the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally administered to the body tissue and applied to the body tissue. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
- the light source device 11203 can be configured to be able to supply narrowband light and / or excitation light corresponding to such special light observation.
- FIG. 18 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG.
- the camera head 11102 includes a lens unit 11401, an imaging unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405.
- the CCU 11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413.
- the camera head 11102 and the CCU 11201 are connected to each other by a transmission cable 11400 so that they can communicate with each other.
- the lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101. Observation light taken from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
- the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
- the imaging unit 11402 includes an imaging element.
- One (so-called single plate type) image sensor may be included in the imaging unit 11402, or a plurality (so-called multi-plate type) may be used.
- image signals corresponding to RGB may be generated by each imaging element, and a color image may be obtained by combining them.
- the imaging unit 11402 may be configured to include a pair of imaging elements for acquiring right-eye and left-eye image signals corresponding to 3D (Dimensional) display. By performing the 3D display, the operator 11131 can more accurately grasp the depth of the living tissue in the surgical site.
- 3D 3D
- the imaging unit 11402 is not necessarily provided in the camera head 11102.
- the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
- the driving unit 11403 is configured by an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405. Thereby, the magnification and the focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
- the communication unit 11404 is configured by a communication device for transmitting and receiving various types of information to and from the CCU 11201.
- the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
- the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. Good. In the latter case, a so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
- AE Auto Exposure
- AF Automatic Focus
- AWB Auto White Balance
- the camera head control unit 11405 controls driving of the camera head 11102 based on a control signal from the CCU 11201 received via the communication unit 11404.
- the communication unit 11411 is configured by a communication device for transmitting and receiving various types of information to and from the camera head 11102.
- the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
- the control unit 11413 performs various types of control related to imaging of the surgical site by the endoscope 11100 and display of a captured image obtained by imaging of the surgical site. For example, the control unit 11413 generates a control signal for controlling driving of the camera head 11102.
- control unit 11413 causes the display device 11202 to display a picked-up image showing the surgical part or the like based on the image signal subjected to the image processing by the image processing unit 11412.
- the control unit 11413 may recognize various objects in the captured image using various image recognition techniques.
- the control unit 11413 detects surgical tools such as forceps, specific biological parts, bleeding, mist when using the energy treatment tool 11112, and the like by detecting the shape and color of the edge of the object included in the captured image. Can be recognized.
- the control unit 11413 may display various types of surgery support information superimposed on the image of the surgical unit using the recognition result. Surgery support information is displayed in a superimposed manner and presented to the operator 11131, thereby reducing the burden on the operator 11131 and allowing the operator 11131 to proceed with surgery reliably.
- the transmission cable 11400 for connecting the camera head 11102 and the CCU 11201 is an electric signal cable corresponding to electric signal communication, an optical fiber corresponding to optical communication, or a composite cable thereof.
- communication is performed by wire using the transmission cable 11400.
- communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
- the technology according to the present disclosure can be applied to the imaging unit 11402 among the configurations described above.
- the imaging device 1 in FIG. 1 can be applied to the imaging unit 11402.
- the cost of the endoscopic surgery system can be reduced.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure is realized as a device that is mounted on any type of mobile body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
- FIG. 19 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
- the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
- the drive system control unit 12010 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
- the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
- the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
- the body control unit 12020 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
- the body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
- the vehicle outside information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted.
- the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030.
- the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image.
- the vehicle outside information detection unit 12030 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received image.
- the vehicle interior information detection unit 12040 detects vehicle interior information.
- a driver state detection unit 12041 that detects a driver's state is connected to the in-vehicle information detection unit 12040.
- the driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
- the microcomputer 12051 calculates a control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside / outside the vehicle acquired by the vehicle outside information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit A control command can be output to 12010.
- the microcomputer 12051 realizes an ADAS (Advanced Driver Assistance System) function including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintaining traveling, vehicle collision warning, or vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose.
- ADAS Advanced Driver Assistance System
- the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation.
- the microcomputer 12051 can output a control command to the body system control unit 12020 based on information outside the vehicle acquired by the vehicle outside information detection unit 12030.
- the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare, such as switching from a high beam to a low beam. It can be carried out.
- the sound image output unit 12052 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
- an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
- the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
- FIG. 20 is a diagram illustrating an example of an installation position of the imaging unit 12031.
- the vehicle 12100 includes imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
- the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper part of a windshield in the vehicle interior of the vehicle 12100.
- the imaging unit 12101 provided in the front nose and the imaging unit 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
- the imaging units 12102 and 12103 provided in the side mirror mainly acquire an image of the side of the vehicle 12100.
- the imaging unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100.
- the forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
- FIG. 20 shows an example of the shooting range of the imaging units 12101 to 12104.
- the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
- the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
- the imaging range 12114 The imaging range of the imaging part 12104 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, an overhead image when the vehicle 12100 is viewed from above is obtained.
- At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
- at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
- the microcomputer 12051 based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object in the imaging range 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100).
- a predetermined speed for example, 0 km / h or more
- the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like.
- automatic brake control including follow-up stop control
- automatic acceleration control including follow-up start control
- cooperative control for the purpose of autonomous driving or the like autonomously traveling without depending on the operation of the driver can be performed.
- At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
- the microcomputer 12051 can recognize a pedestrian by determining whether a pedestrian is present in the captured images of the imaging units 12101 to 12104. Such pedestrian recognition is, for example, whether or not the user is a pedestrian by performing a pattern matching process on a sequence of feature points indicating the outline of an object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. It is carried out by the procedure for determining.
- the audio image output unit 12052 When the microcomputer 12051 determines that there is a pedestrian in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 has a rectangular contour line for emphasizing the recognized pedestrian.
- the display unit 12062 is controlled so as to be superimposed and displayed.
- voice image output part 12052 may control the display part 12062 so that the icon etc. which show a pedestrian may be displayed on a desired position.
- the technology according to the present disclosure can be applied to the imaging unit 12031 and the like among the configurations described above.
- the imaging device 1 in FIG. 1 can be applied to the imaging unit 12031.
- this technique can also take the following structures. (1) a pixel that generates an image signal according to incident light; An underlayer disposed on the pixel to transmit the incident light and having a recess; Each of the pixels includes a convex portion that is disposed adjacent to the concave portion of the base layer and is formed based on the concave portion of the base layer, and collects irradiated light to the pixel through the base layer.
- An imaging device comprising a lens to be incident.
- the lens is disposed adjacent to the base layer and has a light-transmitting member having a second recess formed on the surface based on the recess of the base layer, and a resist further disposed in the second recess
- the imaging device further including a color filter that transmits light having a predetermined wavelength among the collected incident light.
- the imaging element according to any one of (1) to (4), wherein the lens condenses the light transmitted through the color filter.
- a method of manufacturing an image pickup device comprising: a lens convex portion forming step of forming a convex portion formed based on the concave portion of the base layer on the surface of the arranged lens.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Solid State Image Pick-Up Elements (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
Abstract
In order to simplify a manufacturing process for a lens having convex parts on both surfaces in a pixel, this imaging element is provided with pixels, a base layer, and lenses. The pixels generate image signals in response to incident light. The base layer is arranged on the pixels, passes the incident light therethrough, and is provided with recessed parts. The lenses are provided with protruding parts that are arranged adjacent to the recessed parts of the base layer for each pixel and are formed on the basis of the respective recessed parts of the base layer, and the lenses collect emitted light and cause the light to enter respective pixels through the base layer.
Description
本技術は、撮像素子および撮像素子の製造方法に関する。詳しくは、入射光を集光するレンズが配置される撮像素子および当該撮像素子の製造方法に関する。
The present technology relates to an image sensor and a method for manufacturing the image sensor. Specifically, the present invention relates to an image pickup device in which a lens that collects incident light is arranged, and a method for manufacturing the image pickup device.
従来、被写体を撮像する撮像素子は、被写体からの入射光を集光するオンチップレンズと集光された入射光のうち所定の波長の光を透過させるカラーフィルタとカラーフィルタを透過した入射光を電気信号に変換する光電変換部とを有する画素を備える。この画素が2次元格子状に配置されて撮像素子が構成される。また、それぞれの画素の周囲には、遮光体が配置される。この遮光体は、隣接する画素から斜めに入射する光を遮光するものである。また、上述のカラーフィルタは、画素毎に異なる波長の光を透過させる。例えば、赤色光、緑色光および青色光に対応するカラーフィルタが各画素に配置される。画素のカラーフィルタを斜めに透過した入射光が隣接する画素のカラーフィルタを経由せずに光電変換部に到達すると、自身のカラーフィルタとは異なる波長の入射光に基づく電気信号が生成され、画像信号に誤差を生じる。この現象は、混色と称される。上述の遮光体を画素の周囲に配置することにより、混色の発生を防止することができる。
Conventionally, an image pickup device that picks up an image of an object has an on-chip lens that collects incident light from the object, a color filter that transmits light of a predetermined wavelength among the collected incident light, and incident light that has passed through the color filter. A pixel having a photoelectric conversion unit that converts the signal into an electric signal; These pixels are arranged in a two-dimensional grid to form an image sensor. In addition, a light shield is disposed around each pixel. This light blocking body blocks light incident obliquely from adjacent pixels. The color filter described above transmits light having a different wavelength for each pixel. For example, color filters corresponding to red light, green light, and blue light are arranged in each pixel. When incident light that has been obliquely transmitted through the color filter of the pixel reaches the photoelectric conversion unit without passing through the color filter of the adjacent pixel, an electrical signal is generated based on incident light having a wavelength different from that of the color filter of its own, and the image An error occurs in the signal. This phenomenon is called color mixing. By arranging the above-described light shielding body around the pixel, it is possible to prevent color mixing.
近年、撮像素子の解像度の向上に伴い、画素の小型化が進んでいる。小型化された画素において集光率を向上させるため、オンチップレンズにより集光されてカラーフィルタを透過した入射光をさらに集光する第2のレンズが配置された画素を備える撮像素子が使用されている。この第2のレンズは、層内レンズと称され、凸形状に構成される。この層内レンズを形成する方法として、例えば、上述の遮光体をマスクとして自己整合(セルフアライン)により形成する方法が提案されている(例えば、特許文献1参照。)。この方法では、下地層の上に遮光体を形成し、形成された遮光体をマスクとして下地層を選択的にエッチングして凹部を形成する。次に、この凹部内に高密度プラズマ成膜法により絶縁物を堆積させる。この堆積された絶縁膜は、遮光体近傍の膜厚が厚くなり、遮光体から離れるに従って膜厚が薄くなる下凸の形状となる。この絶縁膜に表面が平坦化された第2の絶縁物の膜を形成することにより、下凸の形状を有する第2の絶縁物からなる層内レンズを形成することができる。
In recent years, with the improvement of the resolution of the image sensor, the size of pixels has been reduced. In order to improve the light collection rate in a miniaturized pixel, an image sensor including a pixel in which a second lens that collects incident light collected by an on-chip lens and transmitted through a color filter is disposed is used. ing. This second lens is called an in-layer lens and is configured in a convex shape. As a method of forming this intra-layer lens, for example, a method of forming by self-alignment using the above-described light shielding body as a mask has been proposed (for example, see Patent Document 1). In this method, a light shielding body is formed on the base layer, and the concave layer is formed by selectively etching the base layer using the formed light shielding body as a mask. Next, an insulator is deposited in the recess by a high-density plasma film forming method. The deposited insulating film has a thickness in the vicinity of the light shielding body, and has a downwardly convex shape in which the film thickness decreases as the distance from the light shielding body increases. By forming a second insulator film having a planarized surface on this insulating film, an in-layer lens made of the second insulator having a downwardly convex shape can be formed.
上述の従来技術は、集光率が低いという問題がある。レンズが下凸に構成された半球形状であり、入射光を屈折させる曲面が1面のみとなるためである。レンズの両面に凸部を形成することにより、集光率を向上させることが可能となる。しかし、両面に凸部を有するレンズは、片面に凸部を有するレンズと比較して、形成が困難になるという問題がある。
The above-mentioned conventional technology has a problem that the light collection rate is low. This is because the lens has a hemispherical shape configured to be convex downward, and there is only one curved surface that refracts incident light. By forming convex portions on both surfaces of the lens, the light collection rate can be improved. However, a lens having convex portions on both sides has a problem that it is difficult to form compared with a lens having convex portions on one side.
本技術は、上述の問題点を解消するためになされたものであり、その第1の態様は、入射光に応じた画像信号を生成する画素と、上記画素に配置されて上記入射光を透過させるとともに凹部を備える下地層と、上記画素毎の上記下地層の凹部に隣接して配置されて上記下地層の凹部に基づいて形成された凸部を備え、照射された光を集光して上記下地層を介して上記画素に入射させるレンズとを具備する撮像素子である。これにより、レンズの凸部が下地層の凹部に基づいて形成されるという作用をもたらす。下地層の凹部の位置への自己整合に基づくレンズの凸部の作成が想定される。
The present technology has been made to solve the above-described problems. The first aspect of the present technology is a pixel that generates an image signal corresponding to incident light, and the pixel that is disposed in the pixel and transmits the incident light. And a convex layer formed adjacent to the concave portion of the base layer for each pixel and formed based on the concave portion of the base layer to collect the irradiated light. An imaging device comprising: a lens that is incident on the pixel through the base layer. As a result, the convex portion of the lens is formed based on the concave portion of the base layer. It is assumed that the convex portion of the lens is created based on self-alignment with the concave portion of the underlayer.
また、この第1の態様において、上記レンズは、上記下地層に隣接して配置されて表面に上記下地層の凹部に基づく第2の凹部が形成された光透過性部材と上記第2の凹部にさらに配置されたレジストとにおいて、上記光透過性部材および上記レジストが異なるエッチング速度によりエッチングされて上記下地層の凹部に基づく上記凸部が形成されてもよい。これにより、エッチングの際の下地層および光透過性部材のエッチング速度の差に応じてレンズの凸部が形成されるという作用をもたらす。
In the first aspect, the lens includes a light transmissive member disposed adjacent to the base layer and having a second recess formed on the surface based on the recess of the base layer, and the second recess. In addition, the light transmitting member and the resist may be etched at different etching rates to form the convex portion based on the concave portion of the base layer. Thereby, the convex part of a lens is brought about according to the difference of the etching rate of a base layer and a light transmissive member in the case of an etching.
また、この第1の態様において、上記下地層は、等方性のエッチングにより形成された上記凹部を備えてもよい。これにより、下地層の凹部が等方性のエッチングにより形成されるという作用をもたらす。
Further, in the first aspect, the underlayer may include the concave portion formed by isotropic etching. This brings about the effect that the recess of the underlayer is formed by isotropic etching.
また、この第1の態様において、上記画素の周囲に配置される遮光壁をさらに具備し、上記下地層は、上記遮光壁に囲まれた領域の形状に基づく上記凹部を備えてもよい。これにより、下地層の凹部が遮光壁の形状に基づいて形成されるという作用をもたらす。
Further, in the first aspect, a light shielding wall disposed around the pixel may be further provided, and the base layer may include the concave portion based on a shape of a region surrounded by the light shielding wall. Thereby, the effect | action that the recessed part of a base layer is formed based on the shape of a light-shielding wall is brought about.
また、この第1の態様において、上記集光された入射光のうち所定の波長の光を透過させるカラーフィルタをさらに具備してもよい。これにより、被写体とカラーフィルタとの間にレンズが配置されるという作用をもたらす。
Further, in the first aspect, a color filter that transmits light having a predetermined wavelength among the collected incident light may be further provided. As a result, the lens is arranged between the subject and the color filter.
また、この第1の態様において、上記入射光のうち所定の波長の光を透過させるカラーフィルタをさらに具備し、上記レンズは、上記カラーフィルタを透過した光を上記集光してもよい。これにより、被写体とレンズとの間にカラーフィルタが配置されるという作用をもたらす。
Further, in the first aspect, a color filter that transmits light having a predetermined wavelength among the incident light may be further provided, and the lens may collect the light transmitted through the color filter. As a result, the color filter is arranged between the subject and the lens.
また、本技術の第2の態様は、入射光に応じた画像信号を生成する画素に上記入射光を透過させるとともに凹部を備える下地層を形成する下地層凹部形成工程と、照射された光を集光して上記下地層を介して上記画素に入射させるレンズを上記画素毎の上記下地層の凹部に隣接して配置するレンズ配置工程と、上記下地層の凹部に基づいて形成された凸部を上記配置されたレンズの表面に形成するレンズ凸部形成工程とを具備する撮像素子の製造方法である。これにより、レンズの凸部が下地層の凹部に基づいて形成されるという作用をもたらす。下地層の凹部の位置への自己整合に基づくレンズの凸部の作成が想定される。
In addition, the second aspect of the present technology is configured to transmit the incident light to a pixel that generates an image signal corresponding to incident light and to form a base layer recess forming step including a recess, and to irradiate the irradiated light. A lens placement step for placing a lens that is focused and incident on the pixel through the base layer adjacent to the concave portion of the base layer for each pixel, and a convex portion formed based on the concave portion of the base layer And a lens convex portion forming step of forming the lens on the surface of the lens arranged as described above. As a result, the convex portion of the lens is formed based on the concave portion of the base layer. It is assumed that the convex portion of the lens is created based on self-alignment with the concave portion of the underlayer.
本技術によれば、画素における両側に凸部を有するレンズの製造工程を簡略化するという優れた効果を奏する。
According to the present technology, there is an excellent effect of simplifying the manufacturing process of a lens having convex portions on both sides of a pixel.
次に、図面を参照して、本技術を実施するための形態(以下、実施の形態と称する)を説明する。以下の図面において、同一または類似の部分には同一または類似の符号を付している。ただし、図面は、模式的なものであり、各部の寸法の比率等は現実のものとは必ずしも一致しない。また、図面相互間においても互いの寸法の関係や比率が異なる部分が含まれることは勿論である。また、以下の順序で実施の形態の説明を行う。
1.第1の実施の形態
2.第2の実施の形態
3.第3の実施の形態
4.第4の実施の形態
5.変形例
6.カメラへの応用例
7.内視鏡手術システムへの応用例
8.移動体への応用例 Next, an embodiment for implementing the present technology (hereinafter referred to as an embodiment) will be described with reference to the drawings. In the following drawings, the same or similar parts are denoted by the same or similar reference numerals. However, the drawings are schematic, and the dimensional ratios of the respective parts do not necessarily match the actual ones. Also, it goes without saying that the drawings include portions having different dimensional relationships and ratios. Further, the embodiments will be described in the following order.
1. 1. First embodiment 2. Second embodiment 3. Third embodiment 4. Fourth embodiment Modification 6 6. Application example to camera 7. Application example to endoscopic surgery system Application examples for moving objects
1.第1の実施の形態
2.第2の実施の形態
3.第3の実施の形態
4.第4の実施の形態
5.変形例
6.カメラへの応用例
7.内視鏡手術システムへの応用例
8.移動体への応用例 Next, an embodiment for implementing the present technology (hereinafter referred to as an embodiment) will be described with reference to the drawings. In the following drawings, the same or similar parts are denoted by the same or similar reference numerals. However, the drawings are schematic, and the dimensional ratios of the respective parts do not necessarily match the actual ones. Also, it goes without saying that the drawings include portions having different dimensional relationships and ratios. Further, the embodiments will be described in the following order.
1. 1. First embodiment 2. Second embodiment 3. Third embodiment 4. Fourth embodiment Modification 6 6. Application example to camera 7. Application example to endoscopic surgery system Application examples for moving objects
<1.第1の実施の形態>
[撮像素子の構成]
図1は、本技術の実施の形態に係る撮像素子の構成例を示す図である。同図の撮像素子1は、画素アレイ部10と、垂直駆動部20と、カラム信号処理部30と、制御部40とを備える。 <1. First Embodiment>
[Configuration of image sensor]
FIG. 1 is a diagram illustrating a configuration example of an image sensor according to an embodiment of the present technology. Theimage pickup device 1 of FIG. 1 includes a pixel array unit 10, a vertical drive unit 20, a column signal processing unit 30, and a control unit 40.
[撮像素子の構成]
図1は、本技術の実施の形態に係る撮像素子の構成例を示す図である。同図の撮像素子1は、画素アレイ部10と、垂直駆動部20と、カラム信号処理部30と、制御部40とを備える。 <1. First Embodiment>
[Configuration of image sensor]
FIG. 1 is a diagram illustrating a configuration example of an image sensor according to an embodiment of the present technology. The
画素アレイ部10は、画素100が2次元格子状に配置されて構成されたものである。ここで、画素100は、照射された光に応じた画像信号を生成するものである。この画素100は、照射された光に応じた電荷を生成する光電変換部を有する。また画素100は、画素回路をさらに有する。この画素回路は、光電変換部により生成された電荷に基づく画像信号を生成する。画像信号の生成は、後述する垂直駆動部20により生成された制御信号により制御される。画素アレイ部10には、信号線11および12がXYマトリクス状に配置される。信号線11は、画素100における画素回路の制御信号を伝達する信号線であり、画素アレイ部10の行毎に配置され、各行に配置される画素100に対して共通に配線される。信号線12は、画素100の画素回路により生成された画像信号を伝達する信号線であり、画素アレイ部10の列毎に配置され、各列に配置される画素100に対して共通に配線される。これら光電変換部および画素回路は、半導体基板に形成される。
The pixel array unit 10 is configured by arranging the pixels 100 in a two-dimensional grid. Here, the pixel 100 generates an image signal corresponding to the irradiated light. The pixel 100 includes a photoelectric conversion unit that generates charges according to the irradiated light. The pixel 100 further includes a pixel circuit. This pixel circuit generates an image signal based on the charges generated by the photoelectric conversion unit. The generation of the image signal is controlled by a control signal generated by the vertical drive unit 20 described later. In the pixel array unit 10, signal lines 11 and 12 are arranged in an XY matrix. The signal line 11 is a signal line that transmits a control signal of the pixel circuit in the pixel 100, and is arranged for each row of the pixel array unit 10, and is wired in common to the pixels 100 arranged in each row. The signal line 12 is a signal line that transmits an image signal generated by the pixel circuit of the pixel 100, and is arranged for each column of the pixel array unit 10, and is wired in common to the pixels 100 arranged in each column. The These photoelectric conversion units and pixel circuits are formed on a semiconductor substrate.
垂直駆動部20は、画素100の画素回路の制御信号を生成するものである。この垂直駆動部20は、生成した制御信号を同図の信号線11を介して画素100に伝達する。カラム信号処理部30は、画素100により生成された画像信号を処理するものである。このカラム信号処理部30は、同図の信号線12を介して画素100から伝達された画像信号の処理を行う。カラム信号処理部30における処理には、例えば、画素100において生成されたアナログの画像信号をデジタルの画像信号に変換するアナログデジタル変換が該当する。カラム信号処理部30により処理された画像信号は、撮像素子1の画像信号として出力される。制御部40は、撮像素子1の全体を制御するものである。この制御部40は、垂直駆動部20およびカラム信号処理部30を制御する制御信号を生成して出力することにより、撮像素子1の制御を行う。制御部40により生成された制御信号は、信号線41および42により垂直駆動部20およびカラム信号処理部30に対してそれぞれ伝達される。
The vertical drive unit 20 generates a control signal for the pixel circuit of the pixel 100. The vertical drive unit 20 transmits the generated control signal to the pixel 100 via the signal line 11 in FIG. The column signal processing unit 30 processes the image signal generated by the pixel 100. The column signal processing unit 30 processes the image signal transmitted from the pixel 100 via the signal line 12 shown in FIG. The processing in the column signal processing unit 30 corresponds to, for example, analog-digital conversion that converts an analog image signal generated in the pixel 100 into a digital image signal. The image signal processed by the column signal processing unit 30 is output as an image signal of the image sensor 1. The control unit 40 controls the entire image sensor 1. The control unit 40 controls the image sensor 1 by generating and outputting a control signal for controlling the vertical driving unit 20 and the column signal processing unit 30. The control signal generated by the control unit 40 is transmitted to the vertical drive unit 20 and the column signal processing unit 30 through signal lines 41 and 42, respectively.
[画素の構成]
図2は、本技術の実施の形態に係る画素の構成例を示す図である。同図は、画素100の構成例を表す回路図である。同図の画素100は、光電変換部101と、電荷保持部102と、MOSトランジスタ103乃至106とを備える。 [Pixel configuration]
FIG. 2 is a diagram illustrating a configuration example of a pixel according to the embodiment of the present technology. FIG. 2 is a circuit diagram illustrating a configuration example of thepixel 100. A pixel 100 in the figure includes a photoelectric conversion unit 101, a charge holding unit 102, and MOS transistors 103 to 106.
図2は、本技術の実施の形態に係る画素の構成例を示す図である。同図は、画素100の構成例を表す回路図である。同図の画素100は、光電変換部101と、電荷保持部102と、MOSトランジスタ103乃至106とを備える。 [Pixel configuration]
FIG. 2 is a diagram illustrating a configuration example of a pixel according to the embodiment of the present technology. FIG. 2 is a circuit diagram illustrating a configuration example of the
光電変換部101のアノードは接地され、カソードはMOSトランジスタ103のソースに接続される。MOSトランジスタ103のドレインは、MOSトランジスタ104のソース、MOSトランジスタ105のゲートおよび電荷保持部102の一端に接続される。電荷保持部102の他の一端は、接地される。MOSトランジスタ104および105のドレインは電源線Vddに共通に接続され、MOSトランジスタ105のソースはMOSトランジスタ106のドレインに接続される。MOSトランジスタ106のソースは、信号線12に接続される。MOSトランジスタ103、104および106のゲートは、それぞれ転送信号線TR、リセット信号線RSTおよび選択信号線SELに接続される。なお、転送信号線TR、リセット信号線RSTおよび選択信号線SELは、信号線11を構成する。
The anode of the photoelectric conversion unit 101 is grounded, and the cathode is connected to the source of the MOS transistor 103. The drain of the MOS transistor 103 is connected to the source of the MOS transistor 104, the gate of the MOS transistor 105, and one end of the charge holding unit 102. The other end of the charge holding unit 102 is grounded. The drains of MOS transistors 104 and 105 are commonly connected to power supply line Vdd, and the source of MOS transistor 105 is connected to the drain of MOS transistor 106. The source of the MOS transistor 106 is connected to the signal line 12. MOS transistors 103, 104 and 106 have their gates connected to transfer signal line TR, reset signal line RST and selection signal line SEL, respectively. Note that the transfer signal line TR, the reset signal line RST, and the selection signal line SEL constitute a signal line 11.
光電変換部101は、前述のように照射された光に応じた電荷を生成するものである。この光電変換部101には、フォトダイオードを使用することができる。また、電荷保持部102およびMOSトランジスタ103乃至106は、画素回路を構成する。
The photoelectric conversion unit 101 generates a charge corresponding to the irradiated light as described above. A photodiode can be used for the photoelectric conversion unit 101. Further, the charge holding unit 102 and the MOS transistors 103 to 106 constitute a pixel circuit.
MOSトランジスタ103は、光電変換部101の光電変換により生成された電荷を電荷保持部102に転送するトランジスタである。MOSトランジスタ103における電荷の転送は、転送信号線TRにより伝達される信号により制御される。電荷保持部102は、MOSトランジスタ103により転送された電荷を保持するキャパシタである。MOSトランジスタ105は、電荷保持部102に保持された電荷に基づく信号を生成するトランジスタである。MOSトランジスタ106は、MOSトランジスタ105により生成された信号を画像信号として信号線12に出力するトランジスタである。このMOSトランジスタ106は、選択信号線SELにより伝達される信号により制御される。
The MOS transistor 103 is a transistor that transfers the charge generated by the photoelectric conversion of the photoelectric conversion unit 101 to the charge holding unit 102. The charge transfer in the MOS transistor 103 is controlled by a signal transmitted through the transfer signal line TR. The charge holding unit 102 is a capacitor that holds the charge transferred by the MOS transistor 103. The MOS transistor 105 is a transistor that generates a signal based on the charge held in the charge holding unit 102. The MOS transistor 106 is a transistor that outputs a signal generated by the MOS transistor 105 to the signal line 12 as an image signal. The MOS transistor 106 is controlled by a signal transmitted through the selection signal line SEL.
MOSトランジスタ104は、電荷保持部102に保持された電荷を電源線Vddに排出することにより電荷保持部102をリセットするトランジスタである。このMOSトランジスタ104によるリセットは、リセット信号線RSTにより伝達される信号により制御され、MOSトランジスタ103による電荷の転送の前に実行される。なお、このリセットの際、MOSトランジスタ103を導通させることにより、光電変換部101のリセットも行うことができる。このように、画素回路は、光電変換部101により生成された電荷を画像信号に変換する。
The MOS transistor 104 is a transistor that resets the charge holding unit 102 by discharging the charge held in the charge holding unit 102 to the power supply line Vdd. The reset by the MOS transistor 104 is controlled by a signal transmitted through the reset signal line RST, and is executed before the charge transfer by the MOS transistor 103. Note that the photoelectric conversion unit 101 can also be reset by conducting the MOS transistor 103 at the time of resetting. As described above, the pixel circuit converts the charge generated by the photoelectric conversion unit 101 into an image signal.
[画素の断面の構成]
図3は、本技術の第1の実施の形態に係る画素の構成例を示す断面図である。同図は、画素アレイ部10に配置された画素100の構成例を表す模式断面図である。画素100は、オンチップレンズ151と、下地層144と、遮光壁143と、カラーフィルタ142と、絶縁膜141と、半導体基板111と、配線層122および絶縁層121からなる配線領域と、支持基板131とを備える。 [Configuration of pixel cross section]
FIG. 3 is a cross-sectional view illustrating a configuration example of a pixel according to the first embodiment of the present technology. FIG. 2 is a schematic cross-sectional view illustrating a configuration example of thepixel 100 arranged in the pixel array unit 10. The pixel 100 includes an on-chip lens 151, a base layer 144, a light shielding wall 143, a color filter 142, an insulating film 141, a semiconductor substrate 111, a wiring region including the wiring layer 122 and the insulating layer 121, and a support substrate. 131.
図3は、本技術の第1の実施の形態に係る画素の構成例を示す断面図である。同図は、画素アレイ部10に配置された画素100の構成例を表す模式断面図である。画素100は、オンチップレンズ151と、下地層144と、遮光壁143と、カラーフィルタ142と、絶縁膜141と、半導体基板111と、配線層122および絶縁層121からなる配線領域と、支持基板131とを備える。 [Configuration of pixel cross section]
FIG. 3 is a cross-sectional view illustrating a configuration example of a pixel according to the first embodiment of the present technology. FIG. 2 is a schematic cross-sectional view illustrating a configuration example of the
半導体基板111は、図1において説明した画素100の光電変換部や画素回路の半導体部分が形成される半導体の基板である。また、半導体基板111には、垂直駆動部20、カラム信号処理部30および制御部40の半導体部分がさらに形成される。半導体基板111には、例えば、p型のウェル領域が形成され、このウェル領域に画素100の光電変換部等が形成される。便宜上、同図の半導体基板111は、p型のウェル領域として構成されるものと想定する。また、同図には、画素回路のうち光電変換部101を記載した。同図の光電変換部101は、n型半導体領域112とこのn型半導体領域112の周囲のp型ウェル領域とにより構成される。n型半導体領域112およびp型ウェル領域の界面に形成されたpn接合によりフォトダイオードが構成され、光電変換が行われる。
The semiconductor substrate 111 is a semiconductor substrate on which the photoelectric conversion portion of the pixel 100 described in FIG. 1 and the semiconductor portion of the pixel circuit are formed. In addition, semiconductor portions of the vertical driving unit 20, the column signal processing unit 30, and the control unit 40 are further formed on the semiconductor substrate 111. For example, a p-type well region is formed in the semiconductor substrate 111, and the photoelectric conversion portion of the pixel 100 and the like are formed in this well region. For convenience, it is assumed that the semiconductor substrate 111 in the figure is configured as a p-type well region. Further, the photoelectric conversion unit 101 in the pixel circuit is shown in FIG. The photoelectric conversion unit 101 in FIG. 1 includes an n-type semiconductor region 112 and a p-type well region around the n-type semiconductor region 112. A photodiode is constituted by a pn junction formed at the interface between the n-type semiconductor region 112 and the p-type well region, and photoelectric conversion is performed.
半導体基板111には、n型半導体領域112に隣接してp型半導体領域113が配置される。このp型半導体領域113は、比較的高い不純物濃度に構成された領域であり、半導体基板111の表面の界面準位をピニングする領域である。これにより、半導体基板111の表面の界面準位の影響を軽減することができる。なお、ピニングのためのp型半導体領域は、半導体基板111の他の面にも配置することができる。
In the semiconductor substrate 111, a p-type semiconductor region 113 is disposed adjacent to the n-type semiconductor region 112. The p-type semiconductor region 113 is a region configured with a relatively high impurity concentration, and is a region for pinning an interface state on the surface of the semiconductor substrate 111. Thereby, the influence of the interface state on the surface of the semiconductor substrate 111 can be reduced. Note that the p-type semiconductor region for pinning can also be disposed on the other surface of the semiconductor substrate 111.
配線層122は、画素100において生成された画像信号や画素回路を制御する制御信号を伝達する配線である。図1において説明した信号線11および12は、配線層122により構成される。この配線層122は、銅(Cu)等の金属により構成することができる。絶縁層121は、配線層122を絶縁するものである。この絶縁層121は、例えば、酸化シリコン(SiO2)等の酸化物により構成することができる。なお、同図の画素100を備える撮像素子1は、半導体基板111における光が入射される面とは異なる面(裏面)に配線領域が形成される裏面照射型の撮像素子である。なお、支持基板131は、撮像素子1を支持する基板であり、製造工程において撮像素子1の強度を向上させるために使用される基板である。
The wiring layer 122 is a wiring that transmits an image signal generated in the pixel 100 and a control signal for controlling the pixel circuit. The signal lines 11 and 12 described with reference to FIG. The wiring layer 122 can be made of a metal such as copper (Cu). The insulating layer 121 insulates the wiring layer 122. The insulating layer 121 can be made of an oxide such as silicon oxide (SiO 2 ), for example. Note that the image pickup device 1 including the pixel 100 in FIG. 1 is a back-illuminated image pickup device in which a wiring region is formed on a surface (back surface) different from the surface on which light is incident on the semiconductor substrate 111. The support substrate 131 is a substrate that supports the image sensor 1 and is a substrate that is used for improving the strength of the image sensor 1 in the manufacturing process.
オンチップレンズ151は、被写体からの光を集光するレンズである。同図のオンチップレンズ151は、断面が楕円形状に構成される。すなわち、上下に凸部を有する形状に構成される。これにより、入射光の集光率を向上させることができる。オンチップレンズ151は、光透過性の部材、例えば、窒化シリコン(SiN)により構成することができる。下地層144は、オンチップレンズ151の下層に配置され、オンチップレンズ151の形成の際の下地となる膜である。この下地層144は、オンチップレンズ151より屈折率が低いとともに光透過性の部材、例えば、酸化シリコン(SiO2)により構成することができる。
The on-chip lens 151 is a lens that collects light from the subject. The on-chip lens 151 in the figure has an elliptical cross section. That is, it is configured in a shape having convex portions on the top and bottom. Thereby, the condensing rate of incident light can be improved. The on-chip lens 151 can be composed of a light transmissive member, for example, silicon nitride (SiN). The underlayer 144 is a film that is disposed below the on-chip lens 151 and serves as a base when the on-chip lens 151 is formed. The underlayer 144 can be made of a light transmissive member having a lower refractive index than that of the on-chip lens 151, for example, silicon oxide (SiO 2 ).
同図に表したように、オンチップレンズ151は、その半分が下地層144に埋め込まれた形状に構成される。これは、下地層144に形成された凹部に隣接して形成されることにより裏面にオンチップレンズ151の凸部が形成され、下地層144の凹部の形状に基づいて形成された凸部がオンチップレンズ151の表面に形成されるためである。このような形状のオンチップレンズ151は、例えば、次のように形成することができる。まず、下地層144に隣接してオンチップレンズ151の材料となる光透過性部材を配置する。この際、配置された光透過性部材の表面には、下地層144の凹部の形状が転写された第2の凹部が形成される。
As shown in the figure, the on-chip lens 151 is formed in a shape in which half of the on-chip lens 151 is embedded in the base layer 144. This is because the convex portion of the on-chip lens 151 is formed on the back surface by being formed adjacent to the concave portion formed in the base layer 144, and the convex portion formed based on the shape of the concave portion of the base layer 144 is on. This is because it is formed on the surface of the chip lens 151. The on-chip lens 151 having such a shape can be formed as follows, for example. First, a light transmissive member that is a material of the on-chip lens 151 is disposed adjacent to the base layer 144. At this time, a second concave portion to which the shape of the concave portion of the base layer 144 is transferred is formed on the surface of the light transmissive member arranged.
この第2の凹部にレジストを配置して光透過性部材およびレジストをエッチングする。このエッチングには、ドライエッチングを使用することができる。この際、光透過性部材およびレジストについて異なるエッチング速度によりエッチングを行う。具体的には、レジストより光透過性部材の方が速くエッチングされる条件によりエッチングを行う。すると、レジストの厚さが厚い領域ほど光透過性部材のエッチング量が減少することとなり、光透過性部材に凸部が形成される。すなわち、下地層144の凹部に基づく凸部を備えるオンチップレンズ151を形成することができる。このように、オンチップレンズ151は、下地層144の凹部に自己整合して形成される。オンチップレンズ151の製造方法の詳細については後述する。なお、オンチップレンズ151は、請求の範囲に記載のレンズの一例である。
The resist is disposed in the second recess and the light transmitting member and the resist are etched. For this etching, dry etching can be used. At this time, the light transmissive member and the resist are etched at different etching rates. Specifically, the etching is performed under the condition that the light transmissive member is etched faster than the resist. Then, the etching amount of the light transmissive member decreases as the resist thickness increases, and a convex portion is formed on the light transmissive member. That is, the on-chip lens 151 having a convex portion based on the concave portion of the base layer 144 can be formed. As described above, the on-chip lens 151 is formed in a self-aligned manner with the concave portion of the base layer 144. Details of the manufacturing method of the on-chip lens 151 will be described later. The on-chip lens 151 is an example of a lens described in the claims.
カラーフィルタ142は、オンチップレンズ151により集光された光のうち所定の波長の光を透過させる光学的なフィルタである。このカラーフィルタ142には、例えば、赤色光、緑色光および青色光の可視光や赤外光の何れかを透過するカラーフィルタ142を使用することができる。遮光壁143は、隣接する画素100から斜めに入射する光を遮蔽する壁状の膜である。この遮光壁143により、隣接する画素100のカラーフィルタ142を透過した光の入射を防止することができ、混色の発生を防ぐことができる。遮光壁143は、例えば、アルミニウム(Al)やタングステン(W)により構成することができる。絶縁膜141は、半導体基板111を絶縁する膜である。この絶縁膜141は、例えば、SiO2により構成することができる。
The color filter 142 is an optical filter that transmits light having a predetermined wavelength out of the light collected by the on-chip lens 151. As the color filter 142, for example, a color filter 142 that transmits visible light or infrared light of red light, green light, and blue light can be used. The light shielding wall 143 is a wall-shaped film that shields light incident obliquely from the adjacent pixels 100. The light shielding wall 143 can prevent light that has passed through the color filter 142 of the adjacent pixel 100 from entering, and can prevent color mixing. The light shielding wall 143 can be made of, for example, aluminum (Al) or tungsten (W). The insulating film 141 is a film that insulates the semiconductor substrate 111. This insulating film 141 can be made of, for example, SiO 2 .
[オンチップレンズによる集光]
図4は、本技術の第1の実施の形態に係るオンチップレンズによる集光を説明する図である。同図におけるaは、図3において説明した画素100を簡略化した図であり、入射光301が画素100のn型半導体領域112に入射する様子を表した図である。同図におけるaに表したように、オンチップレンズ151は、楕円形状の断面に構成される。画素100の入射光301は、オンチップレンズ151の2つの凸部により屈折する。このため、画素100の入射光の入射角が比較的大きい場合であっても、n型半導体領域112に照射させることができる。このため、像高が大きい撮影レンズを使用した撮像に適用することができる。 [Condensation by on-chip lens]
FIG. 4 is a diagram illustrating light collection by the on-chip lens according to the first embodiment of the present technology. FIG. 3A is a simplified diagram of thepixel 100 described in FIG. 3, and illustrates how incident light 301 enters the n-type semiconductor region 112 of the pixel 100. As shown to a in the figure, the on-chip lens 151 is comprised by the elliptical cross section. Incident light 301 of the pixel 100 is refracted by the two convex portions of the on-chip lens 151. For this reason, even if the incident angle of the incident light of the pixel 100 is relatively large, the n-type semiconductor region 112 can be irradiated. Therefore, the present invention can be applied to imaging using a photographic lens having a large image height.
図4は、本技術の第1の実施の形態に係るオンチップレンズによる集光を説明する図である。同図におけるaは、図3において説明した画素100を簡略化した図であり、入射光301が画素100のn型半導体領域112に入射する様子を表した図である。同図におけるaに表したように、オンチップレンズ151は、楕円形状の断面に構成される。画素100の入射光301は、オンチップレンズ151の2つの凸部により屈折する。このため、画素100の入射光の入射角が比較的大きい場合であっても、n型半導体領域112に照射させることができる。このため、像高が大きい撮影レンズを使用した撮像に適用することができる。 [Condensation by on-chip lens]
FIG. 4 is a diagram illustrating light collection by the on-chip lens according to the first embodiment of the present technology. FIG. 3A is a simplified diagram of the
同図におけるbは、比較例として半球形状のオンチップレンズ311を使用した場合の例を表した図である。同図におけるbのオンチップレンズ311では、同図におけるaと同じ入射角度の入射光301に対して、n型半導体領域112に集光させることができず、集光率が低くなる。
B in the figure represents an example in which a hemispherical on-chip lens 311 is used as a comparative example. In the on-chip lens 311 of b in the figure, the incident light 301 having the same incident angle as that of a in the figure cannot be condensed on the n-type semiconductor region 112, and the condensing rate becomes low.
また、同図におけるcは、オンチップレンズ151とは異なる方法により、上面側の凸部が形成されたオンチップレンズ312の例を表した図である。具体的には、ドライエッチング法により形成されたオンチップレンズ312の例を表したものである。ここで、ドライエッチング法とは、オンチップレンズの材料となる光透過性の部材の膜を形成し、その上に凸部を有する形状に構成されたレジストを配置し、レジストおよび光透過性の部材の膜をドライエッチングすることにより、レジストの凸部を光透過性の部材の表面に転写する方法である。このドライエッチング法では、レジストの位置ずれやエッチング残りを生じる場合がある。同図におけるcの領域313は、エッチング残りを生じて隣接するオンチップレンズ312と連結した領域を表したものである。この領域313に入射光302が侵入すると、隣接する画素100に入射光が伝達されて混色を生じることとなる。
Further, c in the figure is a view showing an example of the on-chip lens 312 in which the convex portion on the upper surface side is formed by a method different from the on-chip lens 151. Specifically, an example of the on-chip lens 312 formed by the dry etching method is shown. Here, the dry etching method is a method of forming a film of a light transmissive member as a material of an on-chip lens, placing a resist having a convex shape thereon, and forming a resist and a light transmissive material. This is a method of transferring the convex portions of the resist onto the surface of the light-transmitting member by dry etching the member film. In this dry etching method, resist misalignment or etching residue may occur. A region c 313 in FIG. 3 represents a region where etching residue is generated and connected to the adjacent on-chip lens 312. When the incident light 302 enters the region 313, the incident light is transmitted to the adjacent pixels 100, and color mixing occurs.
同図におけるaに表したように、楕円形状の断面のオンチップレンズ151を使用することにより、集光率を向上させることができる。このため、画素100に入射した後に当該画素100における光電変換に寄与しない迷光の発生を軽減することができ、迷光の画素100への入射を防ぐ遮光壁等を省略することができる。また、焦点距離を短くすることができるため、撮像素子1を薄型化することもできる。
As shown to a in the figure, a condensing rate can be improved by using the on-chip lens 151 of an elliptical cross section. For this reason, generation | occurrence | production of the stray light which does not contribute to the photoelectric conversion in the said pixel 100 after injecting into the pixel 100 can be reduced, and the light-shielding wall etc. which prevent the incidence of the stray light to the pixel 100 can be omitted. In addition, since the focal length can be shortened, the imaging device 1 can be thinned.
[撮像素子の製造方法]
図5および6は、本技術の第1の実施の形態に係る撮像素子の製造方法の一例を示す図である。まず、半導体基板111にn型半導体領域112等の半導体領域を形成し、配線領域を積層し、支持基板131を接着する。次に、半導体基板111の裏面を研削して薄肉化する。この半導体基板111の裏面に、絶縁膜141、遮光壁143およびカラーフィルタ142を形成する。このカラーフィルタ142の表面に下地層144の材料となる絶縁物膜401を成膜する。絶縁物膜401には、例えばSiO2を使用することができる。絶縁物膜401の成膜には、例えば、CVD(Chemical Vapor Deposition)を使用することができる(図5におけるa)。次に、レジスト402を形成する。このレジスト402には、下地層144の凹部を形成する位置に開口部403が形成される。この開口部403が形成される位置は、オンチップレンズ151を配置する位置に該当する。(図5におけるb)。 [Method for Manufacturing Image Sensor]
5 and 6 are diagrams illustrating an example of a method for manufacturing the imaging element according to the first embodiment of the present technology. First, a semiconductor region such as the n-type semiconductor region 112 is formed on the semiconductor substrate 111, a wiring region is stacked, and the support substrate 131 is bonded. Next, the back surface of the semiconductor substrate 111 is ground and thinned. An insulating film 141, a light shielding wall 143, and a color filter 142 are formed on the back surface of the semiconductor substrate 111. An insulating film 401 serving as a material for the base layer 144 is formed on the surface of the color filter 142. For the insulating film 401, for example, SiO 2 can be used. For example, CVD (Chemical Vapor Deposition) can be used to form the insulating film 401 (a in FIG. 5). Next, a resist 402 is formed. An opening 403 is formed in the resist 402 at a position where the concave portion of the base layer 144 is formed. The position where the opening 403 is formed corresponds to the position where the on-chip lens 151 is disposed. (B in FIG. 5).
図5および6は、本技術の第1の実施の形態に係る撮像素子の製造方法の一例を示す図である。まず、半導体基板111にn型半導体領域112等の半導体領域を形成し、配線領域を積層し、支持基板131を接着する。次に、半導体基板111の裏面を研削して薄肉化する。この半導体基板111の裏面に、絶縁膜141、遮光壁143およびカラーフィルタ142を形成する。このカラーフィルタ142の表面に下地層144の材料となる絶縁物膜401を成膜する。絶縁物膜401には、例えばSiO2を使用することができる。絶縁物膜401の成膜には、例えば、CVD(Chemical Vapor Deposition)を使用することができる(図5におけるa)。次に、レジスト402を形成する。このレジスト402には、下地層144の凹部を形成する位置に開口部403が形成される。この開口部403が形成される位置は、オンチップレンズ151を配置する位置に該当する。(図5におけるb)。 [Method for Manufacturing Image Sensor]
5 and 6 are diagrams illustrating an example of a method for manufacturing the imaging element according to the first embodiment of the present technology. First, a semiconductor region such as the n-
次に、絶縁物膜401のエッチングを行う。このエッチングは、等方性のドライエッチングにより行うことができる。エッチングガスとしてNF3およびSF6を使用することにより、等方性のドライエッチングを行うことができる。この等方性のドライエッチングにより、絶縁物膜401に凹部404が形成される。その後、レジスト402を除去する。図5におけるcの破線の矩形は、除去される前のレジスト402を表す。この工程により、下地層144を形成することができる(図5におけるc)。なお、絶縁物膜401のエッチング方法はこの例に限定されない。例えば、等方性のウェットエッチングを行うこともできる。なお、当該工程は、請求の範囲に記載の下地層凹部形成工程の一例である。
Next, the insulating film 401 is etched. This etching can be performed by isotropic dry etching. By using NF 3 and SF 6 as the etching gas, isotropic dry etching can be performed. By this isotropic dry etching, a recess 404 is formed in the insulator film 401. Thereafter, the resist 402 is removed. A broken-line rectangle of c in FIG. 5 represents the resist 402 before being removed. Through this step, the underlayer 144 can be formed (c in FIG. 5). Note that the etching method of the insulating film 401 is not limited to this example. For example, isotropic wet etching can be performed. In addition, the said process is an example of the base layer recessed part formation process as described in a claim.
次に、下地層144の表面にオンチップレンズ151の材料となる光透過性部材405を成膜する。この成膜された光透過性部材405の表面には下地層144の凹部404の形状が転写された凹部406が形成される。この凹部406は、図3において説明した第2の凹部に該当する。光透過性部材405には、例えばSiNを使用することができる。光透過性部材405の成膜には、例えば、CVDを使用することができる(図6におけるd)。なお、当該工程は、請求の範囲に記載のレンズ配置工程の一例である。
Next, a light transmissive member 405 serving as a material for the on-chip lens 151 is formed on the surface of the base layer 144. On the surface of the light-transmitting member 405 thus formed, a recess 406 is formed by transferring the shape of the recess 404 of the base layer 144. The recess 406 corresponds to the second recess described in FIG. For the light transmissive member 405, for example, SiN can be used. For example, CVD can be used to form the light transmissive member 405 (d in FIG. 6). In addition, the said process is an example of the lens arrangement | positioning process as described in a claim.
次に、光透過性部材405の表面にレジスト407を配置する。このレジスト407は、平坦な表面に構成する。例えば、表面をレベリングさせることにより平坦な表面のレジスト407を配置することができる(図6におけるe)。図6におけるeに表したように、レジスト407は、光透過性部材405の凹部406の形状に応じた厚さに構成される。
Next, a resist 407 is disposed on the surface of the light transmissive member 405. The resist 407 is formed on a flat surface. For example, the resist 407 having a flat surface can be arranged by leveling the surface (e in FIG. 6). As shown in e in FIG. 6, the resist 407 has a thickness corresponding to the shape of the concave portion 406 of the light transmissive member 405.
次に、レジスト407および光透過性部材405のエッチングを行う。このエッチングは、異方性のドライエッチングにより行うことができる。エッチングガスとしてCH2F2等を添加したNF3およびSF6を使用することにより、異方性のドライエッチングを行うことができる。また、この異方性のドライエッチングにおいて、レジスト407のエッチング速度より光透過性部材405のエッチング速度が速くなる条件においてエッチングを行う。また、エッチングは、レジスト407が消失するまで行う。
Next, the resist 407 and the light transmissive member 405 are etched. This etching can be performed by anisotropic dry etching. By using NF 3 and SF 6 to which CH 2 F 2 or the like is added as an etching gas, anisotropic dry etching can be performed. In this anisotropic dry etching, the etching is performed under the condition that the etching rate of the light transmitting member 405 is higher than the etching rate of the resist 407. Etching is performed until the resist 407 disappears.
これにより、光透過性部材405の凹部406の中央部のようにレジスト407の厚さが厚い領域では、レジスト407の消失に時間が掛かり、光透過性部材405のエッチング量が比較的少なくなる。一方、光透過性部材405の凹部406の端部近傍のようにレジスト407の厚さが薄い領域では、エッチング工程の早期にレジスト407が除去される。その後は、光透過性部材405のエッチング速度が速いため、光透過性部材405のエッチング量が多くなる。このため、レジスト407の下向きの凸部が反転して転写された形状の凸部が光透過性部材405の表面に形成され、断面が楕円形状のオンチップレンズ151が形成される(図6におけるf)。下地層144の凹部404の位置に基づいてオンチップレンズ151が自己整合により形成されるため、製造時の位置ずれ等の不具合を生じることなくオンチップレンズ151を形成することができる。なお、レジスト407および光透過性部材405のエッチング方法はこの例に限定されない。例えば、ウェットエッチングを行うこともできる。なお、当該工程は、請求の範囲に記載のレンズ凸部形成工程の一例である。
Thus, in a region where the thickness of the resist 407 is thick, such as the central portion of the concave portion 406 of the light transmissive member 405, it takes time to disappear the resist 407, and the etching amount of the light transmissive member 405 becomes relatively small. On the other hand, in a region where the thickness of the resist 407 is thin, such as in the vicinity of the end of the concave portion 406 of the light transmitting member 405, the resist 407 is removed early in the etching process. Thereafter, the etching rate of the light transmissive member 405 is high, so that the etching amount of the light transmissive member 405 increases. For this reason, the convex part of the shape in which the downward convex part of the resist 407 is inverted and transferred is formed on the surface of the light transmitting member 405, and the on-chip lens 151 having an elliptical cross section is formed (in FIG. 6). f). Since the on-chip lens 151 is formed by self-alignment based on the position of the concave portion 404 of the base layer 144, the on-chip lens 151 can be formed without causing problems such as misalignment during manufacturing. Note that the etching method of the resist 407 and the light transmissive member 405 is not limited to this example. For example, wet etching can be performed. In addition, the said process is an example of the lens convex part formation process as described in a claim.
このように、本技術の第1の実施の形態の撮像素子1では、下地層144の凹部404の形状に基づく凸部を両面(表面および裏面)に備えるオンチップレンズ151が形成される。下地層144の凹部404の位置に基づいてオンチップレンズ151が自己整合により形成されるため、オンチップレンズ151の製造工程を簡略化することができる。
As described above, in the imaging device 1 according to the first embodiment of the present technology, the on-chip lens 151 including the convex portions based on the shape of the concave portion 404 of the base layer 144 on both surfaces (the front surface and the back surface) is formed. Since the on-chip lens 151 is formed by self-alignment based on the position of the concave portion 404 of the base layer 144, the manufacturing process of the on-chip lens 151 can be simplified.
<2.第2の実施の形態>
上述の第1の実施の形態の撮像素子1は、下地層144の凹部に基づいて形成された凸部を備えるオンチップレンズ151を使用していた。これに対し、本技術の第2の実施の形態の撮像素子1は、下地層の凹部に基づいて形成された凸部を備える層内レンズをさらに備える点で、上述の第1の実施の形態と異なる。 <2. Second Embodiment>
Theimaging device 1 according to the first embodiment described above uses the on-chip lens 151 including a convex portion formed based on the concave portion of the base layer 144. On the other hand, the imaging device 1 according to the second embodiment of the present technology is further provided with an in-layer lens including a convex portion formed based on the concave portion of the base layer, and thus the first embodiment described above. And different.
上述の第1の実施の形態の撮像素子1は、下地層144の凹部に基づいて形成された凸部を備えるオンチップレンズ151を使用していた。これに対し、本技術の第2の実施の形態の撮像素子1は、下地層の凹部に基づいて形成された凸部を備える層内レンズをさらに備える点で、上述の第1の実施の形態と異なる。 <2. Second Embodiment>
The
[画素の断面の構成]
図7は、本技術の第2の実施の形態に係る画素の構成例を示す断面図である。同図の画素100は、下地層146、層内レンズ152および平坦化膜147をさらに備える点で図3において説明した画素100と異なる。また、同図の画素100は、遮光壁143の代わりに遮光壁145を備える。 [Configuration of pixel cross section]
FIG. 7 is a cross-sectional view illustrating a configuration example of a pixel according to the second embodiment of the present technology. Thepixel 100 in the figure is different from the pixel 100 described in FIG. 3 in that it further includes an underlayer 146, an intralayer lens 152, and a planarizing film 147. In addition, the pixel 100 in the figure includes a light shielding wall 145 instead of the light shielding wall 143.
図7は、本技術の第2の実施の形態に係る画素の構成例を示す断面図である。同図の画素100は、下地層146、層内レンズ152および平坦化膜147をさらに備える点で図3において説明した画素100と異なる。また、同図の画素100は、遮光壁143の代わりに遮光壁145を備える。 [Configuration of pixel cross section]
FIG. 7 is a cross-sectional view illustrating a configuration example of a pixel according to the second embodiment of the present technology. The
層内レンズ152は、画素100の内層に配置されるレンズであり、カラーフィルタ142と半導体基板111の間に配置されるレンズである。この層内レンズ152は、例えば、SiNにより構成することができる。同図の画素100は、オンチップレンズ151および層内レンズ152の2つのレンズを備えるため、図3において説明した画素100と比較して高い集光率にすることができる。すなわち、図4におけるaの入射光301よりも大きな入射角の入射光であっても、n型半導体領域112に照射させることができる。
The in-layer lens 152 is a lens disposed in the inner layer of the pixel 100 and is a lens disposed between the color filter 142 and the semiconductor substrate 111. This in-layer lens 152 can be made of SiN, for example. Since the pixel 100 in the figure includes two lenses, an on-chip lens 151 and an in-layer lens 152, the light condensing rate can be higher than that of the pixel 100 described in FIG. That is, even the incident light having an incident angle larger than the incident light 301 of a in FIG. 4 can be applied to the n-type semiconductor region 112.
下地層146は、下地層144と同様に、層内レンズ152の下層に配置され、層内レンズ152の製造の際の下地となる膜である。下地層146は、例えば、SiO2により構成することができる。平坦化膜147は、遮光壁145および層内レンズ152に隣接して配置され、カラーフィルタ142が形成される面を平坦化する膜である。この平坦化膜147は、例えば、SiO2により構成することができる。
Similar to the underlayer 144, the underlayer 146 is a film that is disposed below the inner lens 152 and serves as an underlayer when the inner lens 152 is manufactured. The underlayer 146 can be made of, for example, SiO 2 . The planarizing film 147 is a film that is disposed adjacent to the light shielding wall 145 and the inner lens 152 and planarizes the surface on which the color filter 142 is formed. The planarizing film 147 can be made of, for example, SiO 2 .
遮光壁145は、カラーフィルタ142および層内レンズ152が配置される領域を囲うように配置される。この遮光壁145は、遮光壁143と同様にAlやWにより構成することができる。
The light shielding wall 145 is disposed so as to surround a region where the color filter 142 and the in-layer lens 152 are disposed. The light shielding wall 145 can be made of Al or W like the light shielding wall 143.
下地層146には、遮光壁145に囲まれた領域の形状に基づく凹部が形成される。また、層内レンズ152は、下地層146に形成された凹部に隣接して裏面が形成されるとともに下地層146の凹部の形状に基づいて形成された凸部が表面に形成される。この層内レンズ152もオンチップレンズ151と同様の方法により形成することができる。まず、下地層146に隣接して光透過性部材を配置する。この際、配置された光透過性部材の表面には、下地層146の凹部の形状が転写された第2の凹部が形成される。この第2の凹部にレジストを配置して光透過性部材およびレジストを異なるエッチング速度によりエッチングすることにより、層内レンズ152を形成することができる。なお、層内レンズ152は、請求の範囲に記載のレンズの一例である。
In the base layer 146, a concave portion based on the shape of the region surrounded by the light shielding wall 145 is formed. In addition, the inner lens 152 has a back surface formed adjacent to the concave portion formed in the base layer 146 and a convex portion formed on the surface based on the shape of the concave portion of the base layer 146. This in-layer lens 152 can also be formed by the same method as the on-chip lens 151. First, a light transmissive member is disposed adjacent to the base layer 146. At this time, a second concave portion to which the shape of the concave portion of the base layer 146 is transferred is formed on the surface of the light transmissive member arranged. The in-layer lens 152 can be formed by disposing a resist in the second recess and etching the light transmissive member and the resist at different etching rates. The in-layer lens 152 is an example of a lens described in the claims.
[撮像素子の製造方法]
図8および9は、本技術の第2の実施の形態に係る撮像素子の製造方法の一例を示す図である。まず、配線領域や絶縁膜141が形成された半導体基板111の表面に遮光壁145を形成する。これは、例えば、遮光壁145の材料となるW等の膜を成膜し、画素100の境界以外の領域のWをエッチングして除去することにより、行うことができる(図8におけるa)。次に、下地層146の材料となる絶縁物膜411を成膜する(図8におけるb)。次に、異方性のドライエッチングを行い、絶縁物膜411をエッチングする。この際、遮光壁145の壁面に隣接する絶縁物膜411がエッチングされず残留する。これにより、サイドウォール形状に構成されて凹部413を備える下地層146を形成することができる(図8におけるc)。なお、当該工程は、請求の範囲に記載の下地層凹部形成工程の一例である。 [Method for Manufacturing Image Sensor]
8 and 9 are diagrams illustrating an example of a method of manufacturing an image sensor according to the second embodiment of the present technology. First, thelight shielding wall 145 is formed on the surface of the semiconductor substrate 111 on which the wiring region and the insulating film 141 are formed. This can be performed, for example, by forming a film such as W as a material of the light shielding wall 145 and etching and removing W in a region other than the boundary of the pixel 100 (a in FIG. 8). Next, an insulator film 411 serving as a material for the base layer 146 is formed (b in FIG. 8). Next, anisotropic dry etching is performed to etch the insulator film 411. At this time, the insulating film 411 adjacent to the wall surface of the light shielding wall 145 remains without being etched. Thereby, it is possible to form the base layer 146 having a sidewall shape and including the recess 413 (c in FIG. 8). In addition, the said process is an example of the base layer recessed part formation process as described in a claim.
図8および9は、本技術の第2の実施の形態に係る撮像素子の製造方法の一例を示す図である。まず、配線領域や絶縁膜141が形成された半導体基板111の表面に遮光壁145を形成する。これは、例えば、遮光壁145の材料となるW等の膜を成膜し、画素100の境界以外の領域のWをエッチングして除去することにより、行うことができる(図8におけるa)。次に、下地層146の材料となる絶縁物膜411を成膜する(図8におけるb)。次に、異方性のドライエッチングを行い、絶縁物膜411をエッチングする。この際、遮光壁145の壁面に隣接する絶縁物膜411がエッチングされず残留する。これにより、サイドウォール形状に構成されて凹部413を備える下地層146を形成することができる(図8におけるc)。なお、当該工程は、請求の範囲に記載の下地層凹部形成工程の一例である。 [Method for Manufacturing Image Sensor]
8 and 9 are diagrams illustrating an example of a method of manufacturing an image sensor according to the second embodiment of the present technology. First, the
次に、下地層146の表面に光透過性部材414を成膜する。この成膜された光透過性部材414の表面には下地層146の凹部413の形状が転写された凹部415が形成される。この凹部415は、第2の凹部に該当する(図8におけるd)。なお、当該工程は、請求の範囲に記載のレンズ配置工程の一例である。
Next, a light transmissive member 414 is formed on the surface of the base layer 146. A concave portion 415 to which the shape of the concave portion 413 of the base layer 146 is transferred is formed on the surface of the formed light transmitting member 414. The recess 415 corresponds to a second recess (d in FIG. 8). In addition, the said process is an example of the lens arrangement | positioning process as described in a claim.
次に、平坦な表面形状のレジスト416を光透過性部材414の表面に配置する(図9におけるe)。次に、レジスト416および光透過性部材414のエッチングを異方性のドライエッチングにより行う。また、この異方性のドライエッチングにおいて、レジスト416のエッチング速度より光透過性部材414のエッチング速度が速くなる条件においてエッチングを行う。これにより、凸部が光透過性部材414の表面に形成され、下地層146の凹部413の位置に基づいて層内レンズ152の凸部が自己整合により形成される(図9におけるf)。なお、当該工程は、請求の範囲に記載のレンズ凸部形成工程の一例である。
Next, a resist 416 having a flat surface shape is disposed on the surface of the light transmitting member 414 (e in FIG. 9). Next, the resist 416 and the light transmissive member 414 are etched by anisotropic dry etching. In this anisotropic dry etching, the etching is performed under the condition that the etching rate of the light transmitting member 414 is higher than the etching rate of the resist 416. Thereby, a convex part is formed on the surface of the light transmissive member 414, and the convex part of the in-layer lens 152 is formed by self-alignment based on the position of the concave part 413 of the base layer 146 (f in FIG. 9). In addition, the said process is an example of the lens convex part formation process as described in a claim.
次に、平坦化膜147を配置する(図9におけるg)。その後、カラーフィルタ142やオンチップレンズ151を形成する。
Next, a planarizing film 147 is disposed (g in FIG. 9). Thereafter, the color filter 142 and the on-chip lens 151 are formed.
これ以外の撮像素子1の構成は本技術の第1の実施の形態において説明した撮像素子1の構成と同様であるため、説明を省略する。
Other than that, the configuration of the image sensor 1 is the same as the configuration of the image sensor 1 described in the first embodiment of the present technology, and thus the description thereof is omitted.
以上説明したように、本技術の第2の実施の形態の撮像素子1は、層内レンズ152を備えることにより、集光率をさらに向上させることができる。
As described above, the imaging element 1 according to the second embodiment of the present technology can further improve the light collection rate by including the in-layer lens 152.
<3.第3の実施の形態>
上述の第1の実施の形態の撮像素子1は、等方性のドライエッチングにより下地層144の凹部404が形成されていた。これに対し、本技術の第3の実施の形態の撮像素子1は、遮光壁に囲まれた領域の形状に基づく凹部が下地層に形成される点で、上述の第1の実施の形態と異なる。 <3. Third Embodiment>
In theimage sensor 1 of the first embodiment described above, the recess 404 of the base layer 144 is formed by isotropic dry etching. On the other hand, the imaging device 1 according to the third embodiment of the present technology is different from the above-described first embodiment in that a concave portion based on the shape of the region surrounded by the light shielding wall is formed in the base layer. Different.
上述の第1の実施の形態の撮像素子1は、等方性のドライエッチングにより下地層144の凹部404が形成されていた。これに対し、本技術の第3の実施の形態の撮像素子1は、遮光壁に囲まれた領域の形状に基づく凹部が下地層に形成される点で、上述の第1の実施の形態と異なる。 <3. Third Embodiment>
In the
[画素の断面の構成]
図10は、本技術の第3の実施の形態に係る画素の構成例を示す断面図である。同図の画素100は、オンチップレンズ151、下地層144および遮光壁143の代わりにオンチップレンズ153、下地層148および遮光壁149を備える点で、図3において説明した画素100と異なる。 [Configuration of pixel cross section]
FIG. 10 is a cross-sectional view illustrating a configuration example of a pixel according to the third embodiment of the present technology. Thepixel 100 in the figure is different from the pixel 100 described in FIG. 3 in that an on-chip lens 153, a base layer 148, and a light shielding wall 149 are provided instead of the on-chip lens 151, the ground layer 144 and the light shielding wall 143.
図10は、本技術の第3の実施の形態に係る画素の構成例を示す断面図である。同図の画素100は、オンチップレンズ151、下地層144および遮光壁143の代わりにオンチップレンズ153、下地層148および遮光壁149を備える点で、図3において説明した画素100と異なる。 [Configuration of pixel cross section]
FIG. 10 is a cross-sectional view illustrating a configuration example of a pixel according to the third embodiment of the present technology. The
遮光壁149は、絶縁膜141の表面からオンチップレンズ153に隣接する領域に配置される。下地層148およびオンチップレンズ153は、図7において説明した画素100と同様の製造方法により形成される。すなわち、カラーフィルタ142の表面に隣接して形成される点を除いて、下地層148およびオンチップレンズ153は、図8および9において説明した製造工程により形成することができる。遮光壁149を利用して下地層148の凹部を形成することができるため、オンチップレンズ153の製造方法を簡略化することができる。なお、オンチップレンズ153は、請求の範囲に記載のレンズの一例である。
The light shielding wall 149 is disposed in a region adjacent to the on-chip lens 153 from the surface of the insulating film 141. The underlayer 148 and the on-chip lens 153 are formed by the same manufacturing method as the pixel 100 described in FIG. That is, except for the point formed adjacent to the surface of the color filter 142, the underlayer 148 and the on-chip lens 153 can be formed by the manufacturing process described in FIGS. Since the concave portion of the base layer 148 can be formed using the light shielding wall 149, the manufacturing method of the on-chip lens 153 can be simplified. The on-chip lens 153 is an example of a lens described in the claims.
これ以外の撮像素子1の構成は本技術の第1の実施の形態において説明した撮像素子1の構成と同様であるため、説明を省略する。
Other than that, the configuration of the image sensor 1 is the same as the configuration of the image sensor 1 described in the first embodiment of the present technology, and thus the description thereof is omitted.
以上説明したように、本技術の第3の実施の形態の撮像素子1は、遮光壁149を利用して下地層148の凹部を形成することにより撮像素子1の製造工程をより簡略化することができる。
As described above, the imaging device 1 according to the third embodiment of the present technology further simplifies the manufacturing process of the imaging device 1 by forming the concave portion of the base layer 148 using the light shielding wall 149. Can do.
<4.第4の実施の形態>
上述の第1の実施の形態の撮像素子1には、同一の構成の画素100が配置されていた。これに対し、本技術の第4の実施の形態の撮像素子1には、オートフォーカスのための位相差画素がさらに配置される点で、上述の第1の実施の形態と異なる。 <4. Fourth Embodiment>
Thepixel 100 having the same configuration is arranged in the image sensor 1 of the first embodiment described above. On the other hand, the imaging device 1 according to the fourth embodiment of the present technology is different from the above-described first embodiment in that a phase difference pixel for autofocus is further arranged.
上述の第1の実施の形態の撮像素子1には、同一の構成の画素100が配置されていた。これに対し、本技術の第4の実施の形態の撮像素子1には、オートフォーカスのための位相差画素がさらに配置される点で、上述の第1の実施の形態と異なる。 <4. Fourth Embodiment>
The
[画素アレイ部の構成]
図11は、本技術の第4の実施の形態に係る画素アレイ部の構成例を示す図である。同図は、画素アレイ部10における画素100の配置を表す上面図である。同図において、画素100aおよび画素100bは、位相差画素に該当する。ここで、位相差画素とは、撮像素子1の画素アレイ部10に被写体からの光を集光する撮影レンズの異なる領域を通過した光による画像のずれを位相差として検出するための画素であり、オートフォーカスに利用される画素である。また、同図において、実線の円はオンチップレンズ151を表し、点線の矩形は半導体基板111に形成されたn型半導体領域112のうちの有効な領域を表す。このような画素100aおよび画素100bは、画素アレイ部10の特定の行に複数配置される。 [Configuration of pixel array section]
FIG. 11 is a diagram illustrating a configuration example of the pixel array unit according to the fourth embodiment of the present technology. FIG. 3 is a top view showing the arrangement of thepixels 100 in the pixel array unit 10. In the figure, a pixel 100a and a pixel 100b correspond to phase difference pixels. Here, the phase difference pixel is a pixel for detecting, as a phase difference, an image shift caused by light that has passed through different areas of the photographing lens that collects light from the subject on the pixel array unit 10 of the image sensor 1. This is a pixel used for autofocus. In the figure, a solid circle represents the on-chip lens 151, and a dotted rectangle represents an effective area of the n-type semiconductor region 112 formed on the semiconductor substrate 111. A plurality of such pixels 100 a and pixels 100 b are arranged in a specific row of the pixel array unit 10.
図11は、本技術の第4の実施の形態に係る画素アレイ部の構成例を示す図である。同図は、画素アレイ部10における画素100の配置を表す上面図である。同図において、画素100aおよび画素100bは、位相差画素に該当する。ここで、位相差画素とは、撮像素子1の画素アレイ部10に被写体からの光を集光する撮影レンズの異なる領域を通過した光による画像のずれを位相差として検出するための画素であり、オートフォーカスに利用される画素である。また、同図において、実線の円はオンチップレンズ151を表し、点線の矩形は半導体基板111に形成されたn型半導体領域112のうちの有効な領域を表す。このような画素100aおよび画素100bは、画素アレイ部10の特定の行に複数配置される。 [Configuration of pixel array section]
FIG. 11 is a diagram illustrating a configuration example of the pixel array unit according to the fourth embodiment of the present technology. FIG. 3 is a top view showing the arrangement of the
画素100においては、n型半導体領域112が形成された領域全体が有効な領域となる。一方、画素100aおよび100bにおいては、n型半導体領域112が形成された領域の略半分の領域が有効な領域となる。具体的には、画素100aでは同図の右側半分が有効な領域となり、画素100bでは同図の左側半分が有効な領域となる。これらの画素100aおよび100bには、それぞれ撮影レンズの左側および右側を通った光が入射する。複数の画素100aにより生成された画像信号に基づく画像と複数の画素100bにより生成された画像信号に基づく画像との位相差を検出することにより、被写体に対する撮影レンズの焦点位置を検出することができる。検出した焦点位置に基づいて撮影レンズの位置を調整することにより、オートフォーカスを行うことが可能となる。
In the pixel 100, the entire region where the n-type semiconductor region 112 is formed is an effective region. On the other hand, in the pixels 100a and 100b, approximately half of the region where the n-type semiconductor region 112 is formed is an effective region. Specifically, in the pixel 100a, the right half of the figure is an effective area, and in the pixel 100b, the left half of the figure is an effective area. Light passing through the left and right sides of the photographic lens is incident on these pixels 100a and 100b, respectively. By detecting the phase difference between the image based on the image signal generated by the plurality of pixels 100a and the image based on the image signal generated by the plurality of pixels 100b, the focal position of the photographing lens with respect to the subject can be detected. . By adjusting the position of the photographic lens based on the detected focal position, autofocus can be performed.
[画素の断面の構成] 図12は、本技術の第4の実施の形態に係る画素の構成例を示す断面図である。同図は、画素100および画素100aの構成例を表した図である。同図の画素100および画素100aは、カラーフィルタ142と絶縁膜141との間に下地層161および平坦化膜162を備える点で図3において説明した画素100と異なる。また、遮光壁143は、下地層161の下層に配置される。画素100aにおける遮光壁143は、n型半導体領域112の左半分を覆う位置に配置され、n型半導体領域112の半分の領域を遮光する。これにより、図11において説明したn型半導体領域112の右側半分の有効な領域を設定することができる。
[Configuration of Cross Section of Pixel] FIG. 12 is a cross-sectional view illustrating a configuration example of a pixel according to the fourth embodiment of the present technology. FIG. 2 is a diagram illustrating a configuration example of the pixel 100 and the pixel 100a. 3 are different from the pixel 100 described in FIG. 3 in that a base layer 161 and a planarizing film 162 are provided between the color filter 142 and the insulating film 141. In addition, the light shielding wall 143 is disposed below the base layer 161. The light shielding wall 143 in the pixel 100 a is disposed at a position covering the left half of the n-type semiconductor region 112, and shields the half of the n-type semiconductor region 112. Thereby, an effective region in the right half of the n-type semiconductor region 112 described in FIG. 11 can be set.
また、画素100aは、層内レンズ154をさらに備える。画素100においては、オンチップレンズ151により入射光がn型半導体領域112に集光される。一方、画素100aにおいては、位相差を検出するため、n型半導体領域112を遮光する遮光壁143の端部(同図における端部321)に入射光を集光する必要がある。このため、画素100aには層内レンズ154が配置され、集光位置が端部321になるように調整される。なお、層内レンズ154は、請求の範囲に記載のレンズの一例である。
The pixel 100a further includes an in-layer lens 154. In the pixel 100, incident light is condensed on the n-type semiconductor region 112 by the on-chip lens 151. On the other hand, in the pixel 100a, in order to detect the phase difference, it is necessary to collect incident light on the end portion (end portion 321 in the figure) of the light shielding wall 143 that shields the n-type semiconductor region 112 from light. For this reason, the in-layer lens 154 is disposed in the pixel 100a, and the condensing position is adjusted to be the end 321. The in-layer lens 154 is an example of a lens described in the claims.
[撮像素子の製造方法]
図13および14は、本技術の第4の実施の形態に係る撮像素子の製造方法の一例を示す図である。まず、半導体基板111の表面に絶縁膜141を成膜し、遮光壁143を形成する。この際、n型半導体領域112を遮光する領域に応じて遮光壁143の形状を変更する(図13におけるa)。次に、下地層161の材料となる絶縁物膜421を成膜する(図13におけるb)。次に、絶縁物膜421の表面にレジスト422を形成する。レジスト422には、層内レンズ154を形成する領域に開口部423が形成される(図13におけるc)。次に、絶縁物膜421に対して等方性のドライエッチングを行い、絶縁物膜421に凹部425を形成する。これにより、凹部425を備える下地層161を形成することができる。その後レジスト422を除去する(図13におけるd)。なお、当該工程は、請求の範囲に記載の下地層凹部形成工程の一例である。 [Method for Manufacturing Image Sensor]
13 and 14 are diagrams illustrating an example of a method of manufacturing an image sensor according to the fourth embodiment of the present technology. First, the insulatingfilm 141 is formed on the surface of the semiconductor substrate 111, and the light shielding wall 143 is formed. At this time, the shape of the light shielding wall 143 is changed in accordance with the region that shields the n-type semiconductor region 112 (a in FIG. 13). Next, an insulator film 421 serving as a material for the base layer 161 is formed (b in FIG. 13). Next, a resist 422 is formed on the surface of the insulating film 421. An opening 423 is formed in the resist 422 in a region where the inner lens 154 is to be formed (c in FIG. 13). Next, isotropic dry etching is performed on the insulator film 421 to form a recess 425 in the insulator film 421. Thereby, the base layer 161 provided with the recessed part 425 can be formed. Thereafter, the resist 422 is removed (d in FIG. 13). In addition, the said process is an example of the base layer recessed part formation process as described in a claim.
図13および14は、本技術の第4の実施の形態に係る撮像素子の製造方法の一例を示す図である。まず、半導体基板111の表面に絶縁膜141を成膜し、遮光壁143を形成する。この際、n型半導体領域112を遮光する領域に応じて遮光壁143の形状を変更する(図13におけるa)。次に、下地層161の材料となる絶縁物膜421を成膜する(図13におけるb)。次に、絶縁物膜421の表面にレジスト422を形成する。レジスト422には、層内レンズ154を形成する領域に開口部423が形成される(図13におけるc)。次に、絶縁物膜421に対して等方性のドライエッチングを行い、絶縁物膜421に凹部425を形成する。これにより、凹部425を備える下地層161を形成することができる。その後レジスト422を除去する(図13におけるd)。なお、当該工程は、請求の範囲に記載の下地層凹部形成工程の一例である。 [Method for Manufacturing Image Sensor]
13 and 14 are diagrams illustrating an example of a method of manufacturing an image sensor according to the fourth embodiment of the present technology. First, the insulating
次に下地層161の表面に光透過性部材426を成膜する。この成膜された光透過性部材426の表面には下地層161の凹部425の形状が転写された凹部427が形成される(図14におけるe)。なお、当該工程は、請求の範囲に記載のレンズ配置工程の一例である。
Next, a light transmissive member 426 is formed on the surface of the base layer 161. A concave portion 427 to which the shape of the concave portion 425 of the base layer 161 is transferred is formed on the surface of the formed light transmitting member 426 (e in FIG. 14). In addition, the said process is an example of the lens arrangement | positioning process as described in a claim.
次に、平坦な表面のレジスト428を形成し(図14におけるf)、異方性のドライエッチングを行う。この異方性のドライエッチングは、図6におけるfにおいて説明したドライエッチングと同様に行うことができる。これにより、画素100aに対して選択的に層内レンズ154を形成することができる(図14におけるg)。なお、当該工程は、請求の範囲に記載のレンズ凸部形成工程の一例である。その後、平坦化膜162等を形成することにより撮像素子1を製造することができる。
Next, a resist 428 having a flat surface is formed (f in FIG. 14), and anisotropic dry etching is performed. This anisotropic dry etching can be performed in the same manner as the dry etching described in FIG. Thereby, the inner lens 154 can be formed selectively with respect to the pixel 100a (g in FIG. 14). In addition, the said process is an example of the lens convex part formation process as described in a claim. Thereafter, the imaging element 1 can be manufactured by forming the planarization film 162 and the like.
このように、位相差画素である画素100aおよび100bに層内レンズ154を配置することができる。画素100a等の他の領域の製造工程、例えば、オンチップレンズ151の製造工程は、画素100と共通にすることができ、製造プロセスのばらつきの影響を軽減することができる。
As described above, the in-layer lens 154 can be arranged on the pixels 100a and 100b which are phase difference pixels. The manufacturing process of other regions such as the pixel 100a, for example, the manufacturing process of the on-chip lens 151 can be made common to the pixel 100, and the influence of variations in the manufacturing process can be reduced.
なお、本技術の第4の実施の形態における撮像素子1の構成は、この例に限定されない。例えば、赤色光や赤外光を透過させるカラーフィルタ142が配置されて、赤色光および赤外光を検出する画素100に層内レンズ154を配置する構成にすることもできる。これにより赤色光や赤外光を検出する画素100の焦点位置を変更することが可能となる。
Note that the configuration of the image sensor 1 in the fourth embodiment of the present technology is not limited to this example. For example, a color filter 142 that transmits red light or infrared light may be disposed, and the intralayer lens 154 may be disposed in the pixel 100 that detects red light and infrared light. This makes it possible to change the focal position of the pixel 100 that detects red light or infrared light. *
これ以外の撮像素子1の構成は本技術の第1の実施の形態において説明した撮像素子1の構成と同様であるため、説明を省略する。
Other than that, the configuration of the image sensor 1 is the same as the configuration of the image sensor 1 described in the first embodiment of the present technology, and thus the description thereof is omitted.
以上説明したように、本技術の第4の実施の形態の撮像素子1は、画素アレイ部10の一部の画素100に層内レンズ154を選択的に配置する。これにより、位相差画素を備える撮像素子1の製造工程を簡略化することができる。
As described above, in the imaging device 1 according to the fourth embodiment of the present technology, the in-layer lens 154 is selectively disposed on some pixels 100 of the pixel array unit 10. Thereby, the manufacturing process of the image pick-up element 1 provided with a phase difference pixel can be simplified.
<5.変形例>
上述の撮像素子1は、裏面照射型の撮像素子を使用していたが、表面照射型の撮像素子に適用してもよい。 <5. Modification>
Although the above-describedimage sensor 1 uses a back-illuminated image sensor, it may be applied to a front-illuminated image sensor.
上述の撮像素子1は、裏面照射型の撮像素子を使用していたが、表面照射型の撮像素子に適用してもよい。 <5. Modification>
Although the above-described
[画素の断面の構成]
図15は、本技術の実施の形態の変形例に係る画素の構成例を示す断面図である。同図の画素100は、オンチップレンズ151、下地層144、遮光壁143、カラーフィルタ142および絶縁膜141が配線層122および絶縁層121からなる配線領域に隣接して配置される点で、図3において説明した画素100と異なる。同図の画素100が配置された撮像素子1は、半導体基板111の配線領域が形成された面である表面から入射光がn型半導体領域112に照射される。同図のオンチップレンズ151は、図3において説明したオンチップレンズ151と同様の方法により形成することができる。このような表面照射型の撮像素子1においても、楕円形状の断面のオンチップレンズ151を容易に形成することができる。 [Configuration of pixel cross section]
FIG. 15 is a cross-sectional view illustrating a configuration example of a pixel according to a modification of the embodiment of the present technology. Thepixel 100 in the figure is that the on-chip lens 151, the base layer 144, the light shielding wall 143, the color filter 142, and the insulating film 141 are disposed adjacent to the wiring region composed of the wiring layer 122 and the insulating layer 121. 3 is different from the pixel 100 described in FIG. In the imaging device 1 in which the pixel 100 of FIG. 2 is arranged, incident light is irradiated onto the n-type semiconductor region 112 from the surface, which is the surface on which the wiring region of the semiconductor substrate 111 is formed. The on-chip lens 151 shown in the figure can be formed by the same method as the on-chip lens 151 described in FIG. Even in such a surface irradiation type imaging device 1, the on-chip lens 151 having an elliptical cross section can be easily formed.
図15は、本技術の実施の形態の変形例に係る画素の構成例を示す断面図である。同図の画素100は、オンチップレンズ151、下地層144、遮光壁143、カラーフィルタ142および絶縁膜141が配線層122および絶縁層121からなる配線領域に隣接して配置される点で、図3において説明した画素100と異なる。同図の画素100が配置された撮像素子1は、半導体基板111の配線領域が形成された面である表面から入射光がn型半導体領域112に照射される。同図のオンチップレンズ151は、図3において説明したオンチップレンズ151と同様の方法により形成することができる。このような表面照射型の撮像素子1においても、楕円形状の断面のオンチップレンズ151を容易に形成することができる。 [Configuration of pixel cross section]
FIG. 15 is a cross-sectional view illustrating a configuration example of a pixel according to a modification of the embodiment of the present technology. The
これ以外の撮像素子1の構成は本技術の第1の実施の形態において説明した撮像素子1の構成と同様であるため、説明を省略する。
Other than that, the configuration of the image sensor 1 is the same as the configuration of the image sensor 1 described in the first embodiment of the present technology, and thus the description thereof is omitted.
以上説明したように、本技術の実施の形態の変形例の撮像素子1は、表面照射型に構成された撮像素子においてオンチップレンズの製造方法を簡略化することができる。
As described above, the imaging device 1 according to the modification of the embodiment of the present technology can simplify the method for manufacturing an on-chip lens in an imaging device configured as a surface irradiation type.
<6.カメラへの応用例>
本技術は、様々な製品に応用することができる。例えば、本技術は、カメラ等の撮像装置に搭載される撮像素子として実現されてもよい。 <6. Application examples for cameras>
The present technology can be applied to various products. For example, the present technology may be realized as an imaging element mounted on an imaging device such as a camera.
本技術は、様々な製品に応用することができる。例えば、本技術は、カメラ等の撮像装置に搭載される撮像素子として実現されてもよい。 <6. Application examples for cameras>
The present technology can be applied to various products. For example, the present technology may be realized as an imaging element mounted on an imaging device such as a camera.
図16は、本技術が適用され得る撮像装置の一例であるカメラの概略的な構成例を示すブロック図である。同図のカメラ1000は、レンズ1001と、撮像素子1002と、撮像制御部1003と、レンズ駆動部1004と、画像処理部1005と、操作入力部1006と、フレームメモリ1007と、表示部1008と、記録部1009とを備える。
FIG. 16 is a block diagram illustrating a schematic configuration example of a camera that is an example of an imaging apparatus to which the present technology can be applied. The camera 1000 shown in FIG. 1 includes a lens 1001, an image sensor 1002, an imaging control unit 1003, a lens driving unit 1004, an image processing unit 1005, an operation input unit 1006, a frame memory 1007, a display unit 1008, And a recording unit 1009.
レンズ1001は、カメラ1000の撮影レンズである。このレンズ1001は、被写体からの光を集光し、後述する撮像素子1002に入射させて被写体を結像させる。
The lens 1001 is a photographing lens of the camera 1000. The lens 1001 collects light from the subject and makes it incident on an image sensor 1002 described later to form an image of the subject.
撮像素子1002は、レンズ1001により集光された被写体からの光を撮像する半導体素子である。この撮像素子1002は、照射された光に応じたアナログの画像信号を生成し、デジタルの画像信号に変換して出力する。
The imaging element 1002 is a semiconductor element that images light from the subject condensed by the lens 1001. The image sensor 1002 generates an analog image signal corresponding to the irradiated light, converts it into a digital image signal, and outputs it.
撮像制御部1003は、撮像素子1002における撮像を制御するものである。この撮像制御部1003は、制御信号を生成して撮像素子1002に対して出力することにより、撮像素子1002の制御を行う。また、撮像制御部1003は、撮像素子1002から出力された画像信号に基づいてカメラ1000におけるオートフォーカスを行うことができる。ここでオートフォーカスとは、レンズ1001の焦点位置を検出して、自動的に調整するシステムである。このオートフォーカスとして、撮像素子1002に配置された位相差画素により像面位相差を検出して焦点位置を検出する方式(像面位相差オートフォーカス)を使用することができる。また、画像のコントラストが最も高くなる位置を焦点位置として検出する方式(コントラストオートフォーカス)を適用することもできる。撮像制御部1003は、検出した焦点位置に基づいてレンズ駆動部1004を介してレンズ1001の位置を調整し、オートフォーカスを行う。なお、撮像制御部1003は、例えば、ファームウェアを搭載したDSP(Digital Signal Processor)により構成することができる。
The imaging control unit 1003 controls imaging in the imaging element 1002. The imaging control unit 1003 controls the imaging element 1002 by generating a control signal and outputting the control signal to the imaging element 1002. Further, the imaging control unit 1003 can perform autofocus in the camera 1000 based on the image signal output from the imaging element 1002. Here, the autofocus is a system that detects the focal position of the lens 1001 and automatically adjusts it. As this autofocus, a method (image plane phase difference autofocus) in which an image plane phase difference is detected by a phase difference pixel arranged in the image sensor 1002 to detect a focal position can be used. In addition, a method (contrast autofocus) in which a position where the contrast of an image is the highest is detected as a focal position can be applied. The imaging control unit 1003 adjusts the position of the lens 1001 via the lens driving unit 1004 based on the detected focal position, and performs autofocus. Note that the imaging control unit 1003 can be configured by, for example, a DSP (Digital Signal Processor) equipped with firmware.
レンズ駆動部1004は、撮像制御部1003の制御に基づいて、レンズ1001を駆動するものである。このレンズ駆動部1004は、内蔵するモータを使用してレンズ1001の位置を変更することによりレンズ1001を駆動することができる。
The lens driving unit 1004 drives the lens 1001 based on the control of the imaging control unit 1003. The lens driving unit 1004 can drive the lens 1001 by changing the position of the lens 1001 using a built-in motor.
画像処理部1005は、撮像素子1002により生成された画像信号を処理するものである。この処理には、例えば、画素毎の赤色、緑色および青色に対応する画像信号のうち不足する色の画像信号を生成するデモザイク、画像信号のノイズを除去するノイズリダクションおよび画像信号の符号化等が該当する。画像処理部1005は、例えば、ファームウェアを搭載したマイコンにより構成することができる。
The image processing unit 1005 processes the image signal generated by the image sensor 1002. This processing includes, for example, demosaic that generates an image signal of insufficient color among image signals corresponding to red, green, and blue for each pixel, noise reduction that removes noise of the image signal, and encoding of the image signal. Applicable. The image processing unit 1005 can be configured by, for example, a microcomputer equipped with firmware.
操作入力部1006は、カメラ1000の使用者からの操作入力を受け付けるものである。この操作入力部1006には、例えば、押しボタンやタッチパネルを使用することができる。操作入力部1006により受け付けられた操作入力は、撮像制御部1003や画像処理部1005に伝達される。その後、操作入力に応じた処理、例えば、被写体の撮像等の処理が起動される。
The operation input unit 1006 receives an operation input from the user of the camera 1000. For the operation input unit 1006, for example, a push button or a touch panel can be used. The operation input received by the operation input unit 1006 is transmitted to the imaging control unit 1003 and the image processing unit 1005. Thereafter, processing according to the operation input, for example, processing such as imaging of a subject is started.
フレームメモリ1007は、1画面分の画像信号であるフレームを記憶するメモリである。このフレームメモリ1007は、画像処理部1005により制御され、画像処理の過程におけるフレームの保持を行う。
The frame memory 1007 is a memory for storing frames that are image signals for one screen. The frame memory 1007 is controlled by the image processing unit 1005 and holds a frame in the course of image processing.
表示部1008は、画像処理部1005により処理された画像を表示するものである。この表示部1008には、例えば、液晶パネルを使用することができる。
The display unit 1008 displays the image processed by the image processing unit 1005. For example, a liquid crystal panel can be used for the display unit 1008.
記録部1009は、画像処理部1005により処理された画像を記録するものである。この記録部1009には、例えば、メモリカードやハードディスクを使用することができる。
The recording unit 1009 records the image processed by the image processing unit 1005. For the recording unit 1009, for example, a memory card or a hard disk can be used.
以上、本発明が適用され得るカメラについて説明した。本技術は以上において説明した構成のうち、撮像素子1002に適用され得る。具体的には、図1において説明した撮像素子1は、撮像素子1002に適用することができる。撮像素子1002に撮像素子1を適用することにより楕円形状の断面のオンチップレンズを備える画素が配置された撮像素子1002の製造方法を簡略化することができる。
The camera to which the present invention can be applied has been described above. The present technology can be applied to the image sensor 1002 among the configurations described above. Specifically, the image sensor 1 described in FIG. 1 can be applied to the image sensor 1002. By applying the image sensor 1 to the image sensor 1002, a manufacturing method of the image sensor 1002 in which pixels including an on-chip lens having an elliptical cross section are arranged can be simplified.
なお、ここでは、一例としてカメラについて説明したが、本発明に係る技術は、その他、例えば監視装置等に適用されてもよい。
In addition, although the camera was demonstrated as an example here, the technique which concerns on this invention may be applied to a monitoring apparatus etc., for example.
<7.内視鏡手術システムへの応用例>
本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、内視鏡手術システムに適用されてもよい。 <7. Application example to endoscopic surgery system>
The technology according to the present disclosure (present technology) can be applied to various products. For example, the technology according to the present disclosure may be applied to an endoscopic surgery system.
本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、内視鏡手術システムに適用されてもよい。 <7. Application example to endoscopic surgery system>
The technology according to the present disclosure (present technology) can be applied to various products. For example, the technology according to the present disclosure may be applied to an endoscopic surgery system.
図17は、本開示に係る技術(本技術)が適用され得る内視鏡手術システムの概略的な構成の一例を示す図である。
FIG. 17 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology (present technology) according to the present disclosure can be applied.
図17では、術者(医師)11131が、内視鏡手術システム11000を用いて、患者ベッド11133上の患者11132に手術を行っている様子が図示されている。図示するように、内視鏡手術システム11000は、内視鏡11100と、気腹チューブ11111やエネルギー処置具11112等の、その他の術具11110と、内視鏡11100を支持する支持アーム装置11120と、内視鏡下手術のための各種の装置が搭載されたカート11200と、から構成される。
FIG. 17 shows a state in which an operator (doctor) 11131 is performing an operation on a patient 11132 on a patient bed 11133 using an endoscopic operation system 11000. As shown in the figure, an endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as an insufflation tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 that supports the endoscope 11100. And a cart 11200 on which various devices for endoscopic surgery are mounted.
内視鏡11100は、先端から所定の長さの領域が患者11132の体腔内に挿入される鏡筒11101と、鏡筒11101の基端に接続されるカメラヘッド11102と、から構成される。図示する例では、硬性の鏡筒11101を有するいわゆる硬性鏡として構成される内視鏡11100を図示しているが、内視鏡11100は、軟性の鏡筒を有するいわゆる軟性鏡として構成されてもよい。
The endoscope 11100 includes a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101. In the illustrated example, an endoscope 11100 configured as a so-called rigid mirror having a rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible lens barrel. Good.
鏡筒11101の先端には、対物レンズが嵌め込まれた開口部が設けられている。内視鏡11100には光源装置11203が接続されており、当該光源装置11203によって生成された光が、鏡筒11101の内部に延設されるライトガイドによって当該鏡筒の先端まで導光され、対物レンズを介して患者11132の体腔内の観察対象に向かって照射される。なお、内視鏡11100は、直視鏡であってもよいし、斜視鏡又は側視鏡であってもよい。
An opening into which the objective lens is fitted is provided at the tip of the lens barrel 11101. A light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101. Irradiation is performed toward the observation target in the body cavity of the patient 11132 through the lens. Note that the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
カメラヘッド11102の内部には光学系及び撮像素子が設けられており、観察対象からの反射光(観察光)は当該光学系によって当該撮像素子に集光される。当該撮像素子によって観察光が光電変換され、観察光に対応する電気信号、すなわち観察像に対応する画像信号が生成される。当該画像信号は、RAWデータとしてカメラコントロールユニット(CCU: Camera Control Unit)11201に送信される。
An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from the observation target is condensed on the image sensor by the optical system. Observation light is photoelectrically converted by the imaging element, and an electrical signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated. The image signal is transmitted to a camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
CCU11201は、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)等によって構成され、内視鏡11100及び表示装置11202の動作を統括的に制御する。さらに、CCU11201は、カメラヘッド11102から画像信号を受け取り、その画像信号に対して、例えば現像処理(デモザイク処理)等の、当該画像信号に基づく画像を表示するための各種の画像処理を施す。
The CCU 11201 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs various kinds of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), for example.
表示装置11202は、CCU11201からの制御により、当該CCU11201によって画像処理が施された画像信号に基づく画像を表示する。
The display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201.
光源装置11203は、例えばLED(Light Emitting Diode)等の光源から構成され、術部等を撮影する際の照射光を内視鏡11100に供給する。
The light source device 11203 is composed of a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
入力装置11204は、内視鏡手術システム11000に対する入力インタフェースである。ユーザは、入力装置11204を介して、内視鏡手術システム11000に対して各種の情報の入力や指示入力を行うことができる。例えば、ユーザは、内視鏡11100による撮像条件(照射光の種類、倍率及び焦点距離等)を変更する旨の指示等を入力する。
The input device 11204 is an input interface for the endoscopic surgery system 11000. A user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204. For example, the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
処置具制御装置11205は、組織の焼灼、切開又は血管の封止等のためのエネルギー処置具11112の駆動を制御する。気腹装置11206は、内視鏡11100による視野の確保及び術者の作業空間の確保の目的で、患者11132の体腔を膨らめるために、気腹チューブ11111を介して当該体腔内にガスを送り込む。レコーダ11207は、手術に関する各種の情報を記録可能な装置である。プリンタ11208は、手術に関する各種の情報を、テキスト、画像又はグラフ等各種の形式で印刷可能な装置である。
The treatment instrument control device 11205 controls the drive of the energy treatment instrument 11112 for tissue ablation, incision, blood vessel sealing, or the like. In order to inflate the body cavity of the patient 11132 for the purpose of securing the field of view by the endoscope 11100 and securing the operator's work space, the pneumoperitoneum device 11206 passes gas into the body cavity via the pneumoperitoneum tube 11111. Send in. The recorder 11207 is an apparatus capable of recording various types of information related to surgery. The printer 11208 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
なお、内視鏡11100に術部を撮影する際の照射光を供給する光源装置11203は、例えばLED、レーザ光源又はこれらの組み合わせによって構成される白色光源から構成することができる。RGBレーザ光源の組み合わせにより白色光源が構成される場合には、各色(各波長)の出力強度及び出力タイミングを高精度に制御することができるため、光源装置11203において撮像画像のホワイトバランスの調整を行うことができる。また、この場合には、RGBレーザ光源それぞれからのレーザ光を時分割で観察対象に照射し、その照射タイミングに同期してカメラヘッド11102の撮像素子の駆動を制御することにより、RGBそれぞれに対応した画像を時分割で撮像することも可能である。当該方法によれば、当該撮像素子にカラーフィルタを設けなくても、カラー画像を得ることができる。
Note that the light source device 11203 that supplies the irradiation light when imaging the surgical site to the endoscope 11100 can be configured from a white light source configured by, for example, an LED, a laser light source, or a combination thereof. When a white light source is configured by a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out. In this case, laser light from each of the RGB laser light sources is irradiated on the observation target in a time-sharing manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing, thereby corresponding to each RGB. It is also possible to take the images that have been taken in time division. According to this method, a color image can be obtained without providing a color filter in the image sensor.
また、光源装置11203は、出力する光の強度を所定の時間ごとに変更するようにその駆動が制御されてもよい。その光の強度の変更のタイミングに同期してカメラヘッド11102の撮像素子の駆動を制御して時分割で画像を取得し、その画像を合成することにより、いわゆる黒つぶれ及び白とびのない高ダイナミックレンジの画像を生成することができる。
Further, the driving of the light source device 11203 may be controlled so as to change the intensity of the output light every predetermined time. Synchronously with the timing of changing the intensity of the light, the drive of the image sensor of the camera head 11102 is controlled to acquire an image in a time-sharing manner, and the image is synthesized, so that high dynamic without so-called blackout and overexposure A range image can be generated.
また、光源装置11203は、特殊光観察に対応した所定の波長帯域の光を供給可能に構成されてもよい。特殊光観察では、例えば、体組織における光の吸収の波長依存性を利用して、通常の観察時における照射光(すなわち、白色光)に比べて狭帯域の光を照射することにより、粘膜表層の血管等の所定の組織を高コントラストで撮影する、いわゆる狭帯域光観察(Narrow Band Imaging)が行われる。あるいは、特殊光観察では、励起光を照射することにより発生する蛍光により画像を得る蛍光観察が行われてもよい。蛍光観察では、体組織に励起光を照射し当該体組織からの蛍光を観察すること(自家蛍光観察)、又はインドシアニングリーン(ICG)等の試薬を体組織に局注するとともに当該体組織にその試薬の蛍光波長に対応した励起光を照射し蛍光像を得ること等を行うことができる。光源装置11203は、このような特殊光観察に対応した狭帯域光及び/又は励起光を供給可能に構成され得る。
Further, the light source device 11203 may be configured to be able to supply light of a predetermined wavelength band corresponding to special light observation. In special light observation, for example, by utilizing the wavelength dependence of light absorption in body tissue, the surface of the mucous membrane is irradiated by irradiating light in a narrow band compared to irradiation light (ie, white light) during normal observation. A so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel is imaged with high contrast. Alternatively, in special light observation, fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light. In fluorescence observation, the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally administered to the body tissue and applied to the body tissue. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent. The light source device 11203 can be configured to be able to supply narrowband light and / or excitation light corresponding to such special light observation.
図18は、図17に示すカメラヘッド11102及びCCU11201の機能構成の一例を示すブロック図である。
FIG. 18 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG.
カメラヘッド11102は、レンズユニット11401と、撮像部11402と、駆動部11403と、通信部11404と、カメラヘッド制御部11405と、を有する。CCU11201は、通信部11411と、画像処理部11412と、制御部11413と、を有する。カメラヘッド11102とCCU11201とは、伝送ケーブル11400によって互いに通信可能に接続されている。
The camera head 11102 includes a lens unit 11401, an imaging unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405. The CCU 11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413. The camera head 11102 and the CCU 11201 are connected to each other by a transmission cable 11400 so that they can communicate with each other.
レンズユニット11401は、鏡筒11101との接続部に設けられる光学系である。鏡筒11101の先端から取り込まれた観察光は、カメラヘッド11102まで導光され、当該レンズユニット11401に入射する。レンズユニット11401は、ズームレンズ及びフォーカスレンズを含む複数のレンズが組み合わされて構成される。
The lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101. Observation light taken from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401. The lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
撮像部11402は、撮像素子で構成される。撮像部11402を構成する撮像素子は、1つ(いわゆる単板式)であってもよいし、複数(いわゆる多板式)であってもよい。撮像部11402が多板式で構成される場合には、例えば各撮像素子によってRGBそれぞれに対応する画像信号が生成され、それらが合成されることによりカラー画像が得られてもよい。あるいは、撮像部11402は、3D(Dimensional)表示に対応する右目用及び左目用の画像信号をそれぞれ取得するための1対の撮像素子を有するように構成されてもよい。3D表示が行われることにより、術者11131は術部における生体組織の奥行きをより正確に把握することが可能になる。なお、撮像部11402が多板式で構成される場合には、各撮像素子に対応して、レンズユニット11401も複数系統設けられ得る。
The imaging unit 11402 includes an imaging element. One (so-called single plate type) image sensor may be included in the imaging unit 11402, or a plurality (so-called multi-plate type) may be used. In the case where the imaging unit 11402 is configured as a multi-plate type, for example, image signals corresponding to RGB may be generated by each imaging element, and a color image may be obtained by combining them. Alternatively, the imaging unit 11402 may be configured to include a pair of imaging elements for acquiring right-eye and left-eye image signals corresponding to 3D (Dimensional) display. By performing the 3D display, the operator 11131 can more accurately grasp the depth of the living tissue in the surgical site. Note that in the case where the imaging unit 11402 is configured as a multi-plate type, a plurality of lens units 11401 can be provided corresponding to each imaging element.
また、撮像部11402は、必ずしもカメラヘッド11102に設けられなくてもよい。例えば、撮像部11402は、鏡筒11101の内部に、対物レンズの直後に設けられてもよい。
Further, the imaging unit 11402 is not necessarily provided in the camera head 11102. For example, the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
駆動部11403は、アクチュエータによって構成され、カメラヘッド制御部11405からの制御により、レンズユニット11401のズームレンズ及びフォーカスレンズを光軸に沿って所定の距離だけ移動させる。これにより、撮像部11402による撮像画像の倍率及び焦点が適宜調整され得る。
The driving unit 11403 is configured by an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405. Thereby, the magnification and the focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
通信部11404は、CCU11201との間で各種の情報を送受信するための通信装置によって構成される。通信部11404は、撮像部11402から得た画像信号をRAWデータとして伝送ケーブル11400を介してCCU11201に送信する。
The communication unit 11404 is configured by a communication device for transmitting and receiving various types of information to and from the CCU 11201. The communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
また、通信部11404は、CCU11201から、カメラヘッド11102の駆動を制御するための制御信号を受信し、カメラヘッド制御部11405に供給する。当該制御信号には、例えば、撮像画像のフレームレートを指定する旨の情報、撮像時の露出値を指定する旨の情報、並びに/又は撮像画像の倍率及び焦点を指定する旨の情報等、撮像条件に関する情報が含まれる。
Further, the communication unit 11404 receives a control signal for controlling driving of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405. The control signal includes, for example, information for designating the frame rate of the captured image, information for designating the exposure value at the time of imaging, and / or information for designating the magnification and focus of the captured image. Contains information about the condition.
なお、上記のフレームレートや露出値、倍率、焦点等の撮像条件は、ユーザによって適宜指定されてもよいし、取得された画像信号に基づいてCCU11201の制御部11413によって自動的に設定されてもよい。後者の場合には、いわゆるAE(Auto Exposure)機能、AF(Auto Focus)機能及びAWB(Auto White Balance)機能が内視鏡11100に搭載されていることになる。
Note that the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. Good. In the latter case, a so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
カメラヘッド制御部11405は、通信部11404を介して受信したCCU11201からの制御信号に基づいて、カメラヘッド11102の駆動を制御する。
The camera head control unit 11405 controls driving of the camera head 11102 based on a control signal from the CCU 11201 received via the communication unit 11404.
通信部11411は、カメラヘッド11102との間で各種の情報を送受信するための通信装置によって構成される。通信部11411は、カメラヘッド11102から、伝送ケーブル11400を介して送信される画像信号を受信する。
The communication unit 11411 is configured by a communication device for transmitting and receiving various types of information to and from the camera head 11102. The communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
また、通信部11411は、カメラヘッド11102に対して、カメラヘッド11102の駆動を制御するための制御信号を送信する。画像信号や制御信号は、電気通信や光通信等によって送信することができる。
Further, the communication unit 11411 transmits a control signal for controlling driving of the camera head 11102 to the camera head 11102. The image signal and the control signal can be transmitted by electrical communication, optical communication, or the like.
画像処理部11412は、カメラヘッド11102から送信されたRAWデータである画像信号に対して各種の画像処理を施す。
The image processing unit 11412 performs various types of image processing on the image signal that is RAW data transmitted from the camera head 11102.
制御部11413は、内視鏡11100による術部等の撮像、及び、術部等の撮像により得られる撮像画像の表示に関する各種の制御を行う。例えば、制御部11413は、カメラヘッド11102の駆動を制御するための制御信号を生成する。
The control unit 11413 performs various types of control related to imaging of the surgical site by the endoscope 11100 and display of a captured image obtained by imaging of the surgical site. For example, the control unit 11413 generates a control signal for controlling driving of the camera head 11102. *
また、制御部11413は、画像処理部11412によって画像処理が施された画像信号に基づいて、術部等が映った撮像画像を表示装置11202に表示させる。この際、制御部11413は、各種の画像認識技術を用いて撮像画像内における各種の物体を認識してもよい。例えば、制御部11413は、撮像画像に含まれる物体のエッジの形状や色等を検出することにより、鉗子等の術具、特定の生体部位、出血、エネルギー処置具11112の使用時のミスト等を認識することができる。制御部11413は、表示装置11202に撮像画像を表示させる際に、その認識結果を用いて、各種の手術支援情報を当該術部の画像に重畳表示させてもよい。手術支援情報が重畳表示され、術者11131に提示されることにより、術者11131の負担を軽減することや、術者11131が確実に手術を進めることが可能になる。
Further, the control unit 11413 causes the display device 11202 to display a picked-up image showing the surgical part or the like based on the image signal subjected to the image processing by the image processing unit 11412. At this time, the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 detects surgical tools such as forceps, specific biological parts, bleeding, mist when using the energy treatment tool 11112, and the like by detecting the shape and color of the edge of the object included in the captured image. Can be recognized. When displaying the captured image on the display device 11202, the control unit 11413 may display various types of surgery support information superimposed on the image of the surgical unit using the recognition result. Surgery support information is displayed in a superimposed manner and presented to the operator 11131, thereby reducing the burden on the operator 11131 and allowing the operator 11131 to proceed with surgery reliably.
カメラヘッド11102及びCCU11201を接続する伝送ケーブル11400は、電気信号の通信に対応した電気信号ケーブル、光通信に対応した光ファイバ、又はこれらの複合ケーブルである。
The transmission cable 11400 for connecting the camera head 11102 and the CCU 11201 is an electric signal cable corresponding to electric signal communication, an optical fiber corresponding to optical communication, or a composite cable thereof.
ここで、図示する例では、伝送ケーブル11400を用いて有線で通信が行われていたが、カメラヘッド11102とCCU11201との間の通信は無線で行われてもよい。
Here, in the illustrated example, communication is performed by wire using the transmission cable 11400. However, communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
以上、本開示に係る技術が適用され得る内視鏡手術システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、撮像部11402に適用され得る。具体的には、図1の撮像素子1は、撮像部11402に適用することができる。撮像部11402に本開示に係る技術を適用することにより、内視鏡手術システムを低価格化することができる。
In the foregoing, an example of an endoscopic surgery system to which the technology according to the present disclosure can be applied has been described. The technology according to the present disclosure can be applied to the imaging unit 11402 among the configurations described above. Specifically, the imaging device 1 in FIG. 1 can be applied to the imaging unit 11402. By applying the technique according to the present disclosure to the imaging unit 11402, the cost of the endoscopic surgery system can be reduced.
なお、ここでは、一例として内視鏡手術システムについて説明したが、本開示に係る技術は、その他、例えば、顕微鏡手術システム等に適用されてもよい。
Note that although an endoscopic surgery system has been described here as an example, the technology according to the present disclosure may be applied to, for example, a microscope surgery system and the like.
<8.移動体への応用例>
本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。 <8. Application example to mobile objects>
The technology according to the present disclosure (present technology) can be applied to various products. For example, the technology according to the present disclosure is realized as a device that is mounted on any type of mobile body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。 <8. Application example to mobile objects>
The technology according to the present disclosure (present technology) can be applied to various products. For example, the technology according to the present disclosure is realized as a device that is mounted on any type of mobile body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
図19は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システムの概略的な構成例を示すブロック図である。
FIG. 19 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
車両制御システム12000は、通信ネットワーク12001を介して接続された複数の電子制御ユニットを備える。図19に示した例では、車両制御システム12000は、駆動系制御ユニット12010、ボディ系制御ユニット12020、車外情報検出ユニット12030、車内情報検出ユニット12040、及び統合制御ユニット12050を備える。また、統合制御ユニット12050の機能構成として、マイクロコンピュータ12051、音声画像出力部12052、及び車載ネットワークI/F(interface)12053が図示されている。
The vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001. In the example illustrated in FIG. 19, the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, a vehicle exterior information detection unit 12030, a vehicle interior information detection unit 12040, and an integrated control unit 12050. As a functional configuration of the integrated control unit 12050, a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are illustrated.
駆動系制御ユニット12010は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット12010は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。
The drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs. For example, the drive system control unit 12010 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle. *
ボディ系制御ユニット12020は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット12020は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット12020には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット12020は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。
The body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs. For example, the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp. In this case, the body control unit 12020 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches. The body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
車外情報検出ユニット12030は、車両制御システム12000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット12030には、撮像部12031が接続される。車外情報検出ユニット12030は、撮像部12031に車外の画像を撮像させるとともに、撮像された画像を受信する。車外情報検出ユニット12030は、受信した画像に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。
The vehicle outside information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted. For example, the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030. The vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image. The vehicle outside information detection unit 12030 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received image.
撮像部12031は、光を受光し、その光の受光量に応じた電気信号を出力する光センサである。撮像部12031は、電気信号を画像として出力することもできるし、測距の情報として出力することもできる。また、撮像部12031が受光する光は、可視光であっても良いし、赤外線等の非可視光であっても良い。
The imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal corresponding to the amount of received light. The imaging unit 12031 can output an electrical signal as an image, or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared rays.
車内情報検出ユニット12040は、車内の情報を検出する。車内情報検出ユニット12040には、例えば、運転者の状態を検出する運転者状態検出部12041が接続される。運転者状態検出部12041は、例えば運転者を撮像するカメラを含み、車内情報検出ユニット12040は、運転者状態検出部12041から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。
The vehicle interior information detection unit 12040 detects vehicle interior information. For example, a driver state detection unit 12041 that detects a driver's state is connected to the in-vehicle information detection unit 12040. The driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット12010に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行うことができる。
The microcomputer 12051 calculates a control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside / outside the vehicle acquired by the vehicle outside information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit A control command can be output to 12010. For example, the microcomputer 12051 realizes an ADAS (Advanced Driver Assistance System) function including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintaining traveling, vehicle collision warning, or vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose.
また、マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。
Further, the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation.
また、マイクロコンピュータ12051は、車外情報検出ユニット12030で取得される車外の情報に基づいて、ボディ系制御ユニット12020に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車外情報検出ユニット12030で検知した先行車又は対向車の位置に応じてヘッドランプを制御し、ハイビームをロービームに切り替える等の防眩を図ることを目的とした協調制御を行うことができる。
Further, the microcomputer 12051 can output a control command to the body system control unit 12020 based on information outside the vehicle acquired by the vehicle outside information detection unit 12030. For example, the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare, such as switching from a high beam to a low beam. It can be carried out.
音声画像出力部12052は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図19の例では、出力装置として、オーディオスピーカ12061、表示部12062及びインストルメントパネル12063が例示されている。表示部12062は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。
The sound image output unit 12052 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle. In the example of FIG. 19, an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices. The display unit 12062 may include at least one of an on-board display and a head-up display, for example.
図20は、撮像部12031の設置位置の例を示す図である。
FIG. 20 is a diagram illustrating an example of an installation position of the imaging unit 12031.
図20では、車両12100は、撮像部12031として、撮像部12101、12102、12103、12104、12105を有する。
In FIG. 20, the vehicle 12100 includes imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
撮像部12101、12102、12103、12104、12105は、例えば、車両12100のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部等の位置に設けられる。フロントノーズに備えられる撮像部12101及び車室内のフロントガラスの上部に備えられる撮像部12105は、主として車両12100の前方の画像を取得する。サイドミラーに備えられる撮像部12102、12103は、主として車両12100の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部12104は、主として車両12100の後方の画像を取得する。撮像部12101及び12105で取得される前方の画像は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。
The imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper part of a windshield in the vehicle interior of the vehicle 12100. The imaging unit 12101 provided in the front nose and the imaging unit 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100. The imaging units 12102 and 12103 provided in the side mirror mainly acquire an image of the side of the vehicle 12100. The imaging unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100. The forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
なお、図20には、撮像部12101ないし12104の撮影範囲の一例が示されている。撮像範囲12111は、フロントノーズに設けられた撮像部12101の撮像範囲を示し、撮像範囲12112、12113は、それぞれサイドミラーに設けられた撮像部12102、12103の撮像範囲を示し、撮像範囲12114は、リアバンパ又はバックドアに設けられた撮像部12104の撮像範囲を示す。例えば、撮像部12101ないし12104で撮像された画像データが重ね合わせられることにより、車両12100を上方から見た俯瞰画像が得られる。
FIG. 20 shows an example of the shooting range of the imaging units 12101 to 12104. The imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose, the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively, and the imaging range 12114 The imaging range of the imaging part 12104 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, an overhead image when the vehicle 12100 is viewed from above is obtained.
撮像部12101ないし12104の少なくとも1つは、距離情報を取得する機能を有していてもよい。例えば、撮像部12101ないし12104の少なくとも1つは、複数の撮像素子からなるステレオカメラであってもよいし、位相差検出用の画素を有する撮像素子であってもよい。
At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を基に、撮像範囲12111ないし12114内における各立体物までの距離と、この距離の時間的変化(車両12100に対する相対速度)を求めることにより、特に車両12100の進行路上にある最も近い立体物で、車両12100と略同じ方向に所定の速度(例えば、0km/h以上)で走行する立体物を先行車として抽出することができる。さらに、マイクロコンピュータ12051は、先行車の手前に予め確保すべき車間距離を設定し、自動ブレーキ制御(追従停止制御も含む)や自動加速制御(追従発進制御も含む)等を行うことができる。このように運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。
For example, the microcomputer 12051, based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object in the imaging range 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100). In particular, it is possible to extract, as a preceding vehicle, a three-dimensional object that travels at a predetermined speed (for example, 0 km / h or more) in the same direction as the vehicle 12100, particularly the closest three-dimensional object on the traveling path of the vehicle 12100. it can. Further, the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. Thus, cooperative control for the purpose of autonomous driving or the like autonomously traveling without depending on the operation of the driver can be performed.
例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を元に、立体物に関する立体物データを、2輪車、普通車両、大型車両、歩行者、電柱等その他の立体物に分類して抽出し、障害物の自動回避に用いることができる。例えば、マイクロコンピュータ12051は、車両12100の周辺の障害物を、車両12100のドライバが視認可能な障害物と視認困難な障害物とに識別する。そして、マイクロコンピュータ12051は、各障害物との衝突の危険度を示す衝突リスクを判断し、衝突リスクが設定値以上で衝突可能性がある状況であるときには、オーディオスピーカ12061や表示部12062を介してドライバに警報を出力することや、駆動系制御ユニット12010を介して強制減速や回避操舵を行うことで、衝突回避のための運転支援を行うことができる。
For example, the microcomputer 12051 converts the three-dimensional object data related to the three-dimensional object to other three-dimensional objects such as a two-wheeled vehicle, a normal vehicle, a large vehicle, a pedestrian, and a utility pole based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see. The microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is connected via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration or avoidance steering via the drive system control unit 12010, driving assistance for collision avoidance can be performed.
撮像部12101ないし12104の少なくとも1つは、赤外線を検出する赤外線カメラであってもよい。例えば、マイクロコンピュータ12051は、撮像部12101ないし12104の撮像画像中に歩行者が存在するか否かを判定することで歩行者を認識することができる。かかる歩行者の認識は、例えば赤外線カメラとしての撮像部12101ないし12104の撮像画像における特徴点を抽出する手順と、物体の輪郭を示す一連の特徴点にパターンマッチング処理を行って歩行者か否かを判別する手順によって行われる。マイクロコンピュータ12051が、撮像部12101ないし12104の撮像画像中に歩行者が存在すると判定し、歩行者を認識すると、音声画像出力部12052は、当該認識された歩行者に強調のための方形輪郭線を重畳表示するように、表示部12062を制御する。また、音声画像出力部12052は、歩行者を示すアイコン等を所望の位置に表示するように表示部12062を制御してもよい。
At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays. For example, the microcomputer 12051 can recognize a pedestrian by determining whether a pedestrian is present in the captured images of the imaging units 12101 to 12104. Such pedestrian recognition is, for example, whether or not the user is a pedestrian by performing a pattern matching process on a sequence of feature points indicating the outline of an object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. It is carried out by the procedure for determining. When the microcomputer 12051 determines that there is a pedestrian in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 has a rectangular contour line for emphasizing the recognized pedestrian. The display unit 12062 is controlled so as to be superimposed and displayed. Moreover, the audio | voice image output part 12052 may control the display part 12062 so that the icon etc. which show a pedestrian may be displayed on a desired position.
以上、本開示に係る技術が適用され得る車両制御システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、撮像部12031等に適用され得る。具体的には、図1の撮像素子1は、撮像部12031に適用することができる。撮像部12031に本開示に係る技術を適用することにより、車両制御システムを低価格化することができる。
Heretofore, an example of a vehicle control system to which the technology according to the present disclosure can be applied has been described. The technology according to the present disclosure can be applied to the imaging unit 12031 and the like among the configurations described above. Specifically, the imaging device 1 in FIG. 1 can be applied to the imaging unit 12031. By applying the technique according to the present disclosure to the imaging unit 12031, the price of the vehicle control system can be reduced.
最後に、上述した各実施の形態の説明は本技術の一例であり、本技術は上述の実施の形態に限定されることはない。このため、上述した各実施の形態以外であっても、本技術に係る技術的思想を逸脱しない範囲であれば、設計等に応じて種々の変更が可能であることは勿論である。
Finally, the description of each embodiment described above is an example of the present technology, and the present technology is not limited to the above-described embodiment. For this reason, it is a matter of course that various modifications can be made in accordance with the design and the like as long as they do not deviate from the technical idea according to the present technology other than the embodiments described above.
なお、本技術は以下のような構成もとることができる。
(1)入射光に応じた画像信号を生成する画素と、
前記画素に配置されて前記入射光を透過させるとともに凹部を備える下地層と、
前記画素毎の前記下地層の凹部に隣接して配置されて前記下地層の凹部に基づいて形成された凸部を備え、照射された光を集光して前記下地層を介して前記画素に入射させるレンズと
を具備する撮像素子。
(2)前記レンズは、前記下地層に隣接して配置されて表面に前記下地層の凹部に基づく第2の凹部が形成された光透過性部材と前記第2の凹部にさらに配置されたレジストとにおいて、前記光透過性部材および前記レジストが異なるエッチング速度によりエッチングされて前記下地層の凹部に基づく前記凸部が形成される前記(1)に記載の撮像素子。
(3)前記下地層は、等方性のエッチングにより形成された前記凹部を備える前記(1)または(2)に記載の撮像素子。
(4)前記画素の周囲に配置される遮光壁をさらに具備し、
前記下地層は、前記遮光壁に囲まれた領域の形状に基づく前記凹部を備える
前記(1)または(2)に記載の撮像素子。
(5)前記集光された入射光のうち所定の波長の光を透過させるカラーフィルタをさらに具備する前記(1)から(4)のいずれかに記載の撮像素子。
(6)前記入射光のうち所定の波長の光を透過させるカラーフィルタをさらに具備し、
前記レンズは、前記カラーフィルタを透過した光を前記集光する
前記(1)から(4)のいずれかに記載の撮像素子。
(7)入射光に応じた画像信号を生成する画素に前記入射光を透過させるとともに凹部を備える下地層を形成する下地層凹部形成工程と、
照射された光を集光して前記下地層を介して前記画素に入射させるレンズを前記画素毎の前記下地層の凹部に隣接して配置するレンズ配置工程と、
前記下地層の凹部に基づいて形成された凸部を前記配置されたレンズの表面に形成するレンズ凸部形成工程と
を具備する撮像素子の製造方法。 In addition, this technique can also take the following structures.
(1) a pixel that generates an image signal according to incident light;
An underlayer disposed on the pixel to transmit the incident light and having a recess;
Each of the pixels includes a convex portion that is disposed adjacent to the concave portion of the base layer and is formed based on the concave portion of the base layer, and collects irradiated light to the pixel through the base layer. An imaging device comprising a lens to be incident.
(2) The lens is disposed adjacent to the base layer and has a light-transmitting member having a second recess formed on the surface based on the recess of the base layer, and a resist further disposed in the second recess The imaging element according to (1), wherein the light transmissive member and the resist are etched at different etching rates to form the convex portion based on the concave portion of the base layer.
(3) The imaging device according to (1) or (2), wherein the base layer includes the concave portion formed by isotropic etching.
(4) further comprising a light shielding wall disposed around the pixel;
The imaging device according to (1) or (2), wherein the base layer includes the concave portion based on a shape of a region surrounded by the light shielding wall.
(5) The imaging device according to any one of (1) to (4), further including a color filter that transmits light having a predetermined wavelength among the collected incident light.
(6) It further comprises a color filter that transmits light of a predetermined wavelength among the incident light,
The imaging element according to any one of (1) to (4), wherein the lens condenses the light transmitted through the color filter.
(7) a base layer recess forming step of transmitting the incident light to a pixel that generates an image signal corresponding to the incident light and forming a base layer including a recess;
A lens placement step of placing a lens that collects irradiated light and enters the pixel through the base layer adjacent to the recess of the base layer for each pixel; and
A method of manufacturing an image pickup device, comprising: a lens convex portion forming step of forming a convex portion formed based on the concave portion of the base layer on the surface of the arranged lens.
(1)入射光に応じた画像信号を生成する画素と、
前記画素に配置されて前記入射光を透過させるとともに凹部を備える下地層と、
前記画素毎の前記下地層の凹部に隣接して配置されて前記下地層の凹部に基づいて形成された凸部を備え、照射された光を集光して前記下地層を介して前記画素に入射させるレンズと
を具備する撮像素子。
(2)前記レンズは、前記下地層に隣接して配置されて表面に前記下地層の凹部に基づく第2の凹部が形成された光透過性部材と前記第2の凹部にさらに配置されたレジストとにおいて、前記光透過性部材および前記レジストが異なるエッチング速度によりエッチングされて前記下地層の凹部に基づく前記凸部が形成される前記(1)に記載の撮像素子。
(3)前記下地層は、等方性のエッチングにより形成された前記凹部を備える前記(1)または(2)に記載の撮像素子。
(4)前記画素の周囲に配置される遮光壁をさらに具備し、
前記下地層は、前記遮光壁に囲まれた領域の形状に基づく前記凹部を備える
前記(1)または(2)に記載の撮像素子。
(5)前記集光された入射光のうち所定の波長の光を透過させるカラーフィルタをさらに具備する前記(1)から(4)のいずれかに記載の撮像素子。
(6)前記入射光のうち所定の波長の光を透過させるカラーフィルタをさらに具備し、
前記レンズは、前記カラーフィルタを透過した光を前記集光する
前記(1)から(4)のいずれかに記載の撮像素子。
(7)入射光に応じた画像信号を生成する画素に前記入射光を透過させるとともに凹部を備える下地層を形成する下地層凹部形成工程と、
照射された光を集光して前記下地層を介して前記画素に入射させるレンズを前記画素毎の前記下地層の凹部に隣接して配置するレンズ配置工程と、
前記下地層の凹部に基づいて形成された凸部を前記配置されたレンズの表面に形成するレンズ凸部形成工程と
を具備する撮像素子の製造方法。 In addition, this technique can also take the following structures.
(1) a pixel that generates an image signal according to incident light;
An underlayer disposed on the pixel to transmit the incident light and having a recess;
Each of the pixels includes a convex portion that is disposed adjacent to the concave portion of the base layer and is formed based on the concave portion of the base layer, and collects irradiated light to the pixel through the base layer. An imaging device comprising a lens to be incident.
(2) The lens is disposed adjacent to the base layer and has a light-transmitting member having a second recess formed on the surface based on the recess of the base layer, and a resist further disposed in the second recess The imaging element according to (1), wherein the light transmissive member and the resist are etched at different etching rates to form the convex portion based on the concave portion of the base layer.
(3) The imaging device according to (1) or (2), wherein the base layer includes the concave portion formed by isotropic etching.
(4) further comprising a light shielding wall disposed around the pixel;
The imaging device according to (1) or (2), wherein the base layer includes the concave portion based on a shape of a region surrounded by the light shielding wall.
(5) The imaging device according to any one of (1) to (4), further including a color filter that transmits light having a predetermined wavelength among the collected incident light.
(6) It further comprises a color filter that transmits light of a predetermined wavelength among the incident light,
The imaging element according to any one of (1) to (4), wherein the lens condenses the light transmitted through the color filter.
(7) a base layer recess forming step of transmitting the incident light to a pixel that generates an image signal corresponding to the incident light and forming a base layer including a recess;
A lens placement step of placing a lens that collects irradiated light and enters the pixel through the base layer adjacent to the recess of the base layer for each pixel; and
A method of manufacturing an image pickup device, comprising: a lens convex portion forming step of forming a convex portion formed based on the concave portion of the base layer on the surface of the arranged lens.
1 撮像素子
100、100a、100b 画素
101 光電変換部
142 カラーフィルタ
143、145、149 遮光壁
144、146、148、161 下地層
151、153 オンチップレンズ
152、154 層内レンズ
402、407、416、422、428 レジスト
405、414、426 光透過性部材
1002 撮像素子
11402、12031 撮像部 DESCRIPTION OFSYMBOLS 1 Image pick-up element 100, 100a, 100b Pixel 101 Photoelectric conversion part 142 Color filter 143, 145, 149 Light-shielding wall 144, 146, 148, 161 Underlayer 151, 153 On- chip lens 152, 154 In- layer lens 402, 407, 416, 422, 428 Resist 405, 414, 426 Light transmissive member 1002 Imaging element 11402, 12031 Imaging unit
100、100a、100b 画素
101 光電変換部
142 カラーフィルタ
143、145、149 遮光壁
144、146、148、161 下地層
151、153 オンチップレンズ
152、154 層内レンズ
402、407、416、422、428 レジスト
405、414、426 光透過性部材
1002 撮像素子
11402、12031 撮像部 DESCRIPTION OF
Claims (7)
- 入射光の光電変換を行う光電変換部を備える画素と、
前記画素に配置されて前記入射光を透過させるとともに凹部を備える下地層と、
前記下地層の凹部に隣接して配置されて前記下地層の凹部に基づいて形成される凸部を備え、前記入射光を集光して前記下地層を介して前記光電変換部に入射させるレンズと
を具備する撮像素子。 A pixel including a photoelectric conversion unit that performs photoelectric conversion of incident light;
An underlayer disposed on the pixel to transmit the incident light and having a recess;
A lens that is disposed adjacent to the concave portion of the base layer and is formed based on the concave portion of the base layer, and that condenses the incident light and causes the incident light to enter the photoelectric conversion unit through the base layer An image sensor comprising: - 前記レンズは、前記下地層に隣接して配置されて表面に前記下地層の凹部に基づく第2の凹部が形成された光透過性部材と前記第2の凹部にさらに配置されたレジストとにおいて、前記光透過性部材および前記レジストが異なるエッチング速度によりエッチングされて前記下地層の凹部に基づく前記凸部が形成される請求項1記載の撮像素子。 The lens is disposed adjacent to the base layer and has a light transmissive member having a second recess formed on the surface based on the recess of the base layer and a resist further disposed in the second recess. The imaging element according to claim 1, wherein the light transmissive member and the resist are etched at different etching rates to form the convex portion based on the concave portion of the base layer.
- 前記下地層は、等方性のエッチングにより形成された前記凹部を備える請求項1記載の撮像素子。 The image pickup device according to claim 1, wherein the base layer includes the concave portion formed by isotropic etching.
- 前記画素の周縁部に配置される遮光壁をさらに具備し、
前記下地層は、前記遮光壁に囲まれた領域の形状に基づく前記凹部を備える
請求項1記載の撮像素子。 Further comprising a light-shielding wall disposed at a peripheral edge of the pixel;
The imaging device according to claim 1, wherein the base layer includes the concave portion based on a shape of a region surrounded by the light shielding wall. - 前記集光された入射光のうち所定の波長の光を透過させるカラーフィルタをさらに具備する請求項1記載の撮像素子。 The image pickup device according to claim 1, further comprising a color filter that transmits light of a predetermined wavelength among the collected incident light.
- 前記入射光のうち所定の波長の光を透過させるカラーフィルタをさらに具備し、
前記レンズは、前記カラーフィルタを透過した光を前記集光する
請求項1記載の撮像素子。 A color filter that transmits light of a predetermined wavelength out of the incident light;
The imaging element according to claim 1, wherein the lens condenses the light transmitted through the color filter. - 入射光の光電変換を行う光電変換部を備える画素に前記入射光を透過させるとともに凹部を備える下地層を形成する下地層凹部形成工程と、
前記入射光を集光して前記下地層を介して前記光電変換部に入射させるレンズを前記下地層の凹部に隣接して配置するレンズ配置工程と、
前記下地層の凹部に基づいて形成される凸部を前記配置されたレンズの表面に形成するレンズ凸部形成工程と
を具備する撮像素子の製造方法。 A base layer recess forming step of transmitting the incident light to a pixel including a photoelectric conversion unit that performs photoelectric conversion of incident light and forming a base layer including a recess;
A lens disposing step of condensing the incident light to be incident on the photoelectric conversion unit via the base layer adjacent to the concave portion of the base layer;
A method of manufacturing an image pickup device comprising: a lens convex portion forming step of forming a convex portion formed based on the concave portion of the base layer on the surface of the arranged lens.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018045137A JP2019160986A (en) | 2018-03-13 | 2018-03-13 | Imaging element and manufacturing method thereof |
JP2018-045137 | 2018-03-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019176302A1 true WO2019176302A1 (en) | 2019-09-19 |
Family
ID=67908242
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/002025 WO2019176302A1 (en) | 2018-03-13 | 2019-01-23 | Imaging element and method for manufacturing imaging element |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2019160986A (en) |
WO (1) | WO2019176302A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114664876A (en) * | 2022-05-25 | 2022-06-24 | 合肥晶合集成电路股份有限公司 | Image sensor and manufacturing method thereof |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004253573A (en) * | 2003-02-19 | 2004-09-09 | Sharp Corp | Semiconductor device and its manufacturing method |
JP2010239076A (en) * | 2009-03-31 | 2010-10-21 | Sony Corp | Solid-state imaging device and method of manufacturing the same, and electronic apparatus |
JP2012108327A (en) * | 2010-11-17 | 2012-06-07 | Sharp Corp | Lens and method for manufacturing the same, solid-state image sensor and method for manufacturing the same, and electronic information equipment |
JP2018072757A (en) * | 2016-11-04 | 2018-05-10 | セイコーエプソン株式会社 | Microlens array substrate and method for manufacturing the same, electro-optical device, and method for manufacturing the same, and electronic apparatus |
-
2018
- 2018-03-13 JP JP2018045137A patent/JP2019160986A/en active Pending
-
2019
- 2019-01-23 WO PCT/JP2019/002025 patent/WO2019176302A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004253573A (en) * | 2003-02-19 | 2004-09-09 | Sharp Corp | Semiconductor device and its manufacturing method |
JP2010239076A (en) * | 2009-03-31 | 2010-10-21 | Sony Corp | Solid-state imaging device and method of manufacturing the same, and electronic apparatus |
JP2012108327A (en) * | 2010-11-17 | 2012-06-07 | Sharp Corp | Lens and method for manufacturing the same, solid-state image sensor and method for manufacturing the same, and electronic information equipment |
JP2018072757A (en) * | 2016-11-04 | 2018-05-10 | セイコーエプソン株式会社 | Microlens array substrate and method for manufacturing the same, electro-optical device, and method for manufacturing the same, and electronic apparatus |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114664876A (en) * | 2022-05-25 | 2022-06-24 | 合肥晶合集成电路股份有限公司 | Image sensor and manufacturing method thereof |
CN114664876B (en) * | 2022-05-25 | 2022-08-23 | 合肥晶合集成电路股份有限公司 | Image sensor and manufacturing method thereof |
Also Published As
Publication number | Publication date |
---|---|
JP2019160986A (en) | 2019-09-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110199394B (en) | Image sensor and method for manufacturing the same | |
WO2019220945A1 (en) | Imaging element and electronic device | |
WO2018051604A1 (en) | Solid-state imaging element, imaging device, and method for manufacturing solid-state imaging element | |
WO2020137285A1 (en) | Imaging element and method for manufacturing imaging element | |
WO2020137370A1 (en) | Solid-state imaging apparatus and electronic device | |
WO2019155782A1 (en) | Semiconductor device and method for manufacturing semiconductor device | |
US20230008784A1 (en) | Solid-state imaging device and electronic device | |
JP7544601B2 (en) | Image sensor and image pickup device | |
JP2019091745A (en) | Imaging element and imaging device | |
WO2019207978A1 (en) | Image capture element and method of manufacturing image capture element | |
US20240088189A1 (en) | Imaging device | |
US11417696B2 (en) | Imaging element comprising polarization unit with a conductive member as an electrode for a charge holder | |
WO2019181466A1 (en) | Imaging element and electronic device | |
WO2019176302A1 (en) | Imaging element and method for manufacturing imaging element | |
WO2023042462A1 (en) | Light detecting device, method for manufacturing light detecting device, and electronic instrument | |
WO2021045139A1 (en) | Imaging element and imaging device | |
WO2019235230A1 (en) | Imaging element and electronic device | |
WO2024166667A1 (en) | Light detection device and electronic apparatus | |
WO2022249678A1 (en) | Solid-state imaging device and method for manufacturing same | |
WO2024057805A1 (en) | Imaging element and electronic device | |
WO2023119840A1 (en) | Imaging element, method for manufacturing imaging element, and electronic device | |
WO2024057814A1 (en) | Light-detection device and electronic instrument | |
WO2024084991A1 (en) | Photodetector, electronic apparatus, and optical element | |
WO2024057806A1 (en) | Imaging device and electronic apparatus | |
WO2023021740A1 (en) | Imaging element, imaging device and production method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19766947 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19766947 Country of ref document: EP Kind code of ref document: A1 |