WO2019176302A1 - Élément d'imagerie et procédé de fabrication d'élément d'imagerie - Google Patents
Élément d'imagerie et procédé de fabrication d'élément d'imagerie Download PDFInfo
- Publication number
- WO2019176302A1 WO2019176302A1 PCT/JP2019/002025 JP2019002025W WO2019176302A1 WO 2019176302 A1 WO2019176302 A1 WO 2019176302A1 JP 2019002025 W JP2019002025 W JP 2019002025W WO 2019176302 A1 WO2019176302 A1 WO 2019176302A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- base layer
- lens
- pixel
- image
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 118
- 238000004519 manufacturing process Methods 0.000 title claims abstract description 34
- 238000000034 method Methods 0.000 title description 59
- 238000005530 etching Methods 0.000 claims description 36
- 238000006243 chemical reaction Methods 0.000 claims description 27
- 230000002093 peripheral effect Effects 0.000 claims 1
- 238000005516 engineering process Methods 0.000 description 53
- 239000004065 semiconductor Substances 0.000 description 47
- 238000012545 processing Methods 0.000 description 31
- 239000000758 substrate Substances 0.000 description 26
- 230000008569 process Effects 0.000 description 25
- 238000001514 detection method Methods 0.000 description 23
- 238000001312 dry etching Methods 0.000 description 19
- 238000004891 communication Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 18
- 238000002674 endoscopic surgery Methods 0.000 description 12
- 239000012212 insulator Substances 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 239000000463 material Substances 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 8
- 210000001519 tissue Anatomy 0.000 description 8
- 239000000203 mixture Substances 0.000 description 7
- 229910004298 SiO 2 Inorganic materials 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 6
- 230000015572 biosynthetic process Effects 0.000 description 6
- 238000012986 modification Methods 0.000 description 6
- 230000004048 modification Effects 0.000 description 6
- 238000001356 surgical procedure Methods 0.000 description 6
- 238000012546 transfer Methods 0.000 description 6
- 229910052581 Si3N4 Inorganic materials 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 230000005284 excitation Effects 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 238000002156 mixing Methods 0.000 description 4
- HQVNEWCFYHHQES-UHFFFAOYSA-N silicon nitride Chemical compound N12[Si]34N5[Si]62N3[Si]51N64 HQVNEWCFYHHQES-UHFFFAOYSA-N 0.000 description 4
- 238000005229 chemical vapour deposition Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000010336 energy treatment Methods 0.000 description 3
- 230000001678 irradiating effect Effects 0.000 description 3
- 208000005646 Pneumoperitoneum Diseases 0.000 description 2
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 2
- 229910052782 aluminium Inorganic materials 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 2
- 239000003153 chemical reaction reagent Substances 0.000 description 2
- 239000010949 copper Substances 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- MOFVSTNWEDAEEK-UHFFFAOYSA-M indocyanine green Chemical compound [Na+].[O-]S(=O)(=O)CCCCN1C2=CC=C3C=CC=CC3=C2C(C)(C)C1=CC=CC=CC=CC1=[N+](CCCCS([O-])(=O)=O)C2=CC=C(C=CC=C3)C3=C2C1(C)C MOFVSTNWEDAEEK-UHFFFAOYSA-M 0.000 description 2
- 229960004657 indocyanine green Drugs 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 229910052814 silicon oxide Inorganic materials 0.000 description 2
- 229910052721 tungsten Inorganic materials 0.000 description 2
- 238000001039 wet etching Methods 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 238000002679 ablation Methods 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 230000000740 bleeding effect Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000009833 condensation Methods 0.000 description 1
- 230000005494 condensation Effects 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000002073 fluorescence micrograph Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000012535 impurity Substances 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 239000003595 mist Substances 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 210000004400 mucous membrane Anatomy 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- WFKWXMTUELFFGS-UHFFFAOYSA-N tungsten Chemical compound [W] WFKWXMTUELFFGS-UHFFFAOYSA-N 0.000 description 1
- 239000010937 tungsten Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
Definitions
- the present technology relates to an image sensor and a method for manufacturing the image sensor.
- the present invention relates to an image pickup device in which a lens that collects incident light is arranged, and a method for manufacturing the image pickup device.
- an image pickup device that picks up an image of an object has an on-chip lens that collects incident light from the object, a color filter that transmits light of a predetermined wavelength among the collected incident light, and incident light that has passed through the color filter.
- a pixel having a photoelectric conversion unit that converts the signal into an electric signal; These pixels are arranged in a two-dimensional grid to form an image sensor.
- a light shield is disposed around each pixel. This light blocking body blocks light incident obliquely from adjacent pixels.
- the color filter described above transmits light having a different wavelength for each pixel. For example, color filters corresponding to red light, green light, and blue light are arranged in each pixel.
- an image sensor including a pixel in which a second lens that collects incident light collected by an on-chip lens and transmitted through a color filter is disposed is used. ing.
- This second lens is called an in-layer lens and is configured in a convex shape.
- a method of forming this intra-layer lens for example, a method of forming by self-alignment using the above-described light shielding body as a mask has been proposed (for example, see Patent Document 1).
- a light shielding body is formed on the base layer, and the concave layer is formed by selectively etching the base layer using the formed light shielding body as a mask.
- an insulator is deposited in the recess by a high-density plasma film forming method.
- the deposited insulating film has a thickness in the vicinity of the light shielding body, and has a downwardly convex shape in which the film thickness decreases as the distance from the light shielding body increases.
- the above-mentioned conventional technology has a problem that the light collection rate is low. This is because the lens has a hemispherical shape configured to be convex downward, and there is only one curved surface that refracts incident light. By forming convex portions on both surfaces of the lens, the light collection rate can be improved. However, a lens having convex portions on both sides has a problem that it is difficult to form compared with a lens having convex portions on one side.
- the present technology has been made to solve the above-described problems.
- the first aspect of the present technology is a pixel that generates an image signal corresponding to incident light, and the pixel that is disposed in the pixel and transmits the incident light. And a convex layer formed adjacent to the concave portion of the base layer for each pixel and formed based on the concave portion of the base layer to collect the irradiated light.
- An imaging device comprising: a lens that is incident on the pixel through the base layer. As a result, the convex portion of the lens is formed based on the concave portion of the base layer. It is assumed that the convex portion of the lens is created based on self-alignment with the concave portion of the underlayer.
- the lens includes a light transmissive member disposed adjacent to the base layer and having a second recess formed on the surface based on the recess of the base layer, and the second recess.
- the light transmitting member and the resist may be etched at different etching rates to form the convex portion based on the concave portion of the base layer.
- the underlayer may include the concave portion formed by isotropic etching. This brings about the effect that the recess of the underlayer is formed by isotropic etching.
- a light shielding wall disposed around the pixel may be further provided, and the base layer may include the concave portion based on a shape of a region surrounded by the light shielding wall.
- a color filter that transmits light having a predetermined wavelength among the collected incident light may be further provided.
- the lens is arranged between the subject and the color filter.
- a color filter that transmits light having a predetermined wavelength among the incident light may be further provided, and the lens may collect the light transmitted through the color filter.
- the color filter is arranged between the subject and the lens.
- the second aspect of the present technology is configured to transmit the incident light to a pixel that generates an image signal corresponding to incident light and to form a base layer recess forming step including a recess, and to irradiate the irradiated light.
- the convex portion of the lens is formed based on the concave portion of the base layer. It is assumed that the convex portion of the lens is created based on self-alignment with the concave portion of the underlayer.
- FIG. 1 is a diagram illustrating a configuration example of an image sensor according to an embodiment of the present technology.
- the image pickup device 1 of FIG. 1 includes a pixel array unit 10, a vertical drive unit 20, a column signal processing unit 30, and a control unit 40.
- the pixel array unit 10 is configured by arranging the pixels 100 in a two-dimensional grid.
- the pixel 100 generates an image signal corresponding to the irradiated light.
- the pixel 100 includes a photoelectric conversion unit that generates charges according to the irradiated light.
- the pixel 100 further includes a pixel circuit. This pixel circuit generates an image signal based on the charges generated by the photoelectric conversion unit. The generation of the image signal is controlled by a control signal generated by the vertical drive unit 20 described later.
- signal lines 11 and 12 are arranged in an XY matrix.
- the signal line 11 is a signal line that transmits a control signal of the pixel circuit in the pixel 100, and is arranged for each row of the pixel array unit 10, and is wired in common to the pixels 100 arranged in each row.
- the signal line 12 is a signal line that transmits an image signal generated by the pixel circuit of the pixel 100, and is arranged for each column of the pixel array unit 10, and is wired in common to the pixels 100 arranged in each column.
- the vertical drive unit 20 generates a control signal for the pixel circuit of the pixel 100.
- the vertical drive unit 20 transmits the generated control signal to the pixel 100 via the signal line 11 in FIG.
- the column signal processing unit 30 processes the image signal generated by the pixel 100.
- the column signal processing unit 30 processes the image signal transmitted from the pixel 100 via the signal line 12 shown in FIG.
- the processing in the column signal processing unit 30 corresponds to, for example, analog-digital conversion that converts an analog image signal generated in the pixel 100 into a digital image signal.
- the image signal processed by the column signal processing unit 30 is output as an image signal of the image sensor 1.
- the control unit 40 controls the entire image sensor 1.
- the control unit 40 controls the image sensor 1 by generating and outputting a control signal for controlling the vertical driving unit 20 and the column signal processing unit 30.
- the control signal generated by the control unit 40 is transmitted to the vertical drive unit 20 and the column signal processing unit 30 through signal lines 41 and 42, respectively.
- FIG. 2 is a diagram illustrating a configuration example of a pixel according to the embodiment of the present technology.
- FIG. 2 is a circuit diagram illustrating a configuration example of the pixel 100.
- a pixel 100 in the figure includes a photoelectric conversion unit 101, a charge holding unit 102, and MOS transistors 103 to 106.
- the anode of the photoelectric conversion unit 101 is grounded, and the cathode is connected to the source of the MOS transistor 103.
- the drain of the MOS transistor 103 is connected to the source of the MOS transistor 104, the gate of the MOS transistor 105, and one end of the charge holding unit 102. The other end of the charge holding unit 102 is grounded.
- the drains of MOS transistors 104 and 105 are commonly connected to power supply line Vdd, and the source of MOS transistor 105 is connected to the drain of MOS transistor 106.
- the source of the MOS transistor 106 is connected to the signal line 12.
- MOS transistors 103, 104 and 106 have their gates connected to transfer signal line TR, reset signal line RST and selection signal line SEL, respectively. Note that the transfer signal line TR, the reset signal line RST, and the selection signal line SEL constitute a signal line 11.
- the photoelectric conversion unit 101 generates a charge corresponding to the irradiated light as described above.
- a photodiode can be used for the photoelectric conversion unit 101.
- the charge holding unit 102 and the MOS transistors 103 to 106 constitute a pixel circuit.
- the MOS transistor 103 is a transistor that transfers the charge generated by the photoelectric conversion of the photoelectric conversion unit 101 to the charge holding unit 102.
- the charge transfer in the MOS transistor 103 is controlled by a signal transmitted through the transfer signal line TR.
- the charge holding unit 102 is a capacitor that holds the charge transferred by the MOS transistor 103.
- the MOS transistor 105 is a transistor that generates a signal based on the charge held in the charge holding unit 102.
- the MOS transistor 106 is a transistor that outputs a signal generated by the MOS transistor 105 to the signal line 12 as an image signal.
- the MOS transistor 106 is controlled by a signal transmitted through the selection signal line SEL.
- the MOS transistor 104 is a transistor that resets the charge holding unit 102 by discharging the charge held in the charge holding unit 102 to the power supply line Vdd.
- the reset by the MOS transistor 104 is controlled by a signal transmitted through the reset signal line RST, and is executed before the charge transfer by the MOS transistor 103.
- the photoelectric conversion unit 101 can also be reset by conducting the MOS transistor 103 at the time of resetting.
- the pixel circuit converts the charge generated by the photoelectric conversion unit 101 into an image signal.
- FIG. 3 is a cross-sectional view illustrating a configuration example of a pixel according to the first embodiment of the present technology.
- FIG. 2 is a schematic cross-sectional view illustrating a configuration example of the pixel 100 arranged in the pixel array unit 10.
- the pixel 100 includes an on-chip lens 151, a base layer 144, a light shielding wall 143, a color filter 142, an insulating film 141, a semiconductor substrate 111, a wiring region including the wiring layer 122 and the insulating layer 121, and a support substrate. 131.
- the semiconductor substrate 111 is a semiconductor substrate on which the photoelectric conversion portion of the pixel 100 described in FIG. 1 and the semiconductor portion of the pixel circuit are formed.
- semiconductor portions of the vertical driving unit 20, the column signal processing unit 30, and the control unit 40 are further formed on the semiconductor substrate 111.
- a p-type well region is formed in the semiconductor substrate 111, and the photoelectric conversion portion of the pixel 100 and the like are formed in this well region.
- the semiconductor substrate 111 in the figure is configured as a p-type well region.
- the photoelectric conversion unit 101 in the pixel circuit is shown in FIG.
- a photodiode is constituted by a pn junction formed at the interface between the n-type semiconductor region 112 and the p-type well region, and photoelectric conversion is performed.
- a p-type semiconductor region 113 is disposed adjacent to the n-type semiconductor region 112.
- the p-type semiconductor region 113 is a region configured with a relatively high impurity concentration, and is a region for pinning an interface state on the surface of the semiconductor substrate 111. Thereby, the influence of the interface state on the surface of the semiconductor substrate 111 can be reduced.
- the p-type semiconductor region for pinning can also be disposed on the other surface of the semiconductor substrate 111.
- the wiring layer 122 is a wiring that transmits an image signal generated in the pixel 100 and a control signal for controlling the pixel circuit.
- the wiring layer 122 can be made of a metal such as copper (Cu).
- the insulating layer 121 insulates the wiring layer 122.
- the insulating layer 121 can be made of an oxide such as silicon oxide (SiO 2 ), for example.
- the image pickup device 1 including the pixel 100 in FIG. 1 is a back-illuminated image pickup device in which a wiring region is formed on a surface (back surface) different from the surface on which light is incident on the semiconductor substrate 111.
- the support substrate 131 is a substrate that supports the image sensor 1 and is a substrate that is used for improving the strength of the image sensor 1 in the manufacturing process.
- the on-chip lens 151 is a lens that collects light from the subject.
- the on-chip lens 151 in the figure has an elliptical cross section. That is, it is configured in a shape having convex portions on the top and bottom. Thereby, the condensing rate of incident light can be improved.
- the on-chip lens 151 can be composed of a light transmissive member, for example, silicon nitride (SiN).
- the underlayer 144 is a film that is disposed below the on-chip lens 151 and serves as a base when the on-chip lens 151 is formed.
- the underlayer 144 can be made of a light transmissive member having a lower refractive index than that of the on-chip lens 151, for example, silicon oxide (SiO 2 ).
- the resist is disposed in the second recess and the light transmitting member and the resist are etched.
- dry etching can be used.
- the light transmissive member and the resist are etched at different etching rates. Specifically, the etching is performed under the condition that the light transmissive member is etched faster than the resist. Then, the etching amount of the light transmissive member decreases as the resist thickness increases, and a convex portion is formed on the light transmissive member. That is, the on-chip lens 151 having a convex portion based on the concave portion of the base layer 144 can be formed.
- the on-chip lens 151 is formed in a self-aligned manner with the concave portion of the base layer 144. Details of the manufacturing method of the on-chip lens 151 will be described later.
- the on-chip lens 151 is an example of a lens described in the claims.
- the color filter 142 is an optical filter that transmits light having a predetermined wavelength out of the light collected by the on-chip lens 151.
- a color filter 142 that transmits visible light or infrared light of red light, green light, and blue light can be used.
- the light shielding wall 143 is a wall-shaped film that shields light incident obliquely from the adjacent pixels 100.
- the light shielding wall 143 can prevent light that has passed through the color filter 142 of the adjacent pixel 100 from entering, and can prevent color mixing.
- the light shielding wall 143 can be made of, for example, aluminum (Al) or tungsten (W).
- the insulating film 141 is a film that insulates the semiconductor substrate 111. This insulating film 141 can be made of, for example, SiO 2 .
- FIG. 4 is a diagram illustrating light collection by the on-chip lens according to the first embodiment of the present technology.
- FIG. 3A is a simplified diagram of the pixel 100 described in FIG. 3, and illustrates how incident light 301 enters the n-type semiconductor region 112 of the pixel 100.
- the on-chip lens 151 is comprised by the elliptical cross section. Incident light 301 of the pixel 100 is refracted by the two convex portions of the on-chip lens 151. For this reason, even if the incident angle of the incident light of the pixel 100 is relatively large, the n-type semiconductor region 112 can be irradiated. Therefore, the present invention can be applied to imaging using a photographic lens having a large image height.
- B in the figure represents an example in which a hemispherical on-chip lens 311 is used as a comparative example.
- the incident light 301 having the same incident angle as that of a in the figure cannot be condensed on the n-type semiconductor region 112, and the condensing rate becomes low.
- c in the figure is a view showing an example of the on-chip lens 312 in which the convex portion on the upper surface side is formed by a method different from the on-chip lens 151.
- the dry etching method is a method of forming a film of a light transmissive member as a material of an on-chip lens, placing a resist having a convex shape thereon, and forming a resist and a light transmissive material. This is a method of transferring the convex portions of the resist onto the surface of the light-transmitting member by dry etching the member film.
- a condensing rate can be improved by using the on-chip lens 151 of an elliptical cross section. For this reason, generation
- the focal length can be shortened, the imaging device 1 can be thinned.
- a light transmissive member 405 serving as a material for the on-chip lens 151 is formed on the surface of the base layer 144.
- a recess 406 is formed by transferring the shape of the recess 404 of the base layer 144.
- the recess 406 corresponds to the second recess described in FIG.
- the light transmissive member 405 for example, SiN can be used.
- CVD can be used to form the light transmissive member 405 (d in FIG. 6).
- the said process is an example of the lens arrangement
- the resist 407 and the light transmissive member 405 are etched.
- This etching can be performed by anisotropic dry etching.
- anisotropic dry etching By using NF 3 and SF 6 to which CH 2 F 2 or the like is added as an etching gas, anisotropic dry etching can be performed.
- the etching is performed under the condition that the etching rate of the light transmitting member 405 is higher than the etching rate of the resist 407. Etching is performed until the resist 407 disappears.
- the convex part of the shape in which the downward convex part of the resist 407 is inverted and transferred is formed on the surface of the light transmitting member 405, and the on-chip lens 151 having an elliptical cross section is formed (in FIG. 6). f). Since the on-chip lens 151 is formed by self-alignment based on the position of the concave portion 404 of the base layer 144, the on-chip lens 151 can be formed without causing problems such as misalignment during manufacturing.
- the etching method of the resist 407 and the light transmissive member 405 is not limited to this example. For example, wet etching can be performed.
- the said process is an example of the lens convex part formation process as described in a claim.
- the on-chip lens 151 including the convex portions based on the shape of the concave portion 404 of the base layer 144 on both surfaces (the front surface and the back surface) is formed. Since the on-chip lens 151 is formed by self-alignment based on the position of the concave portion 404 of the base layer 144, the manufacturing process of the on-chip lens 151 can be simplified.
- FIG. 7 is a cross-sectional view illustrating a configuration example of a pixel according to the second embodiment of the present technology.
- the pixel 100 in the figure is different from the pixel 100 described in FIG. 3 in that it further includes an underlayer 146, an intralayer lens 152, and a planarizing film 147.
- the pixel 100 in the figure includes a light shielding wall 145 instead of the light shielding wall 143.
- the in-layer lens 152 is a lens disposed in the inner layer of the pixel 100 and is a lens disposed between the color filter 142 and the semiconductor substrate 111.
- This in-layer lens 152 can be made of SiN, for example. Since the pixel 100 in the figure includes two lenses, an on-chip lens 151 and an in-layer lens 152, the light condensing rate can be higher than that of the pixel 100 described in FIG. That is, even the incident light having an incident angle larger than the incident light 301 of a in FIG. 4 can be applied to the n-type semiconductor region 112.
- the light shielding wall 145 is disposed so as to surround a region where the color filter 142 and the in-layer lens 152 are disposed.
- the light shielding wall 145 can be made of Al or W like the light shielding wall 143.
- the inner lens 152 has a back surface formed adjacent to the concave portion formed in the base layer 146 and a convex portion formed on the surface based on the shape of the concave portion of the base layer 146.
- This in-layer lens 152 can also be formed by the same method as the on-chip lens 151. First, a light transmissive member is disposed adjacent to the base layer 146. At this time, a second concave portion to which the shape of the concave portion of the base layer 146 is transferred is formed on the surface of the light transmissive member arranged.
- the in-layer lens 152 can be formed by disposing a resist in the second recess and etching the light transmissive member and the resist at different etching rates.
- the in-layer lens 152 is an example of a lens described in the claims.
- the light shielding wall 145 is formed on the surface of the semiconductor substrate 111 on which the wiring region and the insulating film 141 are formed. This can be performed, for example, by forming a film such as W as a material of the light shielding wall 145 and etching and removing W in a region other than the boundary of the pixel 100 (a in FIG. 8).
- an insulator film 411 serving as a material for the base layer 146 is formed (b in FIG. 8).
- anisotropic dry etching is performed to etch the insulator film 411.
- the insulating film 411 adjacent to the wall surface of the light shielding wall 145 remains without being etched.
- the said process is an example of the base layer recessed part formation process as described in a claim.
- a light transmissive member 414 is formed on the surface of the base layer 146.
- a concave portion 415 to which the shape of the concave portion 413 of the base layer 146 is transferred is formed on the surface of the formed light transmitting member 414.
- the recess 415 corresponds to a second recess (d in FIG. 8).
- the said process is an example of the lens arrangement
- a resist 416 having a flat surface shape is disposed on the surface of the light transmitting member 414 (e in FIG. 9).
- the resist 416 and the light transmissive member 414 are etched by anisotropic dry etching.
- anisotropic dry etching the etching is performed under the condition that the etching rate of the light transmitting member 414 is higher than the etching rate of the resist 416.
- a convex part is formed on the surface of the light transmissive member 414, and the convex part of the in-layer lens 152 is formed by self-alignment based on the position of the concave part 413 of the base layer 146 (f in FIG. 9).
- the said process is an example of the lens convex part formation process as described in a claim.
- a planarizing film 147 is disposed (g in FIG. 9). Thereafter, the color filter 142 and the on-chip lens 151 are formed.
- the configuration of the image sensor 1 is the same as the configuration of the image sensor 1 described in the first embodiment of the present technology, and thus the description thereof is omitted.
- the imaging element 1 according to the second embodiment of the present technology can further improve the light collection rate by including the in-layer lens 152.
- the recess 404 of the base layer 144 is formed by isotropic dry etching.
- the imaging device 1 according to the third embodiment of the present technology is different from the above-described first embodiment in that a concave portion based on the shape of the region surrounded by the light shielding wall is formed in the base layer. Different.
- FIG. 10 is a cross-sectional view illustrating a configuration example of a pixel according to the third embodiment of the present technology.
- the pixel 100 in the figure is different from the pixel 100 described in FIG. 3 in that an on-chip lens 153, a base layer 148, and a light shielding wall 149 are provided instead of the on-chip lens 151, the ground layer 144 and the light shielding wall 143.
- the light shielding wall 149 is disposed in a region adjacent to the on-chip lens 153 from the surface of the insulating film 141.
- the underlayer 148 and the on-chip lens 153 are formed by the same manufacturing method as the pixel 100 described in FIG. That is, except for the point formed adjacent to the surface of the color filter 142, the underlayer 148 and the on-chip lens 153 can be formed by the manufacturing process described in FIGS. Since the concave portion of the base layer 148 can be formed using the light shielding wall 149, the manufacturing method of the on-chip lens 153 can be simplified.
- the on-chip lens 153 is an example of a lens described in the claims.
- the configuration of the image sensor 1 is the same as the configuration of the image sensor 1 described in the first embodiment of the present technology, and thus the description thereof is omitted.
- the imaging device 1 according to the third embodiment of the present technology further simplifies the manufacturing process of the imaging device 1 by forming the concave portion of the base layer 148 using the light shielding wall 149. Can do.
- the pixel 100 having the same configuration is arranged in the image sensor 1 of the first embodiment described above.
- the imaging device 1 according to the fourth embodiment of the present technology is different from the above-described first embodiment in that a phase difference pixel for autofocus is further arranged.
- FIG. 11 is a diagram illustrating a configuration example of the pixel array unit according to the fourth embodiment of the present technology.
- FIG. 3 is a top view showing the arrangement of the pixels 100 in the pixel array unit 10.
- a pixel 100a and a pixel 100b correspond to phase difference pixels.
- the phase difference pixel is a pixel for detecting, as a phase difference, an image shift caused by light that has passed through different areas of the photographing lens that collects light from the subject on the pixel array unit 10 of the image sensor 1. This is a pixel used for autofocus.
- a solid circle represents the on-chip lens 151, and a dotted rectangle represents an effective area of the n-type semiconductor region 112 formed on the semiconductor substrate 111.
- a plurality of such pixels 100 a and pixels 100 b are arranged in a specific row of the pixel array unit 10.
- the entire region where the n-type semiconductor region 112 is formed is an effective region.
- approximately half of the region where the n-type semiconductor region 112 is formed is an effective region.
- the right half of the figure is an effective area
- the left half of the figure is an effective area.
- Light passing through the left and right sides of the photographic lens is incident on these pixels 100a and 100b, respectively.
- FIG. 12 is a cross-sectional view illustrating a configuration example of a pixel according to the fourth embodiment of the present technology.
- FIG. 2 is a diagram illustrating a configuration example of the pixel 100 and the pixel 100a. 3 are different from the pixel 100 described in FIG. 3 in that a base layer 161 and a planarizing film 162 are provided between the color filter 142 and the insulating film 141.
- the light shielding wall 143 is disposed below the base layer 161.
- the light shielding wall 143 in the pixel 100 a is disposed at a position covering the left half of the n-type semiconductor region 112, and shields the half of the n-type semiconductor region 112. Thereby, an effective region in the right half of the n-type semiconductor region 112 described in FIG. 11 can be set.
- the pixel 100a further includes an in-layer lens 154.
- incident light is condensed on the n-type semiconductor region 112 by the on-chip lens 151.
- the in-layer lens 154 is disposed in the pixel 100a, and the condensing position is adjusted to be the end 321.
- the in-layer lens 154 is an example of a lens described in the claims.
- FIG. 13 and 14 are diagrams illustrating an example of a method of manufacturing an image sensor according to the fourth embodiment of the present technology.
- the insulating film 141 is formed on the surface of the semiconductor substrate 111, and the light shielding wall 143 is formed.
- the shape of the light shielding wall 143 is changed in accordance with the region that shields the n-type semiconductor region 112 (a in FIG. 13).
- an insulator film 421 serving as a material for the base layer 161 is formed (b in FIG. 13).
- a resist 422 is formed on the surface of the insulating film 421.
- An opening 423 is formed in the resist 422 in a region where the inner lens 154 is to be formed (c in FIG.
- a light transmissive member 426 is formed on the surface of the base layer 161.
- a concave portion 427 to which the shape of the concave portion 425 of the base layer 161 is transferred is formed on the surface of the formed light transmitting member 426 (e in FIG. 14).
- the said process is an example of the lens arrangement
- a resist 428 having a flat surface is formed (f in FIG. 14), and anisotropic dry etching is performed.
- This anisotropic dry etching can be performed in the same manner as the dry etching described in FIG.
- the inner lens 154 can be formed selectively with respect to the pixel 100a (g in FIG. 14).
- the said process is an example of the lens convex part formation process as described in a claim.
- the imaging element 1 can be manufactured by forming the planarization film 162 and the like.
- the in-layer lens 154 can be arranged on the pixels 100a and 100b which are phase difference pixels.
- the manufacturing process of other regions such as the pixel 100a, for example, the manufacturing process of the on-chip lens 151 can be made common to the pixel 100, and the influence of variations in the manufacturing process can be reduced.
- the configuration of the image sensor 1 in the fourth embodiment of the present technology is not limited to this example.
- a color filter 142 that transmits red light or infrared light may be disposed, and the intralayer lens 154 may be disposed in the pixel 100 that detects red light and infrared light. This makes it possible to change the focal position of the pixel 100 that detects red light or infrared light.
- the configuration of the image sensor 1 is the same as the configuration of the image sensor 1 described in the first embodiment of the present technology, and thus the description thereof is omitted.
- the in-layer lens 154 is selectively disposed on some pixels 100 of the pixel array unit 10. Thereby, the manufacturing process of the image pick-up element 1 provided with a phase difference pixel can be simplified.
- FIG. 15 is a cross-sectional view illustrating a configuration example of a pixel according to a modification of the embodiment of the present technology.
- the pixel 100 in the figure is that the on-chip lens 151, the base layer 144, the light shielding wall 143, the color filter 142, and the insulating film 141 are disposed adjacent to the wiring region composed of the wiring layer 122 and the insulating layer 121. 3 is different from the pixel 100 described in FIG.
- incident light is irradiated onto the n-type semiconductor region 112 from the surface, which is the surface on which the wiring region of the semiconductor substrate 111 is formed.
- the on-chip lens 151 shown in the figure can be formed by the same method as the on-chip lens 151 described in FIG. Even in such a surface irradiation type imaging device 1, the on-chip lens 151 having an elliptical cross section can be easily formed.
- the configuration of the image sensor 1 is the same as the configuration of the image sensor 1 described in the first embodiment of the present technology, and thus the description thereof is omitted.
- the imaging device 1 can simplify the method for manufacturing an on-chip lens in an imaging device configured as a surface irradiation type.
- the present technology can be applied to various products.
- the present technology may be realized as an imaging element mounted on an imaging device such as a camera.
- FIG. 16 is a block diagram illustrating a schematic configuration example of a camera that is an example of an imaging apparatus to which the present technology can be applied.
- the camera 1000 shown in FIG. 1 includes a lens 1001, an image sensor 1002, an imaging control unit 1003, a lens driving unit 1004, an image processing unit 1005, an operation input unit 1006, a frame memory 1007, a display unit 1008, And a recording unit 1009.
- the lens 1001 is a photographing lens of the camera 1000.
- the lens 1001 collects light from the subject and makes it incident on an image sensor 1002 described later to form an image of the subject.
- the imaging element 1002 is a semiconductor element that images light from the subject condensed by the lens 1001.
- the image sensor 1002 generates an analog image signal corresponding to the irradiated light, converts it into a digital image signal, and outputs it.
- the imaging control unit 1003 controls imaging in the imaging element 1002.
- the imaging control unit 1003 controls the imaging element 1002 by generating a control signal and outputting the control signal to the imaging element 1002.
- the imaging control unit 1003 can perform autofocus in the camera 1000 based on the image signal output from the imaging element 1002.
- the autofocus is a system that detects the focal position of the lens 1001 and automatically adjusts it.
- a method image plane phase difference autofocus
- an image plane phase difference is detected by a phase difference pixel arranged in the image sensor 1002 to detect a focal position
- a method (contrast autofocus) in which a position where the contrast of an image is the highest is detected as a focal position can be applied.
- the imaging control unit 1003 adjusts the position of the lens 1001 via the lens driving unit 1004 based on the detected focal position, and performs autofocus.
- the imaging control unit 1003 can be configured by, for example, a DSP (Digital Signal Processor) equipped with firmware.
- DSP Digital Signal Processor
- the lens driving unit 1004 drives the lens 1001 based on the control of the imaging control unit 1003.
- the lens driving unit 1004 can drive the lens 1001 by changing the position of the lens 1001 using a built-in motor.
- the image processing unit 1005 processes the image signal generated by the image sensor 1002. This processing includes, for example, demosaic that generates an image signal of insufficient color among image signals corresponding to red, green, and blue for each pixel, noise reduction that removes noise of the image signal, and encoding of the image signal. Applicable.
- the image processing unit 1005 can be configured by, for example, a microcomputer equipped with firmware.
- the operation input unit 1006 receives an operation input from the user of the camera 1000.
- the operation input unit 1006 for example, a push button or a touch panel can be used.
- the operation input received by the operation input unit 1006 is transmitted to the imaging control unit 1003 and the image processing unit 1005. Thereafter, processing according to the operation input, for example, processing such as imaging of a subject is started.
- the frame memory 1007 is a memory for storing frames that are image signals for one screen.
- the frame memory 1007 is controlled by the image processing unit 1005 and holds a frame in the course of image processing.
- the display unit 1008 displays the image processed by the image processing unit 1005.
- a liquid crystal panel can be used for the display unit 1008.
- the recording unit 1009 records the image processed by the image processing unit 1005.
- a memory card or a hard disk can be used.
- the camera to which the present invention can be applied has been described above.
- the present technology can be applied to the image sensor 1002 among the configurations described above.
- the image sensor 1 described in FIG. 1 can be applied to the image sensor 1002.
- a manufacturing method of the image sensor 1002 in which pixels including an on-chip lens having an elliptical cross section are arranged can be simplified.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure may be applied to an endoscopic surgery system.
- FIG. 17 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology (present technology) according to the present disclosure can be applied.
- FIG. 17 shows a state in which an operator (doctor) 11131 is performing an operation on a patient 11132 on a patient bed 11133 using an endoscopic operation system 11000.
- an endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as an insufflation tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 that supports the endoscope 11100. And a cart 11200 on which various devices for endoscopic surgery are mounted.
- the endoscope 11100 includes a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
- a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
- an endoscope 11100 configured as a so-called rigid mirror having a rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible lens barrel. Good.
- An opening into which the objective lens is fitted is provided at the tip of the lens barrel 11101.
- a light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101. Irradiation is performed toward the observation target in the body cavity of the patient 11132 through the lens.
- the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
- An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from the observation target is condensed on the image sensor by the optical system. Observation light is photoelectrically converted by the imaging element, and an electrical signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
- the image signal is transmitted to a camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
- CCU Camera Control Unit
- the CCU 11201 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs various kinds of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), for example.
- a CPU Central Processing Unit
- GPU Graphics Processing Unit
- the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201.
- the light source device 11203 is composed of a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
- a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
- the input device 11204 is an input interface for the endoscopic surgery system 11000.
- a user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204.
- the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
- the treatment instrument control device 11205 controls the drive of the energy treatment instrument 11112 for tissue ablation, incision, blood vessel sealing, or the like.
- the pneumoperitoneum device 11206 passes gas into the body cavity via the pneumoperitoneum tube 11111.
- the recorder 11207 is an apparatus capable of recording various types of information related to surgery.
- the printer 11208 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
- the light source device 11203 that supplies the irradiation light when imaging the surgical site to the endoscope 11100 can be configured from a white light source configured by, for example, an LED, a laser light source, or a combination thereof.
- a white light source configured by a combination of RGB laser light sources
- the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
- laser light from each of the RGB laser light sources is irradiated on the observation target in a time-sharing manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing, thereby corresponding to each RGB. It is also possible to take the images that have been taken in time division. According to this method, a color image can be obtained without providing a color filter in the image sensor.
- the driving of the light source device 11203 may be controlled so as to change the intensity of the output light every predetermined time. Synchronously with the timing of changing the intensity of the light, the drive of the image sensor of the camera head 11102 is controlled to acquire an image in a time-sharing manner, and the image is synthesized, so that high dynamic without so-called blackout and overexposure A range image can be generated.
- the light source device 11203 may be configured to be able to supply light of a predetermined wavelength band corresponding to special light observation.
- special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue, the surface of the mucous membrane is irradiated by irradiating light in a narrow band compared to irradiation light (ie, white light) during normal observation.
- a so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel is imaged with high contrast.
- fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light.
- the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally administered to the body tissue and applied to the body tissue. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
- the light source device 11203 can be configured to be able to supply narrowband light and / or excitation light corresponding to such special light observation.
- FIG. 18 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG.
- the camera head 11102 includes a lens unit 11401, an imaging unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405.
- the CCU 11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413.
- the camera head 11102 and the CCU 11201 are connected to each other by a transmission cable 11400 so that they can communicate with each other.
- the lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101. Observation light taken from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
- the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
- the imaging unit 11402 includes an imaging element.
- One (so-called single plate type) image sensor may be included in the imaging unit 11402, or a plurality (so-called multi-plate type) may be used.
- image signals corresponding to RGB may be generated by each imaging element, and a color image may be obtained by combining them.
- the imaging unit 11402 may be configured to include a pair of imaging elements for acquiring right-eye and left-eye image signals corresponding to 3D (Dimensional) display. By performing the 3D display, the operator 11131 can more accurately grasp the depth of the living tissue in the surgical site.
- 3D 3D
- the imaging unit 11402 is not necessarily provided in the camera head 11102.
- the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
- the driving unit 11403 is configured by an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405. Thereby, the magnification and the focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
- the communication unit 11404 is configured by a communication device for transmitting and receiving various types of information to and from the CCU 11201.
- the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
- the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. Good. In the latter case, a so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
- AE Auto Exposure
- AF Automatic Focus
- AWB Auto White Balance
- the camera head control unit 11405 controls driving of the camera head 11102 based on a control signal from the CCU 11201 received via the communication unit 11404.
- the communication unit 11411 is configured by a communication device for transmitting and receiving various types of information to and from the camera head 11102.
- the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
- the control unit 11413 performs various types of control related to imaging of the surgical site by the endoscope 11100 and display of a captured image obtained by imaging of the surgical site. For example, the control unit 11413 generates a control signal for controlling driving of the camera head 11102.
- control unit 11413 causes the display device 11202 to display a picked-up image showing the surgical part or the like based on the image signal subjected to the image processing by the image processing unit 11412.
- the control unit 11413 may recognize various objects in the captured image using various image recognition techniques.
- the control unit 11413 detects surgical tools such as forceps, specific biological parts, bleeding, mist when using the energy treatment tool 11112, and the like by detecting the shape and color of the edge of the object included in the captured image. Can be recognized.
- the control unit 11413 may display various types of surgery support information superimposed on the image of the surgical unit using the recognition result. Surgery support information is displayed in a superimposed manner and presented to the operator 11131, thereby reducing the burden on the operator 11131 and allowing the operator 11131 to proceed with surgery reliably.
- the transmission cable 11400 for connecting the camera head 11102 and the CCU 11201 is an electric signal cable corresponding to electric signal communication, an optical fiber corresponding to optical communication, or a composite cable thereof.
- communication is performed by wire using the transmission cable 11400.
- communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
- the technology according to the present disclosure can be applied to the imaging unit 11402 among the configurations described above.
- the imaging device 1 in FIG. 1 can be applied to the imaging unit 11402.
- the cost of the endoscopic surgery system can be reduced.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure is realized as a device that is mounted on any type of mobile body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
- FIG. 19 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
- the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
- the drive system control unit 12010 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
- the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
- the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
- the body control unit 12020 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
- the body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
- the vehicle outside information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted.
- the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030.
- the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image.
- the vehicle outside information detection unit 12030 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received image.
- the vehicle interior information detection unit 12040 detects vehicle interior information.
- a driver state detection unit 12041 that detects a driver's state is connected to the in-vehicle information detection unit 12040.
- the driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
- the microcomputer 12051 calculates a control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside / outside the vehicle acquired by the vehicle outside information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit A control command can be output to 12010.
- the microcomputer 12051 realizes an ADAS (Advanced Driver Assistance System) function including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintaining traveling, vehicle collision warning, or vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose.
- ADAS Advanced Driver Assistance System
- the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation.
- the microcomputer 12051 can output a control command to the body system control unit 12020 based on information outside the vehicle acquired by the vehicle outside information detection unit 12030.
- the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare, such as switching from a high beam to a low beam. It can be carried out.
- the sound image output unit 12052 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
- an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
- the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
- FIG. 20 is a diagram illustrating an example of an installation position of the imaging unit 12031.
- the vehicle 12100 includes imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
- the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper part of a windshield in the vehicle interior of the vehicle 12100.
- the imaging unit 12101 provided in the front nose and the imaging unit 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
- the imaging units 12102 and 12103 provided in the side mirror mainly acquire an image of the side of the vehicle 12100.
- the imaging unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100.
- the forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
- FIG. 20 shows an example of the shooting range of the imaging units 12101 to 12104.
- the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
- the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
- the imaging range 12114 The imaging range of the imaging part 12104 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, an overhead image when the vehicle 12100 is viewed from above is obtained.
- At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
- at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
- the microcomputer 12051 based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object in the imaging range 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100).
- a predetermined speed for example, 0 km / h or more
- the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like.
- automatic brake control including follow-up stop control
- automatic acceleration control including follow-up start control
- cooperative control for the purpose of autonomous driving or the like autonomously traveling without depending on the operation of the driver can be performed.
- At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
- the microcomputer 12051 can recognize a pedestrian by determining whether a pedestrian is present in the captured images of the imaging units 12101 to 12104. Such pedestrian recognition is, for example, whether or not the user is a pedestrian by performing a pattern matching process on a sequence of feature points indicating the outline of an object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. It is carried out by the procedure for determining.
- the audio image output unit 12052 When the microcomputer 12051 determines that there is a pedestrian in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 has a rectangular contour line for emphasizing the recognized pedestrian.
- the display unit 12062 is controlled so as to be superimposed and displayed.
- voice image output part 12052 may control the display part 12062 so that the icon etc. which show a pedestrian may be displayed on a desired position.
- the technology according to the present disclosure can be applied to the imaging unit 12031 and the like among the configurations described above.
- the imaging device 1 in FIG. 1 can be applied to the imaging unit 12031.
- this technique can also take the following structures. (1) a pixel that generates an image signal according to incident light; An underlayer disposed on the pixel to transmit the incident light and having a recess; Each of the pixels includes a convex portion that is disposed adjacent to the concave portion of the base layer and is formed based on the concave portion of the base layer, and collects irradiated light to the pixel through the base layer.
- An imaging device comprising a lens to be incident.
- the lens is disposed adjacent to the base layer and has a light-transmitting member having a second recess formed on the surface based on the recess of the base layer, and a resist further disposed in the second recess
- the imaging device further including a color filter that transmits light having a predetermined wavelength among the collected incident light.
- the imaging element according to any one of (1) to (4), wherein the lens condenses the light transmitted through the color filter.
- a method of manufacturing an image pickup device comprising: a lens convex portion forming step of forming a convex portion formed based on the concave portion of the base layer on the surface of the arranged lens.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Power Engineering (AREA)
- Signal Processing (AREA)
- Electromagnetism (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Multimedia (AREA)
- Computer Hardware Design (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Optics & Photonics (AREA)
- Solid State Image Pick-Up Elements (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
Abstract
Afin de simplifier un procédé de fabrication d'une lentille ayant des parties convexes sur les deux surfaces dans un pixel, le présent élément d'imagerie est pourvu de pixels, d'une couche de base et de lentilles. Les pixels génèrent des signaux d'image en réponse à une lumière incidente. La couche de base est disposée sur les pixels, laisse passer la lumière incidente, et est pourvue de parties évidées. Les lentilles sont pourvues de parties saillantes qui sont disposées de manière adjacente aux parties évidées de la couche de base pour chaque pixel et sont formées sur la base des parties évidées respectives de la couche de base, et les lentilles collectent la lumière émise et amènent la lumière à entrer dans des pixels respectifs à travers la couche de base.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018045137A JP2019160986A (ja) | 2018-03-13 | 2018-03-13 | 撮像素子および撮像素子の製造方法 |
JP2018-045137 | 2018-03-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019176302A1 true WO2019176302A1 (fr) | 2019-09-19 |
Family
ID=67908242
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/002025 WO2019176302A1 (fr) | 2018-03-13 | 2019-01-23 | Élément d'imagerie et procédé de fabrication d'élément d'imagerie |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2019160986A (fr) |
WO (1) | WO2019176302A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114664876A (zh) * | 2022-05-25 | 2022-06-24 | 合肥晶合集成电路股份有限公司 | 一种图像传感器及其制作方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004253573A (ja) * | 2003-02-19 | 2004-09-09 | Sharp Corp | 半導体装置およびその製造方法 |
JP2010239076A (ja) * | 2009-03-31 | 2010-10-21 | Sony Corp | 固体撮像装置とその製造方法、及び電子機器 |
JP2012108327A (ja) * | 2010-11-17 | 2012-06-07 | Sharp Corp | レンズおよびその製造方法、固体撮像素子およびその製造方法、電子情報機器 |
JP2018072757A (ja) * | 2016-11-04 | 2018-05-10 | セイコーエプソン株式会社 | マイクロレンズアレイ基板およびその製造方法、電気光学装置およびその製造方法、ならびに電子機器 |
-
2018
- 2018-03-13 JP JP2018045137A patent/JP2019160986A/ja active Pending
-
2019
- 2019-01-23 WO PCT/JP2019/002025 patent/WO2019176302A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004253573A (ja) * | 2003-02-19 | 2004-09-09 | Sharp Corp | 半導体装置およびその製造方法 |
JP2010239076A (ja) * | 2009-03-31 | 2010-10-21 | Sony Corp | 固体撮像装置とその製造方法、及び電子機器 |
JP2012108327A (ja) * | 2010-11-17 | 2012-06-07 | Sharp Corp | レンズおよびその製造方法、固体撮像素子およびその製造方法、電子情報機器 |
JP2018072757A (ja) * | 2016-11-04 | 2018-05-10 | セイコーエプソン株式会社 | マイクロレンズアレイ基板およびその製造方法、電気光学装置およびその製造方法、ならびに電子機器 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114664876A (zh) * | 2022-05-25 | 2022-06-24 | 合肥晶合集成电路股份有限公司 | 一种图像传感器及其制作方法 |
CN114664876B (zh) * | 2022-05-25 | 2022-08-23 | 合肥晶合集成电路股份有限公司 | 一种图像传感器及其制作方法 |
Also Published As
Publication number | Publication date |
---|---|
JP2019160986A (ja) | 2019-09-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110199394B (zh) | 图像传感器及图像传感器的制造方法 | |
WO2019220945A1 (fr) | Élément d'imagerie et dispositif électronique | |
WO2018051604A1 (fr) | Élément imageur à semi-conducteurs, dispositif imageur, et procédé de fabrication d'élément imageur à semi-conducteurs | |
WO2020137285A1 (fr) | Élément d'imagerie et procédé de fabrication d'élément d'imagerie | |
WO2020137370A1 (fr) | Appareil d'imagerie à semi-conducteur et dispositif électronique | |
WO2019155782A1 (fr) | Dispositif à semi-conducteur et procédé de fabrication de dispositif à semi-conducteur | |
US20230008784A1 (en) | Solid-state imaging device and electronic device | |
JP7544601B2 (ja) | 撮像素子および撮像装置 | |
JP2019091745A (ja) | 撮像素子および撮像装置 | |
WO2019207978A1 (fr) | Élément de capture d'image et procédé de fabrication d'élément de capture d'image | |
US20240088189A1 (en) | Imaging device | |
US11417696B2 (en) | Imaging element comprising polarization unit with a conductive member as an electrode for a charge holder | |
WO2019181466A1 (fr) | Élément d'imagerie et dispositif électronique | |
WO2019176302A1 (fr) | Élément d'imagerie et procédé de fabrication d'élément d'imagerie | |
WO2023042462A1 (fr) | Dispositif de détection de lumière, procédé de fabrication de dispositif de détection de lumière et instrument électronique | |
WO2021045139A1 (fr) | Élément d'imagerie et dispositif d'imagerie | |
WO2019235230A1 (fr) | Élément d'imagerie et dispositif électronique | |
WO2024166667A1 (fr) | Dispositif de détection de lumière et appareil électronique | |
WO2022249678A1 (fr) | Dispositif d'imagerie à semi-conducteurs et son procédé de fabrication | |
WO2024057805A1 (fr) | Élément d'imagerie et dispositif électronique | |
WO2023119840A1 (fr) | Élément d'imagerie, procédé de fabrication d'élément d'imagerie et dispositif électronique | |
WO2024057814A1 (fr) | Dispositif de détection de lumière et instrument électronique | |
WO2024084991A1 (fr) | Photodétecteur, appareil électronique et élément optique | |
WO2024057806A1 (fr) | Dispositif d'imagerie et appareil électronique | |
WO2023021740A1 (fr) | Élément d'imagerie, dispositif d'imagerie et procédé de production |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19766947 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19766947 Country of ref document: EP Kind code of ref document: A1 |