[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2022102549A1 - Solid-state imaging device - Google Patents

Solid-state imaging device Download PDF

Info

Publication number
WO2022102549A1
WO2022102549A1 PCT/JP2021/040862 JP2021040862W WO2022102549A1 WO 2022102549 A1 WO2022102549 A1 WO 2022102549A1 JP 2021040862 W JP2021040862 W JP 2021040862W WO 2022102549 A1 WO2022102549 A1 WO 2022102549A1
Authority
WO
WIPO (PCT)
Prior art keywords
solid
pixel
state image
layer
semiconductor layer
Prior art date
Application number
PCT/JP2021/040862
Other languages
French (fr)
Japanese (ja)
Inventor
英一郎 越河
睦聡 田代
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Priority to CN202180074178.9A priority Critical patent/CN116406478A/en
Publication of WO2022102549A1 publication Critical patent/WO2022102549A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/10Integrated devices
    • H10F39/12Image sensors

Definitions

  • This disclosure relates to a solid-state image sensor.
  • Patent Document 1 discloses a light receiving element array in which a plurality of light receiving portions having a bandgap energy corresponding to a near infrared wavelength region are arranged.
  • the light receiving portion of this light receiving element array has a pn junction at the tip of the p-shaped region formed by selective diffusion, and an n-shaped region is provided between the light receiving portions to divide the light receiving portion for each pixel. There is.
  • the light receiving element array disclosed in Patent Document 1 has the same light receiving sensitivity for each pixel, if the pixel structure is miniaturized, the light receiving sensitivity is lowered as a whole and the dynamic range cannot be widened.
  • the present disclosure provides a solid-state image sensor capable of expanding the dynamic range even if the pixel structure is miniaturized.
  • a first semiconductor layer having a first conductive type first compound semiconductor material and a first semiconductor layer
  • the photoelectric conversion layer arranged on the first semiconductor layer and A second conductive type second semiconductor layer partially arranged on the photoelectric conversion layer and arranged so as to be in contact with the photoelectric conversion layer.
  • the first semiconductor layer, the photoelectric conversion layer, and the element separation layer for separating the second semiconductor layer on a pixel-by-pixel basis are provided.
  • a solid-state image sensor is provided in which the pixels have a plurality of sub-pixels having different sensitivities and sizes.
  • the second semiconductor layer has a plurality of well regions separated from each other corresponding to each of the plurality of subpixels.
  • the plurality of well regions may have different sizes.
  • the plurality of well regions may have different areas when viewed in a plan view along the stacking direction of the second semiconductor layer.
  • the plane sizes of the plurality of well regions may be different from each other.
  • the plurality of well regions may be arranged in the corresponding pixels so as to be oriented in the same direction.
  • the plurality of well regions may be arranged in different directions in the corresponding pixels.
  • a light-shielding layer arranged in at least a part of the element separation layer may be provided.
  • the light-shielding layer may be arranged at the boundary position of the pixel.
  • the light-shielding layer may be arranged at the boundary position of each of the plurality of sub-pixels in the pixel.
  • the light-shielding layer may be arranged so as to surround each of the plurality of sub-pixels when viewed in a plan view along the stacking direction of the second semiconductor layer.
  • the areas of the plurality of sub-pixels surrounded by the light-shielding layer in the pixels in a plan view along the stacking direction may be different from each other.
  • a light-shielding layer arranged at the boundary position of each of the plurality of sub-pixels in the pixel is provided.
  • the plurality of well regions may be regions separated by the light-shielding layer.
  • the areas of the plurality of sub-pixels divided by the light-shielding layer in the pixels in a plan view along the stacking direction are different from each other.
  • the areas of the plurality of well regions in the pixel viewed in a plane along the stacking direction may be the same.
  • the photoelectric conversion layer of the plurality of subpixels in the pixel may contain a different material for each subpixel.
  • the material may contain InGaAs or CuInGaSe 2 .
  • a third semiconductor layer arranged on the photoelectric conversion layer and having the first conductive type second compound semiconductor material is provided.
  • the second semiconductor layer may be arranged on the third semiconductor layer.
  • FIG. 1A is a cross-sectional view in the diagonal direction.
  • the block diagram which shows the schematic structure of the solid-state image sensor according to this embodiment.
  • FIG. 4A is a cross-sectional view in the diagonal direction.
  • FIG. 7B The potential diagram of the time following FIG. 7B.
  • FIG. 7C The potential diagram of the time following FIG. 7D.
  • the bandgap energy of silicon is 1.1 eV
  • a solid-state image pickup device using silicon as a base substrate cannot detect infrared light having a wavelength longer than 1.1 ⁇ m.
  • the bandgap energy corresponds to the near infrared wavelength region. Therefore, the solid-state image sensor according to the present embodiment uses a group III-V compound semiconductor as a base substrate to enable image pickup of near-infrared light.
  • FIG. 1A is a plan layout view of the solid-state image sensor 1 according to the first embodiment
  • FIG. 1B is a cross-sectional view in the diagonal direction of FIG. 1A.
  • FIGS. 1A and 1B schematically show the characteristic portions of the solid-state image sensor 1 according to the present embodiment, and some members are omitted or simplified. Also, the horizontal and vertical size ratios in FIGS. 1A and 1B may differ from the actual solid-state image sensor 1.
  • the base substrate 2 of the solid-state imaging device 1 is a substrate made of a group III-V semiconductor such as GaAs, InP, GaN, AlN, GaP, GaSb, InSb, and InAs.
  • a first semiconductor layer (hereinafter, also referred to as a first compound semiconductor layer) 3 having a first conductive type (for example, n-type) first compound semiconductor material is arranged on the substrate 2. ing.
  • the first semiconductor layer 3 is made of, for example, InP (specifically, n-InP).
  • the photoelectric conversion layer 4 is arranged on the first semiconductor layer 3.
  • the photoelectric conversion layer 4 is made of, for example, InGaAs (specifically, n-InGaAs).
  • a third semiconductor layer 5 having a second compound material is arranged on the photoelectric conversion layer 4.
  • the third semiconductor layer 5 is made of, for example, InP (specifically, n-InP).
  • the third semiconductor layer 5 may be omitted.
  • the laminated first semiconductor layer 3, photoelectric conversion layer 4, and third semiconductor layer 5 are referred to as photoelectric conversion units 6.
  • a plurality of photoelectric conversion units 6 are arranged in the plane direction of the substrate 2, and an element separation layer 7 is arranged between two adjacent photoelectric conversion units 6. Further, a semiconductor layer made of the same material as the element separation layer 7 is also arranged on the third semiconductor layer 5.
  • the element separation layer 7 is made of, for example, InP (specifically, n-InP).
  • InP specifically, n-InP
  • the energy gap of the element separation layer 7 can be made wider than that of the photoelectric conversion layer 4.
  • a second conductive type (for example, p-type) second semiconductor layer 8 is arranged on each photoelectric conversion unit 6.
  • the second semiconductor layer 8 is partially arranged on the corresponding photoelectric conversion unit 6.
  • the second semiconductor layer 8 is, for example, a region in which p-type impurity ions (for example, Zn ions) are selectively diffused in a semiconductor layer made of the same material as the device separation layer 7.
  • the partially arranged second semiconductor layer 8 may be referred to as a well region.
  • a plurality of well regions having different sizes are arranged on the photoelectric conversion layer 4.
  • the areas of the plurality of well regions in the pixel P are different when viewed in a plan view along the stacking direction of the second semiconductor layer 8.
  • the plane sizes of the plurality of well regions in the pixel P are different from each other.
  • the first electrode layer 9 is arranged on the second semiconductor layer 8. Further, the second electrode layer 10 is partially arranged on the element separation layer 7. The periphery of the first electrode layer 9 and the second electrode layer 10 is covered with a coating layer 11 made of an insulating material.
  • the first electrode layer 9 and the second electrode layer 10 are connected to the drive substrate 13 by a joining member 12 such as a bump, a Cu—Cu joint, or a via.
  • a read circuit, a signal processing circuit, and the like are arranged on the drive board 13.
  • a flattening layer 14 is arranged on a surface of the base substrate 2 opposite to the surface on which the photoelectric conversion portion 6 is formed, and a microlens 15 is arranged on the flattening layer 14.
  • the light incident through the microlens 15 is photoelectrically converted by the photoelectric conversion unit 6.
  • the microlens 15 is arranged on the back surface side, and the solid-state image sensor 1 according to the present embodiment is a back-illuminated type.
  • the broken line frame shown in FIG. 1B indicates the boundary of the pixel P.
  • An element separation layer 7 is provided at the boundary of the pixel P.
  • the solid-state image sensor 1 according to the present embodiment has a plurality of sub-pixels having different sensitivities and sizes for each pixel P.
  • 1A and 1B show an example in which one pixel P has two subpixels (hereinafter, referred to as a first subpixel SP1 and a second subpixel SP2).
  • the number of sub-pixels provided in one pixel P is not limited to two, and may be three or more, but in the present specification, two sub-pixels (th) in one pixel P.
  • An example having one subpixel SP1 and a second subpixel SP2) will be mainly described.
  • FIGS. 1A and 1B show an example in which the size of the first subpixel SP1 is larger than the size of the second subpixel SP2.
  • the size refers to the volume of the photoelectric conversion unit 6 in the first subpixel SP1 and the second subpixel SP2. That is, in FIGS. 1A and 1B, the volume of the photoelectric conversion unit 6 in the first subpixel SP1 is larger than the volume of the photoelectric conversion unit 6 in the second subpixel SP2.
  • the volume of the photoelectric conversion unit 6 is made larger, the sensitivity is further improved. Therefore, the first subpixel SP1 has a higher sensitivity than the second subpixel SP2.
  • each pixel P has a first subpixel SP1 having a higher sensitivity than the second subpixel SP2 in addition to the second subpixel SP2.
  • the dynamic range of each pixel P can be widened.
  • the first subpixel SP1 and the second subpixel SP2 in the pixel P may contain different materials.
  • the material constituting each subpixel may contain, for example, InGaAs or CuInGaSe 2 .
  • FIG. 2 is a block diagram showing a schematic configuration of the solid-state image sensor 1 according to the present embodiment.
  • the solid-state image sensor 1 according to the present embodiment includes a pixel array unit 21, a vertical drive circuit 22, a column signal processing circuit 23, a horizontal drive circuit 24, an output circuit 25, and a drive control circuit. It has 26 and.
  • the pixel array unit 21 has a plurality of pixels P arranged in the horizontal direction and the vertical direction.
  • a row selection line L1 is arranged for each pixel row composed of a plurality of pixels P arranged in the horizontal direction.
  • the plurality of row selection lines L1 extending in the horizontal direction are driven by the vertical drive circuit 22.
  • a signal line L2 is arranged for each pixel row composed of a plurality of pixels P arranged in the vertical direction.
  • a plurality of signal lines L2 extending in the vertical direction are connected to the column signal processing circuit 23.
  • the column signal processing circuit 23 performs signal processing such as removal of noise components included in the signal voltage read from each pixel P via the signal line L2 and adjustment of the signal amplitude.
  • the output signal of the column signal processing circuit 23 is output via the output circuit 25.
  • the column signal processing circuit 23 may perform processing for converting the signal voltage of each signal line L2 into a digital signal.
  • the drive control circuit 26 generates a clock signal and a control signal that serve as a reference for the operation of the vertical drive circuit 22, the column signal processing circuit 23, and the horizontal drive circuit 24, based on the vertical synchronization signal, the horizontal synchronization signal, and the master clock.
  • the generated clock signal and control signal are input to the vertical drive circuit 22, the column signal processing circuit 23, and the horizontal drive circuit 24.
  • Each pixel P in the pixel array unit 21 has a plurality of subpixels (first subpixel SP1 and second subpixel SP2) as described above.
  • the electrical signals photoelectrically converted by each of the first subpixel SP1 and the second subpixel SP2 in the same pixel P are input to the column signal processing circuit 23 via the same signal line L2 at the same timing.
  • the electrical signals photoelectrically converted by the first subpixel SP1 and the second subpixel SP2 may be input to the column signal processing circuit 23 via the same signal line L2 at different times. Since the first subpixel SP1 and the second subpixel SP2 have different sensitivities, the electric signal photoelectrically converted by the first subpixel SP1 and the electrical signal photoelectrically converted by the second subpixel SP2 are combined. The dynamic range of photoelectric conversion can be further expanded.
  • the first subpixel SP1 and the second subpixel SP2 in each pixel P are both rectangular, and the directions of the first subpixel SP1 and the second subpixel SP2 are aligned in each pixel P.
  • An example of placement is shown.
  • the first subpixel SP1 and the second subpixel SP2 in each pixel P are both rectangular, and the directions of the first subpixel SP1 and the second subpixel SP2 are different from each other.
  • An example of arranging in each pixel P is shown. As shown in FIG. 3, by arranging the first subpixel SP1 and the second subpixel SP2 in different directions in each pixel P, the size of each subpixel arranged in the pixel P is made as large as possible.
  • the orientations of the first subpixel SP1 and the second subpixel SP2 can be adjusted and arranged.
  • the size ratio of the first sub-pixel SP1 and the second sub-pixel SP2 can be further increased, and the dynamic range can be further expanded.
  • the orientation of the first subpixel SP1 and the orientation of the second subpixel SP2 are offset by 45 degrees from each other, but the degree of offset is arbitrary.
  • the area of the second semiconductor layer 8 in the first subpixel SP1 viewed in a plan view from the stacking direction is the area of the second semiconductor layer in the second subpixel SP2. 8 is made larger than the area viewed in a plan view from the stacking direction. Since the first subpixel SP1 has a larger volume of the photoelectric conversion unit 6 than the second subpixel SP2, the area of the second semiconductor layer 8 in the first subpixel SP1 is made larger to increase the saturated charge amount. Can be made larger.
  • FIGS. 1A and 3 show an example in which the first subpixel SP1 and the second subpixel SP2 have a rectangular shape, the shapes of the first subpixel SP1 and the second subpixel SP2 are arbitrary. Further, the shape of the first subpixel SP1 may be different from the shape of the second subpixel SP2.
  • all the pixels P in the pixel array unit 21 have a plurality of sub-pixels having different sizes, but only a part of the pixel groups in the pixel array unit 21 has a plurality of sub-pixels having different sizes. It may have pixels.
  • the sensitivity of each subpixel in one pixel P can be made different, and the solid-state image sensor 1 can be used.
  • the dynamic range of can be expanded.
  • the element separation layer 7 is arranged at the boundary between the pixel P and the sub-pixel, but a trench is formed in the element separation layer 7 and the trench is filled with a light-shielding material to form a light-shielding layer. May be formed and a light-shielding layer may be arranged along the boundary of the pixel P or the boundary of the sub-pixel.
  • the solid-state image sensor 1 according to the second embodiment has the same planar layout configuration as in FIG. 1A, and has a plurality of sub-pixels in one pixel P.
  • FIG. 4A is a plan layout view of the solid-state image sensor 1 according to the second embodiment
  • FIG. 4B is a cross-sectional view in the diagonal direction of FIG. 4A.
  • the light-shielding layer 16 is arranged at least a part of the element separation layer 7.
  • the light-shielding layer 16 is provided along the boundary of the pixel P.
  • the light-shielding layer 16 is provided in the trench formed in the element separation layer 7.
  • the material of the light-shielding layer 16 is titanium Ti, tungsten W, molybdenum Mo, manganese Mn, copper Cu, or the like, and may be an alloy of two or more kinds of metals.
  • the material of the light-shielding layer 16 does not necessarily have to be metal, but is a material capable of absorbing or reflecting the light to be photoelectrically converted by the solid-state image sensor 1.
  • the light-shielding layer 16 is also arranged at the boundary position of each of the plurality of sub-pixels in the pixel P.
  • the light-shielding layer 16 is arranged so as to surround each of the plurality of sub-pixels when viewed in a plane along the stacking direction of the second semiconductor layer 8.
  • the areas of the plurality of sub-pixels surrounded by the light-shielding layer 16 in the pixel P in a plan view along the stacking direction are different from each other.
  • the light-shielding layer 16 shown in FIGS. 4A and 4B can suppress the incident of light from the adjacent pixel P or sub-pixel. This can reduce crosstalk.
  • the light-shielding layer 16 serves as the boundary of the pixel P and the boundary of the sub-pixel in the pixel P.
  • a plurality of regions surrounded by the light-shielding layer 16 are provided in one pixel P, and each region constitutes a different sub-pixel.
  • there are two regions surrounded by the light-shielding layer 16 in one pixel P one region is the first subpixel SP1 and the other region is the second subpixel SP2. .
  • the area of the second semiconductor layer 8 in the first subpixel SP1 viewed in a plan view from the stacking direction and the area of the second semiconductor layer 8 in the second subpixel SP2 viewed in a plan view from the stacking direction. are different from each other. Specifically, in the first subpixel SP1 having a size larger than that of the second subpixel SP2, the area of the second semiconductor layer 8 is made larger. As a result, the saturated charge amount of the first subpixel SP1 can be further increased and the sensitivity can be further improved, so that the dynamic range of the solid-state image sensor 1 can be widened.
  • the light-shielding layer 16 is arranged along the boundary between the pixel P and the sub-pixels, and the sizes of the plurality of regions surrounded by the light-shielding layer 16 in the pixel P are different. Not only can crosstalk be reduced by the light-shielding layer 16, but a plurality of sub-pixels having different sensitivities can be provided in the pixel P by the light-shielding layer 16, and the dynamic range of the solid-state image sensor 1 can be expanded.
  • the second embodiment shows an example in which the sizes of the plurality of subpixels in the pixel P are different, and the size of the second semiconductor layer 8 in each subpixel in the pixel P is also different.
  • the size of the second semiconductor layer 8 in each subpixel in P may be the same.
  • FIG. 5 is a plan layout view of the solid-state image sensor 1 according to the third embodiment.
  • the light-shielding layer 16 is arranged along the boundary of the pixel P and the boundary of the sub-pixel, as in FIGS. 4A and 4B. Further, the sizes of the plurality of sub-pixels in one pixel P are different.
  • FIG. 5 shows an example in which two subpixels (first subpixel SP1 and second subpixel SP2) having different sizes are provided in one pixel P.
  • the size of the first subpixel SP1 is larger than the size of the second subpixel SP2.
  • the area viewed in a plan view from the stacking direction of the second semiconductor layer 8 in the first subpixel SP1 is abbreviated as the area viewed in a plan view from the stacking direction of the second semiconductor layer 8 in the second subpixel SP2. It is the same. This is a big difference from FIG. 3A.
  • the volume of the photoelectric conversion unit 6 in the first subpixel SP1 is the second sub. Since it is larger than the volume of the photoelectric conversion unit 6 in the pixel SP2, the sensitivity of the first subpixel SP1 can be made higher than that of the second subpixel SP2. Therefore, the dynamic range of the solid-state image sensor 1 according to the third embodiment can be expanded.
  • the size of the second semiconductor layer 8 is changed according to the size of the subpixel in the pixel P, whereas in the third embodiment, the subpixel in the pixel P is changed.
  • the size of the second semiconductor layer 8 in each subpixel is substantially the same regardless of the size of. As a result, the manufacturing process can be simplified as compared with the solid-state image pickup device 1 according to the second embodiment.
  • the signal imaged by each pixel P in the solid-state image sensor 1 according to the first to third embodiments described above is sent to the signal line L2 via the readout circuit.
  • At least a part of the readout circuit is formed on, for example, the drive substrate 13 of FIG. 1B.
  • FIG. 6 is a circuit diagram showing an example of a readout circuit
  • FIGS. 7A to 7E are an operation timing diagram and a potential diagram of the readout circuit of FIG.
  • the readout circuit of FIG. 6 has a current source CS1 and capacitors C1 that equivalently represent the photoelectric conversion operation of the photoelectric conversion unit 6, transistors Q1 to Q5, capacitors C2 and C3, and a current source CS2.
  • Capacitors C1 and C2 are connected in series between the reference voltage Vtop and the ground potential.
  • the transistor Q1 is turned on when the OFG signal input to the gate is at a high level, and the potential of the connection node of the capacitors C1 and C2 is set to the reset voltage VDC.
  • the transistor Q2 is turned on when the TRG signal input to the gate is at a high level, and the charge photoelectrically converted by the photoelectric conversion unit 6 is transferred to the capacitor C3.
  • One end of the capacitor C3 is a floating diffusion FD.
  • the transistor Q2 when the transistor Q2 is turned on, the potentials of the connection node of the capacitors C1 and C2 and the floating diffusion FD which is one end of the capacitor C3 become equal.
  • the transistor Q3 is turned on when the RST signal input to the gate is at a high level to set the floating diffusion FD to the reset voltage VDC.
  • a floating diffusion FD is connected to the gate of the transistor Q4.
  • the transistors Q4 and Q5 are cascode-connected between the power supply voltage VDD and one end of the current source CS2.
  • An SEL signal is input to the gate of the transistor Q5.
  • the TRG signal is set to a low level and the transistor Q2 is turned off, the OFG signal is set to a high level and the transistor Q1 is turned on to initialize the charge of the capacitor C2 and set the RST signal to high.
  • the floating diffusion FD is initialized by turning on the transistor Q3 at the level. After that, the TRG signal is temporarily raised to a high level, and the charge signal by photoelectric conversion is transferred to the floating diffusion FD.
  • the RST signal is temporarily set to a high level and the transistor Q3 is temporarily turned on to set the reset level.
  • the signal is read out via the transistors Q4 and Q5.
  • the readout circuit of FIG. 6 performs a rolling operation after performing a global operation operation as shown in the operation timing diagrams of FIGS. 7A to 7E.
  • all the pixels P in the pixel array unit 21 operate at the same time.
  • the transistor Q1 is turned on, and the connection node of the capacitors C1 and C2 is set to the reset voltage VDG.
  • the bottoms of the potential wells on the drain side and the source side of the transistor Q1 in which the OFG signal is input to the gate and the potential wells of the floating diffusion FD become high.
  • the OFG signal becomes low level, so that the transistor Q1 is turned off and the bottom of the potential well of the connection node between the transistor Q1 and the transistor Q2 becomes low.
  • the RST signal becomes low level, so that the transistor Q3 is turned off and the bottom of the potential well of the floating diffusion FD becomes low. Further, the amount of electrons accumulated in the potential well of the connection node between the transistor Q1 and the transistor Q2 decreases according to the amount of charge converted by photoelectric conversion. This decrease in the amount of electrons corresponds to the imaging signal.
  • the TRG signal once becomes a high level and then changes to a low level, and the transistor Q2 is once turned on and then turned off.
  • the transistor Q2 is once turned on and then turned off.
  • the RST signal once becomes a high level and then changes to a low level, and the transistor Q3 is once turned on and then turned off.
  • the bottom of the well of the potential of the floating diffusion FD is once raised and reset, and then lowered.
  • the solid-state image sensor 1 includes a plurality of sub-pixels having different sizes in one pixel P.
  • the signal charges photoelectrically converted by the photoelectric conversion unit 6 of these subpixels are synthesized by, for example, a floating diffusion FD, and are supplied to the corresponding signal line L2 via the transistors Q4 and Q5.
  • the technique according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is any kind of movement such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, a robot, a construction machine, and an agricultural machine (tractor). It may be realized as a device mounted on the body.
  • FIG. 8 is a block diagram showing a schematic configuration example of a vehicle control system 7000, which is an example of a mobile control system to which the technique according to the present disclosure can be applied.
  • the vehicle control system 7000 includes a plurality of electronic control units connected via a communication network 7010.
  • the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside information detection unit 7400, an in-vehicle information detection unit 7500, and an integrated control unit 7600. ..
  • the communication network 7010 connecting these multiple control units conforms to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network) or FlexRay (registered trademark). It may be an in-vehicle communication network.
  • CAN Controller Area Network
  • LIN Local Interconnect Network
  • LAN Local Area Network
  • FlexRay registered trademark
  • Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used for various arithmetic, and a drive circuit that drives various controlled devices. To prepare for.
  • Each control unit is provided with a network I / F for communicating with other control units via the communication network 7010, and is connected to devices or sensors inside and outside the vehicle by wired communication or wireless communication.
  • a communication I / F for performing communication is provided. In FIG.
  • control unit 7600 As the functional configuration of the integrated control unit 7600, the microcomputer 7610, the general-purpose communication I / F7620, the dedicated communication I / F7630, the positioning unit 7640, the beacon receiving unit 7650, the in-vehicle device I / F7660, the audio image output unit 7670, The vehicle-mounted network I / F 7680 and the storage unit 7690 are illustrated.
  • Other control units also include a microcomputer, a communication I / F, a storage unit, and the like.
  • the drive system control unit 7100 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 7100 has a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating braking force of the vehicle.
  • the drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
  • the vehicle state detection unit 7110 is connected to the drive system control unit 7100.
  • the vehicle state detection unit 7110 may include, for example, a gyro sensor that detects the angular speed of the axial rotation motion of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, an accelerator pedal operation amount, a brake pedal operation amount, or steering wheel steering. It includes at least one of sensors for detecting angles, engine speeds, wheel speeds, and the like.
  • the drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detection unit 7110, and controls an internal combustion engine, a drive motor, an electric power steering device, a brake device, and the like.
  • the body system control unit 7200 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, turn signals or fog lamps.
  • a radio wave transmitted from a portable device that substitutes for a key or signals of various switches may be input to the body system control unit 7200.
  • the body system control unit 7200 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
  • the battery control unit 7300 controls the secondary battery 7310, which is the power supply source of the drive motor, according to various programs. For example, information such as the battery temperature, the battery output voltage, or the remaining capacity of the battery is input to the battery control unit 7300 from the battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and controls the temperature control of the secondary battery 7310 or the cooling device provided in the battery device.
  • the outside information detection unit 7400 detects information outside the vehicle equipped with the vehicle control system 7000.
  • the image pickup unit 7410 and the vehicle exterior information detection unit 7420 is connected to the vehicle exterior information detection unit 7400.
  • the image pickup unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
  • the vehicle outside information detection unit 7420 is used, for example, to detect the current weather or an environment sensor for detecting the weather, or other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. At least one of the ambient information detection sensors is included.
  • the environment sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects the degree of sunshine, and a snow sensor that detects snowfall.
  • the ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device.
  • the image pickup unit 7410 and the vehicle exterior information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
  • FIG. 9 shows an example of the installation position of the image pickup unit 7410 and the vehicle exterior information detection unit 7420.
  • the image pickup unit 7910, 7912, 7914, 7916, 7918 are provided, for example, at at least one of the front nose, side mirror, rear bumper, back door, and upper part of the windshield of the vehicle interior of the vehicle 7900.
  • the image pickup unit 7910 provided in the front nose and the image pickup section 7918 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 7900.
  • the image pickup units 7912 and 7914 provided in the side mirrors mainly acquire images of the side of the vehicle 7900.
  • the image pickup unit 7916 provided in the rear bumper or the back door mainly acquires an image of the rear of the vehicle 7900.
  • the image pickup unit 7918 provided on the upper part of the windshield in the vehicle interior is mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 9 shows an example of the shooting range of each of the imaging units 7910, 7912, 7914, 7916.
  • the imaging range a indicates the imaging range of the imaging unit 7910 provided on the front nose
  • the imaging ranges b and c indicate the imaging range of the imaging units 7912 and 7914 provided on the side mirrors, respectively
  • the imaging range d indicates the imaging range d.
  • the imaging range of the imaging unit 7916 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the image pickup units 7910, 7912, 7914, 7916, a bird's-eye view image of the vehicle 7900 can be obtained.
  • the vehicle exterior information detection unit 7920, 7922, 7924, 7926, 7928, 7930 provided on the front, rear, side, corner and the upper part of the windshield in the vehicle interior of the vehicle 7900 may be, for example, an ultrasonic sensor or a radar device.
  • the vehicle exterior information detection units 7920, 7926, 7930 provided on the front nose, rear bumper, back door, and upper part of the windshield in the vehicle interior of the vehicle 7900 may be, for example, a lidar device.
  • These out-of-vehicle information detection units 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, or the like.
  • the vehicle outside information detection unit 7400 causes the image pickup unit 7410 to capture an image of the outside of the vehicle and receives the captured image data. Further, the vehicle outside information detection unit 7400 receives the detection information from the connected vehicle outside information detection unit 7420.
  • the vehicle exterior information detection unit 7420 is an ultrasonic sensor, a radar device, or a lidar device
  • the vehicle exterior information detection unit 7400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives received reflected wave information.
  • the out-of-vehicle information detection unit 7400 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on a road surface based on the received information.
  • the out-of-vehicle information detection unit 7400 may perform an environment recognition process for recognizing rainfall, fog, road surface conditions, etc. based on the received information.
  • the out-of-vehicle information detection unit 7400 may calculate the distance to an object outside the vehicle based on the received information.
  • the vehicle outside information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a vehicle, an obstacle, a sign, a character on the road surface, or the like based on the received image data.
  • the vehicle exterior information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and synthesizes image data captured by different image pickup units 7410 to generate a bird's-eye view image or a panoramic image. May be good.
  • the vehicle exterior information detection unit 7400 may perform a viewpoint conversion process using image data captured by different image pickup units 7410.
  • the in-vehicle information detection unit 7500 detects the in-vehicle information.
  • a driver state detection unit 7510 that detects the state of the driver is connected to the in-vehicle information detection unit 7500.
  • the driver state detection unit 7510 may include a camera that captures the driver, a biosensor that detects the driver's biological information, a microphone that collects sound in the vehicle interior, and the like.
  • the biosensor is provided on, for example, a seat surface or a steering wheel, and detects biometric information of a passenger sitting on the seat or a driver holding the steering wheel.
  • the in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, and determines whether or not the driver is dozing. You may.
  • the in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected audio signal.
  • the integrated control unit 7600 controls the overall operation in the vehicle control system 7000 according to various programs.
  • An input unit 7800 is connected to the integrated control unit 7600.
  • the input unit 7800 is realized by a device that can be input-operated by the passenger, such as a touch panel, a button, a microphone, a switch, or a lever. Data obtained by recognizing the voice input by the microphone may be input to the integrated control unit 7600.
  • the input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or an external connection device such as a mobile phone or a PDA (Personal Digital Assistant) corresponding to the operation of the vehicle control system 7000. You may.
  • the input unit 7800 may be, for example, a camera, in which case the passenger can input information by gesture. Alternatively, data obtained by detecting the movement of the wearable device worn by the passenger may be input. Further, the input unit 7800 may include, for example, an input control circuit that generates an input signal based on the information input by the passenger or the like using the above input unit 7800 and outputs the input signal to the integrated control unit 7600. By operating the input unit 7800, the passenger or the like inputs various data to the vehicle control system 7000 and instructs the processing operation.
  • the storage unit 7690 may include a ROM (Read Only Memory) for storing various programs executed by the microcomputer, and a RAM (Random Access Memory) for storing various parameters, calculation results, sensor values, and the like. Further, the storage unit 7690 may be realized by a magnetic storage device such as an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, or the like.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the general-purpose communication I / F 7620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 7750.
  • General-purpose communication I / F7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced).
  • GSM Global System of Mobile communications
  • WiMAX registered trademark
  • LTE registered trademark
  • LTE-A Long Term Evolution-Advanced
  • Bluetooth® may be implemented.
  • the general-purpose communication I / F7620 connects to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or a business-specific network) via a base station or an access point, for example. You may. Further, the general-purpose communication I / F7620 uses, for example, P2P (Peer To Peer) technology, and is a terminal existing in the vicinity of the vehicle (for example, a driver, a pedestrian or a store terminal, or an MTC (Machine Type Communication) terminal). May be connected with.
  • P2P Peer To Peer
  • MTC Machine Type Communication
  • the dedicated communication I / F 7630 is a communication I / F that supports a communication protocol formulated for use in a vehicle.
  • the dedicated communication I / F7630 uses a standard protocol such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), which is a combination of the lower layer IEEE802.11p and the upper layer IEEE1609, or a cellular communication protocol. May be implemented.
  • Dedicated communication I / F7630 is typically vehicle-to-vehicle (Vehicle to Vehicle) communication, road-to-vehicle (Vehicle to Infrastructure) communication, vehicle-to-house (Vehicle to Home) communication, and pedestrian-to-vehicle (Vehicle to Pedestrian) communication. ) Carry out V2X communication, a concept that includes one or more of the communications.
  • the positioning unit 7640 receives, for example, a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite), executes positioning, and executes positioning, and the latitude, longitude, and altitude of the vehicle. Generate location information including.
  • the positioning unit 7640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smartphone having a positioning function.
  • the beacon receiving unit 7650 receives, for example, a radio wave or an electromagnetic wave transmitted from a radio station or the like installed on a road, and acquires information such as a current position, a traffic jam, a road closure, or a required time.
  • the function of the beacon receiving unit 7650 may be included in the above-mentioned dedicated communication I / F 7630.
  • the in-vehicle device I / F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle devices 7760 existing in the vehicle.
  • the in-vehicle device I / F7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication) or WUSB (Wireless USB).
  • a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication) or WUSB (Wireless USB).
  • the in-vehicle device I / F7660 is via a connection terminal (and a cable if necessary) (not shown), USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface, or MHL (Mobile High)).
  • -Definition Link and other wired connections may be established.
  • the in-vehicle device 7760 includes, for example, at least one of a passenger's mobile device or wearable device, or an information device carried in or attached to the vehicle. Further, the in-vehicle device 7760 may include a navigation device for searching a route to an arbitrary destination.
  • the in-vehicle device I / F 7660 may be a control signal to and from these in-vehicle devices 7760. Or exchange the data signal.
  • the in-vehicle network I / F7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010.
  • the vehicle-mounted network I / F7680 transmits / receives signals and the like according to a predetermined protocol supported by the communication network 7010.
  • the microcomputer 7610 of the integrated control unit 7600 is via at least one of general-purpose communication I / F7620, dedicated communication I / F7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I / F7660, and in-vehicle network I / F7680. Based on the information acquired in the above, the vehicle control system 7000 is controlled according to various programs. For example, the microcomputer 7610 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the drive system control unit 7100. May be good.
  • the microcomputer 7610 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. Cooperative control may be performed for the purpose of.
  • the microcomputer 7610 automatically travels autonomously without relying on the driver's operation by controlling the driving force generator, steering mechanism, braking device, etc. based on the acquired information on the surroundings of the vehicle. Coordinated control may be performed for the purpose of driving or the like.
  • the microcomputer 7610 has information acquired via at least one of general-purpose communication I / F 7620, dedicated communication I / F 7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I / F 7660, and in-vehicle network I / F 7680. Based on the above, three-dimensional distance information between the vehicle and an object such as a surrounding structure or a person may be generated, and local map information including the peripheral information of the current position of the vehicle may be created. Further, the microcomputer 7610 may predict the danger of a vehicle collision, a pedestrian or the like approaching or entering a closed road, and generate a warning signal based on the acquired information.
  • the warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
  • the audio image output unit 7670 transmits an output signal of at least one of audio and image to an output device capable of visually or audibly notifying information to the passenger or the outside of the vehicle.
  • an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are exemplified as output devices.
  • the display unit 7720 may include, for example, at least one of an onboard display and a head-up display.
  • the display unit 7720 may have an AR (Augmented Reality) display function.
  • the output device may be other devices such as headphones, wearable devices such as eyeglass-type displays worn by passengers, projectors or lamps other than these devices.
  • the display device displays the results obtained by various processes performed by the microcomputer 7610 or the information received from other control units in various formats such as texts, images, tables, and graphs. Display visually.
  • the audio output device converts an audio signal composed of reproduced audio data, acoustic data, or the like into an analog signal and outputs the audio signal audibly.
  • At least two control units connected via the communication network 7010 may be integrated as one control unit.
  • each control unit may be composed of a plurality of control units.
  • the vehicle control system 7000 may include another control unit (not shown).
  • the other control unit may have a part or all of the functions carried out by any of the control units. That is, as long as information is transmitted and received via the communication network 7010, predetermined arithmetic processing may be performed by any of the control units.
  • a sensor or device connected to any control unit may be connected to another control unit, and a plurality of control units may send and receive detection information to and from each other via the communication network 7010. .
  • the present technology can have the following configurations.
  • the present technology can have the following configurations.
  • a first semiconductor layer having a first conductive type first compound semiconductor material, and The photoelectric conversion layer arranged on the first semiconductor layer and A second conductive type second semiconductor layer partially arranged on the photoelectric conversion layer and arranged so as to be in contact with the photoelectric conversion layer.
  • the first semiconductor layer, the photoelectric conversion layer, and the element separation layer for separating the second semiconductor layer on a pixel-by-pixel basis are provided.
  • the pixel is a solid-state image sensor having a plurality of sub-pixels having different sensitivities and sizes.
  • the second semiconductor layer has a plurality of well regions separated from each other corresponding to each of the plurality of subpixels.
  • the solid-state image pickup device according to (1), wherein the plurality of well regions have different sizes.
  • (4) The solid-state image pickup device according to (3), wherein the plurality of well regions have different planar sizes.
  • the solid-state image pickup apparatus comprising a light-shielding layer arranged in at least a part of the element separation layer.
  • the solid-state image pickup device comprising a light-shielding layer arranged in at least a part of the element separation layer.
  • the solid-state image pickup device wherein the light-shielding layer is arranged at a boundary position of the pixels.
  • the solid-state image pickup device wherein the light-shielding layer is arranged at the boundary position of each of the plurality of sub-pixels in the pixel.
  • the light-shielding layer is arranged so as to surround each of the plurality of subpixels when viewed in a plan view along the stacking direction of the second semiconductor layer.
  • the solid-state image sensor according to one item.
  • the solid-state image pickup device according to (10), wherein the plurality of sub-pixels surrounded by the light-shielding layer in the pixel have different areas in a plan view along the stacking direction. (12) A light-shielding layer arranged at the boundary position of each of the plurality of sub-pixels in the pixel is provided.
  • the solid-state image pickup device according to any one of (2) to (6), wherein the plurality of well regions are regions separated by the light-shielding layer.
  • the areas of the plurality of sub-pixels divided by the light-shielding layer in the pixels in a plan view along the stacking direction are different from each other.
  • the solid-state image pickup apparatus wherein the areas of the plurality of well regions in the pixel viewed in a plan view along the stacking direction are all the same.
  • (14) The solid-state image sensor according to any one of (1) to (13), wherein the photoelectric conversion layer of the plurality of subpixels in the pixel contains a different material for each subpixel.
  • 15) The solid-state image sensor according to (14), wherein the material comprises InGaAs or CuInGaSe 2 .
  • a third semiconductor layer arranged on the photoelectric conversion layer and having the first conductive type second compound semiconductor material is provided.
  • the solid-state image pickup device according to any one of (1) to (15), wherein the second semiconductor layer is arranged on the third semiconductor layer.
  • 1 solid-state imager 2 base substrate, 3 1st semiconductor layer, 4 photoelectric conversion layer, 5 3rd semiconductor layer, 6 photoelectric conversion unit, 7 element separation layer, 8 2nd semiconductor layer, 9 1st electrode layer, 10th 2 electrode layer, 11 coating layer, 12 bonding member, 13 drive substrate, 14 flattening layer, 15 microlens, 16 light-shielding layer, 21 pixel array section, 22 vertical drive circuit, 23 column signal processing circuit, 24 horizontal drive circuit , 25 output circuit, 26 drive control circuit,

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

[Problem] To make it possible to expand dynamic range, even if pixel structure is downsized. [Solution] A solid-state imaging device that comprises a first semiconductor layer that includes a first-conduction-type first compound semiconductor material, a photoelectric conversion layer that is arranged above the first semiconductor layer, a second-conduction-type second semiconductor layer that is partially arranged above the photoelectric conversion layer and contacts the photoelectric conversion layer, and an element separation layer that separates the first semiconductor layer, the photoelectric conversion layer, and the second semiconductor layer pixel by pixel. The pixels include subpixels that have different sizes and sensitivities.

Description

固体撮像装置Solid-state image sensor

 本開示は、固体撮像装置に関する。 This disclosure relates to a solid-state image sensor.

 近赤外の長波長領域まで受光でき、かつ画素ピッチを密にしても受光感度を確保できる受光素子アレイが提案されている(特許文献1参照)。特許文献1には、近赤外波長領域に対応するバンドギャップエネルギを有する受光部を複数配列した受光素子アレイが開示されている。この受光素子アレイの受光部は、選択拡散によって形成されたp型領域の先端部にpn接合を有し、受光部の間にn型領域を設けて、受光部の画素ごとの区分けを行っている。 A light receiving element array that can receive light up to a long wavelength region in the near infrared and can secure light receiving sensitivity even if the pixel pitch is dense has been proposed (see Patent Document 1). Patent Document 1 discloses a light receiving element array in which a plurality of light receiving portions having a bandgap energy corresponding to a near infrared wavelength region are arranged. The light receiving portion of this light receiving element array has a pn junction at the tip of the p-shaped region formed by selective diffusion, and an n-shaped region is provided between the light receiving portions to divide the light receiving portion for each pixel. There is.

特開2012-244124号公報Japanese Unexamined Patent Publication No. 2012-244124

 特許文献1に開示された受光素子アレイは、画素ごとの受光感度が同一であるため、画素構造を微細化すると、全体的に受光感度が低下してダイナミックレンジを広げることができない。 Since the light receiving element array disclosed in Patent Document 1 has the same light receiving sensitivity for each pixel, if the pixel structure is miniaturized, the light receiving sensitivity is lowered as a whole and the dynamic range cannot be widened.

 そこで、本開示では、画素構造を微細化しても、ダイナミックレンジを広げることができる固体撮像装置を提供するものである。 Therefore, the present disclosure provides a solid-state image sensor capable of expanding the dynamic range even if the pixel structure is miniaturized.

 上記の課題を解決するために、本開示によれば、第1導電型の第1化合物半導体材料を有する第1半導体層と、
 前記第1半導体層の上に配置される光電変換層と、
 前記光電変換層の上に部分的に配置され、前記光電変換層に接触するように配置される第2導電型の第2半導体層と、
 前記第1半導体層、前記光電変換層、及び前記第2半導体層を画素単位で分離する素子分離層と、を備え、
 前記画素は、感度及びサイズの異なる複数のサブピクセルを有する、固体撮像装置が提供される。
In order to solve the above problems, according to the present disclosure, a first semiconductor layer having a first conductive type first compound semiconductor material and a first semiconductor layer
The photoelectric conversion layer arranged on the first semiconductor layer and
A second conductive type second semiconductor layer partially arranged on the photoelectric conversion layer and arranged so as to be in contact with the photoelectric conversion layer.
The first semiconductor layer, the photoelectric conversion layer, and the element separation layer for separating the second semiconductor layer on a pixel-by-pixel basis are provided.
A solid-state image sensor is provided in which the pixels have a plurality of sub-pixels having different sensitivities and sizes.

 前記第2半導体層は、前記複数のサブピクセルのそれぞれに対応して、それぞれが分離された複数のウェル領域を有し、
 前記複数のウェル領域はそれぞれサイズが異なっていてもよい。
The second semiconductor layer has a plurality of well regions separated from each other corresponding to each of the plurality of subpixels.
The plurality of well regions may have different sizes.

 前記複数のウェル領域は、前記第2半導体層の積層方向に沿って平面視した場合の面積がそれぞれ異なってもよい。 The plurality of well regions may have different areas when viewed in a plan view along the stacking direction of the second semiconductor layer.

 前記複数のウェル領域の平面サイズはそれぞれ異なっていてもよい。 The plane sizes of the plurality of well regions may be different from each other.

 前記複数のウェル領域は、対応する前記画素内において、向きを揃えて配置されてもよい。 The plurality of well regions may be arranged in the corresponding pixels so as to be oriented in the same direction.

 前記複数のウェル領域は、対応する前記画素内において、それぞれ異なる向きに配置されてもよい。 The plurality of well regions may be arranged in different directions in the corresponding pixels.

 前記素子分離層の少なくとも一部に配置される遮光層を備えてもよい。 A light-shielding layer arranged in at least a part of the element separation layer may be provided.

 前記遮光層は、前記画素の境界位置に配置されてもよい。 The light-shielding layer may be arranged at the boundary position of the pixel.

 前記遮光層は、前記画素内の前記複数のサブピクセルのそれぞれの境界位置に配置されてもよい。 The light-shielding layer may be arranged at the boundary position of each of the plurality of sub-pixels in the pixel.

 前記遮光層は、前記第2半導体層の積層方向に沿って平面視したときに、前記複数のサブピクセルのそれぞれを取り囲むように配置されてもよい。 The light-shielding layer may be arranged so as to surround each of the plurality of sub-pixels when viewed in a plan view along the stacking direction of the second semiconductor layer.

 前記画素内の前記遮光層で取り囲まれる前記複数のサブピクセルを積層方向に沿って平面視した面積はそれぞれ異なってもよい。 The areas of the plurality of sub-pixels surrounded by the light-shielding layer in the pixels in a plan view along the stacking direction may be different from each other.

 前記画素内の前記複数のサブピクセルのそれぞれの境界位置に配置される遮光層を備え、
 前記複数のウェル領域は、前記遮光層で分離された領域であってもよい。
A light-shielding layer arranged at the boundary position of each of the plurality of sub-pixels in the pixel is provided.
The plurality of well regions may be regions separated by the light-shielding layer.

 前記画素内の前記遮光層で区分けされる前記複数のサブピクセルを積層方向に沿って平面視した面積はそれぞれ異なっており、
 前記画素内の前記複数のウェル領域を積層方向に沿って平面視した面積はいずれも等しくてもよい。
The areas of the plurality of sub-pixels divided by the light-shielding layer in the pixels in a plan view along the stacking direction are different from each other.
The areas of the plurality of well regions in the pixel viewed in a plane along the stacking direction may be the same.

 前記画素内の前記複数のサブピクセルの前記光電変換層は、サブピクセルごとにそれぞれ異なる材料を含んでもよい。 The photoelectric conversion layer of the plurality of subpixels in the pixel may contain a different material for each subpixel.

 前記材料は、InGaAs又はCuInGaSeを含んでもよい。 The material may contain InGaAs or CuInGaSe 2 .

 前記光電変換層の上に配置され、前記第1導電型の第2化合物半導体材料を有する第3半導体層を備え、
 前記第2半導体層は、前記第3半導体層の上に配置されてもよい。
A third semiconductor layer arranged on the photoelectric conversion layer and having the first conductive type second compound semiconductor material is provided.
The second semiconductor layer may be arranged on the third semiconductor layer.

第1の実施形態による固体撮像装置の平面レイアウト図。The plan layout view of the solid-state image sensor according to 1st Embodiment. 図1Aの対角線方向の断面図。FIG. 1A is a cross-sectional view in the diagonal direction. 本実施形態による固体撮像装置の概略構成を示すブロック図。The block diagram which shows the schematic structure of the solid-state image sensor according to this embodiment. 第1サブピクセルと第2サブピクセルの向きを互いに相違させて各画素P内に配置する平面レイアウト図。A plane layout diagram in which the first subpixel and the second subpixel are arranged in each pixel P with different orientations. 第2の実施形態による固体撮像装置の平面レイアウト図。The plan layout view of the solid-state image sensor according to the 2nd Embodiment. 図4Aの対角線方向の断面図。FIG. 4A is a cross-sectional view in the diagonal direction. 第3の実施形態による固体撮像装置の平面レイアウト図。The plan layout view of the solid-state image sensor according to the 3rd Embodiment. 読み出し回路の一例を示す回路図。A circuit diagram showing an example of a readout circuit. 図6の読み出し回路の動作タイミング図及びポテンシャル図。The operation timing diagram and the potential diagram of the readout circuit of FIG. 図7Aに続く時刻のポテンシャル図。The potential diagram of the time following FIG. 7A. 図7Bに続く時刻のポテンシャル図。The potential diagram of the time following FIG. 7B. 図7Cに続く時刻のポテンシャル図。The potential diagram of the time following FIG. 7C. 図7Dに続く時刻のポテンシャル図。The potential diagram of the time following FIG. 7D. 車両制御システムの概略的な構成の一例を示すブロック図。A block diagram showing an example of a schematic configuration of a vehicle control system. 車外情報検出部及び撮像部の設置位置の一例を示す説明図。Explanatory drawing which shows an example of the installation position of the outside information detection unit and the image pickup unit.

 以下、図面を参照して、固体撮像装置の実施形態について説明する。以下では、固体撮像装置の主要な構成部分を中心に説明するが、固体撮像装置には、図示又は説明されていない構成部分や機能が存在しうる。以下の説明は、図示又は説明されていない構成部分や機能を除外するものではない。 Hereinafter, embodiments of the solid-state image sensor will be described with reference to the drawings. In the following, the main components of the solid-state image sensor will be mainly described, but the solid-state image sensor may have components and functions not shown or described. The following description does not exclude components or functions not shown or described.

 (第1の実施形態)
 シリコンのバンドギャップエネルギは1.1eVであるため、シリコンをベース基板とする固体撮像装置では、1.1μmよりも長波長の赤外光を検出することはできない。一方、III-V族の化合物半導体は、バンドギャップエネルギが近赤外波長領域に対応する。そこで、本実施形態による固体撮像装置は、III-V族の化合物半導体をベース基板として用いて、近赤外光を撮像できるようにしている。
(First Embodiment)
Since the bandgap energy of silicon is 1.1 eV, a solid-state image pickup device using silicon as a base substrate cannot detect infrared light having a wavelength longer than 1.1 μm. On the other hand, in the group III-V compound semiconductor, the bandgap energy corresponds to the near infrared wavelength region. Therefore, the solid-state image sensor according to the present embodiment uses a group III-V compound semiconductor as a base substrate to enable image pickup of near-infrared light.

 図1Aは第1の実施形態による固体撮像装置1の平面レイアウト図、図1Bは図1Aの対角線方向の断面図である。なお、図1A及び図1Bは、本実施形態による固体撮像装置1の特徴部分を模式的に示したものであり、一部の部材は省略または簡略化して図示している。また、図1A及び図1Bにおける水平方向及び垂直方向のサイズの比率は、実際の固体撮像装置1とは異なっている可能性がありうる。 FIG. 1A is a plan layout view of the solid-state image sensor 1 according to the first embodiment, and FIG. 1B is a cross-sectional view in the diagonal direction of FIG. 1A. Note that FIGS. 1A and 1B schematically show the characteristic portions of the solid-state image sensor 1 according to the present embodiment, and some members are omitted or simplified. Also, the horizontal and vertical size ratios in FIGS. 1A and 1B may differ from the actual solid-state image sensor 1.

 本実施形態による固体撮像装置1のベース基板2は、GaAs、InP、GaN、AlN、GaP、GaSb、InSb、InAsなどのIII-V族の半導体を材料とする基板である。 The base substrate 2 of the solid-state imaging device 1 according to the present embodiment is a substrate made of a group III-V semiconductor such as GaAs, InP, GaN, AlN, GaP, GaSb, InSb, and InAs.

 図1Bに示すように、基板2の上には、第1導電型(例えばn型)の第1化合物半導体材料を有する第1半導体層(以下、第1化合物半導体層とも呼ぶ)3が配置されている。第1半導体層3は、例えばInP(具体的には、n-InP)を材料とする。 As shown in FIG. 1B, a first semiconductor layer (hereinafter, also referred to as a first compound semiconductor layer) 3 having a first conductive type (for example, n-type) first compound semiconductor material is arranged on the substrate 2. ing. The first semiconductor layer 3 is made of, for example, InP (specifically, n-InP).

 第1半導体層3の上には、光電変換層4が配置されている。光電変換層4は、例えばInGaAs(具体的には、n-InGaAs)を材料とする。 The photoelectric conversion layer 4 is arranged on the first semiconductor layer 3. The photoelectric conversion layer 4 is made of, for example, InGaAs (specifically, n-InGaAs).

 光電変換層4の上には、第2化合物材料を有する第3半導体層5が配置されている。第3半導体層5は、例えばInP(具体的には、n-InP)を材料とする。なお、第3半導体層5は省略してもよい。 A third semiconductor layer 5 having a second compound material is arranged on the photoelectric conversion layer 4. The third semiconductor layer 5 is made of, for example, InP (specifically, n-InP). The third semiconductor layer 5 may be omitted.

 本明細書では、積層された第1半導体層3、光電変換層4及び第3半導体層5を光電変換部6と呼ぶ。基板2の面方向に、複数の光電変換部6が配置されており、隣接する2つの光電変換部6の間には素子分離層7が配置されている。また、第3半導体層5の上にも、素子分離層7と同じ材料の半導体層が配置されている。 In the present specification, the laminated first semiconductor layer 3, photoelectric conversion layer 4, and third semiconductor layer 5 are referred to as photoelectric conversion units 6. A plurality of photoelectric conversion units 6 are arranged in the plane direction of the substrate 2, and an element separation layer 7 is arranged between two adjacent photoelectric conversion units 6. Further, a semiconductor layer made of the same material as the element separation layer 7 is also arranged on the third semiconductor layer 5.

 素子分離層7は、例えばInP(具体的にはn-InP)を材料とする。例えば、素子分離層7の不純物濃度を光電変換層4の不純物濃度よりも高くすることで、素子分離層7のエネルギギャップを光電変換層4よりも広くすることができる。 The element separation layer 7 is made of, for example, InP (specifically, n-InP). For example, by making the impurity concentration of the element separation layer 7 higher than the impurity concentration of the photoelectric conversion layer 4, the energy gap of the element separation layer 7 can be made wider than that of the photoelectric conversion layer 4.

 また、各光電変換部6の上には、第2導電型(例えばp型)の第2半導体層8が配置されている。第2半導体層8は、対応する光電変換部6の上に部分的に配置されている。第2半導体層8は、例えば、素子分離層7と同じ材料の半導体層に、p型不純物イオン(例えばZnイオン)を選択拡散させた領域である。本明細書では、部分的に配置される第2半導体層8をウェル領域と呼ぶこともある。図1Aに示すように、光電変換層4の上に、サイズが異なる複数のウェル領域が配置されている。画素P内の複数のウェル領域は、第2半導体層8の積層方向に沿って平面視した場合の面積がそれぞれ異なっている。画素P内の複数のウェル領域の平面サイズはそれぞれ異なっている。 Further, a second conductive type (for example, p-type) second semiconductor layer 8 is arranged on each photoelectric conversion unit 6. The second semiconductor layer 8 is partially arranged on the corresponding photoelectric conversion unit 6. The second semiconductor layer 8 is, for example, a region in which p-type impurity ions (for example, Zn ions) are selectively diffused in a semiconductor layer made of the same material as the device separation layer 7. In the present specification, the partially arranged second semiconductor layer 8 may be referred to as a well region. As shown in FIG. 1A, a plurality of well regions having different sizes are arranged on the photoelectric conversion layer 4. The areas of the plurality of well regions in the pixel P are different when viewed in a plan view along the stacking direction of the second semiconductor layer 8. The plane sizes of the plurality of well regions in the pixel P are different from each other.

 第2半導体層8の上には第1電極層9が配置されている。また、素子分離層7の上には、部分的に第2電極層10が配置されている。第1電極層9と第2電極層10の周囲は、絶縁材料からなる被覆層11で覆われている。 The first electrode layer 9 is arranged on the second semiconductor layer 8. Further, the second electrode layer 10 is partially arranged on the element separation layer 7. The periphery of the first electrode layer 9 and the second electrode layer 10 is covered with a coating layer 11 made of an insulating material.

 第1電極層9及び第2電極層10は、バンプ、Cu-Cu接合、又はビア等の接合部材12により、駆動用基板13に接続されている。駆動用基板13には、読み出し回路や信号処理回路などが配置される。 The first electrode layer 9 and the second electrode layer 10 are connected to the drive substrate 13 by a joining member 12 such as a bump, a Cu—Cu joint, or a via. A read circuit, a signal processing circuit, and the like are arranged on the drive board 13.

 ベース基板2の光電変換部6が形成される面と反対側の面には、平坦化層14が配置され、その上には、マイクロレンズ15が配置されている。本実施形態による固体撮像装置1は、マイクロレンズ15を介して入射された光を光電変換部6で光電変換する。マイクロレンズ15は裏面側に配置されており、本実施形態による固体撮像装置1は、裏面照射型である。 A flattening layer 14 is arranged on a surface of the base substrate 2 opposite to the surface on which the photoelectric conversion portion 6 is formed, and a microlens 15 is arranged on the flattening layer 14. In the solid-state image sensor 1 according to the present embodiment, the light incident through the microlens 15 is photoelectrically converted by the photoelectric conversion unit 6. The microlens 15 is arranged on the back surface side, and the solid-state image sensor 1 according to the present embodiment is a back-illuminated type.

 図1Bに示す破線枠は、画素Pの境界を示している。画素Pの境界には、素子分離層7が設けられている。本実施形態による固体撮像装置1は、画素Pごとに、感度及びサイズの異なる複数のサブピクセルを有する。図1A及び図1Bは、一つの画素P内に2つのサブピクセル(以下、第1サブピクセルSP1と第2サブピクセルSP2と呼ぶ)を有する例を示している。なお、一つの画素P内に設けられるサブピクセルの数は2個に限定されるものではなく、3個以上でもよいが、本明細書では、一つの画素P内に2個のサブピクセル(第1サブピクセルSP1と第2サブピクセルSP2)を有する例を主に説明する。 The broken line frame shown in FIG. 1B indicates the boundary of the pixel P. An element separation layer 7 is provided at the boundary of the pixel P. The solid-state image sensor 1 according to the present embodiment has a plurality of sub-pixels having different sensitivities and sizes for each pixel P. 1A and 1B show an example in which one pixel P has two subpixels (hereinafter, referred to as a first subpixel SP1 and a second subpixel SP2). The number of sub-pixels provided in one pixel P is not limited to two, and may be three or more, but in the present specification, two sub-pixels (th) in one pixel P. An example having one subpixel SP1 and a second subpixel SP2) will be mainly described.

 図1A及び図1Bでは、第1サブピクセルSP1のサイズが第2サブピクセルSP2のサイズよりも大きい例を示している。ここで、サイズとは、第1サブピクセルSP1及び第2サブピクセルSP2内の光電変換部6の体積を指している。すなわち、図1A及び図1Bでは、第1サブピクセルSP1内の光電変換部6の体積を第2サブピクセルSP2内の光電変換部6の体積よりも大きくしている。光電変換部6の体積をより大きくすると、感度がより向上する。このため、第1サブピクセルSP1は、第2サブピクセルSP2よりも感度が高くなる。 1A and 1B show an example in which the size of the first subpixel SP1 is larger than the size of the second subpixel SP2. Here, the size refers to the volume of the photoelectric conversion unit 6 in the first subpixel SP1 and the second subpixel SP2. That is, in FIGS. 1A and 1B, the volume of the photoelectric conversion unit 6 in the first subpixel SP1 is larger than the volume of the photoelectric conversion unit 6 in the second subpixel SP2. When the volume of the photoelectric conversion unit 6 is made larger, the sensitivity is further improved. Therefore, the first subpixel SP1 has a higher sensitivity than the second subpixel SP2.

 図1Aに示すように、各画素Pのサイズは同一であるが、各画素Pは第2サブピクセルSP2の他に、第2サブピクセルSP2よりも感度が高い第1サブピクセルSP1を有する。感度が異なる第1サブピクセルSP1及び第2サブピクセルSP2を設けることで、各画素Pのダイナミックレンジを広げることができる。 As shown in FIG. 1A, the size of each pixel P is the same, but each pixel P has a first subpixel SP1 having a higher sensitivity than the second subpixel SP2 in addition to the second subpixel SP2. By providing the first subpixel SP1 and the second subpixel SP2 having different sensitivities, the dynamic range of each pixel P can be widened.

 画素P内の第1サブピクセルSP1と第2サブピクセルSP2は、それぞれ異なる材料を含んでいてもよい。各サブピクセルを構成する材料は、例えば、InGaAs又はCuInGaSeを含んでいてもよい。 The first subpixel SP1 and the second subpixel SP2 in the pixel P may contain different materials. The material constituting each subpixel may contain, for example, InGaAs or CuInGaSe 2 .

 図2は本実施形態による固体撮像装置1の概略構成を示すブロック図である。図2に示すように、本実施形態による固体撮像装置1は、画素アレイ部21と、垂直駆動回路22と、カラム信号処理回路23と、水平駆動回路24と、出力回路25と、駆動制御回路26とを備えている。 FIG. 2 is a block diagram showing a schematic configuration of the solid-state image sensor 1 according to the present embodiment. As shown in FIG. 2, the solid-state image sensor 1 according to the present embodiment includes a pixel array unit 21, a vertical drive circuit 22, a column signal processing circuit 23, a horizontal drive circuit 24, an output circuit 25, and a drive control circuit. It has 26 and.

 画素アレイ部21は、水平方向及び垂直方向に配列された複数の画素Pを有する。水平方向に並ぶ複数の画素Pからなる画素行ごとに行選択線L1が配置されている。水平方向に延びる複数の行選択線L1は、垂直駆動回路22により駆動される。垂直方向に並ぶ複数の画素Pからなる画素列ごとに信号線L2が配置されている。垂直方向に延びる複数の信号線L2は、カラム信号処理回路23に接続されている。カラム信号処理回路23は、各画素Pから信号線L2を介して読み出した信号電圧に含まれるノイズ成分の除去や信号振幅の調整等の信号処理を行う。カラム信号処理回路23の出力信号は出力回路25を介して出力される。なお、カラム信号処理回路23は、各信号線L2の信号電圧をデジタル信号に変換する処理を行ってもよい。 The pixel array unit 21 has a plurality of pixels P arranged in the horizontal direction and the vertical direction. A row selection line L1 is arranged for each pixel row composed of a plurality of pixels P arranged in the horizontal direction. The plurality of row selection lines L1 extending in the horizontal direction are driven by the vertical drive circuit 22. A signal line L2 is arranged for each pixel row composed of a plurality of pixels P arranged in the vertical direction. A plurality of signal lines L2 extending in the vertical direction are connected to the column signal processing circuit 23. The column signal processing circuit 23 performs signal processing such as removal of noise components included in the signal voltage read from each pixel P via the signal line L2 and adjustment of the signal amplitude. The output signal of the column signal processing circuit 23 is output via the output circuit 25. The column signal processing circuit 23 may perform processing for converting the signal voltage of each signal line L2 into a digital signal.

 駆動制御回路26は、垂直同期信号、水平同期信号及びマスタークロックに基づいて、垂直駆動回路22、カラム信号処理回路23及び水平駆動回路24の動作の基準となるクロック信号と制御信号を生成する。生成されたクロック信号と制御信号は、垂直駆動回路22、カラム信号処理回路23及び水平駆動回路24に入力される。 The drive control circuit 26 generates a clock signal and a control signal that serve as a reference for the operation of the vertical drive circuit 22, the column signal processing circuit 23, and the horizontal drive circuit 24, based on the vertical synchronization signal, the horizontal synchronization signal, and the master clock. The generated clock signal and control signal are input to the vertical drive circuit 22, the column signal processing circuit 23, and the horizontal drive circuit 24.

 本実施形態による画素アレイ部21内の各画素Pは、上述したように複数のサブピクセル(第1サブピクセルSP1と第2サブピクセルSP2)を有する。同一画素P内の第1サブピクセルSP1と第2サブピクセルSP2のそれぞれが光電変換した電気信号は、同タイミングで同一の信号線L2を介してカラム信号処理回路23に入力される。あるいは、第1サブピクセルSP1と第2サブピクセルSP2が光電変換した電気信号を、時間をずらして同一の信号線L2を介してカラム信号処理回路23に入力してもよい。第1サブピクセルSP1と第2サブピクセルSP2は、互いに感度が異なっているため、第1サブピクセルSP1が光電変換した電気信号と第2サブピクセルSP2が光電変換した電気信号を合成することで、光電変換のダイナミックレンジをより広げることができる。 Each pixel P in the pixel array unit 21 according to the present embodiment has a plurality of subpixels (first subpixel SP1 and second subpixel SP2) as described above. The electrical signals photoelectrically converted by each of the first subpixel SP1 and the second subpixel SP2 in the same pixel P are input to the column signal processing circuit 23 via the same signal line L2 at the same timing. Alternatively, the electrical signals photoelectrically converted by the first subpixel SP1 and the second subpixel SP2 may be input to the column signal processing circuit 23 via the same signal line L2 at different times. Since the first subpixel SP1 and the second subpixel SP2 have different sensitivities, the electric signal photoelectrically converted by the first subpixel SP1 and the electrical signal photoelectrically converted by the second subpixel SP2 are combined. The dynamic range of photoelectric conversion can be further expanded.

 図1Bのレイアウトは、各画素P内の第1サブピクセルSP1と第2サブピクセルSP2がともに矩形状であり、第1サブピクセルSP1と第2サブピクセルSP2の向きを揃えて各画素P内に配置する例を示している。これに対して、図3は、各画素P内の第1サブピクセルSP1と第2サブピクセルSP2がともに矩形状であり、第1サブピクセルSP1と第2サブピクセルSP2の向きを互いに相違させて各画素P内に配置する例を示している。図3に示すように、第1サブピクセルSP1と第2サブピクセルSP2の向きを互いに相違させて各画素P内に配置することにより、画素P内に配置される各サブピクセルのサイズができるだけ大きくなるように、第1サブピクセルSP1と第2サブピクセルSP2の向きを調整して配置することができる。これにより、第1サブピクセルSP1と第2サブピクセルSP2のサイズ比率をより大きくすることができ、ダイナミックレンジをより広げることができる。図3では、第1サブピクセルSP1の向きと第2サブピクセルSP2の向きを互いに45度ずらしているが、どの程度向きをずらすかは任意である。 In the layout of FIG. 1B, the first subpixel SP1 and the second subpixel SP2 in each pixel P are both rectangular, and the directions of the first subpixel SP1 and the second subpixel SP2 are aligned in each pixel P. An example of placement is shown. On the other hand, in FIG. 3, the first subpixel SP1 and the second subpixel SP2 in each pixel P are both rectangular, and the directions of the first subpixel SP1 and the second subpixel SP2 are different from each other. An example of arranging in each pixel P is shown. As shown in FIG. 3, by arranging the first subpixel SP1 and the second subpixel SP2 in different directions in each pixel P, the size of each subpixel arranged in the pixel P is made as large as possible. As such, the orientations of the first subpixel SP1 and the second subpixel SP2 can be adjusted and arranged. As a result, the size ratio of the first sub-pixel SP1 and the second sub-pixel SP2 can be further increased, and the dynamic range can be further expanded. In FIG. 3, the orientation of the first subpixel SP1 and the orientation of the second subpixel SP2 are offset by 45 degrees from each other, but the degree of offset is arbitrary.

 より詳細には、本実施形態では、図1Aに示すように、第1サブピクセルSP1内の第2半導体層8を積層方向から平面視した面積を、第2サブピクセルSP2内の第2半導体層8を積層方向から平面視した面積よりも大きくしている。第1サブピクセルSP1は、第2サブピクセルSP2よりも光電変換部6の体積が大きいため、第1サブピクセルSP1内の第2半導体層8の面積をより大きくすることで、飽和電荷量をより大きくすることができる。 More specifically, in the present embodiment, as shown in FIG. 1A, the area of the second semiconductor layer 8 in the first subpixel SP1 viewed in a plan view from the stacking direction is the area of the second semiconductor layer in the second subpixel SP2. 8 is made larger than the area viewed in a plan view from the stacking direction. Since the first subpixel SP1 has a larger volume of the photoelectric conversion unit 6 than the second subpixel SP2, the area of the second semiconductor layer 8 in the first subpixel SP1 is made larger to increase the saturated charge amount. Can be made larger.

 なお、図1Aと図3では、第1サブピクセルSP1と第2サブピクセルSP2が矩形状の例を示しているが、第1サブピクセルSP1と第2サブピクセルSP2の形状は任意である。また、第1サブピクセルSP1の形状は、第2サブピクセルSP2の形状とは異なっていてもよい。 Although FIGS. 1A and 3 show an example in which the first subpixel SP1 and the second subpixel SP2 have a rectangular shape, the shapes of the first subpixel SP1 and the second subpixel SP2 are arbitrary. Further, the shape of the first subpixel SP1 may be different from the shape of the second subpixel SP2.

 上述した例では、画素アレイ部21内の全画素Pがサイズの異なる複数のサブピクセルを有する例を示したが、画素アレイ部21内の一部の画素群だけが、サイズの異なる複数のサブピクセルを有していてもよい。 In the above-mentioned example, all the pixels P in the pixel array unit 21 have a plurality of sub-pixels having different sizes, but only a part of the pixel groups in the pixel array unit 21 has a plurality of sub-pixels having different sizes. It may have pixels.

 このように、第1の実施形態では、一つの画素P内にサイズの異なる複数のサブピクセルを設けるため、一つの画素P内の各サブピクセルの感度を相違させることができ、固体撮像装置1のダイナミックレンジを広げることができる。 As described above, in the first embodiment, since a plurality of subpixels having different sizes are provided in one pixel P, the sensitivity of each subpixel in one pixel P can be made different, and the solid-state image sensor 1 can be used. The dynamic range of can be expanded.

 (第2の実施形態)
 第1の実施形態では、画素Pの境界とサブピクセルの境界に素子分離層7を配置しているが、素子分離層7にトレンチを形成して、トレンチ内に遮光材料を充填して遮光層を形成し、画素Pの境界やサブピクセルの境界に沿って遮光層を配置してもよい。
(Second embodiment)
In the first embodiment, the element separation layer 7 is arranged at the boundary between the pixel P and the sub-pixel, but a trench is formed in the element separation layer 7 and the trench is filled with a light-shielding material to form a light-shielding layer. May be formed and a light-shielding layer may be arranged along the boundary of the pixel P or the boundary of the sub-pixel.

 第2の実施形態による固体撮像装置1は、図1Aと同様の平面レイアウト構成を備えており、一つの画素P内に複数のサブピクセルを有する。 The solid-state image sensor 1 according to the second embodiment has the same planar layout configuration as in FIG. 1A, and has a plurality of sub-pixels in one pixel P.

 図4Aは第2の実施形態による固体撮像装置1の平面レイアウト図、図4Bは図4Aの対角線方向の断面図である。図4A及び図4Bを図1A及び図1Bと比べればわかるように、第2の実施形態による固体撮像装置1は、素子分離層7の少なくとも一部に遮光層16が配置されている。遮光層16は、画素Pの境界に沿って設けられている。遮光層16は、素子分離層7に形成されたトレンチ内に設けられている。遮光層16の材料は、チタンTi、タングステンW、モリブデンMo、マンガンMn、銅Cuなどであり、二種類以上の金属の合金でもよい。遮光層16の材料は、必ずしも金属でなくてもよいが、固体撮像装置1が光電変換を行う光を吸収又は反射することができる材料である。遮光層16は、画素P内の複数のサブピクセルのそれぞれの境界位置にも配置されている。遮光層16は、第2半導体層8の積層方向に沿って平面視したときに、複数のサブピクセルのそれぞれを取り囲むように配置されている。画素P内の遮光層16で取り囲まれる複数のサブピクセルを積層方向に沿って平面視した面積は、それぞれ異なっている。 FIG. 4A is a plan layout view of the solid-state image sensor 1 according to the second embodiment, and FIG. 4B is a cross-sectional view in the diagonal direction of FIG. 4A. As can be seen by comparing FIGS. 4A and 4B with FIGS. 1A and 1B, in the solid-state image pickup device 1 according to the second embodiment, the light-shielding layer 16 is arranged at least a part of the element separation layer 7. The light-shielding layer 16 is provided along the boundary of the pixel P. The light-shielding layer 16 is provided in the trench formed in the element separation layer 7. The material of the light-shielding layer 16 is titanium Ti, tungsten W, molybdenum Mo, manganese Mn, copper Cu, or the like, and may be an alloy of two or more kinds of metals. The material of the light-shielding layer 16 does not necessarily have to be metal, but is a material capable of absorbing or reflecting the light to be photoelectrically converted by the solid-state image sensor 1. The light-shielding layer 16 is also arranged at the boundary position of each of the plurality of sub-pixels in the pixel P. The light-shielding layer 16 is arranged so as to surround each of the plurality of sub-pixels when viewed in a plane along the stacking direction of the second semiconductor layer 8. The areas of the plurality of sub-pixels surrounded by the light-shielding layer 16 in the pixel P in a plan view along the stacking direction are different from each other.

 図4A及び図4Bに示す遮光層16は、隣接する画素P又はサブピクセルからの光の入射を抑制することができる。これにより、クロストークを低減できる。 The light-shielding layer 16 shown in FIGS. 4A and 4B can suppress the incident of light from the adjacent pixel P or sub-pixel. This can reduce crosstalk.

 第2の実施形態による固体撮像装置1では、遮光層16が画素Pの境界になるとともに、画素P内のサブピクセルの境界になっている。一つの画素P内には、遮光層16で囲まれる複数の領域が設けられており、各領域がそれぞれ異なるサブピクセルを構成している。図4Aの例では、一つの画素P内に、遮光層16で囲まれる2つの領域が存在し、一つの領域は第1サブピクセルSP1であり、もう一つの領域は第2サブピクセルSP2である。本実施形態においても、画素P内に設けられるサブピクセルの数に特に制限はない。 In the solid-state image sensor 1 according to the second embodiment, the light-shielding layer 16 serves as the boundary of the pixel P and the boundary of the sub-pixel in the pixel P. A plurality of regions surrounded by the light-shielding layer 16 are provided in one pixel P, and each region constitutes a different sub-pixel. In the example of FIG. 4A, there are two regions surrounded by the light-shielding layer 16 in one pixel P, one region is the first subpixel SP1 and the other region is the second subpixel SP2. .. Also in this embodiment, there is no particular limitation on the number of sub-pixels provided in the pixel P.

 図4Aに示すように、第1サブピクセルSP1内の第2半導体層8を積層方向から平面視した面積と、第2サブピクセルSP2内の第2半導体層8を積層方向から平面視した面積とは、互いに相違している。具体的には、第2サブピクセルSP2よりもサイズの大きい第1サブピクセルSP1では、第2半導体層8の面積をより大きくしている。これにより、第1サブピクセルSP1の飽和電荷量をより大きくすることができ、感度もより向上できるため、固体撮像装置1のダイナミックレンジを広げることができる。 As shown in FIG. 4A, the area of the second semiconductor layer 8 in the first subpixel SP1 viewed in a plan view from the stacking direction and the area of the second semiconductor layer 8 in the second subpixel SP2 viewed in a plan view from the stacking direction. Are different from each other. Specifically, in the first subpixel SP1 having a size larger than that of the second subpixel SP2, the area of the second semiconductor layer 8 is made larger. As a result, the saturated charge amount of the first subpixel SP1 can be further increased and the sensitivity can be further improved, so that the dynamic range of the solid-state image sensor 1 can be widened.

 このように、第2の実施形態では、画素P及びサブピクセルの境界に沿って遮光層16を配置し、画素P内の遮光層16で囲まれる複数の領域のサイズを相違させる。遮光層16によりクロストークを低減できるだけでなく、遮光層16にて画素P内に感度の異なる複数のサブピクセルを設けることができ、固体撮像装置1のダイナミックレンジを広げることができる。 As described above, in the second embodiment, the light-shielding layer 16 is arranged along the boundary between the pixel P and the sub-pixels, and the sizes of the plurality of regions surrounded by the light-shielding layer 16 in the pixel P are different. Not only can crosstalk be reduced by the light-shielding layer 16, but a plurality of sub-pixels having different sensitivities can be provided in the pixel P by the light-shielding layer 16, and the dynamic range of the solid-state image sensor 1 can be expanded.

 (第3の実施形態)
 第2の実施形態は、画素P内の複数のサブピクセルのサイズが異なっており、かつ画素P内の各サブピクセル内の第2半導体層8のサイズも異なっている例を示したが、画素P内の各サブピクセル内の第2半導体層8のサイズは同一でもよい。
(Third embodiment)
The second embodiment shows an example in which the sizes of the plurality of subpixels in the pixel P are different, and the size of the second semiconductor layer 8 in each subpixel in the pixel P is also different. The size of the second semiconductor layer 8 in each subpixel in P may be the same.

 図5は第3の実施形態による固体撮像装置1の平面レイアウト図である。図5の固体撮像装置1は、図4A及び図4Bと同様に、画素Pの境界とサブピクセルの境界に沿って遮光層16が配置されている。また、一つの画素P内の複数のサブピクセルのサイズは異なっている。図5では、一つの画素P内に、サイズの異なる2つのサブピクセル(第1サブピクセルSP1と第2サブピクセルSP2)を設ける例を示している。 FIG. 5 is a plan layout view of the solid-state image sensor 1 according to the third embodiment. In the solid-state image sensor 1 of FIG. 5, the light-shielding layer 16 is arranged along the boundary of the pixel P and the boundary of the sub-pixel, as in FIGS. 4A and 4B. Further, the sizes of the plurality of sub-pixels in one pixel P are different. FIG. 5 shows an example in which two subpixels (first subpixel SP1 and second subpixel SP2) having different sizes are provided in one pixel P.

 図4Aに示すように、第1サブピクセルSP1のサイズは、第2サブピクセルSP2のサイズよりも大きい。また、図4Aでは、第1サブピクセルSP1内の第2半導体層8の積層方向から平面視した面積を、第2サブピクセルSP2内の第2半導体層8の積層方向から平面視した面積と略同一にしている。これが、図3Aとの大きな違いである。 As shown in FIG. 4A, the size of the first subpixel SP1 is larger than the size of the second subpixel SP2. Further, in FIG. 4A, the area viewed in a plan view from the stacking direction of the second semiconductor layer 8 in the first subpixel SP1 is abbreviated as the area viewed in a plan view from the stacking direction of the second semiconductor layer 8 in the second subpixel SP2. It is the same. This is a big difference from FIG. 3A.

 第1サブピクセルSP1内の第2半導体層8と第2サブピクセルSP2内の第2半導体層8とを同サイズにしても、第1サブピクセルSP1内の光電変換部6の体積は第2サブピクセルSP2内の光電変換部6の体積よりも大きいため、第1サブピクセルSP1の感度を第2サブピクセルSP2よりも高くすることができる。よって、第3の実施形態による固体撮像装置1のダイナミックレンジを広げることができる。 Even if the second semiconductor layer 8 in the first subpixel SP1 and the second semiconductor layer 8 in the second subpixel SP2 have the same size, the volume of the photoelectric conversion unit 6 in the first subpixel SP1 is the second sub. Since it is larger than the volume of the photoelectric conversion unit 6 in the pixel SP2, the sensitivity of the first subpixel SP1 can be made higher than that of the second subpixel SP2. Therefore, the dynamic range of the solid-state image sensor 1 according to the third embodiment can be expanded.

 このように、第2の実施形態では、画素P内のサブピクセルのサイズに応じて、第2半導体層8のサイズを変えるのに対して、第3の実施形態では、画素P内のサブピクセルのサイズによらず、各サブピクセル内の第2半導体層8のサイズを略同一にしている。これにより、第2の実施形態による固体撮像装置1よりも、製造工程を簡略化できる。 As described above, in the second embodiment, the size of the second semiconductor layer 8 is changed according to the size of the subpixel in the pixel P, whereas in the third embodiment, the subpixel in the pixel P is changed. The size of the second semiconductor layer 8 in each subpixel is substantially the same regardless of the size of. As a result, the manufacturing process can be simplified as compared with the solid-state image pickup device 1 according to the second embodiment.

 (第4の実施形態)
 上述した第1~第3の実施形態による固体撮像装置1内の各画素Pで撮像された信号は、読み出し回路を介して信号線L2に送られる。読み出し回路の少なくとも一部は、例えば図1Bの駆動用基板13に形成される。
(Fourth Embodiment)
The signal imaged by each pixel P in the solid-state image sensor 1 according to the first to third embodiments described above is sent to the signal line L2 via the readout circuit. At least a part of the readout circuit is formed on, for example, the drive substrate 13 of FIG. 1B.

 図6は読み出し回路の一例を示す回路図、図7A~図7Eは図6の読み出し回路の動作タイミング図及びポテンシャル図である。 FIG. 6 is a circuit diagram showing an example of a readout circuit, and FIGS. 7A to 7E are an operation timing diagram and a potential diagram of the readout circuit of FIG.

 図6の読み出し回路は、光電変換部6の光電変換動作を等価的に表した電流源CS1及びキャパシタC1と、トランジスタQ1~Q5と、キャパシタC2、C3と、電流源CS2とを有する。キャパシタC1とC2は、基準電圧Vtopと接地電位との間に直列に接続されている。トランジスタQ1は、ゲートに入力されるOFG信号がハイレベルのときにオンして、キャパシタC1とC2の接続ノードの電位をリセット電圧VDRに設定する。トランジスタQ2は、ゲートに入力されるTRG信号がハイレベルのときにオンして、光電変換部6で光電変換された電荷を、キャパシタC3に転送する。キャパシタC3の一端はフローティングディフュージョンFDである。より詳細には、トランジスタQ2がオンすると、キャパシタC1とC2の接続ノードとキャパシタC3の一端であるフローティングディフュージョンFDとの電位が等しくなる。トランジスタQ3は、ゲートに入力されるRST信号がハイレベルときにオンして、フローティングディフュージョンFDをリセット電圧VDRに設定する。トランジスタQ4のゲートには、フローティングディフュージョンFDが接続されている。トランジスタQ4とQ5は、電源電圧VDDと電流源CS2の一端との間にカスコード接続されている。トランジスタQ5のゲートにはSEL信号が入力されている。 The readout circuit of FIG. 6 has a current source CS1 and capacitors C1 that equivalently represent the photoelectric conversion operation of the photoelectric conversion unit 6, transistors Q1 to Q5, capacitors C2 and C3, and a current source CS2. Capacitors C1 and C2 are connected in series between the reference voltage Vtop and the ground potential. The transistor Q1 is turned on when the OFG signal input to the gate is at a high level, and the potential of the connection node of the capacitors C1 and C2 is set to the reset voltage VDC. The transistor Q2 is turned on when the TRG signal input to the gate is at a high level, and the charge photoelectrically converted by the photoelectric conversion unit 6 is transferred to the capacitor C3. One end of the capacitor C3 is a floating diffusion FD. More specifically, when the transistor Q2 is turned on, the potentials of the connection node of the capacitors C1 and C2 and the floating diffusion FD which is one end of the capacitor C3 become equal. The transistor Q3 is turned on when the RST signal input to the gate is at a high level to set the floating diffusion FD to the reset voltage VDC. A floating diffusion FD is connected to the gate of the transistor Q4. The transistors Q4 and Q5 are cascode-connected between the power supply voltage VDD and one end of the current source CS2. An SEL signal is input to the gate of the transistor Q5.

 グローバル動作期間中は、TRG信号をローレベルにしてトランジスタQ2をオフにした状態で、OFG信号をハイレベルにしてトランジスタQ1をオンして、キャパシタC2の電荷を初期化するとともに、RST信号をハイレベルにしてトランジスタQ3をオンすることにより、フローティングディフュージョンFDを初期化する。その後、TRG信号を一時的にハイレベルにして、光電変換による電荷信号をフローティングディフュージョンFDに転送する。 During the global operation period, the TRG signal is set to a low level and the transistor Q2 is turned off, the OFG signal is set to a high level and the transistor Q1 is turned on to initialize the charge of the capacitor C2 and set the RST signal to high. The floating diffusion FD is initialized by turning on the transistor Q3 at the level. After that, the TRG signal is temporarily raised to a high level, and the charge signal by photoelectric conversion is transferred to the floating diffusion FD.

 ローリング動作期間中は、選択した画素Pの光電変換による電荷信号をトランジスタQ4、Q5を介して読み出した後、RST信号を一時的にハイレベルにしてトランジスタQ3を一時的にオンし、リセットレベルの信号をトランジスタQ4、Q5を介して読み出す。 During the rolling operation period, after reading the charge signal obtained by photoelectric conversion of the selected pixel P via the transistors Q4 and Q5, the RST signal is temporarily set to a high level and the transistor Q3 is temporarily turned on to set the reset level. The signal is read out via the transistors Q4 and Q5.

 より詳細には、図6の読み出し回路は、図7A~図7Eの動作タイミング図に示すように、グローバル動作動作を行った後に、ローリング動作を行う。グローバル動作期間内は、画素アレイ部21内の全画素Pが同時に動作を行う。図7Aに示すように時刻t1では、OFG信号がハイレベルであるため、トランジスタQ1はオンし、キャパシタC1とC2の接続ノードはリセット電圧VDGに設定される。このとき、ゲートにOFG信号が入力されるトランジスタQ1のドレイン側及びソース側のポテンシャルの井戸と、フローティングディフュージョンFDのポテンシャルの井戸の底は高くなる。 More specifically, the readout circuit of FIG. 6 performs a rolling operation after performing a global operation operation as shown in the operation timing diagrams of FIGS. 7A to 7E. During the global operation period, all the pixels P in the pixel array unit 21 operate at the same time. As shown in FIG. 7A, at time t1, since the OFG signal is at a high level, the transistor Q1 is turned on, and the connection node of the capacitors C1 and C2 is set to the reset voltage VDG. At this time, the bottoms of the potential wells on the drain side and the source side of the transistor Q1 in which the OFG signal is input to the gate and the potential wells of the floating diffusion FD become high.

 次に、図7Bに示すように、時刻t2では、OFG信号がローレベルになるため、トランジスタQ1がオフし、トランジスタQ1とトランジスタQ2との接続ノードのポテンシャルの井戸の底は低くなる。 Next, as shown in FIG. 7B, at time t2, the OFG signal becomes low level, so that the transistor Q1 is turned off and the bottom of the potential well of the connection node between the transistor Q1 and the transistor Q2 becomes low.

 次に、図7Cに示すように、時刻t3では、RST信号がローレベルになるため、トランジスタQ3がオフし、フローティングディフュージョンFDのポテンシャルの井戸の底が低くなる。また、トランジスタQ1とトランジスタQ2との接続ノードのポテンシャルの井戸に蓄積される電子量は、光電変換された電荷量に応じて減少する。この電子量の減少分が撮像信号に該当する。 Next, as shown in FIG. 7C, at time t3, the RST signal becomes low level, so that the transistor Q3 is turned off and the bottom of the potential well of the floating diffusion FD becomes low. Further, the amount of electrons accumulated in the potential well of the connection node between the transistor Q1 and the transistor Q2 decreases according to the amount of charge converted by photoelectric conversion. This decrease in the amount of electrons corresponds to the imaging signal.

 次に、図7Dに示すように、時刻t4~t5では、TRG信号がいったんハイレベルになってからローレベルに変化し、トランジスタQ2がいったんオンになった後、オフする。これにより、トランジスタQ1とトランジスタQ2との接続ノードのポテンシャルの井戸と、フローティングディフュージョンFDのポテンシャルの井戸との間で、電子が平均化される。このため、ゲインロスが生じる。 Next, as shown in FIG. 7D, at times t4 to t5, the TRG signal once becomes a high level and then changes to a low level, and the transistor Q2 is once turned on and then turned off. As a result, electrons are averaged between the potential well of the connection node between the transistor Q1 and the transistor Q2 and the potential well of the floating diffusion FD. Therefore, gain loss occurs.

 次に、図7Eに示すように、時刻t6~t7では、RST信号がいったんハイレベルになってからローレベルに変化し、トランジスタQ3がいったんオンになった後、オフする。これにより、フローティングディフュージョンFDのポテンシャルの井戸の底がいったん高くなってリセットされた後、低くなる。 Next, as shown in FIG. 7E, at times t6 to t7, the RST signal once becomes a high level and then changes to a low level, and the transistor Q3 is once turned on and then turned off. As a result, the bottom of the well of the potential of the floating diffusion FD is once raised and reset, and then lowered.

 上述した第1~第3の実施形態による固体撮像装置1は、1つの画素P内にサイズの異なる複数のサブピクセルを備えている。これらサブピクセルの光電変換部6で光電変換された信号電荷は、例えばフローティングディフュージョンFDで合成されて、トランジスタQ4とQ5を介して、対応する信号線L2に供給される。 The solid-state image sensor 1 according to the first to third embodiments described above includes a plurality of sub-pixels having different sizes in one pixel P. The signal charges photoelectrically converted by the photoelectric conversion unit 6 of these subpixels are synthesized by, for example, a floating diffusion FD, and are supplied to the corresponding signal line L2 via the transistors Q4 and Q5.

 <<応用例>>
 本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット、建設機械、農業機械(トラクター)などのいずれかの種類の移動体に搭載される装置として実現されてもよい。
<< Application example >>
The technique according to the present disclosure can be applied to various products. For example, the technology according to the present disclosure is any kind of movement such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, a robot, a construction machine, and an agricultural machine (tractor). It may be realized as a device mounted on the body.

 図8は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システム7000の概略的な構成例を示すブロック図である。車両制御システム7000は、通信ネットワーク7010を介して接続された複数の電子制御ユニットを備える。図8に示した例では、車両制御システム7000は、駆動系制御ユニット7100、ボディ系制御ユニット7200、バッテリ制御ユニット7300、車外情報検出ユニット7400、車内情報検出ユニット7500、及び統合制御ユニット7600を備える。これらの複数の制御ユニットを接続する通信ネットワーク7010は、例えば、CAN(Controller Area Network)、LIN(Local Interconnect Network)、LAN(Local Area Network)又はFlexRay(登録商標)等の任意の規格に準拠した車載通信ネットワークであってよい。 FIG. 8 is a block diagram showing a schematic configuration example of a vehicle control system 7000, which is an example of a mobile control system to which the technique according to the present disclosure can be applied. The vehicle control system 7000 includes a plurality of electronic control units connected via a communication network 7010. In the example shown in FIG. 8, the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside information detection unit 7400, an in-vehicle information detection unit 7500, and an integrated control unit 7600. .. The communication network 7010 connecting these multiple control units conforms to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network) or FlexRay (registered trademark). It may be an in-vehicle communication network.

 各制御ユニットは、各種プログラムにしたがって演算処理を行うマイクロコンピュータと、マイクロコンピュータにより実行されるプログラム又は各種演算に用いられるパラメータ等を記憶する記憶部と、各種制御対象の装置を駆動する駆動回路とを備える。各制御ユニットは、通信ネットワーク7010を介して他の制御ユニットとの間で通信を行うためのネットワークI/Fを備えるとともに、車内外の装置又はセンサ等との間で、有線通信又は無線通信により通信を行うための通信I/Fを備える。図8では、統合制御ユニット7600の機能構成として、マイクロコンピュータ7610、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660、音声画像出力部7670、車載ネットワークI/F7680及び記憶部7690が図示されている。他の制御ユニットも同様に、マイクロコンピュータ、通信I/F及び記憶部等を備える。 Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used for various arithmetic, and a drive circuit that drives various controlled devices. To prepare for. Each control unit is provided with a network I / F for communicating with other control units via the communication network 7010, and is connected to devices or sensors inside and outside the vehicle by wired communication or wireless communication. A communication I / F for performing communication is provided. In FIG. 8, as the functional configuration of the integrated control unit 7600, the microcomputer 7610, the general-purpose communication I / F7620, the dedicated communication I / F7630, the positioning unit 7640, the beacon receiving unit 7650, the in-vehicle device I / F7660, the audio image output unit 7670, The vehicle-mounted network I / F 7680 and the storage unit 7690 are illustrated. Other control units also include a microcomputer, a communication I / F, a storage unit, and the like.

 駆動系制御ユニット7100は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット7100は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。駆動系制御ユニット7100は、ABS(Antilock Brake System)又はESC(Electronic Stability Control)等の制御装置としての機能を有してもよい。 The drive system control unit 7100 controls the operation of the device related to the drive system of the vehicle according to various programs. For example, the drive system control unit 7100 has a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating braking force of the vehicle. The drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).

 駆動系制御ユニット7100には、車両状態検出部7110が接続される。車両状態検出部7110には、例えば、車体の軸回転運動の角速度を検出するジャイロセンサ、車両の加速度を検出する加速度センサ、あるいは、アクセルペダルの操作量、ブレーキペダルの操作量、ステアリングホイールの操舵角、エンジン回転数又は車輪の回転速度等を検出するためのセンサのうちの少なくとも一つが含まれる。駆動系制御ユニット7100は、車両状態検出部7110から入力される信号を用いて演算処理を行い、内燃機関、駆動用モータ、電動パワーステアリング装置又はブレーキ装置等を制御する。 The vehicle state detection unit 7110 is connected to the drive system control unit 7100. The vehicle state detection unit 7110 may include, for example, a gyro sensor that detects the angular speed of the axial rotation motion of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, an accelerator pedal operation amount, a brake pedal operation amount, or steering wheel steering. It includes at least one of sensors for detecting angles, engine speeds, wheel speeds, and the like. The drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detection unit 7110, and controls an internal combustion engine, a drive motor, an electric power steering device, a brake device, and the like.

 ボディ系制御ユニット7200は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット7200は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット7200には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット7200は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。 The body system control unit 7200 controls the operation of various devices mounted on the vehicle body according to various programs. For example, the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, turn signals or fog lamps. In this case, a radio wave transmitted from a portable device that substitutes for a key or signals of various switches may be input to the body system control unit 7200. The body system control unit 7200 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.

 バッテリ制御ユニット7300は、各種プログラムにしたがって駆動用モータの電力供給源である二次電池7310を制御する。例えば、バッテリ制御ユニット7300には、二次電池7310を備えたバッテリ装置から、バッテリ温度、バッテリ出力電圧又はバッテリの残存容量等の情報が入力される。バッテリ制御ユニット7300は、これらの信号を用いて演算処理を行い、二次電池7310の温度調節制御又はバッテリ装置に備えられた冷却装置等の制御を行う。 The battery control unit 7300 controls the secondary battery 7310, which is the power supply source of the drive motor, according to various programs. For example, information such as the battery temperature, the battery output voltage, or the remaining capacity of the battery is input to the battery control unit 7300 from the battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and controls the temperature control of the secondary battery 7310 or the cooling device provided in the battery device.

 車外情報検出ユニット7400は、車両制御システム7000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット7400には、撮像部7410及び車外情報検出部7420のうちの少なくとも一方が接続される。撮像部7410には、ToF(Time Of Flight)カメラ、ステレオカメラ、単眼カメラ、赤外線カメラ及びその他のカメラのうちの少なくとも一つが含まれる。車外情報検出部7420には、例えば、現在の天候又は気象を検出するための環境センサ、あるいは、車両制御システム7000を搭載した車両の周囲の他の車両、障害物又は歩行者等を検出するための周囲情報検出センサのうちの少なくとも一つが含まれる。 The outside information detection unit 7400 detects information outside the vehicle equipped with the vehicle control system 7000. For example, at least one of the image pickup unit 7410 and the vehicle exterior information detection unit 7420 is connected to the vehicle exterior information detection unit 7400. The image pickup unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras. The vehicle outside information detection unit 7420 is used, for example, to detect the current weather or an environment sensor for detecting the weather, or other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. At least one of the ambient information detection sensors is included.

 環境センサは、例えば、雨天を検出する雨滴センサ、霧を検出する霧センサ、日照度合いを検出する日照センサ、及び降雪を検出する雪センサのうちの少なくとも一つであってよい。周囲情報検出センサは、超音波センサ、レーダ装置及びLIDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)装置のうちの少なくとも一つであってよい。これらの撮像部7410及び車外情報検出部7420は、それぞれ独立したセンサないし装置として備えられてもよいし、複数のセンサないし装置が統合された装置として備えられてもよい。 The environment sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects the degree of sunshine, and a snow sensor that detects snowfall. The ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device. The image pickup unit 7410 and the vehicle exterior information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.

 ここで、図9は、撮像部7410及び車外情報検出部7420の設置位置の例を示す。撮像部7910,7912,7914,7916,7918は、例えば、車両7900のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部のうちの少なくとも一つの位置に設けられる。フロントノーズに備えられる撮像部7910及び車室内のフロントガラスの上部に備えられる撮像部7918は、主として車両7900の前方の画像を取得する。サイドミラーに備えられる撮像部7912,7914は、主として車両7900の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部7916は、主として車両7900の後方の画像を取得する。車室内のフロントガラスの上部に備えられる撮像部7918は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。 Here, FIG. 9 shows an example of the installation position of the image pickup unit 7410 and the vehicle exterior information detection unit 7420. The image pickup unit 7910, 7912, 7914, 7916, 7918 are provided, for example, at at least one of the front nose, side mirror, rear bumper, back door, and upper part of the windshield of the vehicle interior of the vehicle 7900. The image pickup unit 7910 provided in the front nose and the image pickup section 7918 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 7900. The image pickup units 7912 and 7914 provided in the side mirrors mainly acquire images of the side of the vehicle 7900. The image pickup unit 7916 provided in the rear bumper or the back door mainly acquires an image of the rear of the vehicle 7900. The image pickup unit 7918 provided on the upper part of the windshield in the vehicle interior is mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.

 なお、図9には、それぞれの撮像部7910,7912,7914,7916の撮影範囲の一例が示されている。撮像範囲aは、フロントノーズに設けられた撮像部7910の撮像範囲を示し、撮像範囲b,cは、それぞれサイドミラーに設けられた撮像部7912,7914の撮像範囲を示し、撮像範囲dは、リアバンパ又はバックドアに設けられた撮像部7916の撮像範囲を示す。例えば、撮像部7910,7912,7914,7916で撮像された画像データが重ね合わせられることにより、車両7900を上方から見た俯瞰画像が得られる。 Note that FIG. 9 shows an example of the shooting range of each of the imaging units 7910, 7912, 7914, 7916. The imaging range a indicates the imaging range of the imaging unit 7910 provided on the front nose, the imaging ranges b and c indicate the imaging range of the imaging units 7912 and 7914 provided on the side mirrors, respectively, and the imaging range d indicates the imaging range d. The imaging range of the imaging unit 7916 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the image pickup units 7910, 7912, 7914, 7916, a bird's-eye view image of the vehicle 7900 can be obtained.

 車両7900のフロント、リア、サイド、コーナ及び車室内のフロントガラスの上部に設けられる車外情報検出部7920,7922,7924,7926,7928,7930は、例えば超音波センサ又はレーダ装置であってよい。車両7900のフロントノーズ、リアバンパ、バックドア及び車室内のフロントガラスの上部に設けられる車外情報検出部7920,7926,7930は、例えばLIDAR装置であってよい。これらの車外情報検出部7920~7930は、主として先行車両、歩行者又は障害物等の検出に用いられる。 The vehicle exterior information detection unit 7920, 7922, 7924, 7926, 7928, 7930 provided on the front, rear, side, corner and the upper part of the windshield in the vehicle interior of the vehicle 7900 may be, for example, an ultrasonic sensor or a radar device. The vehicle exterior information detection units 7920, 7926, 7930 provided on the front nose, rear bumper, back door, and upper part of the windshield in the vehicle interior of the vehicle 7900 may be, for example, a lidar device. These out-of-vehicle information detection units 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, or the like.

 図8に戻って説明を続ける。車外情報検出ユニット7400は、撮像部7410に車外の画像を撮像させるとともに、撮像された画像データを受信する。また、車外情報検出ユニット7400は、接続されている車外情報検出部7420から検出情報を受信する。車外情報検出部7420が超音波センサ、レーダ装置又はLIDAR装置である場合には、車外情報検出ユニット7400は、超音波又は電磁波等を発信させるとともに、受信された反射波の情報を受信する。車外情報検出ユニット7400は、受信した情報に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。車外情報検出ユニット7400は、受信した情報に基づいて、降雨、霧又は路面状況等を認識する環境認識処理を行ってもよい。車外情報検出ユニット7400は、受信した情報に基づいて、車外の物体までの距離を算出してもよい。 Return to Fig. 8 and continue the explanation. The vehicle outside information detection unit 7400 causes the image pickup unit 7410 to capture an image of the outside of the vehicle and receives the captured image data. Further, the vehicle outside information detection unit 7400 receives the detection information from the connected vehicle outside information detection unit 7420. When the vehicle exterior information detection unit 7420 is an ultrasonic sensor, a radar device, or a lidar device, the vehicle exterior information detection unit 7400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives received reflected wave information. The out-of-vehicle information detection unit 7400 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on a road surface based on the received information. The out-of-vehicle information detection unit 7400 may perform an environment recognition process for recognizing rainfall, fog, road surface conditions, etc. based on the received information. The out-of-vehicle information detection unit 7400 may calculate the distance to an object outside the vehicle based on the received information.

 また、車外情報検出ユニット7400は、受信した画像データに基づいて、人、車、障害物、標識又は路面上の文字等を認識する画像認識処理又は距離検出処理を行ってもよい。車外情報検出ユニット7400は、受信した画像データに対して歪補正又は位置合わせ等の処理を行うとともに、異なる撮像部7410により撮像された画像データを合成して、俯瞰画像又はパノラマ画像を生成してもよい。車外情報検出ユニット7400は、異なる撮像部7410により撮像された画像データを用いて、視点変換処理を行ってもよい。 Further, the vehicle outside information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a vehicle, an obstacle, a sign, a character on the road surface, or the like based on the received image data. The vehicle exterior information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and synthesizes image data captured by different image pickup units 7410 to generate a bird's-eye view image or a panoramic image. May be good. The vehicle exterior information detection unit 7400 may perform a viewpoint conversion process using image data captured by different image pickup units 7410.

 車内情報検出ユニット7500は、車内の情報を検出する。車内情報検出ユニット7500には、例えば、運転者の状態を検出する運転者状態検出部7510が接続される。運転者状態検出部7510は、運転者を撮像するカメラ、運転者の生体情報を検出する生体センサ又は車室内の音声を集音するマイク等を含んでもよい。生体センサは、例えば、座面又はステアリングホイール等に設けられ、座席に座った搭乗者又はステアリングホイールを握る運転者の生体情報を検出する。車内情報検出ユニット7500は、運転者状態検出部7510から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。車内情報検出ユニット7500は、集音された音声信号に対してノイズキャンセリング処理等の処理を行ってもよい。 The in-vehicle information detection unit 7500 detects the in-vehicle information. For example, a driver state detection unit 7510 that detects the state of the driver is connected to the in-vehicle information detection unit 7500. The driver state detection unit 7510 may include a camera that captures the driver, a biosensor that detects the driver's biological information, a microphone that collects sound in the vehicle interior, and the like. The biosensor is provided on, for example, a seat surface or a steering wheel, and detects biometric information of a passenger sitting on the seat or a driver holding the steering wheel. The in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, and determines whether or not the driver is dozing. You may. The in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected audio signal.

 統合制御ユニット7600は、各種プログラムにしたがって車両制御システム7000内の動作全般を制御する。統合制御ユニット7600には、入力部7800が接続されている。入力部7800は、例えば、タッチパネル、ボタン、マイクロフォン、スイッチ又はレバー等、搭乗者によって入力操作され得る装置によって実現される。統合制御ユニット7600には、マイクロフォンにより入力される音声を音声認識することにより得たデータが入力されてもよい。入力部7800は、例えば、赤外線又はその他の電波を利用したリモートコントロール装置であってもよいし、車両制御システム7000の操作に対応した携帯電話又はPDA(Personal Digital Assistant)等の外部接続機器であってもよい。入力部7800は、例えばカメラであってもよく、その場合搭乗者はジェスチャにより情報を入力することができる。あるいは、搭乗者が装着したウェアラブル装置の動きを検出することで得られたデータが入力されてもよい。さらに、入力部7800は、例えば、上記の入力部7800を用いて搭乗者等により入力された情報に基づいて入力信号を生成し、統合制御ユニット7600に出力する入力制御回路などを含んでもよい。搭乗者等は、この入力部7800を操作することにより、車両制御システム7000に対して各種のデータを入力したり処理動作を指示したりする。 The integrated control unit 7600 controls the overall operation in the vehicle control system 7000 according to various programs. An input unit 7800 is connected to the integrated control unit 7600. The input unit 7800 is realized by a device that can be input-operated by the passenger, such as a touch panel, a button, a microphone, a switch, or a lever. Data obtained by recognizing the voice input by the microphone may be input to the integrated control unit 7600. The input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or an external connection device such as a mobile phone or a PDA (Personal Digital Assistant) corresponding to the operation of the vehicle control system 7000. You may. The input unit 7800 may be, for example, a camera, in which case the passenger can input information by gesture. Alternatively, data obtained by detecting the movement of the wearable device worn by the passenger may be input. Further, the input unit 7800 may include, for example, an input control circuit that generates an input signal based on the information input by the passenger or the like using the above input unit 7800 and outputs the input signal to the integrated control unit 7600. By operating the input unit 7800, the passenger or the like inputs various data to the vehicle control system 7000 and instructs the processing operation.

 記憶部7690は、マイクロコンピュータにより実行される各種プログラムを記憶するROM(Read Only Memory)、及び各種パラメータ、演算結果又はセンサ値等を記憶するRAM(Random Access Memory)を含んでいてもよい。また、記憶部7690は、HDD(Hard Disc Drive)等の磁気記憶デバイス、半導体記憶デバイス、光記憶デバイス又は光磁気記憶デバイス等によって実現してもよい。 The storage unit 7690 may include a ROM (Read Only Memory) for storing various programs executed by the microcomputer, and a RAM (Random Access Memory) for storing various parameters, calculation results, sensor values, and the like. Further, the storage unit 7690 may be realized by a magnetic storage device such as an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, or the like.

 汎用通信I/F7620は、外部環境7750に存在する様々な機器との間の通信を仲介する汎用的な通信I/Fである。汎用通信I/F7620は、GSM(登録商標)(Global System of Mobile communications)、WiMAX(登録商標)、LTE(登録商標)(Long Term Evolution)若しくはLTE-A(LTE-Advanced)などのセルラー通信プロトコル、又は無線LAN(Wi-Fi(登録商標)ともいう)、Bluetooth(登録商標)などのその他の無線通信プロトコルを実装してよい。汎用通信I/F7620は、例えば、基地局又はアクセスポイントを介して、外部ネットワーク(例えば、インターネット、クラウドネットワーク又は事業者固有のネットワーク)上に存在する機器(例えば、アプリケーションサーバ又は制御サーバ)へ接続してもよい。また、汎用通信I/F7620は、例えばP2P(Peer To Peer)技術を用いて、車両の近傍に存在する端末(例えば、運転者、歩行者若しくは店舗の端末、又はMTC(Machine Type Communication)端末)と接続してもよい。 The general-purpose communication I / F 7620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 7750. General-purpose communication I / F7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced). , Or other wireless communication protocols such as wireless LAN (also referred to as Wi-Fi®), Bluetooth® may be implemented. The general-purpose communication I / F7620 connects to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or a business-specific network) via a base station or an access point, for example. You may. Further, the general-purpose communication I / F7620 uses, for example, P2P (Peer To Peer) technology, and is a terminal existing in the vicinity of the vehicle (for example, a driver, a pedestrian or a store terminal, or an MTC (Machine Type Communication) terminal). May be connected with.

 専用通信I/F7630は、車両における使用を目的として策定された通信プロトコルをサポートする通信I/Fである。専用通信I/F7630は、例えば、下位レイヤのIEEE802.11pと上位レイヤのIEEE1609との組合せであるWAVE(Wireless Access in Vehicle Environment)、DSRC(Dedicated Short Range Communications)、又はセルラー通信プロトコルといった標準プロトコルを実装してよい。専用通信I/F7630は、典型的には、車車間(Vehicle to Vehicle)通信、路車間(Vehicle to Infrastructure)通信、車両と家との間(Vehicle to Home)の通信及び歩車間(Vehicle to Pedestrian)通信のうちの1つ以上を含む概念であるV2X通信を遂行する。 The dedicated communication I / F 7630 is a communication I / F that supports a communication protocol formulated for use in a vehicle. The dedicated communication I / F7630 uses a standard protocol such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), which is a combination of the lower layer IEEE802.11p and the upper layer IEEE1609, or a cellular communication protocol. May be implemented. Dedicated communication I / F7630 is typically vehicle-to-vehicle (Vehicle to Vehicle) communication, road-to-vehicle (Vehicle to Infrastructure) communication, vehicle-to-house (Vehicle to Home) communication, and pedestrian-to-vehicle (Vehicle to Pedestrian) communication. ) Carry out V2X communication, a concept that includes one or more of the communications.

 測位部7640は、例えば、GNSS(Global Navigation Satellite System)衛星からのGNSS信号(例えば、GPS(Global Positioning System)衛星からのGPS信号)を受信して測位を実行し、車両の緯度、経度及び高度を含む位置情報を生成する。なお、測位部7640は、無線アクセスポイントとの信号の交換により現在位置を特定してもよく、又は測位機能を有する携帯電話、PHS若しくはスマートフォンといった端末から位置情報を取得してもよい。 The positioning unit 7640 receives, for example, a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite), executes positioning, and executes positioning, and the latitude, longitude, and altitude of the vehicle. Generate location information including. The positioning unit 7640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smartphone having a positioning function.

 ビーコン受信部7650は、例えば、道路上に設置された無線局等から発信される電波あるいは電磁波を受信し、現在位置、渋滞、通行止め又は所要時間等の情報を取得する。なお、ビーコン受信部7650の機能は、上述した専用通信I/F7630に含まれてもよい。 The beacon receiving unit 7650 receives, for example, a radio wave or an electromagnetic wave transmitted from a radio station or the like installed on a road, and acquires information such as a current position, a traffic jam, a road closure, or a required time. The function of the beacon receiving unit 7650 may be included in the above-mentioned dedicated communication I / F 7630.

 車内機器I/F7660は、マイクロコンピュータ7610と車内に存在する様々な車内機器7760との間の接続を仲介する通信インタフェースである。車内機器I/F7660は、無線LAN、Bluetooth(登録商標)、NFC(Near Field Communication)又はWUSB(Wireless USB)といった無線通信プロトコルを用いて無線接続を確立してもよい。また、車内機器I/F7660は、図示しない接続端子(及び、必要であればケーブル)を介して、USB(Universal Serial Bus)、HDMI(登録商標)(High-Definition Multimedia Interface、又はMHL(Mobile High-definition Link)等の有線接続を確立してもよい。車内機器7760は、例えば、搭乗者が有するモバイル機器若しくはウェアラブル機器、又は車両に搬入され若しくは取り付けられる情報機器のうちの少なくとも1つを含んでいてもよい。また、車内機器7760は、任意の目的地までの経路探索を行うナビゲーション装置を含んでいてもよい。車内機器I/F7660は、これらの車内機器7760との間で、制御信号又はデータ信号を交換する。 The in-vehicle device I / F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle devices 7760 existing in the vehicle. The in-vehicle device I / F7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication) or WUSB (Wireless USB). In addition, the in-vehicle device I / F7660 is via a connection terminal (and a cable if necessary) (not shown), USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface, or MHL (Mobile High)). -Definition Link) and other wired connections may be established. The in-vehicle device 7760 includes, for example, at least one of a passenger's mobile device or wearable device, or an information device carried in or attached to the vehicle. Further, the in-vehicle device 7760 may include a navigation device for searching a route to an arbitrary destination. The in-vehicle device I / F 7660 may be a control signal to and from these in-vehicle devices 7760. Or exchange the data signal.

 車載ネットワークI/F7680は、マイクロコンピュータ7610と通信ネットワーク7010との間の通信を仲介するインタフェースである。車載ネットワークI/F7680は、通信ネットワーク7010によりサポートされる所定のプロトコルに則して、信号等を送受信する。 The in-vehicle network I / F7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. The vehicle-mounted network I / F7680 transmits / receives signals and the like according to a predetermined protocol supported by the communication network 7010.

 統合制御ユニット7600のマイクロコンピュータ7610は、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660及び車載ネットワークI/F7680のうちの少なくとも一つを介して取得される情報に基づき、各種プログラムにしたがって、車両制御システム7000を制御する。例えば、マイクロコンピュータ7610は、取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット7100に対して制御指令を出力してもよい。例えば、マイクロコンピュータ7610は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行ってもよい。また、マイクロコンピュータ7610は、取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行ってもよい。 The microcomputer 7610 of the integrated control unit 7600 is via at least one of general-purpose communication I / F7620, dedicated communication I / F7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I / F7660, and in-vehicle network I / F7680. Based on the information acquired in the above, the vehicle control system 7000 is controlled according to various programs. For example, the microcomputer 7610 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the drive system control unit 7100. May be good. For example, the microcomputer 7610 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. Cooperative control may be performed for the purpose of. In addition, the microcomputer 7610 automatically travels autonomously without relying on the driver's operation by controlling the driving force generator, steering mechanism, braking device, etc. based on the acquired information on the surroundings of the vehicle. Coordinated control may be performed for the purpose of driving or the like.

 マイクロコンピュータ7610は、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660及び車載ネットワークI/F7680のうちの少なくとも一つを介して取得される情報に基づき、車両と周辺の構造物や人物等の物体との間の3次元距離情報を生成し、車両の現在位置の周辺情報を含むローカル地図情報を作成してもよい。また、マイクロコンピュータ7610は、取得される情報に基づき、車両の衝突、歩行者等の近接又は通行止めの道路への進入等の危険を予測し、警告用信号を生成してもよい。警告用信号は、例えば、警告音を発生させたり、警告ランプを点灯させたりするための信号であってよい。 The microcomputer 7610 has information acquired via at least one of general-purpose communication I / F 7620, dedicated communication I / F 7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I / F 7660, and in-vehicle network I / F 7680. Based on the above, three-dimensional distance information between the vehicle and an object such as a surrounding structure or a person may be generated, and local map information including the peripheral information of the current position of the vehicle may be created. Further, the microcomputer 7610 may predict the danger of a vehicle collision, a pedestrian or the like approaching or entering a closed road, and generate a warning signal based on the acquired information. The warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.

 音声画像出力部7670は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図8の例では、出力装置として、オーディオスピーカ7710、表示部7720及びインストルメントパネル7730が例示されている。表示部7720は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。表示部7720は、AR(Augmented Reality)表示機能を有していてもよい。出力装置は、これらの装置以外の、ヘッドホン、搭乗者が装着する眼鏡型ディスプレイ等のウェアラブルデバイス、プロジェクタ又はランプ等の他の装置であってもよい。出力装置が表示装置の場合、表示装置は、マイクロコンピュータ7610が行った各種処理により得られた結果又は他の制御ユニットから受信された情報を、テキスト、イメージ、表、グラフ等、様々な形式で視覚的に表示する。また、出力装置が音声出力装置の場合、音声出力装置は、再生された音声データ又は音響データ等からなるオーディオ信号をアナログ信号に変換して聴覚的に出力する。 The audio image output unit 7670 transmits an output signal of at least one of audio and image to an output device capable of visually or audibly notifying information to the passenger or the outside of the vehicle. In the example of FIG. 8, an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are exemplified as output devices. The display unit 7720 may include, for example, at least one of an onboard display and a head-up display. The display unit 7720 may have an AR (Augmented Reality) display function. The output device may be other devices such as headphones, wearable devices such as eyeglass-type displays worn by passengers, projectors or lamps other than these devices. When the output device is a display device, the display device displays the results obtained by various processes performed by the microcomputer 7610 or the information received from other control units in various formats such as texts, images, tables, and graphs. Display visually. When the output device is an audio output device, the audio output device converts an audio signal composed of reproduced audio data, acoustic data, or the like into an analog signal and outputs the audio signal audibly.

 なお、図8に示した例において、通信ネットワーク7010を介して接続された少なくとも二つの制御ユニットが一つの制御ユニットとして一体化されてもよい。あるいは、個々の制御ユニットが、複数の制御ユニットにより構成されてもよい。さらに、車両制御システム7000が、図示されていない別の制御ユニットを備えてもよい。また、上記の説明において、いずれかの制御ユニットが担う機能の一部又は全部を、他の制御ユニットに持たせてもよい。つまり、通信ネットワーク7010を介して情報の送受信がされるようになっていれば、所定の演算処理が、いずれかの制御ユニットで行われるようになってもよい。同様に、いずれかの制御ユニットに接続されているセンサ又は装置が、他の制御ユニットに接続されるとともに、複数の制御ユニットが、通信ネットワーク7010を介して相互に検出情報を送受信してもよい。
 なお、本技術は以下のような構成を取ることができる。
In the example shown in FIG. 8, at least two control units connected via the communication network 7010 may be integrated as one control unit. Alternatively, each control unit may be composed of a plurality of control units. Further, the vehicle control system 7000 may include another control unit (not shown). Further, in the above description, the other control unit may have a part or all of the functions carried out by any of the control units. That is, as long as information is transmitted and received via the communication network 7010, predetermined arithmetic processing may be performed by any of the control units. Similarly, a sensor or device connected to any control unit may be connected to another control unit, and a plurality of control units may send and receive detection information to and from each other via the communication network 7010. ..
The present technology can have the following configurations.

 なお、本技術は以下のような構成を取ることができる。
 (1)第1導電型の第1化合物半導体材料を有する第1半導体層と、
 前記第1半導体層の上に配置される光電変換層と、
 前記光電変換層の上に部分的に配置され、前記光電変換層に接触するように配置される第2導電型の第2半導体層と、
 前記第1半導体層、前記光電変換層、及び前記第2半導体層を画素単位で分離する素子分離層と、を備え、
 前記画素は、感度及びサイズの異なる複数のサブピクセルを有する、固体撮像装置。
 (2)前記第2半導体層は、前記複数のサブピクセルのそれぞれに対応して、それぞれが分離された複数のウェル領域を有し、
 前記複数のウェル領域はそれぞれサイズが異なる、(1)に記載の固体撮像装置。
 (3)前記複数のウェル領域は、前記第2半導体層の積層方向に沿って平面視した場合の面積がそれぞれ異なる、(2に記載の固体撮像装置。
 (4)前記複数のウェル領域の平面サイズはそれぞれ異なっている、(3)に記載の固体撮像装置。
 (5)前記複数のウェル領域は、対応する前記画素内において、向きを揃えて配置される、(4に記載の固体撮像装置。
 (6)前記複数のウェル領域は、対応する前記画素内において、それぞれ異なる向きに配置される、(4に記載の固体撮像装置。
 (7)前記素子分離層の少なくとも一部に配置される遮光層を備える、(1)乃至(6)のいずれか一項に記載の固体撮像装置。
 (8)前記遮光層は、前記画素の境界位置に配置される、(7)に記載の固体撮像装置。
 (9)前記遮光層は、前記画素内の前記複数のサブピクセルのそれぞれの境界位置に配置される、(7)に記載の固体撮像装置。
 (10)前記遮光層は、前記第2半導体層の積層方向に沿って平面視したときに、前記複数のサブピクセルのそれぞれを取り囲むように配置される、(7)乃至(9)のいずれか一項に記載の固体撮像装置。
 (11)前記画素内の前記遮光層で取り囲まれる前記複数のサブピクセルを積層方向に沿って平面視した面積はそれぞれ異なる、(10)に記載の固体撮像装置。
 (12)前記画素内の前記複数のサブピクセルのそれぞれの境界位置に配置される遮光層を備え、
 前記複数のウェル領域は、前記遮光層で分離された領域である、(2)乃至(6)のいずれか一項に記載の固体撮像装置。
 (13)前記画素内の前記遮光層で区分けされる前記複数のサブピクセルを積層方向に沿って平面視した面積はそれぞれ異なっており、
 前記画素内の前記複数のウェル領域を積層方向に沿って平面視した面積はいずれも等しい、(12)に記載の固体撮像装置。
 (14)前記画素内の前記複数のサブピクセルの前記光電変換層は、サブピクセルごとにそれぞれ異なる材料を含む、(1)乃至(13)のいずれか一項に記載の固体撮像装置。
 (15)前記材料は、InGaAs又はCuInGaSeを含む、(14)に記載の固体撮像装置。
 (16)前記光電変換層の上に配置され、前記第1導電型の第2化合物半導体材料を有する第3半導体層を備え、
 前記第2半導体層は、前記第3半導体層の上に配置される、(1)乃至(15)のいずれか一項に記載の固体撮像装置。
The present technology can have the following configurations.
(1) A first semiconductor layer having a first conductive type first compound semiconductor material, and
The photoelectric conversion layer arranged on the first semiconductor layer and
A second conductive type second semiconductor layer partially arranged on the photoelectric conversion layer and arranged so as to be in contact with the photoelectric conversion layer.
The first semiconductor layer, the photoelectric conversion layer, and the element separation layer for separating the second semiconductor layer on a pixel-by-pixel basis are provided.
The pixel is a solid-state image sensor having a plurality of sub-pixels having different sensitivities and sizes.
(2) The second semiconductor layer has a plurality of well regions separated from each other corresponding to each of the plurality of subpixels.
The solid-state image pickup device according to (1), wherein the plurality of well regions have different sizes.
(3) The solid-state image sensor according to (2), wherein the plurality of well regions have different areas when viewed in a plan view along the stacking direction of the second semiconductor layer.
(4) The solid-state image pickup device according to (3), wherein the plurality of well regions have different planar sizes.
(5) The solid-state image sensor according to (4), wherein the plurality of well regions are arranged in the corresponding pixels so as to be oriented in the same direction.
(6) The solid-state image sensor according to (4), wherein the plurality of well regions are arranged in different directions in the corresponding pixels.
(7) The solid-state image pickup apparatus according to any one of (1) to (6), comprising a light-shielding layer arranged in at least a part of the element separation layer.
(8) The solid-state image pickup device according to (7), wherein the light-shielding layer is arranged at a boundary position of the pixels.
(9) The solid-state image pickup device according to (7), wherein the light-shielding layer is arranged at the boundary position of each of the plurality of sub-pixels in the pixel.
(10) Any of (7) to (9), wherein the light-shielding layer is arranged so as to surround each of the plurality of subpixels when viewed in a plan view along the stacking direction of the second semiconductor layer. The solid-state image sensor according to one item.
(11) The solid-state image pickup device according to (10), wherein the plurality of sub-pixels surrounded by the light-shielding layer in the pixel have different areas in a plan view along the stacking direction.
(12) A light-shielding layer arranged at the boundary position of each of the plurality of sub-pixels in the pixel is provided.
The solid-state image pickup device according to any one of (2) to (6), wherein the plurality of well regions are regions separated by the light-shielding layer.
(13) The areas of the plurality of sub-pixels divided by the light-shielding layer in the pixels in a plan view along the stacking direction are different from each other.
The solid-state image pickup apparatus according to (12), wherein the areas of the plurality of well regions in the pixel viewed in a plan view along the stacking direction are all the same.
(14) The solid-state image sensor according to any one of (1) to (13), wherein the photoelectric conversion layer of the plurality of subpixels in the pixel contains a different material for each subpixel.
(15) The solid-state image sensor according to (14), wherein the material comprises InGaAs or CuInGaSe 2 .
(16) A third semiconductor layer arranged on the photoelectric conversion layer and having the first conductive type second compound semiconductor material is provided.
The solid-state image pickup device according to any one of (1) to (15), wherein the second semiconductor layer is arranged on the third semiconductor layer.

 本開示の態様は、上述した個々の実施形態に限定されるものではなく、当業者が想到しうる種々の変形も含むものであり、本開示の効果も上述した内容に限定されない。すなわち、特許請求の範囲に規定された内容およびその均等物から導き出される本開示の概念的な思想と趣旨を逸脱しない範囲で種々の追加、変更および部分的削除が可能である。 The aspects of the present disclosure are not limited to the individual embodiments described above, but also include various modifications that can be conceived by those skilled in the art, and the effects of the present disclosure are not limited to the above-mentioned contents. That is, various additions, changes and partial deletions are possible without departing from the conceptual idea and purpose of the present disclosure derived from the contents specified in the claims and their equivalents.

 1 固体撮像装置、2 ベース基板、3 第1半導体層、4 光電変換層、5 第3半導体層、6 光電変換部、7 素子分離層、8 第2半導体層、9 第1電極層、10 第2電極層、11 被覆層、12 接合部材、13 駆動用基板、14 平坦化層、15 マイクロレンズ、16 遮光層、21 画素アレイ部、22 垂直駆動回路、23 カラム信号処理回路、24 水平駆動回路、25 出力回路、26 駆動制御回路、 1 solid-state imager, 2 base substrate, 3 1st semiconductor layer, 4 photoelectric conversion layer, 5 3rd semiconductor layer, 6 photoelectric conversion unit, 7 element separation layer, 8 2nd semiconductor layer, 9 1st electrode layer, 10th 2 electrode layer, 11 coating layer, 12 bonding member, 13 drive substrate, 14 flattening layer, 15 microlens, 16 light-shielding layer, 21 pixel array section, 22 vertical drive circuit, 23 column signal processing circuit, 24 horizontal drive circuit , 25 output circuit, 26 drive control circuit,

Claims (16)

 第1導電型の第1化合物半導体材料を有する第1半導体層と、
 前記第1半導体層の上に配置される光電変換層と、
 前記光電変換層の上に部分的に配置され、前記光電変換層に接触するように配置される第2導電型の第2半導体層と、
 前記第1半導体層、前記光電変換層、及び前記第2半導体層を画素単位で分離する素子分離層と、を備え、
 前記画素は、感度及びサイズの異なる複数のサブピクセルを有する、固体撮像装置。
A first semiconductor layer having a first conductive type first compound semiconductor material,
The photoelectric conversion layer arranged on the first semiconductor layer and
A second conductive type second semiconductor layer partially arranged on the photoelectric conversion layer and arranged so as to be in contact with the photoelectric conversion layer.
The first semiconductor layer, the photoelectric conversion layer, and the element separation layer for separating the second semiconductor layer on a pixel-by-pixel basis are provided.
The pixel is a solid-state image sensor having a plurality of sub-pixels having different sensitivities and sizes.
 前記第2半導体層は、前記複数のサブピクセルのそれぞれに対応して、それぞれが分離された複数のウェル領域を有し、
 前記複数のウェル領域はそれぞれサイズが異なる、請求項1に記載の固体撮像装置。
The second semiconductor layer has a plurality of well regions separated from each other corresponding to each of the plurality of subpixels.
The solid-state image pickup device according to claim 1, wherein the plurality of well regions have different sizes.
 前記複数のウェル領域は、前記第2半導体層の積層方向に沿って平面視した場合の面積がそれぞれ異なる、請求項2に記載の固体撮像装置。 The solid-state image pickup device according to claim 2, wherein the plurality of well regions have different areas when viewed in a plan view along the stacking direction of the second semiconductor layer.  前記複数のウェル領域の平面サイズはそれぞれ異なっている、請求項3に記載の固体撮像装置。 The solid-state image pickup device according to claim 3, wherein the plurality of well regions have different planar sizes.  前記複数のウェル領域は、対応する前記画素内において、向きを揃えて配置される、請求項4に記載の固体撮像装置。 The solid-state image pickup device according to claim 4, wherein the plurality of well regions are arranged in the corresponding pixels so as to be oriented in the same direction.  前記複数のウェル領域は、対応する前記画素内において、それぞれ異なる向きに配置される、請求項4に記載の固体撮像装置。 The solid-state image pickup device according to claim 4, wherein the plurality of well regions are arranged in different directions in the corresponding pixels.  前記素子分離層の少なくとも一部に配置される遮光層を備える、請求項1に記載の固体撮像装置。 The solid-state image pickup apparatus according to claim 1, further comprising a light-shielding layer arranged in at least a part of the element separation layer.  前記遮光層は、前記画素の境界位置に配置される、請求項7に記載の固体撮像装置。 The solid-state image pickup device according to claim 7, wherein the light-shielding layer is arranged at a boundary position of the pixels.  前記遮光層は、前記画素内の前記複数のサブピクセルのそれぞれの境界位置に配置される、請求項7に記載の固体撮像装置。 The solid-state image pickup device according to claim 7, wherein the light-shielding layer is arranged at a boundary position of each of the plurality of sub-pixels in the pixel.  前記遮光層は、前記第2半導体層の積層方向に沿って平面視したときに、前記複数のサブピクセルのそれぞれを取り囲むように配置される、請求項7に記載の固体撮像装置。 The solid-state image pickup device according to claim 7, wherein the light-shielding layer is arranged so as to surround each of the plurality of subpixels when viewed in a plan view along the stacking direction of the second semiconductor layer.  前記画素内の前記遮光層で取り囲まれる前記複数のサブピクセルを積層方向に沿って平面視した面積はそれぞれ異なる、請求項10に記載の固体撮像装置。 The solid-state image pickup device according to claim 10, wherein the plurality of sub-pixels surrounded by the light-shielding layer in the pixel have different areas in a plan view along the stacking direction.  前記画素内の前記複数のサブピクセルのそれぞれの境界位置に配置される遮光層を備え、
 前記複数のウェル領域は、前記遮光層で分離された領域である、請求項2に記載の固体撮像装置。
A light-shielding layer arranged at the boundary position of each of the plurality of sub-pixels in the pixel is provided.
The solid-state image pickup device according to claim 2, wherein the plurality of well regions are regions separated by the light-shielding layer.
 前記画素内の前記遮光層で区分けされる前記複数のサブピクセルを積層方向に沿って平面視した面積はそれぞれ異なっており、
 前記画素内の前記複数のウェル領域を積層方向に沿って平面視した面積はいずれも等しい、請求項12に記載の固体撮像装置。
The areas of the plurality of sub-pixels divided by the light-shielding layer in the pixels in a plan view along the stacking direction are different from each other.
The solid-state image pickup device according to claim 12, wherein the areas of the plurality of well regions in the pixel viewed in a plan view along the stacking direction are all the same.
 前記画素内の前記複数のサブピクセルの前記光電変換層は、サブピクセルごとにそれぞれ異なる材料を含む、請求項1に記載の固体撮像装置。 The solid-state image pickup device according to claim 1, wherein the photoelectric conversion layer of the plurality of subpixels in the pixel contains a different material for each subpixel.  前記材料は、InGaAs又はCuInGaSeを含む、請求項14に記載の固体撮像装置。 The solid-state image sensor according to claim 14, wherein the material comprises InGaAs or CuInGaSe 2 .  前記光電変換層の上に配置され、前記第1導電型の第2化合物半導体材料を有する第3半導体層を備え、
 前記第2半導体層は、前記第3半導体層の上に配置される、請求項1に記載の固体撮像装置。
A third semiconductor layer arranged on the photoelectric conversion layer and having the first conductive type second compound semiconductor material is provided.
The solid-state image sensor according to claim 1, wherein the second semiconductor layer is arranged on the third semiconductor layer.
PCT/JP2021/040862 2020-11-13 2021-11-05 Solid-state imaging device WO2022102549A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202180074178.9A CN116406478A (en) 2020-11-13 2021-11-05 Solid-state image pickup device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-189768 2020-11-13
JP2020189768A JP2023182874A (en) 2020-11-13 2020-11-13 solid-state imaging device

Publications (1)

Publication Number Publication Date
WO2022102549A1 true WO2022102549A1 (en) 2022-05-19

Family

ID=81602274

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/040862 WO2022102549A1 (en) 2020-11-13 2021-11-05 Solid-state imaging device

Country Status (3)

Country Link
JP (1) JP2023182874A (en)
CN (1) CN116406478A (en)
WO (1) WO2022102549A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012133448A1 (en) * 2011-03-31 2012-10-04 浜松ホトニクス株式会社 Photodiode array module and manufacturing method for same
JP2012244124A (en) * 2011-05-24 2012-12-10 Sumitomo Electric Ind Ltd Light-receiving element array, manufacturing method therefor and detector
JP2017084892A (en) * 2015-10-26 2017-05-18 ソニーセミコンダクタソリューションズ株式会社 Imaging device
WO2020170618A1 (en) * 2019-02-18 2020-08-27 ソニーセミコンダクタソリューションズ株式会社 Imaging element and imaging device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012133448A1 (en) * 2011-03-31 2012-10-04 浜松ホトニクス株式会社 Photodiode array module and manufacturing method for same
JP2012244124A (en) * 2011-05-24 2012-12-10 Sumitomo Electric Ind Ltd Light-receiving element array, manufacturing method therefor and detector
JP2017084892A (en) * 2015-10-26 2017-05-18 ソニーセミコンダクタソリューションズ株式会社 Imaging device
WO2020170618A1 (en) * 2019-02-18 2020-08-27 ソニーセミコンダクタソリューションズ株式会社 Imaging element and imaging device

Also Published As

Publication number Publication date
CN116406478A (en) 2023-07-07
JP2023182874A (en) 2023-12-27

Similar Documents

Publication Publication Date Title
CN107851656B (en) Camera and distance measuring system
US20230047180A1 (en) Imaging device and imaging method
JPWO2020158322A1 (en) Light receiving element, solid-state image sensor and ranging device
US20240120356A1 (en) Imaging device and electronic apparatus
WO2022102549A1 (en) Solid-state imaging device
US11101309B2 (en) Imaging element, method for manufacturing imaging element, and electronic device
TWI856507B (en) Light detection device
EP4513565A1 (en) Optical detection device
WO2024075492A1 (en) Solid-state imaging device, and comparison device
WO2024070523A1 (en) Light detection element and electronic device
US20240304646A1 (en) Imaging device and electronic apparatus
US20240080587A1 (en) Solid-state imaging device and electronic instrument
WO2023248346A1 (en) Imaging device
EP4261890A1 (en) Solid-state imaging element, imaging device, and method for controlling solid-state imaging unit
JP2022107201A (en) Imaging devices and electronic devices
WO2024038828A1 (en) Light detection device
WO2025013515A1 (en) Light detection device, imaging device, and electronic apparatus
WO2024057471A1 (en) Photoelectric conversion element, solid-state imaging element, and ranging system
WO2023248855A1 (en) Light detection device and electronic apparatus
JP2022135738A (en) IMAGING DEVICE, DRIVING METHOD THEREOF, AND ELECTRONIC DEVICE
JP2024073899A (en) Image sensor
CN119522646A (en) Light detection device
WO2022014383A1 (en) Solid-state imaging device and method for manufacturing same
WO2022209256A1 (en) Imaging element, imaging device, and method for manufacturing imaging element
WO2024024450A1 (en) Semiconductor device and manufacturing method for same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21891795

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21891795

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP