[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2018124285A1 - Imaging device and imaging method - Google Patents

Imaging device and imaging method Download PDF

Info

Publication number
WO2018124285A1
WO2018124285A1 PCT/JP2017/047256 JP2017047256W WO2018124285A1 WO 2018124285 A1 WO2018124285 A1 WO 2018124285A1 JP 2017047256 W JP2017047256 W JP 2017047256W WO 2018124285 A1 WO2018124285 A1 WO 2018124285A1
Authority
WO
WIPO (PCT)
Prior art keywords
illumination light
phased array
sensor
light
optical phased
Prior art date
Application number
PCT/JP2017/047256
Other languages
French (fr)
Japanese (ja)
Inventor
拓夫 種村
憲人 小松
義昭 中野
泰之 小関
Original Assignee
国立大学法人東京大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 国立大学法人東京大学 filed Critical 国立大学法人東京大学
Priority to JP2018559637A priority Critical patent/JP6765687B2/en
Publication of WO2018124285A1 publication Critical patent/WO2018124285A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • G01S7/4863Detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/27Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N21/49Scattering, i.e. diffuse reflection within a body or fluid
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/59Transmissivity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4818Constructional features, e.g. arrangements of optical elements using optical fibres
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/487Extracting wanted echo signals, e.g. pulse detection

Definitions

  • the present invention relates to an imaging apparatus and method using an optical phased array, and more particularly to an imaging apparatus and method for obtaining a two-dimensional image or a three-dimensional image of an object.
  • the optical phased array (OPA) type imaging device that has been demonstrated in the past has arranged a large number (several hundreds to several thousand) of antenna units one-dimensionally or two-dimensionally to heat the appropriate place of the antenna units.
  • the optical phase of the output beam is controlled by applying an electric field or applying an electric field (for example, Non-Patent Document 1).
  • an electric field or applying an electric field for example, Non-Patent Document 1
  • the state of the exit beam changes sensitively due to the influence of the antenna unit fabrication error, temperature, and other environmental conditions, the imaging performance deteriorates, or complicated control for always correcting such influence
  • a circuit must be added.
  • Non-patent literature 2 a target is irradiated (non-patent literature) 2 2).
  • the reflected light from the target is collectively detected by an optical sensor while changing the speckle pattern, and the two-dimensional image or three-dimensional image of the target is detected from the correlation with the speckle pattern.
  • An object of the present invention is to provide an imaging apparatus and method in which an optical phased array that does not require a complicated control circuit is used, and deterioration in imaging performance is suppressed regardless of manufacturing errors and environmental conditions. .
  • an imaging apparatus includes an optical phased array that emits a plurality of patterns of illumination light each having a random phase distribution, a first sensor that detects an illumination state of the illumination light as a distribution, A process of extracting an image related to the state of the target by combining the second sensor that detects the intensity of the measurement light from the target illuminated by the illumination light, and the detection information of the first sensor and the detection information of the second sensor A part.
  • the image is not limited to a two-dimensional image, but includes a one-dimensional image and a three-dimensional image.
  • an image can be extracted with reference to the second sensor in an illumination environment in which a plurality of patterns of illumination light each having a random phase distribution is emitted by the optical phased array. It can be reduced to offset the effects of phase shift and intensity variation of illumination light due to manufacturing errors and other imperfections, and it is less susceptible to changes in temperature and other usage environments, making it relatively inexpensive Even if an optical phased array is used, the reliability of measurement can be easily increased.
  • an imaging method detects an illumination state of illumination light as a distribution by a first sensor while emitting illumination light having a random phase distribution from an optical phased array, and uses a second sensor. Detecting the intensity of the measurement light from the object illuminated by the illumination light, and obtaining an image relating to the state of the object by processing the detection information of the first sensor and the detection information of the second sensor in combination.
  • FIG. 1 is a conceptual block diagram illustrating an imaging apparatus according to a first embodiment. It is a top view explaining an optical phased array. It is a conceptual diagram of the structure which integrated the injection
  • 6A and 6B are charts showing numerical verification results using the imaging apparatus of the embodiment. It is a block diagram which shows the specific example of production of an optical phased array.
  • FIG. 8A is a chart showing measurement results using a conventional method that requires advanced adjustment
  • FIG. 8B is a chart showing measurement results using the method of an embodiment that does not require advanced adjustment.
  • An imaging apparatus 100 includes a light source unit 20 that generates laser light B1, and an optical phased array that forms and emits illumination light B2 having a desired wavefront state from the laser light B1 from the light source unit 20. 30, an observation optical system 40 for illumination and measurement, an image sensor 50 which is a first sensor for detecting the irradiation state of the illumination light B2 as a distribution, and measurement light B3 from the target OB illuminated by the illumination light B2 Manages the operation state of the light receiving element 60, which is a second sensor for detecting the intensity of the light, and the optical phased array 30, and receives detection information from the image sensor 50 and the light receiving element 60 to extract or generate an image relating to the state of the target OB. And an information processing unit 70.
  • the light source unit 20 is composed of a semiconductor laser or other coherent light source, and is accompanied by a light source driving circuit (not shown).
  • the light source unit 20 emits laser light B1 set in various wavelength regions such as an infrared region and a visible region.
  • the optical phased array (OPA) 30 is an optical waveguide type integrated circuit, and includes an optical branching unit 31 and a phase control unit 32 provided on a substrate 38.
  • the optical phased array (OPA) 30 can switch the output in a short time of, for example, several tens of ⁇ s or less, and enables high-speed measurement and imaging.
  • the light branching unit 31 branches the laser beam B1 into M channels and guides them to M waveguides 36.
  • the optical branching unit 31 for example, a combination of a star coupler having a slab waveguide, a multistage directional coupler, or the like can be used.
  • the phase control unit 32 includes, for example, M electrodes 32a that apply an electric field to the M-channel waveguide 36, and wirings 32b that enable voltage supply to the electrodes 32a.
  • the supply voltage to the wiring 32b is adjusted by the OPA drive control unit 81 shown in FIG.
  • the phase-controlled illumination light B ⁇ b> 2 is emitted from the emission port 34 of the optical phased array 30 that has passed through the phase control unit 32.
  • the illumination light B2 is light having a random phase distribution.
  • the illumination light B2 changes at high speed in time series, and the illumination light B2 having a random phase distribution itself is emitted from the emission port 34 as N patterns different from each other. That is, these N patterns or N types of illumination light B2 all have a random phase distribution and change randomly in time series. In each pattern, the random phase distribution range is ⁇ 180 °, and there is no bias.
  • the phase control unit 32 constituting the optical phased array 30 is one-dimensional in the X direction, and the illumination port B2 having the random phase distribution or pattern of the one-dimensional array is emitted from the emission port 34.
  • the optical phased array 30 shown in FIG. 2 can be stacked in the Y direction. With such a laminated optical phased array 130, illumination light B2 having a two-dimensional random phase distribution in the XY directions can be formed and emitted.
  • by changing the structure of the waveguide and arranging the emission ports two-dimensionally it is possible to emit laser light whose phase is controlled in the direction perpendicular to the main surface of the substrate 38. In this case, as shown in FIG.
  • the injection unit 237 is an optical phased array 130 having an injection unit 434 arranged in a two-dimensional array in the XY direction.
  • the illumination light B2 having a two-dimensional phase distribution in the XY directions can be formed and emitted in the Z direction.
  • a one-dimensional image corresponding to the distribution direction of the illumination light B2 can be obtained as the target image, and when the illumination light B2 is two-dimensional, the distribution of the illumination light B2 is the target image. A corresponding two-dimensional image can be obtained.
  • the phase control unit 32 is not limited to the electro-optic effect type modulation in which the phase is adjusted by the electric field strength applied to the waveguide 36, but the carrier effect type modulation and the waveguide in which the phase is adjusted by current injection into the waveguide unit.
  • a thermo-optic effect type modulation that adjusts the phase by heating can be used.
  • the observation optical system 40 includes a branch mirror 43 and a plurality of lenses L1 to L3.
  • the branch mirror 43 is a half mirror having a uniform transmittance or reflectance.
  • the branch mirror 43 divides the illumination light B ⁇ b> 2 from the optical phased array 30 so that a part thereof is incident on the target OB and the other is incident on the image sensor (first sensor) 50. Further, the branch mirror 43 reflects the measurement light B ⁇ b> 3 that is return light reflected by scattering on the surface OBa of the target OB and guides it to the light receiving element (second sensor) 60.
  • the lens L1 enables illumination in a far field state while preventing the divergence of the illumination light B2 emitted from the optical phased array 30.
  • the lens L2 has a role of collectively entering the photosensitive portion 61 of the light receiving element 60 by reducing the diameter of the measurement light B3 reflected by the object OB.
  • the lens L3 forms a pattern of the measurement light B2 as a far-field image on the photosensitive surface 51 of the image sensor 50 in cooperation with the lens L1.
  • the size of the far-field image on the photosensitive surface 51 can be adjusted by adjusting the focal length of the lens L3.
  • the lens L1 or the lens L3 may be omitted.
  • the light receiving element (second sensor) 60 collectively detects the intensity of the measurement light reflected by the target OB, and the image sensor (first sensor) 50 is the illumination light emitted from the optical phased array 30.
  • the far field image of is detected.
  • the image sensor (first sensor) 50 is a semiconductor image sensor such as a CMOS or CCD.
  • the image sensor 50 is sensitive to the wavelength of the light source unit 20 and can be accompanied by a wavelength selection filter.
  • the image sensor 50 detects the pattern of the illumination light B2 formed on the photosensitive surface 51 and captures it as a detected image. At this time, the intensity value of the illumination light B2 is detected for each pixel position. As described above, since the illumination light B2 is emitted in N patterns by the optical phased array 30, N detection images of the illumination light B2 are also obtained.
  • the light receiving element (second sensor) 60 is a semiconductor optical sensor such as a photodiode.
  • the light receiving element 60 is sensitive to the wavelength of the light source unit 20 and can be accompanied by a wavelength selection filter.
  • the light receiving element 60 operates by being driven by the light receiving element driving unit 82, and outputs a signal corresponding to the light intensity of the entire interference pattern of the measurement light B3 incident on the photosensitive unit 61. That is, the light receiving element 60 collectively detects the measurement light B3 reflected by the entire target OB as the total signal intensity.
  • N detection intensities of the measurement light B3 are also obtained.
  • the information processing unit 70 includes a control unit 71, an interface unit 72, and a storage unit 73.
  • the control unit 71 operates the optical phased array 30 and the like via the interface unit 72 and the OPA drive control unit 81, and emits illumination light B2 having a random phase distribution in a plurality of patterns.
  • the control unit 71 receives the detected image taken by the image sensor 50 through the interface unit 72 together with timing information.
  • the information processing unit 70 receives the intensity of the measurement light B3 detected by the light receiving element 60 through the interface unit 72 together with timing information.
  • the control unit 71 temporarily stores the detection image of the illumination light B2 acquired from the image sensor 50 and the intensity value of the measurement light B3 acquired from the light receiving element 60 in the storage unit 73, and from these detection images and intensity values.
  • the obtained state of the target OB is stored as a measurement result or a reconstructed image.
  • the control unit 71 corresponds to the position information on the image sensor (first sensor) 50 (specifically, the coordinate x corresponding to the X axis on the illustrated object OB and corresponding to the Z axis on the image sensor 50).
  • the reconstructed image is calculated from the detection information of the image sensor (first sensor) 50 and the detection information of the light receiving element (second sensor) 60.
  • This reconstructed image represents the state of the target OB such as the reflectance of the target OB.
  • the result of processing by the information processing unit 70 specifically, an image reflecting the state of the target OB is displayed on the input / output unit 91.
  • the input / output unit 91 presents various information regarding the operating state of the imaging apparatus 100 to the operator.
  • An instruction is input from the operator to the information processing unit 70 via the input / output unit 91.
  • control unit 71 of the information processing unit 70 outputs an operation command to the OPA drive control unit 81 via the interface unit 72, and causes the OPA drive control unit 81 to prepare a random virtual irradiation pattern (step S11).
  • a random virtual irradiation pattern can be generated every time, but a random pattern stored in advance can also be read out.
  • the OPA drive control unit 81 operates the optical phased array 30 based on this virtual irradiation pattern, and emits the illumination light B2 having a random pattern having a random phase distribution from the optical phased array 30 (step S12).
  • This random pattern is a speckle-like luminance distribution pattern.
  • the virtual irradiation pattern prepared in step S11 and the random pattern of the illumination light B2 actually emitted from the optical phased array 30 do not need to correspond exactly. That is, when there is an error or fluctuation in the size or arrangement of the individual electrodes 32a constituting the optical phased array 30, the illumination light B2 whose phase is strictly controlled is not emitted from the optical phased array 30.
  • the illumination phase B2 whose phase is strictly controlled is not emitted from the optical phased array 30.
  • the random pattern of the illumination light B2 emitted from the optical phased array 30 is not strictly controlled, but by calculating the reconstructed image O (x) from the extracted value Sr related to correlation described later.
  • Such manufacturing errors and environmental variations of the optical phased array 30 are offset or mitigated.
  • the control unit 71 receives the detected image taken by the image sensor 50 together with the timing information via the interface unit 72 and stores it in the storage unit 73 (step S13). In parallel with this, the detection intensity of the measurement light B3 detected by the light receiving element 60 is received together with the timing information via the light receiving element driving unit 82 and the interface unit 72 and stored in the storage unit 73 (step S14).
  • the control unit 71 determines whether or not the process of forming and outputting N random patterns by the optical phased array 30 is completed (step S15). If the output of the N random patterns is not completed, the control unit 71 Returning to S11, the control unit 71 causes the OPA drive control unit 81 to prepare the next virtual irradiation pattern.
  • step S16 the control unit 71 detects and detects the detected images of the image sensor 50 and the light receiving element 60 held in the storage unit 73 in steps S13 and S14.
  • An image is generated or reconstructed based on the intensity (step S16).
  • the control unit 71 stores the obtained reconstructed image in the storage unit 73.
  • the control unit 71 presupposes position information on the image sensor (first sensor) 50 (values such as coordinates x) and detection information of the image sensor (first sensor) 50 and a light receiving element (second sensor).
  • 60 detection information Sr a reconstructed image O (x) of the object is calculated.
  • control unit 71 changes the phase distribution of the illumination light B ⁇ b> 2 from the optical phased array 30 N times, and detects the total signal intensity detected by the light receiving element 60 and the signal at the target position on the image sensor 50.
  • the reconstructed image O (x) is calculated from the intensity.
  • an image signal can be calculated for a pixel corresponding to an arbitrary position on the image sensor 50 using the optical phased array 30 that operates at high speed, and the reconstruction or extraction of the target image is realized at high speed and with high accuracy. can do.
  • the value x corresponds to the X axis on the object OB in the apparatus configuration of FIG. 1, but corresponds to the Z axis on the image sensor 50.
  • the value x on the image sensor 50 is a discrete value corresponding to a pixel.
  • the value N indicates the number (natural number) of random patterns formed and output by the optical phased array 30.
  • the value Sr indicates the measurement value of the light receiving element 60, that is, the intensity value of the measurement light B3.
  • the value ⁇ S> indicates an average value of N values Sr obtained by N measurements with the random pattern changed.
  • Ir (x) represents the relationship between the coordinate value x on the image sensor 50 and the intensity value at the pixel corresponding to the coordinate value x, that is, the detected luminance.
  • means adding (Sr ⁇ ⁇ S>) ⁇ Ir (x) while changing the variable r of the values Sr, Ir (x) from 1 to N.
  • the reconstructed image O (x) gives the luminance value of the reconstructed image for each coordinate value x.
  • the above equation (1) is for determining the reconstructed image O (x) for a one-dimensional pixel column in the image sensor 50. Processing for determining the configuration image O (x, y) is performed.
  • the value y corresponds to the Y axis on the target OB and also corresponds to the Y axis on the image sensor 50.
  • the number N of random patterns generated or the value M corresponding to the number of divisions and the number of arrays is relatively reduced.
  • Equivalent results can be obtained. For example, by imposing appropriate mathematical constraints on the reconstructed image O (x) so that the object does not actually have an unnatural shape, the reconstructed image O ( x) can be calculated. On the contrary, if the value N is allowed to be relatively large, even if the value M is reduced, the reconstruction is performed by the least square method, the inverse matrix method, or the like instead of the simple addition represented by the equation (1). A similar spatial resolution can be obtained by estimating the image O (x).
  • the imaging apparatus 100 can be changed to a device that performs measurement in the depth direction of an object by using a pulse light source.
  • a two-dimensional image or a three-dimensional image including information regarding the traveling direction of the illumination light can be obtained as the target image.
  • the imaging apparatus 100 includes a pulse light source as the light source unit 20 that supplies light to the optical phased array 30, and the information processing unit 70 includes measurement light.
  • the depth direction is measured for the target OB.
  • a pulsed laser beam B1 is emitted from the light source unit 20.
  • the operation of the optical phased array 30 is the same, but for the measurement value or detection signal of the light receiving element 60, a time gate is provided in the light receiving element driving unit 82 to extract only a signal corresponding to a specific distance from the measurement light B3. To do. As a result, multi-stage measurement light B3 can be obtained so as to slice the space gradually with respect to the depth direction.
  • the information processing unit 70 a two-dimensional horizontal slice sliced in the depth direction by the same method as in the two-dimensional case. An image is obtained and a three-dimensional image obtained by synthesizing a number of two-dimensional horizontal images having different depths is reproduced.
  • the random pattern of the illumination light B2 emitted from the optical phased array 30 does not have to be strictly controlled.
  • the detection information of the image sensor 50 and the detection information of the light receiving element 60 the manufacturing error and environmental fluctuation of the optical phased array 30 are offset or alleviated so as to be averaged.
  • the image O (x) can be obtained with high accuracy.
  • FIG. 5 is a view showing a modification of the imaging apparatus 100 of the first embodiment shown in FIG.
  • the observation optical system 140 the measurement light B3 transmitted through the object OB is observed by the light receiving element 60.
  • the transmittance distribution and the like of the target OB can be determined as the reconstructed image O (x) described above.
  • 6A and 6B are charts showing numerical verification results using the imaging apparatus 100 of the embodiment.
  • a one-dimensional scanning result is shown, and a one-dimensional image or distribution is obtained.
  • the horizontal axis indicates the pixel position, and the vertical axis indicates the transmittance.
  • numerical values arranged below the horizontal axis indicate actual transmittance.
  • the dotted line uses a conventional technique, and the phase state of the illumination light B2 emitted from the optical phased array 30 is strictly controlled by tuning, and the transmittance of the target OB is measured from the distribution of the measurement light B3. Yes.
  • the other lines are the number of divisions M of the laser beam B1 by the optical phased array 30 using the apparatus of the embodiment (specifically, the apparatus system of FIG. 5).
  • the number N of random pattern generations is changed.
  • the generation number N of random patterns is 10, 100, and 1000.
  • the generation number N of random patterns is 2000.
  • FIG. 7 shows a specific example of manufacturing the optical phased array 30, and corresponds to FIG.
  • the rectangular substrate in plan view is an indium phosphide (InP) semiconductor substrate, a waveguide made of InGaAsP, a phase shift portion of a double hetero structure made of pin type InP / InGaAsP / InP, and Ti /
  • the phase control part 32 which consists of an electrode made from Au was formed.
  • the optical phased array 30 is a one-dimensional modulator.
  • FIG. 8A and 8B are charts showing verification results using the imaging apparatus 100 in which the optical phased array 30 of FIG. 7 is incorporated.
  • the measurement target is a slit pattern.
  • the horizontal axis indicates the pixel position, and the vertical axis indicates the transmittance.
  • the numerical values on the chart indicate the transmittance.
  • the drive condition is extracted 18,000 times, and the measurement target is measured relatively accurately. It can be considered that the slit pattern is reproduced.
  • N 100 times without extracting the driving condition of the optical phased array 30, and the reconstructed image O (x) of Expression (1) is generated.
  • the measurement target image is obtained from The image or distribution pattern shown in FIG. 8B approximates the image or distribution pattern shown in FIG. 8A, and can be considered to reproduce the slit pattern to be measured relatively accurately.
  • the light receiving element (second sensor) 60 is provided in an illumination environment in which the optical phased array 30 emits a plurality of patterns of illumination light B2 each having a random phase distribution. Since it is possible to extract an image by referring to it, it is possible to reduce the influence of the phase error, intensity variation, etc. of the illumination light B2 due to manufacturing errors and other imperfections of the optical phased array 30, and further, The reliability of measurement can be easily increased even if the optical phased array 30 which is less susceptible to the influence of temperature and other changes in the usage environment and is manufactured at a relatively low cost.
  • the number of branches or the number of divisions M is sufficiently large, it is not necessary to switch all the phase control units 32. For example, about half (M / 2) of M phase controllers are fixed, Similar characteristics can be obtained by switching only the remaining half (M / 2) phase controllers. Thereby, simplification and power saving of the drive control unit 81 can be achieved, and at the same time, it is possible to make it difficult to be affected by manufacturing errors and other imperfections of the optical phased array 30.
  • Imaging apparatus and the like according to the second embodiment will be described.
  • the imaging apparatus according to the second embodiment is a modification of the first embodiment, and parts that are not particularly described are the same as those in the first embodiment.
  • the electrodes for operating the optical phased array are simplified.
  • the optical phased array 230 used in the imaging apparatus of the second embodiment operates M waveguides 36, which are a plurality of optical paths, with fewer electrodes 32e to 32h.
  • seven waveguides 36 are operated by four electrodes 32e to 32h.
  • the values of the voltages V1 to V4 applied to the electrodes 32e to 32h are randomly changed.
  • the illumination light B2 having a random phase distribution can be emitted from the emission port 34.
  • the optical phased array 230 in the imaging apparatus of the second embodiment, a plurality of electrodes 32e to 32h for phase adjustment having a random shape pattern different from each other are provided over the plurality of waveguides 36, The illumination light B2 having a random phase distribution is emitted depending on how the electrodes 32e to 32h are combined.
  • the number of electrodes can be significantly reduced without reducing the spatial resolution, the optical phased array 30 can be downsized, and the driving method of the optical phased array 230 can be simplified.
  • the imaging apparatus according to the third embodiment is a modification of the first embodiment, and parts that are not particularly described are the same as those in the first embodiment.
  • the illumination light B2 having a two-dimensional phase distribution or the illumination light B2 having a phase distribution expanded in the one-dimensional direction is formed using a one-dimensional optical phased array (OPA).
  • OPA optical phased array
  • the optical phased array 330 used in the imaging apparatus of the third embodiment includes a main body portion 330a having the same structure as the optical phased array 30 shown in FIG. 2, and an emission port 34 side of the main body portion 330a. And a prism 330b that is a branch portion that is arranged and extends along the arrangement direction of the exit ports 34.
  • a broadband light source light B ⁇ b> 12 is incident on the optical phased array 330.
  • the optical phased array 330 has a random phase distribution for each wavelength with respect to the light source light B12.
  • the illumination light B ⁇ b> 2 emitted from the emission port 34 via the waveguide 36 has a phase distribution with respect to the arrangement direction of the emission ports 34.
  • the illumination light B2 is deflected in a direction orthogonal to the arrangement direction of the emission ports 34 through the prism 330b. At this time, the deflection angle differs depending on the wavelength component of the illumination light B2, and is orthogonal to the arrangement direction of the emission ports 34. It is divided in the direction to do.
  • the illumination light B2 that has passed through the prism 330b has a two-dimensional spread.
  • the illumination light B ⁇ b> 2 has at least a random phase distribution with respect to the arrangement direction of the emission ports 34.
  • the direction orthogonal to the arrangement direction of the injection ports 34 may have a correlation, but one-dimensional image data along the arrangement direction of the injection ports 34 may be processed individually.
  • the illumination light B2 in a plurality of wavelength regions is modulated so as to have a random phase distribution for each wavelength region, and the optical phased array 330 has a branching portion.
  • the illumination light B2 from the array 330 is divided for each wavelength region.
  • two-dimensional illumination light can be emitted using the one-dimensional optical phased array 330.
  • the one-dimensional irradiation range can be divided for each wavelength.
  • the dimension of detection or the scan range can be shared by the wavelength range.
  • a diffraction grating can also be used. Further, the same effect can be obtained by integrating a diffraction grating type coupler at the position of the emission port 34 and extracting light in a direction perpendicular to the substrate 38. In general, in the diffraction grating type coupler, since the angle of emission differs depending on the wavelength, two-dimensional irradiation light can be directly emitted. Thereby, further downsizing can be realized.
  • the detection dimension or scan range can be shared by sweeping the wavelength using a wavelength variable light source.
  • the imaging apparatus according to the fourth embodiment is a modification of the first embodiment, and parts that are not particularly described are the same as those in the first embodiment.
  • the optical phased array 430 used in the imaging apparatus of the fourth embodiment is close to the main body portion 430a having the same structure as the optical phased array 30 shown in FIG. 2 and the exit port 34 of the main body portion 430a.
  • a multimode optical fiber 430c disposed in the vicinity of the light emitting portion of the optical coupling portion 430b.
  • the optical coupling unit 430b is a three-dimensional optical circuit, and for example, a photonic lantern can be used.
  • the optical coupling unit 430b receives the illumination light B21, which is a one-dimensional optical signal, emitted from the emission port 34 of the main body portion 340a at the light incident part, and receives the one-dimensional optical signal as illumination light B22, which is a two-dimensional optical signal. To be emitted from the light emitting part.
  • the multimode optical fiber 430c receives the illumination light B22 that is a two-dimensional optical signal emitted from the emission unit of the optical coupling unit 430b at the incident end 3a, and receives the illumination light B2 that is a two-dimensional optical signal propagated through the core 3b. Injection is performed from the injection end 3c.
  • the original illumination light B21 formed by the main body portion 430a has a random pattern
  • the converted illumination light B2 that has passed through the multimode optical fiber 430c and the like has a fine luminance pattern due to the influence of mode coupling and mode dispersion. Is output as That is, the illumination light B2 is converted into a random and fine luminance pattern by passing through the multimode optical fiber 430c and the like.
  • the multimode optical fiber 430c has a length of about several meters, for example, and can be freely bent with a curvature below a certain value. At that time, the light propagation condition in the multimode optical fiber 430c changes, and the state of the random pattern changes.
  • the phase control unit 32 switches the output state in a short time compared to the fluctuation of the optical fiber, the illumination light B2 that has passed through the multimode optical fiber 430c is reflected at a high speed reflecting the instantaneous state of the multimode optical fiber 430c. A random pattern that changes.
  • the injection end 3c is coupled.
  • the system hereinafter referred to as the tip observation unit
  • the tip of the cable containing the multimode optical fiber 430c and the signal line is placed at a desired position.
  • the tip observation unit at the tip of the cable can be brought closer to the target OB, and the state of the target OB can be measured remotely.
  • Such an imaging apparatus can be used as a fiberscope or an endoscope.
  • the image sensor 50 can be arranged on the incident end 3a side of the multimode optical fiber 430c via a branching section.
  • a technique for acquiring an image on the exit end 3c side can be used regardless of the bending state of the multimode optical fiber 430c (Ruo Yu Gu, et al., “Design of flexible multi-mode fiber endoscope”, 19, Oct 2015
  • a reflection mirror that partially reflects light is provided at the exit end 3c of the multimode optical fiber 430c, and calibration is performed on the base side of the multimode optical fiber 430c, so that the projection is performed on the exit end 3c side as with the image sensor 50.
  • the image to be played, that is, the random pattern can be captured remotely on the root side.
  • a multi-core optical fiber or a bundle fiber in which a large number of fibers are bundled can be used instead of the multi-mode optical fiber 430c.
  • the phase distribution set for the illumination light B2 is divided into appropriate units or stages such as 10 steps within a range of ⁇ 180 °, or, if the branching number M is sufficiently large, two steps, preferably three steps or more.
  • a digital circuit can be used, so that the drive controller 81 can be simplified.
  • the phase distribution set for each electrode needs to be within a range of ⁇ 180 °. For example, the same resolution can be obtained even if it is set within a range of ⁇ 45 °. Thereby, simplification and power saving of the drive control part 81 can be achieved.
  • the imaging apparatus 100 that acquires a three-dimensional image can be used, for example, in the field of LIDAR (Light Detection and Ranging), and can be used to discriminate an object existing ahead. Furthermore, the imaging apparatus 100 of the embodiment can also be used in fields such as a barcode reader, biological imaging, and a microscope.
  • LIDAR Light Detection and Ranging

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Remote Sensing (AREA)
  • Immunology (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Pathology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Electromagnetism (AREA)
  • Optical Modulation, Optical Deflection, Nonlinear Optics, Optical Demodulation, Optical Logic Elements (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Provided are an imaging device and an imaging method, wherein degradation in imaging performance is inhibited while using an optical phased array that does not require a complex control circuit, irrespective of production errors or environmental conditions. An imaging device 100 is provided with: an optical phased array 30 that emits illumination light B2 in multiple patterns each having a random phase distribution; an image sensor 50 that is a first sensor for detecting, in the form of a distribution, a radiation state of the illumination light B2; a light-receiving element 60 that is a second sensor for detecting the intensity of measurement light B3 from an object OB illuminated by the illumination light B2; and an information processing unit 70 that performs processing for extracting an image relating to the state of the object by combining detection information of the image sensor 50 and detection information of the light-receiving element 60.

Description

イメージング装置及び方法Imaging apparatus and method
 本発明は、光フェーズドアレイを用いるイメージング装置及び方法に関し、特に対象の2次元像又は3次元像等を得るためのイメージング装置及び方法に関する。 The present invention relates to an imaging apparatus and method using an optical phased array, and more particularly to an imaging apparatus and method for obtaining a two-dimensional image or a three-dimensional image of an object.
 従来実証されてきた光フェーズドアレイ(OPA:optical phased array)型のイメージング装置は、多数(数100~数1000個)のアンテナユニットを1次元もしくは2次元的に配列し、アンテナユニットの適所を加熱したり電界を与えたりすることで、出力ビームの光位相を制御するものである(例えば、非特許文献1)。この種のイメージング装置で射出方向を掃引するためには、出力ビームの光位相を精密に制御する必要がある。この際、アンテナユニットの作製誤差、温度等の環境条件の影響で射出ビームの状態が敏感に変化するため、イメージング性能が劣化するか、或いは、そのような影響を常に補正するための複雑な制御回路を付加せざるを得なくなる。 The optical phased array (OPA) type imaging device that has been demonstrated in the past has arranged a large number (several hundreds to several thousand) of antenna units one-dimensionally or two-dimensionally to heat the appropriate place of the antenna units. The optical phase of the output beam is controlled by applying an electric field or applying an electric field (for example, Non-Patent Document 1). In order to sweep the emission direction with this type of imaging apparatus, it is necessary to precisely control the optical phase of the output beam. At this time, since the state of the exit beam changes sensitively due to the influence of the antenna unit fabrication error, temperature, and other environmental conditions, the imaging performance deteriorates, or complicated control for always correcting such influence A circuit must be added.
 光フェーズドアレイを用いるイメージング装置ではないが、拡散体によってランダムなスペックルパターンを形成し、このスペックルパターンをCCDで撮影するとともにターゲットに照射するゴーストイメージング法と呼ばれるものが存在する(非特許文献2)。非特許文献2のゴーストイメージング法では、スペックルパターンを変化させながら、ターゲットからの反射光を光センサーで一括して検出し、スペックルパターンとの相関から、ターゲットの2次元像や3次元像を得ている。 Although it is not an imaging device using an optical phased array, there is a so-called ghost imaging method in which a random speckle pattern is formed by a diffuser, the speckle pattern is photographed with a CCD, and a target is irradiated (non-patent literature) 2). In the ghost imaging method of Non-Patent Document 2, the reflected light from the target is collectively detected by an optical sensor while changing the speckle pattern, and the two-dimensional image or three-dimensional image of the target is detected from the correlation with the speckle pattern. Have gained.
 本発明は、複雑な制御回路を必要としない光フェーズドアレイを用いながらも、作製誤差や環境条件にかかわらず、イメージング性能が劣化することを抑えたイメージング装置及び方法を提供することを目的とする。 SUMMARY OF THE INVENTION An object of the present invention is to provide an imaging apparatus and method in which an optical phased array that does not require a complicated control circuit is used, and deterioration in imaging performance is suppressed regardless of manufacturing errors and environmental conditions. .
 上記目的を達成するため、本発明に係るイメージング装置は、ランダムな位相分布をそれぞれ有する複数パターンの照明光を射出する光フェーズドアレイと、照明光の照射状態を分布として検出する第1センサーと、照明光によって照明された対象からの計測光の強度を検出する第2センサーと、第1センサーの検出情報と第2センサーの検出情報とを組み合わせて対象の状態に関する像を抽出する処理を行う処理部とを備える。ここで、像は、2次元のものに限らず、1次元のものや3次元のものを含む。 To achieve the above object, an imaging apparatus according to the present invention includes an optical phased array that emits a plurality of patterns of illumination light each having a random phase distribution, a first sensor that detects an illumination state of the illumination light as a distribution, A process of extracting an image related to the state of the target by combining the second sensor that detects the intensity of the measurement light from the target illuminated by the illumination light, and the detection information of the first sensor and the detection information of the second sensor A part. Here, the image is not limited to a two-dimensional image, but includes a one-dimensional image and a three-dimensional image.
 上記イメージング装置では、光フェーズドアレイによってランダムな位相分布をそれぞれ有する複数パターンの照明光を照射する照明環境下で、第2センサーを参照して像を抽出することが可能になるので、光フェーズドアレイの作製誤差その他の不完全性による照明光の位相ズレ、強度バラツキ等の影響を相殺するように低減でき、さらには、温度その他の使用環境の変動による影響を受けにくくなり、比較的安価に作製される光フェーズドアレイを用いても計測の信頼性を簡易に高めることができる。 In the imaging apparatus, an image can be extracted with reference to the second sensor in an illumination environment in which a plurality of patterns of illumination light each having a random phase distribution is emitted by the optical phased array. It can be reduced to offset the effects of phase shift and intensity variation of illumination light due to manufacturing errors and other imperfections, and it is less susceptible to changes in temperature and other usage environments, making it relatively inexpensive Even if an optical phased array is used, the reliability of measurement can be easily increased.
 上記目的を達成するため、本発明に係るイメージング方法は、光フェーズドアレイからランダムな位相分布の照明光を射出させつつ、第1センサーによって照明光の照射状態を分布として検出するとともに、第2センサーによって照明光によって照明された対象からの計測光の強度を検出する工程と、第1センサーの検出情報と第2センサーの検出情報とを組み合わせて処理することによって対象の状態に関する像を得る工程とを備える。 In order to achieve the above object, an imaging method according to the present invention detects an illumination state of illumination light as a distribution by a first sensor while emitting illumination light having a random phase distribution from an optical phased array, and uses a second sensor. Detecting the intensity of the measurement light from the object illuminated by the illumination light, and obtaining an image relating to the state of the object by processing the detection information of the first sensor and the detection information of the second sensor in combination. Is provided.
第1実施形態のイメージング装置を説明する概念的なブロック図である。1 is a conceptual block diagram illustrating an imaging apparatus according to a first embodiment. 光フェーズドアレイを説明する平面図である。It is a top view explaining an optical phased array. 射出部を2次元アレイ状に集積した構造の概念図である。It is a conceptual diagram of the structure which integrated the injection | emission part in the two-dimensional array form. イメージング装置の動作を説明する概念図である。It is a conceptual diagram explaining operation | movement of an imaging device. 変形例のイメージング装置を説明する概念的なブロック図である。It is a conceptual block diagram explaining the imaging device of a modification. 図6A及び6Bは、実施形態のイメージング装置を用いた数値検証結果を示すチャートである。6A and 6B are charts showing numerical verification results using the imaging apparatus of the embodiment. 光フェーズドアレイの具体的な作製例を示すブロック図である。It is a block diagram which shows the specific example of production of an optical phased array. 図8Aは、高度な調整を要する従来的な手法を用いた計測結果を示すチャートであり、図8Bは、高度な調整を要しない実施例の手法を用いた計測結果を示すチャートである。FIG. 8A is a chart showing measurement results using a conventional method that requires advanced adjustment, and FIG. 8B is a chart showing measurement results using the method of an embodiment that does not require advanced adjustment. 第2実施形態のイメージング装置の要部を説明する平面図である。It is a top view explaining the principal part of the imaging device of 2nd Embodiment. 第3実施形態のイメージング装置の要部を説明する斜視図である。It is a perspective view explaining the principal part of the imaging device of 3rd Embodiment. 第4実施形態のイメージング装置の要部を説明する斜視図である。It is a perspective view explaining the principal part of the imaging device of 4th Embodiment.
〔第1実施形態〕
 以下、図1等を参照して、本発明に係る第1実施形態のイメージング装置ついて詳細に説明する。
[First Embodiment]
The imaging apparatus according to the first embodiment of the present invention will be described in detail below with reference to FIG.
 図1に示す第1実施形態のイメージング装置100は、レーザー光B1を発生する光源部20と、光源部20からのレーザー光B1から所望の波面状態の照明光B2を形成し射出する光フェーズドアレイ30と、照明用及び計測用の観察光学系40と、照明光B2の照射状態を分布として検出する第1センサーである像センサー50と、照明光B2によって照明された対象OBからの計測光B3の強度を検出する第2センサーである受光素子60と、光フェーズドアレイ30の動作状態を管理するとともに像センサー50及び受光素子60から検出情報を受け取って対象OBの状態に関する画像を抽出又は生成する情報処理部70とを備える。 An imaging apparatus 100 according to the first embodiment shown in FIG. 1 includes a light source unit 20 that generates laser light B1, and an optical phased array that forms and emits illumination light B2 having a desired wavefront state from the laser light B1 from the light source unit 20. 30, an observation optical system 40 for illumination and measurement, an image sensor 50 which is a first sensor for detecting the irradiation state of the illumination light B2 as a distribution, and measurement light B3 from the target OB illuminated by the illumination light B2 Manages the operation state of the light receiving element 60, which is a second sensor for detecting the intensity of the light, and the optical phased array 30, and receives detection information from the image sensor 50 and the light receiving element 60 to extract or generate an image relating to the state of the target OB. And an information processing unit 70.
 光源部20は、半導体レーザーその他のコヒーレント光源で構成され、光源駆動回路(不図示)を付随させたものとなっている。光源部20からは、赤外域、可視域等の各種波長域に設定されたレーザー光B1が射出される。 The light source unit 20 is composed of a semiconductor laser or other coherent light source, and is accompanied by a light source driving circuit (not shown). The light source unit 20 emits laser light B1 set in various wavelength regions such as an infrared region and a visible region.
 図2に示すように、光フェーズドアレイ(OPA)30は、光導波路型の集積回路であり、基板38上に光分岐部31と位相制御部32とを設けたものである。光フェーズドアレイ(OPA)30は、例えば数10μs以下の短時間で出力の切り換えが可能であり、高速の計測やイメージングが可能になる。 As shown in FIG. 2, the optical phased array (OPA) 30 is an optical waveguide type integrated circuit, and includes an optical branching unit 31 and a phase control unit 32 provided on a substrate 38. The optical phased array (OPA) 30 can switch the output in a short time of, for example, several tens of μs or less, and enables high-speed measurement and imaging.
 光分岐部31は、レーザー光B1をMチャンネルに分岐してM個の導波路36に導く。光分岐部31として、例えばスラブ導波路を有するスターカプラー、多段の方向性結合器を組み合わせたもの等を用いることができる。 The light branching unit 31 branches the laser beam B1 into M channels and guides them to M waveguides 36. As the optical branching unit 31, for example, a combination of a star coupler having a slab waveguide, a multistage directional coupler, or the like can be used.
 位相制御部32は、Mチャンネルの導波路36に例えば電界を付与するM個の電極32aと、各電極32aへの電圧供給を可能にする配線32bとからなる。配線32bへの供給電圧は、図1に示すOPA駆動制御部81によって調整されている。各配線32bに付与する電圧の調整により、各導波路36を通過するレーザー光B1の位相状態を個別に制御することができる。位相制御部32を経た光フェーズドアレイ30の射出ポート34からは、位相状態の制御された照明光B2が射出される。この際、照明光B2は、ランダムな位相分布を有する光とされる。さらに、照明光B2は、時系列的に高速で変化し、射出ポート34からは、それ自体がランダムな位相分布を有する照明光B2が互いに異なるN個のパターンとして射出される。つまり、これらN個のパターン又はN種の照明光B2は、いずれもランダムな位相分布を有し、時系列的にもランダムに変化する。なお、各パターンにおいて、ランダムな位相の分布範囲は、±180°であり、偏りのないものとなっている。 The phase control unit 32 includes, for example, M electrodes 32a that apply an electric field to the M-channel waveguide 36, and wirings 32b that enable voltage supply to the electrodes 32a. The supply voltage to the wiring 32b is adjusted by the OPA drive control unit 81 shown in FIG. By adjusting the voltage applied to each wiring 32b, the phase state of the laser beam B1 passing through each waveguide 36 can be individually controlled. The phase-controlled illumination light B <b> 2 is emitted from the emission port 34 of the optical phased array 30 that has passed through the phase control unit 32. At this time, the illumination light B2 is light having a random phase distribution. Further, the illumination light B2 changes at high speed in time series, and the illumination light B2 having a random phase distribution itself is emitted from the emission port 34 as N patterns different from each other. That is, these N patterns or N types of illumination light B2 all have a random phase distribution and change randomly in time series. In each pattern, the random phase distribution range is ± 180 °, and there is no bias.
 以上では、光フェーズドアレイ30を構成する位相制御部32がX方向に1次元であり、射出ポート34から1次元配列のランダムな位相分布又はパターンを有する照明光B2を射出させるとして説明したが、例えば図2に示す光フェーズドアレイ30をY方向に積層することができる。このような積層型の光フェーズドアレイ130によって、XY方向に2次元のランダムな位相分布を有する照明光B2を形成し射出させることができる。もしくは、導波路の構造を変更して射出ポートを2次元配置することにより、基板38の主面に対して垂直方向に位相を制御したレーザー光を射出させる構造とすることができる。この場合は、図3に示すように、1次元アレイ状の射出ポート34の代わりに、射出ユニット237をXY方向に2次元アレイ状に配置した射出部434を有する光フェーズドアレイ130とすることにより、XY方向に2次元の位相分布を持つ照明光B2を形成しZ方向に射出することができる。照明光B2を1次元とした場合、対象像として照明光B2の分布方向に対応する1次元像を得ることができ、照明光B2を2次元とした場合、対象像として照明光B2の分布に対応する2次元像を得ることができる。 In the above description, the phase control unit 32 constituting the optical phased array 30 is one-dimensional in the X direction, and the illumination port B2 having the random phase distribution or pattern of the one-dimensional array is emitted from the emission port 34. For example, the optical phased array 30 shown in FIG. 2 can be stacked in the Y direction. With such a laminated optical phased array 130, illumination light B2 having a two-dimensional random phase distribution in the XY directions can be formed and emitted. Alternatively, by changing the structure of the waveguide and arranging the emission ports two-dimensionally, it is possible to emit laser light whose phase is controlled in the direction perpendicular to the main surface of the substrate 38. In this case, as shown in FIG. 3, instead of the one-dimensional array of injection ports 34, the injection unit 237 is an optical phased array 130 having an injection unit 434 arranged in a two-dimensional array in the XY direction. The illumination light B2 having a two-dimensional phase distribution in the XY directions can be formed and emitted in the Z direction. When the illumination light B2 is one-dimensional, a one-dimensional image corresponding to the distribution direction of the illumination light B2 can be obtained as the target image, and when the illumination light B2 is two-dimensional, the distribution of the illumination light B2 is the target image. A corresponding two-dimensional image can be obtained.
 位相制御部32については、導波路36に与える電界強度によって位相を調整する電気光学効果型の変調に限らず、導波路部への電流注入によって位相を調整するキャリア効果型の変調、導波路を加熱することによって位相を調整する熱光学効果型の変調等を利用するものとできる。電気光学効果、キャリア効果、熱光学効果等を利用することで、位相制御部32を小型化かつ安価にでき、信頼性を高めることができ、位相の切り換えを高速化できる。 The phase control unit 32 is not limited to the electro-optic effect type modulation in which the phase is adjusted by the electric field strength applied to the waveguide 36, but the carrier effect type modulation and the waveguide in which the phase is adjusted by current injection into the waveguide unit. A thermo-optic effect type modulation that adjusts the phase by heating can be used. By utilizing the electro-optic effect, the carrier effect, the thermo-optic effect, etc., the phase control unit 32 can be reduced in size and cost, reliability can be improved, and phase switching can be speeded up.
 図1に戻って、観察光学系40は、分岐ミラー43と、複数のレンズL1~L3とを備える。分岐ミラー43は、一様な透過率又は反射率を有するハーフミラーである。分岐ミラー43は、光フェーズドアレイ30からの照明光B2を分割して、一部を対象OBに入射させるとともに、残りを像センサー(第1センサー)50に入射させる。また、分岐ミラー43は、対象OBの表面OBaでの散乱によって反射された戻り光である計測光B3を反射して受光素子(第2センサー)60に導く。ここで、レンズL1は、光フェーズドアレイ30から射出された照明光B2の発散を防止しつつ遠視野状態での照明を可能にする。また、レンズL2は、対象OBで反射された計測光B3の光束径を絞って受光素子60の感光部61に一括入射させる役割を有する。レンズL3は、レンズL1と協働して像センサー50の感光面51に遠視野像として計測光B2のパターンを形成する。レンズL3の焦点距離の調整によって、感光面51上の遠視野像のサイズを調整することもできる。観察光学系40の構成によっては、レンズL1若しくはレンズL3を省略することもできる。以上において、受光素子(第2センサー)60は、対象OBで反射された計測光の強度を一括して検出し、像センサー(第1センサー)50は、光フェーズドアレイ30から射出された照明光の遠視野像を検出する。これにより、受光素子60の検出情報が単純化され、検出情報の処理が簡便となり、対象像の抽出処理が高速となる。 Referring back to FIG. 1, the observation optical system 40 includes a branch mirror 43 and a plurality of lenses L1 to L3. The branch mirror 43 is a half mirror having a uniform transmittance or reflectance. The branch mirror 43 divides the illumination light B <b> 2 from the optical phased array 30 so that a part thereof is incident on the target OB and the other is incident on the image sensor (first sensor) 50. Further, the branch mirror 43 reflects the measurement light B <b> 3 that is return light reflected by scattering on the surface OBa of the target OB and guides it to the light receiving element (second sensor) 60. Here, the lens L1 enables illumination in a far field state while preventing the divergence of the illumination light B2 emitted from the optical phased array 30. In addition, the lens L2 has a role of collectively entering the photosensitive portion 61 of the light receiving element 60 by reducing the diameter of the measurement light B3 reflected by the object OB. The lens L3 forms a pattern of the measurement light B2 as a far-field image on the photosensitive surface 51 of the image sensor 50 in cooperation with the lens L1. The size of the far-field image on the photosensitive surface 51 can be adjusted by adjusting the focal length of the lens L3. Depending on the configuration of the observation optical system 40, the lens L1 or the lens L3 may be omitted. In the above, the light receiving element (second sensor) 60 collectively detects the intensity of the measurement light reflected by the target OB, and the image sensor (first sensor) 50 is the illumination light emitted from the optical phased array 30. The far field image of is detected. Thereby, the detection information of the light receiving element 60 is simplified, the processing of the detection information is simplified, and the target image extraction process is accelerated.
 像センサー(第1センサー)50は、CMOS、CCD等の半導体イメージセンサーである。像センサー50は、光源部20の波長に感度を有しており、波長選択フィルターを付随させることができる。像センサー50は、感光面51に形成された照明光B2のパターンを検出し検出画像として取り込む。この際、画素位置ごとに照明光B2の強度値が検出される。上記のように、光フェーズドアレイ30によって照明光B2がNパターンで射出されるので、照明光B2の検出画像もN個得られる。 The image sensor (first sensor) 50 is a semiconductor image sensor such as a CMOS or CCD. The image sensor 50 is sensitive to the wavelength of the light source unit 20 and can be accompanied by a wavelength selection filter. The image sensor 50 detects the pattern of the illumination light B2 formed on the photosensitive surface 51 and captures it as a detected image. At this time, the intensity value of the illumination light B2 is detected for each pixel position. As described above, since the illumination light B2 is emitted in N patterns by the optical phased array 30, N detection images of the illumination light B2 are also obtained.
 受光素子(第2センサー)60は、フォトダイオード等の半導体光センサーである。受光素子60は、光源部20の波長に感度を有しており、波長選択フィルターを付随させることができる。受光素子60は、受光素子駆動部82に駆動されて動作し、感光部61に入射した計測光B3の干渉パターン全体の光強度に対応する信号を出力する。つまり、受光素子60は、対象OBの全体で反射された計測光B3を一括して総信号強度として検出する。上記のように、光フェーズドアレイ30によって照明光B2がN個のパターンで射出されるので、計測光B3の検出強度もN個得られる。 The light receiving element (second sensor) 60 is a semiconductor optical sensor such as a photodiode. The light receiving element 60 is sensitive to the wavelength of the light source unit 20 and can be accompanied by a wavelength selection filter. The light receiving element 60 operates by being driven by the light receiving element driving unit 82, and outputs a signal corresponding to the light intensity of the entire interference pattern of the measurement light B3 incident on the photosensitive unit 61. That is, the light receiving element 60 collectively detects the measurement light B3 reflected by the entire target OB as the total signal intensity. As described above, since the illumination light B2 is emitted in N patterns by the optical phased array 30, N detection intensities of the measurement light B3 are also obtained.
 情報処理部70は、制御部71とインターフェース部72と記憶部73とを有する。制御部71は、インターフェース部72及びOPA駆動制御部81を介して光フェーズドアレイ30等を動作させ、ランダムな位相分布を有する照明光B2を複数個のパターンで射出させる。制御部71は、像センサー50によって撮影された検出画像を、タイミング情報とともにインターフェース部72を介して受け取る。情報処理部70は、受光素子60によって検出された計測光B3の強度を、タイミング情報とともにインターフェース部72を介して受け取る。制御部71は、像センサー50から取得した照明光B2の検出画像や受光素子60から取得した計測光B3の強度値を一時的に記憶部73に保管するとともに、これらの検出画像や強度値から得た対象OBの状態を計測結果又は再構成画像として保管する。この際、制御部71は、像センサー(第1センサー)50上の位置情報(具体的には、図示の対象OB上でX軸に対応し、像センサー50上でZ軸に対応する座標x等の値)を前提として、像センサー(第1センサー)50の検出情報と受光素子(第2センサー)60の検出情報から再構成画像を算出する。この再構成画像は、対象OBの反射率といった対象OBの状態を表す。 The information processing unit 70 includes a control unit 71, an interface unit 72, and a storage unit 73. The control unit 71 operates the optical phased array 30 and the like via the interface unit 72 and the OPA drive control unit 81, and emits illumination light B2 having a random phase distribution in a plurality of patterns. The control unit 71 receives the detected image taken by the image sensor 50 through the interface unit 72 together with timing information. The information processing unit 70 receives the intensity of the measurement light B3 detected by the light receiving element 60 through the interface unit 72 together with timing information. The control unit 71 temporarily stores the detection image of the illumination light B2 acquired from the image sensor 50 and the intensity value of the measurement light B3 acquired from the light receiving element 60 in the storage unit 73, and from these detection images and intensity values. The obtained state of the target OB is stored as a measurement result or a reconstructed image. At this time, the control unit 71 corresponds to the position information on the image sensor (first sensor) 50 (specifically, the coordinate x corresponding to the X axis on the illustrated object OB and corresponding to the Z axis on the image sensor 50). And the like, the reconstructed image is calculated from the detection information of the image sensor (first sensor) 50 and the detection information of the light receiving element (second sensor) 60. This reconstructed image represents the state of the target OB such as the reflectance of the target OB.
 情報処理部70による処理結果、具外的には対象OBの状態を反映した画像は、入出力部91に表示される。入出力部91には、情報処理部70の制御下で、オペレーターに対してイメージング装置100の動作状態に関する種々の情報が提示される。情報処理部70に対しては、入出力部91を介してオペレーターから指示が入力される。 The result of processing by the information processing unit 70, specifically, an image reflecting the state of the target OB is displayed on the input / output unit 91. Under the control of the information processing unit 70, the input / output unit 91 presents various information regarding the operating state of the imaging apparatus 100 to the operator. An instruction is input from the operator to the information processing unit 70 via the input / output unit 91.
 以下、図4を参照して、第1実施形態のイメージング装置100の動作例について説明する。 Hereinafter, an operation example of the imaging apparatus 100 according to the first embodiment will be described with reference to FIG.
 まず、情報処理部70の制御部71は、インターフェース部72を介してOPA駆動制御部81に動作指令を出力し、OPA駆動制御部81にランダムな仮想的照射パターンを準備させる(ステップS11)。ランダムな仮想的照射パターンは、毎回生成することもできるが、予め記憶したランダムパターンを読み出すこともできる。 First, the control unit 71 of the information processing unit 70 outputs an operation command to the OPA drive control unit 81 via the interface unit 72, and causes the OPA drive control unit 81 to prepare a random virtual irradiation pattern (step S11). A random virtual irradiation pattern can be generated every time, but a random pattern stored in advance can also be read out.
 OPA駆動制御部81は、この仮想的照射パターンに基づいて光フェーズドアレイ30を動作させ、光フェーズドアレイ30からランダムな位相分布を有するランダムパターンの照明光B2を射出させる(ステップS12)。このランダムパターンは、スペックル状の輝度分布パターンである。この際、ステップS11で準備した仮想的照射パターンと、光フェーズドアレイ30から実際に射出させる照明光B2のランダムパターンとは、厳密には対応していなくてもよい。つまり、光フェーズドアレイ30を構成する個々の電極32aのサイズや配置に関して誤差やゆらぎがある場合、光フェーズドアレイ30からは厳密に位相が制御された照明光B2が射出されない。また、光フェーズドアレイ30の特性が温度環境の変動で変化する場合も、光フェーズドアレイ30からは厳密に位相が制御された照明光B2が射出されない。このような場合、光フェーズドアレイ30から射出させる照明光B2のランダムパターンは、厳密に制御されたものではなくなるが、後述する相関に関する抽出値Srから再構成画像O(x)を算出することで、このような光フェーズドアレイ30の製造誤差や環境変動が相殺又は緩和される。 The OPA drive control unit 81 operates the optical phased array 30 based on this virtual irradiation pattern, and emits the illumination light B2 having a random pattern having a random phase distribution from the optical phased array 30 (step S12). This random pattern is a speckle-like luminance distribution pattern. At this time, the virtual irradiation pattern prepared in step S11 and the random pattern of the illumination light B2 actually emitted from the optical phased array 30 do not need to correspond exactly. That is, when there is an error or fluctuation in the size or arrangement of the individual electrodes 32a constituting the optical phased array 30, the illumination light B2 whose phase is strictly controlled is not emitted from the optical phased array 30. Further, even when the characteristics of the optical phased array 30 change due to changes in the temperature environment, the illumination phase B2 whose phase is strictly controlled is not emitted from the optical phased array 30. In such a case, the random pattern of the illumination light B2 emitted from the optical phased array 30 is not strictly controlled, but by calculating the reconstructed image O (x) from the extracted value Sr related to correlation described later. Such manufacturing errors and environmental variations of the optical phased array 30 are offset or mitigated.
 制御部71は、像センサー50によって撮影された検出画像を、インターフェース部72を介して、タイミング情報とともに受け取るとともに記憶部73に保管する(ステップS13)。これと並行して、受光素子60によって検出された計測光B3の検出強度を、受光素子駆動部82及びインターフェース部72を介して、タイミング情報とともに受け取るとともに記憶部73に保管する(ステップS14)。 The control unit 71 receives the detected image taken by the image sensor 50 together with the timing information via the interface unit 72 and stores it in the storage unit 73 (step S13). In parallel with this, the detection intensity of the measurement light B3 detected by the light receiving element 60 is received together with the timing information via the light receiving element driving unit 82 and the interface unit 72 and stored in the storage unit 73 (step S14).
 制御部71は、光フェーズドアレイ30によってN個のランダムパターンを形成及び出力する処理が完了したか否かを判断し(ステップS15)、N個のランダムパターンの出力が完了していない場合、ステップS11に戻って、制御部71がOPA駆動制御部81に次の仮想的照射パターンを準備させる。 The control unit 71 determines whether or not the process of forming and outputting N random patterns by the optical phased array 30 is completed (step S15). If the output of the N random patterns is not completed, the control unit 71 Returning to S11, the control unit 71 causes the OPA drive control unit 81 to prepare the next virtual irradiation pattern.
 一方、N個のランダムパターンの出力が完了している場合、ステップS16に進み、制御部71は、ステップS13,S14で記憶部73に保された像センサー50及び受光素子60の検出画像及び検出強度に基づいて画像を生成又は再構成する(ステップS16)。制御部71は、得られた再構成画像を記憶部73に保管する。この際、制御部71は、像センサー(第1センサー)50上の位置情報(座標x等の値)を前提として、像センサー(第1センサー)50の検出情報と受光素子(第2センサー)60の検出情報Srを用いることで、対象物の再構成画像O(x)を算出する。より具体的には、制御部71は、光フェーズドアレイ30からの照明光B2の位相分布をN回変化させつつ、受光素子60によって検出した総信号強度と、像センサー50上の対象位置における信号強度とから再構成画像O(x)を算出する。この場合、高速で動作する光フェーズドアレイ30を用いて像センサー50上の任意の位置に対応する画素について画像信号を算出することができ、対象像の再構成又は抽出を高速かつ高い精度で実現することができる。この再構成画像O(x)は、最も単純には、
O(x)
=(1/N)×Σ{(Sr-<S>)・Ir(x)}  …  (1)
で与えられる。ここで、値xは、図1の装置構成において対象OB上ではX軸に対応するが像センサー50上ではZ軸に対応する。像センサー50上での値xは、画素に相当するディスクリートな値となる。また、値Nは、光フェーズドアレイ30によって形成され出力されるランダムパターンの個数(自然数)を示す。値Srは、受光素子60の計測値すなわち計測光B3の強度値を示す。値<S>は、ランダムパターンを変更したN回の計測で得られたN個の値Srの平均値を示す。Ir(x)は、像センサー50上の座標値xと、この座標値xに対応する画素での強度値すなわち検出輝度との関係を表す。さらに、Σは、値Sr,Ir(x)の変数rを1~Nまで変化させつつ(Sr-<S>)・Ir(x)を加算することを意味する。この再構成画像O(x)は、各座標値xに対して再構成画像の輝度値を与える。
On the other hand, when the output of N random patterns is completed, the process proceeds to step S16, and the control unit 71 detects and detects the detected images of the image sensor 50 and the light receiving element 60 held in the storage unit 73 in steps S13 and S14. An image is generated or reconstructed based on the intensity (step S16). The control unit 71 stores the obtained reconstructed image in the storage unit 73. At this time, the control unit 71 presupposes position information on the image sensor (first sensor) 50 (values such as coordinates x) and detection information of the image sensor (first sensor) 50 and a light receiving element (second sensor). By using 60 detection information Sr, a reconstructed image O (x) of the object is calculated. More specifically, the control unit 71 changes the phase distribution of the illumination light B <b> 2 from the optical phased array 30 N times, and detects the total signal intensity detected by the light receiving element 60 and the signal at the target position on the image sensor 50. The reconstructed image O (x) is calculated from the intensity. In this case, an image signal can be calculated for a pixel corresponding to an arbitrary position on the image sensor 50 using the optical phased array 30 that operates at high speed, and the reconstruction or extraction of the target image is realized at high speed and with high accuracy. can do. This reconstructed image O (x) is, most simply,
O (x)
= (1 / N) × Σ {(Sr− <S>) · Ir (x)} (1)
Given in. Here, the value x corresponds to the X axis on the object OB in the apparatus configuration of FIG. 1, but corresponds to the Z axis on the image sensor 50. The value x on the image sensor 50 is a discrete value corresponding to a pixel. The value N indicates the number (natural number) of random patterns formed and output by the optical phased array 30. The value Sr indicates the measurement value of the light receiving element 60, that is, the intensity value of the measurement light B3. The value <S> indicates an average value of N values Sr obtained by N measurements with the random pattern changed. Ir (x) represents the relationship between the coordinate value x on the image sensor 50 and the intensity value at the pixel corresponding to the coordinate value x, that is, the detected luminance. Further, Σ means adding (Sr− <S>) · Ir (x) while changing the variable r of the values Sr, Ir (x) from 1 to N. The reconstructed image O (x) gives the luminance value of the reconstructed image for each coordinate value x.
 以上の式(1)は、像センサー50における1次元の画素列に関して再構成画像O(x)を決定するものであるが、像センサー50が2次元である場合、2次元の画素配列に関して再構成画像O(x,y)を決定する処理が行われる。この場合、再構成画像O(x,y)は、
O(x,y)
=(1/N)×Σ{(Sr-<S>)・Ir(x,y)}  …  (2)
で与えられる。ここで、値yは、対象OB上ではY軸に対応し、像センサー50上でもY軸に対応する。
The above equation (1) is for determining the reconstructed image O (x) for a one-dimensional pixel column in the image sensor 50. Processing for determining the configuration image O (x, y) is performed. In this case, the reconstructed image O (x, y) is
O (x, y)
= (1 / N) × Σ {(Sr− <S>) · Ir (x, y)} (2)
Given in. Here, the value y corresponds to the Y axis on the target OB and also corresponds to the Y axis on the image sensor 50.
 なお、上記(1)式、または、(2)式の代わりに、より高度なアルゴリズムを用いれば、ランダムパターンの生成数Nもしくは分割数やアレイ数に相当する値Mを相対的に減らしながら、同等の結果を得ることができる。例えば、対象物が現実的に不自然な形状にならないように再構成画像O(x)に適切な数学的な制約を課すことで、圧縮センシング手法等により、少ない値Nで再構成画像O(x)を算出することができる。逆に、値Nを相対的に大きくすることが許されるのであれば、値Mを減らしても、(1)式で表される単純な加算ではなく最小二乗法や逆行列法等により再構成画像O(x)を推定することで、同様の空間分解能を得ることができる。 If a more advanced algorithm is used instead of the above formula (1) or (2), the number N of random patterns generated or the value M corresponding to the number of divisions and the number of arrays is relatively reduced. Equivalent results can be obtained. For example, by imposing appropriate mathematical constraints on the reconstructed image O (x) so that the object does not actually have an unnatural shape, the reconstructed image O ( x) can be calculated. On the contrary, if the value N is allowed to be relatively large, even if the value M is reduced, the reconstruction is performed by the least square method, the inverse matrix method, or the like instead of the simple addition represented by the equation (1). A similar spatial resolution can be obtained by estimating the image O (x).
 第1実施形態のイメージング装置100は、パルス光源を用いることで、対象について奥行方向の計測を行うものに変更できる。この場合、対象像として照明光の進行方向に関する情報を含めた2次元像もしくは3次元像を得ることができる。イメージング装置100を対象の奥行方向も含めた3次元のイメージングに用いる場合、イメージング装置100は、光フェーズドアレイ30に光を供給する光源部20としてパルス光源を備え、情報処理部70は、計測光B3の応答時間に基づいて対象OBについて奥行方向の計測を行う。具体的には、光源部20からパルス状のレーザー光B1を射出させる。光フェーズドアレイ30の動作は同じであるが、受光素子60の計測値又は検出信号については、受光素子駆動部82において時間ゲートを設けて計測光B3のうち特定の距離に対応する信号のみを抽出する。これにより、奥行き方向に関して空間を徐々にスライスするように多段の計測光B3を得ることができ、情報処理部70では、2次元の場合と同様の手法で奥行き方向にスライスされた2次元の横画像を得るとともに、奥行きが異なる多数の2次元の横画像を合成した3次元的な画像を再生する。 The imaging apparatus 100 according to the first embodiment can be changed to a device that performs measurement in the depth direction of an object by using a pulse light source. In this case, a two-dimensional image or a three-dimensional image including information regarding the traveling direction of the illumination light can be obtained as the target image. When the imaging apparatus 100 is used for three-dimensional imaging including the depth direction of an object, the imaging apparatus 100 includes a pulse light source as the light source unit 20 that supplies light to the optical phased array 30, and the information processing unit 70 includes measurement light. Based on the response time of B3, the depth direction is measured for the target OB. Specifically, a pulsed laser beam B1 is emitted from the light source unit 20. The operation of the optical phased array 30 is the same, but for the measurement value or detection signal of the light receiving element 60, a time gate is provided in the light receiving element driving unit 82 to extract only a signal corresponding to a specific distance from the measurement light B3. To do. As a result, multi-stage measurement light B3 can be obtained so as to slice the space gradually with respect to the depth direction. In the information processing unit 70, a two-dimensional horizontal slice sliced in the depth direction by the same method as in the two-dimensional case. An image is obtained and a three-dimensional image obtained by synthesizing a number of two-dimensional horizontal images having different depths is reproduced.
 なお、光フェーズドアレイ30から射出させる照明光B2のランダムパターンは、厳密に制御されたものでなくてよい。像センサー50の検出情報と受光素子60の検出情報とを用いることで、光フェーズドアレイ30の製造誤差や環境変動が平均化されるように相殺又は緩和されることになり、対象OBに対する再構成画像O(x)を高い精度で得ることができる。 Note that the random pattern of the illumination light B2 emitted from the optical phased array 30 does not have to be strictly controlled. By using the detection information of the image sensor 50 and the detection information of the light receiving element 60, the manufacturing error and environmental fluctuation of the optical phased array 30 are offset or alleviated so as to be averaged. The image O (x) can be obtained with high accuracy.
 図5は、図1に示す第1実施形態のイメージング装置100の変形例を示す図である。この場合、観察光学系140において、対象OBを透過した計測光B3を受光素子60によって観測する。この場合も、上述した再構成画像O(x)として対象OBの透過率分布等を決定することができる。 FIG. 5 is a view showing a modification of the imaging apparatus 100 of the first embodiment shown in FIG. In this case, in the observation optical system 140, the measurement light B3 transmitted through the object OB is observed by the light receiving element 60. Also in this case, the transmittance distribution and the like of the target OB can be determined as the reconstructed image O (x) described above.
 図6A及び6Bは、実施形態のイメージング装置100を用いた数値検証結果を示すチャートである。この場合、1次元の走査結果を示しており、1次元の画像又は分布が得られている。横軸は、画素位置を示し、縦軸は透過率を示す。図6Aにおいて横軸の下に並ぶ数値は、現実の透過率を示す。点線は、従来的な手法を用いたものであり、光フェーズドアレイ30から射出させる照明光B2の位相状態をチューニングによって厳密に制御して計測光B3の分布から対象OBの透過率を計測している。その他の線(具体的には実線、破線、及び一点鎖線)は、実施形態の装置(具体的には図5の装置系)を用いて、光フェーズドアレイ30によるレーザー光B1の分割数Mと、ランダムパターンの生成数Nとを変化させたものである。図6Aにおいて、ランダムパターンの生成数Nは、10、100、及び1000とし、図6Bにおいて、ランダムパターンの生成数Nは、2000としている。両チャートから明らかなように、ランダムパターンの生成数Nを多くすることにより、特に位置の分解能に対応するレーザー光B1の分割数Mよりも多くすることにより(N>M)、高精度の光フェーズドアレイ30を用いないでも、それと同様の結果を得ることができることが分かる。 6A and 6B are charts showing numerical verification results using the imaging apparatus 100 of the embodiment. In this case, a one-dimensional scanning result is shown, and a one-dimensional image or distribution is obtained. The horizontal axis indicates the pixel position, and the vertical axis indicates the transmittance. In FIG. 6A, numerical values arranged below the horizontal axis indicate actual transmittance. The dotted line uses a conventional technique, and the phase state of the illumination light B2 emitted from the optical phased array 30 is strictly controlled by tuning, and the transmittance of the target OB is measured from the distribution of the measurement light B3. Yes. The other lines (specifically, solid line, broken line, and alternate long and short dash line) are the number of divisions M of the laser beam B1 by the optical phased array 30 using the apparatus of the embodiment (specifically, the apparatus system of FIG. 5). The number N of random pattern generations is changed. In FIG. 6A, the generation number N of random patterns is 10, 100, and 1000. In FIG. 6B, the generation number N of random patterns is 2000. As is clear from both charts, by increasing the number N of random patterns generated, in particular by increasing the number of divisions M of the laser beam B1 corresponding to the position resolution (N> M), high-precision light is obtained. It can be seen that similar results can be obtained without using the phased array 30.
 図7は、光フェーズドアレイ30の具体的な作製例を示すものであり、図2に対応する。この場合、平面視矩形の基板は、インジウムリン(InP)半導体基板であり、InGaAsP製の導波路、p-i-nタイプのInP/InGaAsP/InPからなるダブルヘテロ構造の位相シフト部、及びTi/Au製の電極からなる位相制御部32を形成した。この場合、光フェーズドアレイ30は、1次元の変調器となっている。 FIG. 7 shows a specific example of manufacturing the optical phased array 30, and corresponds to FIG. In this case, the rectangular substrate in plan view is an indium phosphide (InP) semiconductor substrate, a waveguide made of InGaAsP, a phase shift portion of a double hetero structure made of pin type InP / InGaAsP / InP, and Ti / The phase control part 32 which consists of an electrode made from Au was formed. In this case, the optical phased array 30 is a one-dimensional modulator.
 図8A及び8Bは、図7の光フェーズドアレイ30を組み込んだイメージング装置100を用いた検証結果を示すチャートである。計測対象はスリットパターンである。横軸は、画素位置を示し、縦軸は透過率を示す。チャートの上に並ぶ数値は、透過率を示す。図8Aに示す従来的な手法の場合、光フェーズドアレイ30から射出させる照明光B2の位相状態を制御するため、駆動条件の抽出に18000回の測定を行っており、比較的正確に計測対象のスリットパターンを再現していると考えることができる。一方、図8Bに示す実施例の手法の場合、光フェーズドアレイ30の駆動条件の抽出を行わないで、ランダムパターンをN=100回生成して、式(1)の再構成画像O(x)から計測対象の画像を得ている。図8Bに示す画像又は分布パターンは、図8Aに示す画像又は分布パターンと近似しており、比較的正確に計測対象のスリットパターンを再現していると考えることができる。 8A and 8B are charts showing verification results using the imaging apparatus 100 in which the optical phased array 30 of FIG. 7 is incorporated. The measurement target is a slit pattern. The horizontal axis indicates the pixel position, and the vertical axis indicates the transmittance. The numerical values on the chart indicate the transmittance. In the case of the conventional method shown in FIG. 8A, in order to control the phase state of the illumination light B2 emitted from the optical phased array 30, the drive condition is extracted 18,000 times, and the measurement target is measured relatively accurately. It can be considered that the slit pattern is reproduced. On the other hand, in the case of the method of the embodiment shown in FIG. 8B, a random pattern is generated N = 100 times without extracting the driving condition of the optical phased array 30, and the reconstructed image O (x) of Expression (1) is generated. The measurement target image is obtained from The image or distribution pattern shown in FIG. 8B approximates the image or distribution pattern shown in FIG. 8A, and can be considered to reproduce the slit pattern to be measured relatively accurately.
 以上のように、第1実施形態のイメージング装置100では、光フェーズドアレイ30によってランダムな位相分布をそれぞれ有する複数パターンの照明光B2を照射する照明環境下で、受光素子(第2センサー)60を参照して像を抽出することが可能になるので、光フェーズドアレイ30の作製誤差その他の不完全性による照明光B2の位相ズレ、強度バラツキ等の影響を相殺するように低減でき、さらには、温度その他の使用環境の変動による影響を受けにくくなり、比較的安価に作製される光フェーズドアレイ30を用いても計測の信頼性を簡易に高めることができる。 As described above, in the imaging apparatus 100 according to the first embodiment, the light receiving element (second sensor) 60 is provided in an illumination environment in which the optical phased array 30 emits a plurality of patterns of illumination light B2 each having a random phase distribution. Since it is possible to extract an image by referring to it, it is possible to reduce the influence of the phase error, intensity variation, etc. of the illumination light B2 due to manufacturing errors and other imperfections of the optical phased array 30, and further, The reliability of measurement can be easily increased even if the optical phased array 30 which is less susceptible to the influence of temperature and other changes in the usage environment and is manufactured at a relatively low cost.
 また、分岐数又は分割数Mが十分に大きければ、全ての位相制御部32を切り替える必要がなく、例えば、M個の位相制御器のうち、約半分(M/2個)は固定して、残りの半分(M/2個)の位相制御器のみを切り替えることで、同様の特性を得ることができる。これにより、駆動制御部81の簡略化や省電力化が達成できると同時に、光フェーズドアレイ30の作製誤差その他不完全性の影響を受けにくくすることができる。 Further, if the number of branches or the number of divisions M is sufficiently large, it is not necessary to switch all the phase control units 32. For example, about half (M / 2) of M phase controllers are fixed, Similar characteristics can be obtained by switching only the remaining half (M / 2) phase controllers. Thereby, simplification and power saving of the drive control unit 81 can be achieved, and at the same time, it is possible to make it difficult to be affected by manufacturing errors and other imperfections of the optical phased array 30.
〔第2実施形態〕
 以下、第2実施形態に係るイメージング装置等について説明する。なお、第2実施形態に係るイメージング装置は、第1実施形態を変形したものであり、特に説明しない部分については、第1実施形態と同様である。
[Second Embodiment]
Hereinafter, an imaging apparatus and the like according to the second embodiment will be described. Note that the imaging apparatus according to the second embodiment is a modification of the first embodiment, and parts that are not particularly described are the same as those in the first embodiment.
 第2実施形態のイメージング装置では、光フェーズドアレイ(OPA)を動作させるための電極を簡便なものとしている。 In the imaging apparatus of the second embodiment, the electrodes for operating the optical phased array (OPA) are simplified.
 図9に示すように、第2実施形態のイメージング装置に用いる光フェーズドアレイ230は、複数の光路であるM個の導波路36をこれよりも少ない電極32e~32hで動作させる。図示の例では、7個の導波路36を4つの電極32e~32hで動作させる例となっている。この場合、各電極32e~32hに印可する電圧V1~V4の値をランダムに変化させる。これにより、射出ポート34からランダムな位相分布を有する照明光B2を射出させることができる。 As shown in FIG. 9, the optical phased array 230 used in the imaging apparatus of the second embodiment operates M waveguides 36, which are a plurality of optical paths, with fewer electrodes 32e to 32h. In the illustrated example, seven waveguides 36 are operated by four electrodes 32e to 32h. In this case, the values of the voltages V1 to V4 applied to the electrodes 32e to 32h are randomly changed. Thereby, the illumination light B2 having a random phase distribution can be emitted from the emission port 34.
 第2実施形態のイメージング装置では、光フェーズドアレイ230において、複数の導波路36に亘って配置されるとともに互いに異なるランダムな形状パターンを有する位相調整用の複数の電極32e~32hを設け、これら複数の電極32e~32hの組み合わせ方によってランダムな位相分布の照明光B2を射出する。この結果、空間分解能を下げないで電極数を大幅に削減することができ、光フェーズドアレイ30の小型化が可能になり、光フェーズドアレイ230の駆動方法を簡素化することができる。 In the imaging apparatus of the second embodiment, in the optical phased array 230, a plurality of electrodes 32e to 32h for phase adjustment having a random shape pattern different from each other are provided over the plurality of waveguides 36, The illumination light B2 having a random phase distribution is emitted depending on how the electrodes 32e to 32h are combined. As a result, the number of electrodes can be significantly reduced without reducing the spatial resolution, the optical phased array 30 can be downsized, and the driving method of the optical phased array 230 can be simplified.
〔第3実施形態〕
 以下、第3実施形態に係るイメージング装置等について説明する。なお、第3実施形態に係るイメージング装置は、第1実施形態を変形したものであり、特に説明しない部分については、第1実施形態と同様である。
[Third Embodiment]
Hereinafter, an imaging apparatus and the like according to the third embodiment will be described. Note that the imaging apparatus according to the third embodiment is a modification of the first embodiment, and parts that are not particularly described are the same as those in the first embodiment.
 第3実施形態のイメージング装置では、1次元の光フェーズドアレイ(OPA)を用いて2次元の位相分布を有する照明光B2、もしくは、1次元方向に拡大した位相分布を有する照明光B2を形成する。 In the imaging apparatus of the third embodiment, the illumination light B2 having a two-dimensional phase distribution or the illumination light B2 having a phase distribution expanded in the one-dimensional direction is formed using a one-dimensional optical phased array (OPA). .
 図10に示すように、第3実施形態のイメージング装置に用いる光フェーズドアレイ330は、図2に示す光フェーズドアレイ30と同様の構造を有する本体部分330aと、本体部分330aの射出ポート34側に配置されて、射出ポート34の配列方向に沿って延びる分岐部であるプリズム330bとを備える。光フェーズドアレイ330には、広帯域の光源光B12を入射させる。光フェーズドアレイ330は、光源光B12に対して波長ごとにランダムな位相分布を持たせる。導波路36を経て射出ポート34から射出された照明光B2は、射出ポート34の配列方向に関して位相分布を有している。照明光B2は、プリズム330bを経て射出ポート34の配列方向に直交する方向に偏向されるが、この際、照明光B2の波長成分に応じて偏向角が異なり、射出ポート34の配列方向に直交する方向に分割される。プリズム330bを通過した照明光B2は、2次元的な拡がりを有する。この照明光B2は、射出ポート34の配列方向に関しては少なくともランダムな位相分布を有する。射出ポート34の配列方向に直交する方向に関しては、相関性を有する可能性があるが、射出ポート34の配列方向に沿った1次元の画像データを個別に処理すればよい。 As shown in FIG. 10, the optical phased array 330 used in the imaging apparatus of the third embodiment includes a main body portion 330a having the same structure as the optical phased array 30 shown in FIG. 2, and an emission port 34 side of the main body portion 330a. And a prism 330b that is a branch portion that is arranged and extends along the arrangement direction of the exit ports 34. A broadband light source light B <b> 12 is incident on the optical phased array 330. The optical phased array 330 has a random phase distribution for each wavelength with respect to the light source light B12. The illumination light B <b> 2 emitted from the emission port 34 via the waveguide 36 has a phase distribution with respect to the arrangement direction of the emission ports 34. The illumination light B2 is deflected in a direction orthogonal to the arrangement direction of the emission ports 34 through the prism 330b. At this time, the deflection angle differs depending on the wavelength component of the illumination light B2, and is orthogonal to the arrangement direction of the emission ports 34. It is divided in the direction to do. The illumination light B2 that has passed through the prism 330b has a two-dimensional spread. The illumination light B <b> 2 has at least a random phase distribution with respect to the arrangement direction of the emission ports 34. The direction orthogonal to the arrangement direction of the injection ports 34 may have a correlation, but one-dimensional image data along the arrangement direction of the injection ports 34 may be processed individually.
 上記第3実施形態のイメージング装置では、光フェーズドアレイ330において、複数の波長域の照明光B2を変調して、波長域ごとにランダムな位相分布を持たせ、分岐部であるプリズム330bによって光フェーズドアレイ330からの照明光B2を波長域ごとに分割する。これにより、1次元の光フェーズドアレイ330を用いて2次元の照明光を射出させることができる。 In the imaging apparatus of the third embodiment, in the optical phased array 330, the illumination light B2 in a plurality of wavelength regions is modulated so as to have a random phase distribution for each wavelength region, and the optical phased array 330 has a branching portion. The illumination light B2 from the array 330 is divided for each wavelength region. As a result, two-dimensional illumination light can be emitted using the one-dimensional optical phased array 330.
 同様に、プリズムを導波路36が延びる方向のまわりに90°(度)傾けて使用すれば、1次元の照射範囲を波長ごとに分割することもできる。 Similarly, if the prism is tilted 90 ° (degrees) around the direction in which the waveguide 36 extends, the one-dimensional irradiation range can be divided for each wavelength.
 以上のように、プリズムを用いることで検出の次元又はスキャン範囲を波長域によって分担させることができる。 As described above, by using the prism, the dimension of detection or the scan range can be shared by the wavelength range.
 プリズム330bを用いる代わりに、回折格子を用いることもできる。さらに、射出ポート34の箇所に回折格子型結合器を集積し、基板38に対して垂直方向に光を取り出すようにしても同じ効果が得られる。一般に回折格子型結合器では、波長によって射出される角度が異なるため、2次元の照射光を直接射出させることができる。これによりさらなる小型化が実現できる。 Instead of using the prism 330b, a diffraction grating can also be used. Further, the same effect can be obtained by integrating a diffraction grating type coupler at the position of the emission port 34 and extracting light in a direction perpendicular to the substrate 38. In general, in the diffraction grating type coupler, since the angle of emission differs depending on the wavelength, two-dimensional irradiation light can be directly emitted. Thereby, further downsizing can be realized.
 以上のように光源部20として広帯域の光源を用いる代わりに、波長可変光源を用いて波長を掃引することで、検出の次元又はスキャン範囲を分担させることができる。 As described above, instead of using a broadband light source as the light source unit 20, the detection dimension or scan range can be shared by sweeping the wavelength using a wavelength variable light source.
〔第4実施形態〕
 以下、第4実施形態に係るイメージング装置等について説明する。第4実施形態に係るイメージング装置は、第1実施形態を変形したものであり、特に説明しない部分については、第1実施形態と同様である。
[Fourth Embodiment]
Hereinafter, an imaging apparatus and the like according to the fourth embodiment will be described. The imaging apparatus according to the fourth embodiment is a modification of the first embodiment, and parts that are not particularly described are the same as those in the first embodiment.
 図11に示すように、第4実施形態のイメージング装置に用いる光フェーズドアレイ430は、図2に示す光フェーズドアレイ30と同様の構造を有する本体部分430aと、本体部分430aの射出ポート34に近接して配置される光結合部430bと、光結合部430bの光射出部に近接して配置される多モード光ファイバー430cとを備える。光結合部430bは、3次元光回路であり、例えばフォトニックランターンを用いることができる。光結合部430bは、本体部分340aの射出ポート34から射出された一次元の光信号である照明光B21を光入射部で受け、一次元の光信号を二次元の光信号である照明光B22に変換して光射出部から射出させる。多モード光ファイバー430cは、光結合部430bの射出部から射出された二次元の光信号である照明光B22を入射端3aで受け、コア3bを伝搬した二次元の光信号である照明光B2を射出端3cから射出させる。本体部分430aによって形成された元の照明光B21は、ランダムパターンを有し、多モード光ファイバー430c等を経た変換後の照明光B2は、モード結合やモード分散の影響で微細な輝度パターンであるスペックルとして出力される。つまり、多モード光ファイバー430c等を経ることで、照明光B2は、さらにランダムで微細な輝度パターンに変換されたものとなっている。多モード光ファイバー430cは、例えば数m程度の長さを有し、一定以下の曲率で自在に曲げることができる。その際、多モード光ファイバー430cでの光の伝搬条件が変化し、ランダムパターンの状態が変化する。しかしながら、位相制御部32によって、光ファイバーの変動に比べて短時間で出力状態の切り換えが行われるので、多モード光ファイバー430cを経た照明光B2は、多モード光ファイバー430cの一瞬の状態を反映した高速に変化するランダムパターンとなる。 As shown in FIG. 11, the optical phased array 430 used in the imaging apparatus of the fourth embodiment is close to the main body portion 430a having the same structure as the optical phased array 30 shown in FIG. 2 and the exit port 34 of the main body portion 430a. And a multimode optical fiber 430c disposed in the vicinity of the light emitting portion of the optical coupling portion 430b. The optical coupling unit 430b is a three-dimensional optical circuit, and for example, a photonic lantern can be used. The optical coupling unit 430b receives the illumination light B21, which is a one-dimensional optical signal, emitted from the emission port 34 of the main body portion 340a at the light incident part, and receives the one-dimensional optical signal as illumination light B22, which is a two-dimensional optical signal. To be emitted from the light emitting part. The multimode optical fiber 430c receives the illumination light B22 that is a two-dimensional optical signal emitted from the emission unit of the optical coupling unit 430b at the incident end 3a, and receives the illumination light B2 that is a two-dimensional optical signal propagated through the core 3b. Injection is performed from the injection end 3c. The original illumination light B21 formed by the main body portion 430a has a random pattern, and the converted illumination light B2 that has passed through the multimode optical fiber 430c and the like has a fine luminance pattern due to the influence of mode coupling and mode dispersion. Is output as That is, the illumination light B2 is converted into a random and fine luminance pattern by passing through the multimode optical fiber 430c and the like. The multimode optical fiber 430c has a length of about several meters, for example, and can be freely bent with a curvature below a certain value. At that time, the light propagation condition in the multimode optical fiber 430c changes, and the state of the random pattern changes. However, since the phase control unit 32 switches the output state in a short time compared to the fluctuation of the optical fiber, the illumination light B2 that has passed through the multimode optical fiber 430c is reflected at a high speed reflecting the instantaneous state of the multimode optical fiber 430c. A random pattern that changes.
 第4実施形態に係るイメージング装置の場合、図1に示す観察光学系40と、像センサー50と、受光素子60とを一組とする系の入射部に対して図11に示す多モード光ファイバー430cの射出端3cを結合する。ここで、観察光学系40、像センサー50、及び受光素子60からなる系(以下、先端観察ユニットと呼ぶ)を小さくすれば、多モード光ファイバー430cや信号線を収納したケーブルの先端を所望の位置に移動させつつ、ケーブル先端の先端観察ユニットを対象OBに向けて近接させることができ、遠隔的に対象OBの状態を計測することができる。このようなイメージング装置は、ファイバースコープ又は内視鏡として用いることができる。 In the case of the imaging apparatus according to the fourth embodiment, the multimode optical fiber 430c shown in FIG. 11 with respect to the incident part of the system including the observation optical system 40, the image sensor 50, and the light receiving element 60 shown in FIG. The injection end 3c is coupled. Here, if the system (hereinafter referred to as the tip observation unit) including the observation optical system 40, the image sensor 50, and the light receiving element 60 is made small, the tip of the cable containing the multimode optical fiber 430c and the signal line is placed at a desired position. , The tip observation unit at the tip of the cable can be brought closer to the target OB, and the state of the target OB can be measured remotely. Such an imaging apparatus can be used as a fiberscope or an endoscope.
 第4実施形態のイメージング装置については、例えば像センサー50を多モード光ファイバー430cの入射端3a側に分岐部を介して配置することができる。像センサー50を多モード光ファイバー430cの根元側に配置する場合、多モード光ファイバー430cの曲げ状態に関わらず射出端3c側の映像を取得する技術を用いることができる(Ruo Yu Gu, et al., "Design of flexible multi-mode fiber endoscope", 19, Oct 2015|Vol. 23, No. 21|OPTICS EXPRESS 26905)。この場合、多モード光ファイバー430cの出射端3cに光を一部反射する反射鏡を設け、多モード光ファイバー430cの根元側でキャリブレーションを行うことで、像センサー50と同様に射出端3c側に投影される映像つまりランダムパターンを根元側に遠隔的に取り込むことができる。 For the imaging apparatus of the fourth embodiment, for example, the image sensor 50 can be arranged on the incident end 3a side of the multimode optical fiber 430c via a branching section. When the image sensor 50 is disposed on the base side of the multimode optical fiber 430c, a technique for acquiring an image on the exit end 3c side can be used regardless of the bending state of the multimode optical fiber 430c (Ruo Yu Gu, et al., “Design of flexible multi-mode fiber endoscope”, 19, Oct 2015 | Vol. 23, No. 21 | OPTICS EXPRESS 26905). In this case, a reflection mirror that partially reflects light is provided at the exit end 3c of the multimode optical fiber 430c, and calibration is performed on the base side of the multimode optical fiber 430c, so that the projection is performed on the exit end 3c side as with the image sensor 50. The image to be played, that is, the random pattern can be captured remotely on the root side.
 第4実施形態のイメージング装置において、多モード光ファイバー430cに代えて多芯光ファイバーや、多数のファイバーを束ねたバンドルファイバーを用いることもできる。 In the imaging apparatus of the fourth embodiment, a multi-core optical fiber or a bundle fiber in which a large number of fibers are bundled can be used instead of the multi-mode optical fiber 430c.
〔その他〕
 以上実施形態に即して本発明を説明したが、本発明は、上記実施形態に限定されるものではない。
[Others]
Although the present invention has been described based on the above embodiments, the present invention is not limited to the above embodiments.
 例えば、照明光B2に設定する位相の分布については、±180°の範囲内で例えば10段階、或いは分岐数Mが十分に大きければ2段階より好ましくは3段階以上といった適宜の単位又は段階に分割して設定することができ、特に位相を2段階に設定して切り換える場合、デジタル回路を用いることができるので、駆動制御部81の簡略化が期待できる。 For example, the phase distribution set for the illumination light B2 is divided into appropriate units or stages such as 10 steps within a range of ± 180 °, or, if the branching number M is sufficiently large, two steps, preferably three steps or more. In particular, when the phase is set in two stages and switched, a digital circuit can be used, so that the drive controller 81 can be simplified.
 また、分岐数Mが十分大きい、もしくは、第2実施形態(図9参照)の構成で電極数が十分に大きければ、各電極において設定する位相の分布が±180°の範囲内である必要はなく、例えば±45°の範囲内で設定しても同等の分解能が得られる。これにより、駆動制御部81の簡略化や省電力化が達成できる。 Further, if the number of branches M is sufficiently large, or if the number of electrodes is sufficiently large in the configuration of the second embodiment (see FIG. 9), the phase distribution set for each electrode needs to be within a range of ± 180 °. For example, the same resolution can be obtained even if it is set within a range of ± 45 °. Thereby, simplification and power saving of the drive control part 81 can be achieved.
 実施形態のイメージング装置100のうち、3次元画像を取得するものについては、例えばLIDAR(Light Detection and Ranging)の分野に用いることができ、前方に存在する物体を弁別するために用いることができる。さらに、実施形態のイメージング装置100は、バーコードリーダー、生体イメージング、顕微鏡等の分野に用いることもできる。 The imaging apparatus 100 according to the embodiment that acquires a three-dimensional image can be used, for example, in the field of LIDAR (Light Detection and Ranging), and can be used to discriminate an object existing ahead. Furthermore, the imaging apparatus 100 of the embodiment can also be used in fields such as a barcode reader, biological imaging, and a microscope.

Claims (9)

  1.  ランダムな位相分布をそれぞれ有する複数パターンの照明光を射出する光フェーズドアレイと、
     照明光の照射状態を分布として検出する第1センサーと、
     照明光によって照明された対象からの計測光の強度を検出する第2センサーと、
     前記第1センサーの検出情報と前記第2センサーの検出情報とを組み合わせて対象の状態に関する像を抽出する処理を行う処理部と
    を備えるイメージング装置。
    An optical phased array that emits multiple patterns of illumination light each having a random phase distribution;
    A first sensor that detects the illumination light irradiation state as a distribution;
    A second sensor for detecting the intensity of measurement light from the object illuminated by the illumination light;
    An imaging apparatus comprising: a processing unit that performs processing for extracting an image related to a target state by combining detection information of the first sensor and detection information of the second sensor.
  2.  前記第1センサーは、前記光フェーズドアレイから射出された照明光の遠視野像を検出し、前記第2センサーは、対象で反射され又は対象を透過した計測光の強度を一括して検出する、請求項1に記載のイメージング装置。 The first sensor detects a far-field image of illumination light emitted from the optical phased array, and the second sensor collectively detects the intensity of measurement light reflected from or transmitted through the target. The imaging apparatus according to claim 1.
  3.  前記処理部は、対象の状態として、前記光フェーズドアレイからの照明光の位相分布を変化させつつ、前記第2センサーによって検出した総信号強度と前記第1センサーの対象位置における信号強度とから再構成画像を算出する、請求項2に記載のイメージング装置。 The processing unit re-establishes the target state from the total signal intensity detected by the second sensor and the signal intensity at the target position of the first sensor while changing the phase distribution of the illumination light from the optical phased array. The imaging apparatus according to claim 2, wherein a constituent image is calculated.
  4.  前記光フェーズドアレイは、照明光を1次元又は2次元の分布で射出する、請求項1~3のいずれか一項に記載のイメージング装置。 The imaging apparatus according to any one of claims 1 to 3, wherein the optical phased array emits illumination light in a one-dimensional or two-dimensional distribution.
  5.  前記光フェーズドアレイに光を供給するパルス光源を備え、
     前記処理部は、計測光の応答時間に基づいて対象について奥行方向の計測を行う、請求項1~4のいずれか一項に記載のイメージング装置。
    A pulse light source for supplying light to the optical phased array;
    The imaging apparatus according to any one of claims 1 to 4, wherein the processing unit performs measurement in a depth direction with respect to an object based on a response time of measurement light.
  6.  前記光フェーズドアレイは、複数の光路に亘って配置されるとともに互いに異なるランダムな形状パターンを有する位相調整用の複数の電極を有し、前記複数の電極の組み合わせ方によってランダムな位相分布の照明光を射出する、請求項1~5のいずれか一項に記載のイメージング装置。 The optical phased array has a plurality of electrodes for phase adjustment which are arranged over a plurality of optical paths and have random shape patterns different from each other, and illumination light having a random phase distribution depending on the combination of the plurality of electrodes. The imaging apparatus according to any one of claims 1 to 5, wherein
  7.  前記光フェーズドアレイは、複数の波長域の照明光を用いて、波長域ごとにランダムな位相分布を持たせ、
     前記光フェーズドアレイからの照明光を波長域ごとに分割する分岐部をさらに有する、請求項1~6のいずれか一項に記載のイメージング装置。
    The optical phased array uses illumination light in a plurality of wavelength ranges, has a random phase distribution for each wavelength range,
    The imaging apparatus according to any one of claims 1 to 6, further comprising a branching unit that divides illumination light from the optical phased array for each wavelength region.
  8.  前記光フェーズドアレイからの照明光を導光する多モード光ファイバー、多芯光ファイバー、又はバンドルファイバーをさらに有する、請求項1~6のいずれか一項に記載のイメージング装置。 The imaging apparatus according to any one of claims 1 to 6, further comprising a multimode optical fiber, a multicore optical fiber, or a bundle fiber that guides illumination light from the optical phased array.
  9.  光フェーズドアレイからランダムな位相分布の照明光を射出させつつ、第1センサーによって照明光の照射状態を分布として検出するとともに、第2センサーによって照明光によって照明された対象からの計測光の強度を検出する工程と、
     前記第1センサーの検出情報と前記第2センサーの検出情報とを組み合わせて処理することによって対象の状態に関する像を得る工程と
    を備えるイメージング方法。
    While emitting illumination light having a random phase distribution from the optical phased array, the irradiation state of the illumination light is detected as a distribution by the first sensor, and the intensity of the measurement light from the object illuminated by the illumination light by the second sensor is detected. Detecting step;
    An imaging method comprising: obtaining an image related to a target state by processing the detection information of the first sensor and the detection information of the second sensor in combination.
PCT/JP2017/047256 2016-12-29 2017-12-28 Imaging device and imaging method WO2018124285A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018559637A JP6765687B2 (en) 2016-12-29 2017-12-28 Imaging equipment and methods

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016257455 2016-12-29
JP2016-257455 2016-12-29

Publications (1)

Publication Number Publication Date
WO2018124285A1 true WO2018124285A1 (en) 2018-07-05

Family

ID=62709616

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/047256 WO2018124285A1 (en) 2016-12-29 2017-12-28 Imaging device and imaging method

Country Status (2)

Country Link
JP (1) JP6765687B2 (en)
WO (1) WO2018124285A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111142270A (en) * 2018-11-05 2020-05-12 青岛海信激光显示股份有限公司 Laser speckle eliminating device and laser display equipment thereof
JP2020076991A (en) * 2018-11-09 2020-05-21 株式会社東芝 Optical device
WO2020137908A1 (en) * 2018-12-27 2020-07-02 株式会社小糸製作所 Lighting fixture for vehicle, and vehicle
WO2020145266A1 (en) * 2019-01-07 2020-07-16 国立大学法人東京大学 Light irradiation device, imaging device, and laser processing device
WO2021079811A1 (en) * 2019-10-23 2021-04-29 株式会社小糸製作所 Imaging device, vehicular lamp, vehicle, and imaging method
CN113302520A (en) * 2019-01-16 2021-08-24 株式会社小糸制作所 Imaging device, arithmetic processing device therefor, vehicle lamp, vehicle, and sensing method
JP2022512037A (en) * 2018-10-19 2022-02-01 シュティッヒティング・フェーウー Multimode waveguide imaging
CN114128245A (en) * 2019-07-12 2022-03-01 株式会社小糸制作所 Imaging device, lighting device for imaging device, vehicle, and vehicle lamp
JP2022520815A (en) * 2019-02-21 2022-04-01 エレクトロ サイエンティフィック インダストリーズ インコーポレーテッド Phased array beam steering for material processing
WO2022091972A1 (en) * 2020-10-28 2022-05-05 株式会社小糸製作所 Imaging device, vehicle lighting fixture, and vehicle
US20220146903A1 (en) * 2020-11-11 2022-05-12 Analog Photonics LLC Optical Phased Array Light Steering
US11960117B2 (en) 2021-10-18 2024-04-16 Analog Photonics LLC Optical phased array light shaping

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011247868A (en) * 2010-05-26 2011-12-08 Korea Institute Of Science And Technology Beam scanning system for detecting bio-substance

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011247868A (en) * 2010-05-26 2011-12-08 Korea Institute Of Science And Technology Beam scanning system for detecting bio-substance

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SUN, J. ET AL.: "Large-scale nanophotonic phased array", NATURE, vol. 493, 9 January 2013 (2013-01-09), pages 195 - 199, XP055124083 *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022512037A (en) * 2018-10-19 2022-02-01 シュティッヒティング・フェーウー Multimode waveguide imaging
CN111142270A (en) * 2018-11-05 2020-05-12 青岛海信激光显示股份有限公司 Laser speckle eliminating device and laser display equipment thereof
JP2020076991A (en) * 2018-11-09 2020-05-21 株式会社東芝 Optical device
CN113227838A (en) * 2018-12-27 2021-08-06 株式会社小糸制作所 Vehicle lamp and vehicle
CN113227838B (en) * 2018-12-27 2024-07-12 株式会社小糸制作所 Vehicle lamp and vehicle
JPWO2020137908A1 (en) * 2018-12-27 2021-11-11 株式会社小糸製作所 Vehicle lighting and vehicles
WO2020137908A1 (en) * 2018-12-27 2020-07-02 株式会社小糸製作所 Lighting fixture for vehicle, and vehicle
JP7408572B2 (en) 2018-12-27 2024-01-05 株式会社小糸製作所 Vehicle lights and vehicles
JP2020112582A (en) * 2019-01-07 2020-07-27 国立大学法人 東京大学 Light irradiation device, imaging device and laser processing device
JP7281064B2 (en) 2019-01-07 2023-05-25 国立大学法人 東京大学 Light irradiation device, imaging device, and laser processing device
WO2020145266A1 (en) * 2019-01-07 2020-07-16 国立大学法人東京大学 Light irradiation device, imaging device, and laser processing device
CN113302520A (en) * 2019-01-16 2021-08-24 株式会社小糸制作所 Imaging device, arithmetic processing device therefor, vehicle lamp, vehicle, and sensing method
JP2022520815A (en) * 2019-02-21 2022-04-01 エレクトロ サイエンティフィック インダストリーズ インコーポレーテッド Phased array beam steering for material processing
JP7470702B2 (en) 2019-02-21 2024-04-18 エレクトロ サイエンティフィック インダストリーズ インコーポレーテッド Phased array beam steering for materials processing.
CN114128245A (en) * 2019-07-12 2022-03-01 株式会社小糸制作所 Imaging device, lighting device for imaging device, vehicle, and vehicle lamp
CN114128245B (en) * 2019-07-12 2024-08-20 株式会社小糸制作所 Imaging device, illuminating device thereof, vehicle and vehicle lamp
WO2021079811A1 (en) * 2019-10-23 2021-04-29 株式会社小糸製作所 Imaging device, vehicular lamp, vehicle, and imaging method
JP7524207B2 (en) 2019-10-23 2024-07-29 株式会社小糸製作所 Imaging device, vehicle lighting device, vehicle, and imaging method
WO2022091972A1 (en) * 2020-10-28 2022-05-05 株式会社小糸製作所 Imaging device, vehicle lighting fixture, and vehicle
US20220146903A1 (en) * 2020-11-11 2022-05-12 Analog Photonics LLC Optical Phased Array Light Steering
US12085833B2 (en) * 2020-11-11 2024-09-10 Analog Photonics LLC Optical phased array light steering
US11960117B2 (en) 2021-10-18 2024-04-16 Analog Photonics LLC Optical phased array light shaping

Also Published As

Publication number Publication date
JP6765687B2 (en) 2020-10-07
JPWO2018124285A1 (en) 2020-01-16

Similar Documents

Publication Publication Date Title
WO2018124285A1 (en) Imaging device and imaging method
JP6956964B2 (en) Light deflection device and rider device
US7787132B2 (en) Method and arrangement for a rapid and robust chromatic confocal 3D measurement technique
US8213022B1 (en) Spatially smart optical sensing and scanning
US11002601B2 (en) Spectroscopic microscope and spectroscopic observation method
JP2013545113A (en) Image map optical coherence tomography
US20190117077A1 (en) Fast parallel optical coherence tomographic image generating apparatus and method
US11579299B2 (en) 3D range imaging method using optical phased array and photo sensor array
CN110914634B (en) Holographic interferometry method and system
US20050274913A1 (en) Object data input apparatus and object reconstruction apparatus
JP2020190557A (en) Time resolution hyperspectral single pixel imaging
US8896833B2 (en) Device and method for determining a piece of polarization information and polarimetric imaging device
WO2020145266A1 (en) Light irradiation device, imaging device, and laser processing device
KR102125483B1 (en) Confocal measuring apparatus
JP7021061B2 (en) Exit pupil dilator that distributes light to the LCD variable retarder
JP6887350B2 (en) Optical image measuring device
JP6273109B2 (en) Optical interference measurement device
KR101078190B1 (en) Wavelength detector and optical coherence topography having the same
RU2528109C1 (en) Pulsed laser location system
US20210116692A1 (en) Sample observation device
JP5740701B2 (en) Interferometer
JP2011203156A (en) Distance measuring device
US11391426B2 (en) Light source device and light-amount adjusting method
TWI755690B (en) Optical measurement device, optical measurement method, and optical measurement program
JP2022125206A (en) Scanning device and light detection device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17886856

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2018559637

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17886856

Country of ref document: EP

Kind code of ref document: A1