WO2018124285A1 - Dispositif d'imagerie et procédé d'imagerie - Google Patents

Dispositif d'imagerie et procédé d'imagerie Download PDF

Info

Publication number
WO2018124285A1
WO2018124285A1 PCT/JP2017/047256 JP2017047256W WO2018124285A1 WO 2018124285 A1 WO2018124285 A1 WO 2018124285A1 JP 2017047256 W JP2017047256 W JP 2017047256W WO 2018124285 A1 WO2018124285 A1 WO 2018124285A1
Authority
WO
WIPO (PCT)
Prior art keywords
illumination light
phased array
sensor
light
optical phased
Prior art date
Application number
PCT/JP2017/047256
Other languages
English (en)
Japanese (ja)
Inventor
拓夫 種村
憲人 小松
義昭 中野
泰之 小関
Original Assignee
国立大学法人東京大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 国立大学法人東京大学 filed Critical 国立大学法人東京大学
Priority to JP2018559637A priority Critical patent/JP6765687B2/ja
Publication of WO2018124285A1 publication Critical patent/WO2018124285A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • G01S7/4863Detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/27Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N21/49Scattering, i.e. diffuse reflection within a body or fluid
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/59Transmissivity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4818Constructional features, e.g. arrangements of optical elements using optical fibres
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/487Extracting wanted echo signals, e.g. pulse detection

Definitions

  • the present invention relates to an imaging apparatus and method using an optical phased array, and more particularly to an imaging apparatus and method for obtaining a two-dimensional image or a three-dimensional image of an object.
  • the optical phased array (OPA) type imaging device that has been demonstrated in the past has arranged a large number (several hundreds to several thousand) of antenna units one-dimensionally or two-dimensionally to heat the appropriate place of the antenna units.
  • the optical phase of the output beam is controlled by applying an electric field or applying an electric field (for example, Non-Patent Document 1).
  • an electric field or applying an electric field for example, Non-Patent Document 1
  • the state of the exit beam changes sensitively due to the influence of the antenna unit fabrication error, temperature, and other environmental conditions, the imaging performance deteriorates, or complicated control for always correcting such influence
  • a circuit must be added.
  • Non-patent literature 2 a target is irradiated (non-patent literature) 2 2).
  • the reflected light from the target is collectively detected by an optical sensor while changing the speckle pattern, and the two-dimensional image or three-dimensional image of the target is detected from the correlation with the speckle pattern.
  • An object of the present invention is to provide an imaging apparatus and method in which an optical phased array that does not require a complicated control circuit is used, and deterioration in imaging performance is suppressed regardless of manufacturing errors and environmental conditions. .
  • an imaging apparatus includes an optical phased array that emits a plurality of patterns of illumination light each having a random phase distribution, a first sensor that detects an illumination state of the illumination light as a distribution, A process of extracting an image related to the state of the target by combining the second sensor that detects the intensity of the measurement light from the target illuminated by the illumination light, and the detection information of the first sensor and the detection information of the second sensor A part.
  • the image is not limited to a two-dimensional image, but includes a one-dimensional image and a three-dimensional image.
  • an image can be extracted with reference to the second sensor in an illumination environment in which a plurality of patterns of illumination light each having a random phase distribution is emitted by the optical phased array. It can be reduced to offset the effects of phase shift and intensity variation of illumination light due to manufacturing errors and other imperfections, and it is less susceptible to changes in temperature and other usage environments, making it relatively inexpensive Even if an optical phased array is used, the reliability of measurement can be easily increased.
  • an imaging method detects an illumination state of illumination light as a distribution by a first sensor while emitting illumination light having a random phase distribution from an optical phased array, and uses a second sensor. Detecting the intensity of the measurement light from the object illuminated by the illumination light, and obtaining an image relating to the state of the object by processing the detection information of the first sensor and the detection information of the second sensor in combination.
  • FIG. 1 is a conceptual block diagram illustrating an imaging apparatus according to a first embodiment. It is a top view explaining an optical phased array. It is a conceptual diagram of the structure which integrated the injection
  • 6A and 6B are charts showing numerical verification results using the imaging apparatus of the embodiment. It is a block diagram which shows the specific example of production of an optical phased array.
  • FIG. 8A is a chart showing measurement results using a conventional method that requires advanced adjustment
  • FIG. 8B is a chart showing measurement results using the method of an embodiment that does not require advanced adjustment.
  • An imaging apparatus 100 includes a light source unit 20 that generates laser light B1, and an optical phased array that forms and emits illumination light B2 having a desired wavefront state from the laser light B1 from the light source unit 20. 30, an observation optical system 40 for illumination and measurement, an image sensor 50 which is a first sensor for detecting the irradiation state of the illumination light B2 as a distribution, and measurement light B3 from the target OB illuminated by the illumination light B2 Manages the operation state of the light receiving element 60, which is a second sensor for detecting the intensity of the light, and the optical phased array 30, and receives detection information from the image sensor 50 and the light receiving element 60 to extract or generate an image relating to the state of the target OB. And an information processing unit 70.
  • the light source unit 20 is composed of a semiconductor laser or other coherent light source, and is accompanied by a light source driving circuit (not shown).
  • the light source unit 20 emits laser light B1 set in various wavelength regions such as an infrared region and a visible region.
  • the optical phased array (OPA) 30 is an optical waveguide type integrated circuit, and includes an optical branching unit 31 and a phase control unit 32 provided on a substrate 38.
  • the optical phased array (OPA) 30 can switch the output in a short time of, for example, several tens of ⁇ s or less, and enables high-speed measurement and imaging.
  • the light branching unit 31 branches the laser beam B1 into M channels and guides them to M waveguides 36.
  • the optical branching unit 31 for example, a combination of a star coupler having a slab waveguide, a multistage directional coupler, or the like can be used.
  • the phase control unit 32 includes, for example, M electrodes 32a that apply an electric field to the M-channel waveguide 36, and wirings 32b that enable voltage supply to the electrodes 32a.
  • the supply voltage to the wiring 32b is adjusted by the OPA drive control unit 81 shown in FIG.
  • the phase-controlled illumination light B ⁇ b> 2 is emitted from the emission port 34 of the optical phased array 30 that has passed through the phase control unit 32.
  • the illumination light B2 is light having a random phase distribution.
  • the illumination light B2 changes at high speed in time series, and the illumination light B2 having a random phase distribution itself is emitted from the emission port 34 as N patterns different from each other. That is, these N patterns or N types of illumination light B2 all have a random phase distribution and change randomly in time series. In each pattern, the random phase distribution range is ⁇ 180 °, and there is no bias.
  • the phase control unit 32 constituting the optical phased array 30 is one-dimensional in the X direction, and the illumination port B2 having the random phase distribution or pattern of the one-dimensional array is emitted from the emission port 34.
  • the optical phased array 30 shown in FIG. 2 can be stacked in the Y direction. With such a laminated optical phased array 130, illumination light B2 having a two-dimensional random phase distribution in the XY directions can be formed and emitted.
  • by changing the structure of the waveguide and arranging the emission ports two-dimensionally it is possible to emit laser light whose phase is controlled in the direction perpendicular to the main surface of the substrate 38. In this case, as shown in FIG.
  • the injection unit 237 is an optical phased array 130 having an injection unit 434 arranged in a two-dimensional array in the XY direction.
  • the illumination light B2 having a two-dimensional phase distribution in the XY directions can be formed and emitted in the Z direction.
  • a one-dimensional image corresponding to the distribution direction of the illumination light B2 can be obtained as the target image, and when the illumination light B2 is two-dimensional, the distribution of the illumination light B2 is the target image. A corresponding two-dimensional image can be obtained.
  • the phase control unit 32 is not limited to the electro-optic effect type modulation in which the phase is adjusted by the electric field strength applied to the waveguide 36, but the carrier effect type modulation and the waveguide in which the phase is adjusted by current injection into the waveguide unit.
  • a thermo-optic effect type modulation that adjusts the phase by heating can be used.
  • the observation optical system 40 includes a branch mirror 43 and a plurality of lenses L1 to L3.
  • the branch mirror 43 is a half mirror having a uniform transmittance or reflectance.
  • the branch mirror 43 divides the illumination light B ⁇ b> 2 from the optical phased array 30 so that a part thereof is incident on the target OB and the other is incident on the image sensor (first sensor) 50. Further, the branch mirror 43 reflects the measurement light B ⁇ b> 3 that is return light reflected by scattering on the surface OBa of the target OB and guides it to the light receiving element (second sensor) 60.
  • the lens L1 enables illumination in a far field state while preventing the divergence of the illumination light B2 emitted from the optical phased array 30.
  • the lens L2 has a role of collectively entering the photosensitive portion 61 of the light receiving element 60 by reducing the diameter of the measurement light B3 reflected by the object OB.
  • the lens L3 forms a pattern of the measurement light B2 as a far-field image on the photosensitive surface 51 of the image sensor 50 in cooperation with the lens L1.
  • the size of the far-field image on the photosensitive surface 51 can be adjusted by adjusting the focal length of the lens L3.
  • the lens L1 or the lens L3 may be omitted.
  • the light receiving element (second sensor) 60 collectively detects the intensity of the measurement light reflected by the target OB, and the image sensor (first sensor) 50 is the illumination light emitted from the optical phased array 30.
  • the far field image of is detected.
  • the image sensor (first sensor) 50 is a semiconductor image sensor such as a CMOS or CCD.
  • the image sensor 50 is sensitive to the wavelength of the light source unit 20 and can be accompanied by a wavelength selection filter.
  • the image sensor 50 detects the pattern of the illumination light B2 formed on the photosensitive surface 51 and captures it as a detected image. At this time, the intensity value of the illumination light B2 is detected for each pixel position. As described above, since the illumination light B2 is emitted in N patterns by the optical phased array 30, N detection images of the illumination light B2 are also obtained.
  • the light receiving element (second sensor) 60 is a semiconductor optical sensor such as a photodiode.
  • the light receiving element 60 is sensitive to the wavelength of the light source unit 20 and can be accompanied by a wavelength selection filter.
  • the light receiving element 60 operates by being driven by the light receiving element driving unit 82, and outputs a signal corresponding to the light intensity of the entire interference pattern of the measurement light B3 incident on the photosensitive unit 61. That is, the light receiving element 60 collectively detects the measurement light B3 reflected by the entire target OB as the total signal intensity.
  • N detection intensities of the measurement light B3 are also obtained.
  • the information processing unit 70 includes a control unit 71, an interface unit 72, and a storage unit 73.
  • the control unit 71 operates the optical phased array 30 and the like via the interface unit 72 and the OPA drive control unit 81, and emits illumination light B2 having a random phase distribution in a plurality of patterns.
  • the control unit 71 receives the detected image taken by the image sensor 50 through the interface unit 72 together with timing information.
  • the information processing unit 70 receives the intensity of the measurement light B3 detected by the light receiving element 60 through the interface unit 72 together with timing information.
  • the control unit 71 temporarily stores the detection image of the illumination light B2 acquired from the image sensor 50 and the intensity value of the measurement light B3 acquired from the light receiving element 60 in the storage unit 73, and from these detection images and intensity values.
  • the obtained state of the target OB is stored as a measurement result or a reconstructed image.
  • the control unit 71 corresponds to the position information on the image sensor (first sensor) 50 (specifically, the coordinate x corresponding to the X axis on the illustrated object OB and corresponding to the Z axis on the image sensor 50).
  • the reconstructed image is calculated from the detection information of the image sensor (first sensor) 50 and the detection information of the light receiving element (second sensor) 60.
  • This reconstructed image represents the state of the target OB such as the reflectance of the target OB.
  • the result of processing by the information processing unit 70 specifically, an image reflecting the state of the target OB is displayed on the input / output unit 91.
  • the input / output unit 91 presents various information regarding the operating state of the imaging apparatus 100 to the operator.
  • An instruction is input from the operator to the information processing unit 70 via the input / output unit 91.
  • control unit 71 of the information processing unit 70 outputs an operation command to the OPA drive control unit 81 via the interface unit 72, and causes the OPA drive control unit 81 to prepare a random virtual irradiation pattern (step S11).
  • a random virtual irradiation pattern can be generated every time, but a random pattern stored in advance can also be read out.
  • the OPA drive control unit 81 operates the optical phased array 30 based on this virtual irradiation pattern, and emits the illumination light B2 having a random pattern having a random phase distribution from the optical phased array 30 (step S12).
  • This random pattern is a speckle-like luminance distribution pattern.
  • the virtual irradiation pattern prepared in step S11 and the random pattern of the illumination light B2 actually emitted from the optical phased array 30 do not need to correspond exactly. That is, when there is an error or fluctuation in the size or arrangement of the individual electrodes 32a constituting the optical phased array 30, the illumination light B2 whose phase is strictly controlled is not emitted from the optical phased array 30.
  • the illumination phase B2 whose phase is strictly controlled is not emitted from the optical phased array 30.
  • the random pattern of the illumination light B2 emitted from the optical phased array 30 is not strictly controlled, but by calculating the reconstructed image O (x) from the extracted value Sr related to correlation described later.
  • Such manufacturing errors and environmental variations of the optical phased array 30 are offset or mitigated.
  • the control unit 71 receives the detected image taken by the image sensor 50 together with the timing information via the interface unit 72 and stores it in the storage unit 73 (step S13). In parallel with this, the detection intensity of the measurement light B3 detected by the light receiving element 60 is received together with the timing information via the light receiving element driving unit 82 and the interface unit 72 and stored in the storage unit 73 (step S14).
  • the control unit 71 determines whether or not the process of forming and outputting N random patterns by the optical phased array 30 is completed (step S15). If the output of the N random patterns is not completed, the control unit 71 Returning to S11, the control unit 71 causes the OPA drive control unit 81 to prepare the next virtual irradiation pattern.
  • step S16 the control unit 71 detects and detects the detected images of the image sensor 50 and the light receiving element 60 held in the storage unit 73 in steps S13 and S14.
  • An image is generated or reconstructed based on the intensity (step S16).
  • the control unit 71 stores the obtained reconstructed image in the storage unit 73.
  • the control unit 71 presupposes position information on the image sensor (first sensor) 50 (values such as coordinates x) and detection information of the image sensor (first sensor) 50 and a light receiving element (second sensor).
  • 60 detection information Sr a reconstructed image O (x) of the object is calculated.
  • control unit 71 changes the phase distribution of the illumination light B ⁇ b> 2 from the optical phased array 30 N times, and detects the total signal intensity detected by the light receiving element 60 and the signal at the target position on the image sensor 50.
  • the reconstructed image O (x) is calculated from the intensity.
  • an image signal can be calculated for a pixel corresponding to an arbitrary position on the image sensor 50 using the optical phased array 30 that operates at high speed, and the reconstruction or extraction of the target image is realized at high speed and with high accuracy. can do.
  • the value x corresponds to the X axis on the object OB in the apparatus configuration of FIG. 1, but corresponds to the Z axis on the image sensor 50.
  • the value x on the image sensor 50 is a discrete value corresponding to a pixel.
  • the value N indicates the number (natural number) of random patterns formed and output by the optical phased array 30.
  • the value Sr indicates the measurement value of the light receiving element 60, that is, the intensity value of the measurement light B3.
  • the value ⁇ S> indicates an average value of N values Sr obtained by N measurements with the random pattern changed.
  • Ir (x) represents the relationship between the coordinate value x on the image sensor 50 and the intensity value at the pixel corresponding to the coordinate value x, that is, the detected luminance.
  • means adding (Sr ⁇ ⁇ S>) ⁇ Ir (x) while changing the variable r of the values Sr, Ir (x) from 1 to N.
  • the reconstructed image O (x) gives the luminance value of the reconstructed image for each coordinate value x.
  • the above equation (1) is for determining the reconstructed image O (x) for a one-dimensional pixel column in the image sensor 50. Processing for determining the configuration image O (x, y) is performed.
  • the value y corresponds to the Y axis on the target OB and also corresponds to the Y axis on the image sensor 50.
  • the number N of random patterns generated or the value M corresponding to the number of divisions and the number of arrays is relatively reduced.
  • Equivalent results can be obtained. For example, by imposing appropriate mathematical constraints on the reconstructed image O (x) so that the object does not actually have an unnatural shape, the reconstructed image O ( x) can be calculated. On the contrary, if the value N is allowed to be relatively large, even if the value M is reduced, the reconstruction is performed by the least square method, the inverse matrix method, or the like instead of the simple addition represented by the equation (1). A similar spatial resolution can be obtained by estimating the image O (x).
  • the imaging apparatus 100 can be changed to a device that performs measurement in the depth direction of an object by using a pulse light source.
  • a two-dimensional image or a three-dimensional image including information regarding the traveling direction of the illumination light can be obtained as the target image.
  • the imaging apparatus 100 includes a pulse light source as the light source unit 20 that supplies light to the optical phased array 30, and the information processing unit 70 includes measurement light.
  • the depth direction is measured for the target OB.
  • a pulsed laser beam B1 is emitted from the light source unit 20.
  • the operation of the optical phased array 30 is the same, but for the measurement value or detection signal of the light receiving element 60, a time gate is provided in the light receiving element driving unit 82 to extract only a signal corresponding to a specific distance from the measurement light B3. To do. As a result, multi-stage measurement light B3 can be obtained so as to slice the space gradually with respect to the depth direction.
  • the information processing unit 70 a two-dimensional horizontal slice sliced in the depth direction by the same method as in the two-dimensional case. An image is obtained and a three-dimensional image obtained by synthesizing a number of two-dimensional horizontal images having different depths is reproduced.
  • the random pattern of the illumination light B2 emitted from the optical phased array 30 does not have to be strictly controlled.
  • the detection information of the image sensor 50 and the detection information of the light receiving element 60 the manufacturing error and environmental fluctuation of the optical phased array 30 are offset or alleviated so as to be averaged.
  • the image O (x) can be obtained with high accuracy.
  • FIG. 5 is a view showing a modification of the imaging apparatus 100 of the first embodiment shown in FIG.
  • the observation optical system 140 the measurement light B3 transmitted through the object OB is observed by the light receiving element 60.
  • the transmittance distribution and the like of the target OB can be determined as the reconstructed image O (x) described above.
  • 6A and 6B are charts showing numerical verification results using the imaging apparatus 100 of the embodiment.
  • a one-dimensional scanning result is shown, and a one-dimensional image or distribution is obtained.
  • the horizontal axis indicates the pixel position, and the vertical axis indicates the transmittance.
  • numerical values arranged below the horizontal axis indicate actual transmittance.
  • the dotted line uses a conventional technique, and the phase state of the illumination light B2 emitted from the optical phased array 30 is strictly controlled by tuning, and the transmittance of the target OB is measured from the distribution of the measurement light B3. Yes.
  • the other lines are the number of divisions M of the laser beam B1 by the optical phased array 30 using the apparatus of the embodiment (specifically, the apparatus system of FIG. 5).
  • the number N of random pattern generations is changed.
  • the generation number N of random patterns is 10, 100, and 1000.
  • the generation number N of random patterns is 2000.
  • FIG. 7 shows a specific example of manufacturing the optical phased array 30, and corresponds to FIG.
  • the rectangular substrate in plan view is an indium phosphide (InP) semiconductor substrate, a waveguide made of InGaAsP, a phase shift portion of a double hetero structure made of pin type InP / InGaAsP / InP, and Ti /
  • the phase control part 32 which consists of an electrode made from Au was formed.
  • the optical phased array 30 is a one-dimensional modulator.
  • FIG. 8A and 8B are charts showing verification results using the imaging apparatus 100 in which the optical phased array 30 of FIG. 7 is incorporated.
  • the measurement target is a slit pattern.
  • the horizontal axis indicates the pixel position, and the vertical axis indicates the transmittance.
  • the numerical values on the chart indicate the transmittance.
  • the drive condition is extracted 18,000 times, and the measurement target is measured relatively accurately. It can be considered that the slit pattern is reproduced.
  • N 100 times without extracting the driving condition of the optical phased array 30, and the reconstructed image O (x) of Expression (1) is generated.
  • the measurement target image is obtained from The image or distribution pattern shown in FIG. 8B approximates the image or distribution pattern shown in FIG. 8A, and can be considered to reproduce the slit pattern to be measured relatively accurately.
  • the light receiving element (second sensor) 60 is provided in an illumination environment in which the optical phased array 30 emits a plurality of patterns of illumination light B2 each having a random phase distribution. Since it is possible to extract an image by referring to it, it is possible to reduce the influence of the phase error, intensity variation, etc. of the illumination light B2 due to manufacturing errors and other imperfections of the optical phased array 30, and further, The reliability of measurement can be easily increased even if the optical phased array 30 which is less susceptible to the influence of temperature and other changes in the usage environment and is manufactured at a relatively low cost.
  • the number of branches or the number of divisions M is sufficiently large, it is not necessary to switch all the phase control units 32. For example, about half (M / 2) of M phase controllers are fixed, Similar characteristics can be obtained by switching only the remaining half (M / 2) phase controllers. Thereby, simplification and power saving of the drive control unit 81 can be achieved, and at the same time, it is possible to make it difficult to be affected by manufacturing errors and other imperfections of the optical phased array 30.
  • Imaging apparatus and the like according to the second embodiment will be described.
  • the imaging apparatus according to the second embodiment is a modification of the first embodiment, and parts that are not particularly described are the same as those in the first embodiment.
  • the electrodes for operating the optical phased array are simplified.
  • the optical phased array 230 used in the imaging apparatus of the second embodiment operates M waveguides 36, which are a plurality of optical paths, with fewer electrodes 32e to 32h.
  • seven waveguides 36 are operated by four electrodes 32e to 32h.
  • the values of the voltages V1 to V4 applied to the electrodes 32e to 32h are randomly changed.
  • the illumination light B2 having a random phase distribution can be emitted from the emission port 34.
  • the optical phased array 230 in the imaging apparatus of the second embodiment, a plurality of electrodes 32e to 32h for phase adjustment having a random shape pattern different from each other are provided over the plurality of waveguides 36, The illumination light B2 having a random phase distribution is emitted depending on how the electrodes 32e to 32h are combined.
  • the number of electrodes can be significantly reduced without reducing the spatial resolution, the optical phased array 30 can be downsized, and the driving method of the optical phased array 230 can be simplified.
  • the imaging apparatus according to the third embodiment is a modification of the first embodiment, and parts that are not particularly described are the same as those in the first embodiment.
  • the illumination light B2 having a two-dimensional phase distribution or the illumination light B2 having a phase distribution expanded in the one-dimensional direction is formed using a one-dimensional optical phased array (OPA).
  • OPA optical phased array
  • the optical phased array 330 used in the imaging apparatus of the third embodiment includes a main body portion 330a having the same structure as the optical phased array 30 shown in FIG. 2, and an emission port 34 side of the main body portion 330a. And a prism 330b that is a branch portion that is arranged and extends along the arrangement direction of the exit ports 34.
  • a broadband light source light B ⁇ b> 12 is incident on the optical phased array 330.
  • the optical phased array 330 has a random phase distribution for each wavelength with respect to the light source light B12.
  • the illumination light B ⁇ b> 2 emitted from the emission port 34 via the waveguide 36 has a phase distribution with respect to the arrangement direction of the emission ports 34.
  • the illumination light B2 is deflected in a direction orthogonal to the arrangement direction of the emission ports 34 through the prism 330b. At this time, the deflection angle differs depending on the wavelength component of the illumination light B2, and is orthogonal to the arrangement direction of the emission ports 34. It is divided in the direction to do.
  • the illumination light B2 that has passed through the prism 330b has a two-dimensional spread.
  • the illumination light B ⁇ b> 2 has at least a random phase distribution with respect to the arrangement direction of the emission ports 34.
  • the direction orthogonal to the arrangement direction of the injection ports 34 may have a correlation, but one-dimensional image data along the arrangement direction of the injection ports 34 may be processed individually.
  • the illumination light B2 in a plurality of wavelength regions is modulated so as to have a random phase distribution for each wavelength region, and the optical phased array 330 has a branching portion.
  • the illumination light B2 from the array 330 is divided for each wavelength region.
  • two-dimensional illumination light can be emitted using the one-dimensional optical phased array 330.
  • the one-dimensional irradiation range can be divided for each wavelength.
  • the dimension of detection or the scan range can be shared by the wavelength range.
  • a diffraction grating can also be used. Further, the same effect can be obtained by integrating a diffraction grating type coupler at the position of the emission port 34 and extracting light in a direction perpendicular to the substrate 38. In general, in the diffraction grating type coupler, since the angle of emission differs depending on the wavelength, two-dimensional irradiation light can be directly emitted. Thereby, further downsizing can be realized.
  • the detection dimension or scan range can be shared by sweeping the wavelength using a wavelength variable light source.
  • the imaging apparatus according to the fourth embodiment is a modification of the first embodiment, and parts that are not particularly described are the same as those in the first embodiment.
  • the optical phased array 430 used in the imaging apparatus of the fourth embodiment is close to the main body portion 430a having the same structure as the optical phased array 30 shown in FIG. 2 and the exit port 34 of the main body portion 430a.
  • a multimode optical fiber 430c disposed in the vicinity of the light emitting portion of the optical coupling portion 430b.
  • the optical coupling unit 430b is a three-dimensional optical circuit, and for example, a photonic lantern can be used.
  • the optical coupling unit 430b receives the illumination light B21, which is a one-dimensional optical signal, emitted from the emission port 34 of the main body portion 340a at the light incident part, and receives the one-dimensional optical signal as illumination light B22, which is a two-dimensional optical signal. To be emitted from the light emitting part.
  • the multimode optical fiber 430c receives the illumination light B22 that is a two-dimensional optical signal emitted from the emission unit of the optical coupling unit 430b at the incident end 3a, and receives the illumination light B2 that is a two-dimensional optical signal propagated through the core 3b. Injection is performed from the injection end 3c.
  • the original illumination light B21 formed by the main body portion 430a has a random pattern
  • the converted illumination light B2 that has passed through the multimode optical fiber 430c and the like has a fine luminance pattern due to the influence of mode coupling and mode dispersion. Is output as That is, the illumination light B2 is converted into a random and fine luminance pattern by passing through the multimode optical fiber 430c and the like.
  • the multimode optical fiber 430c has a length of about several meters, for example, and can be freely bent with a curvature below a certain value. At that time, the light propagation condition in the multimode optical fiber 430c changes, and the state of the random pattern changes.
  • the phase control unit 32 switches the output state in a short time compared to the fluctuation of the optical fiber, the illumination light B2 that has passed through the multimode optical fiber 430c is reflected at a high speed reflecting the instantaneous state of the multimode optical fiber 430c. A random pattern that changes.
  • the injection end 3c is coupled.
  • the system hereinafter referred to as the tip observation unit
  • the tip of the cable containing the multimode optical fiber 430c and the signal line is placed at a desired position.
  • the tip observation unit at the tip of the cable can be brought closer to the target OB, and the state of the target OB can be measured remotely.
  • Such an imaging apparatus can be used as a fiberscope or an endoscope.
  • the image sensor 50 can be arranged on the incident end 3a side of the multimode optical fiber 430c via a branching section.
  • a technique for acquiring an image on the exit end 3c side can be used regardless of the bending state of the multimode optical fiber 430c (Ruo Yu Gu, et al., “Design of flexible multi-mode fiber endoscope”, 19, Oct 2015
  • a reflection mirror that partially reflects light is provided at the exit end 3c of the multimode optical fiber 430c, and calibration is performed on the base side of the multimode optical fiber 430c, so that the projection is performed on the exit end 3c side as with the image sensor 50.
  • the image to be played, that is, the random pattern can be captured remotely on the root side.
  • a multi-core optical fiber or a bundle fiber in which a large number of fibers are bundled can be used instead of the multi-mode optical fiber 430c.
  • the phase distribution set for the illumination light B2 is divided into appropriate units or stages such as 10 steps within a range of ⁇ 180 °, or, if the branching number M is sufficiently large, two steps, preferably three steps or more.
  • a digital circuit can be used, so that the drive controller 81 can be simplified.
  • the phase distribution set for each electrode needs to be within a range of ⁇ 180 °. For example, the same resolution can be obtained even if it is set within a range of ⁇ 45 °. Thereby, simplification and power saving of the drive control part 81 can be achieved.
  • the imaging apparatus 100 that acquires a three-dimensional image can be used, for example, in the field of LIDAR (Light Detection and Ranging), and can be used to discriminate an object existing ahead. Furthermore, the imaging apparatus 100 of the embodiment can also be used in fields such as a barcode reader, biological imaging, and a microscope.
  • LIDAR Light Detection and Ranging

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Remote Sensing (AREA)
  • Immunology (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Pathology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Electromagnetism (AREA)
  • Optical Modulation, Optical Deflection, Nonlinear Optics, Optical Demodulation, Optical Logic Elements (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

L'invention concerne un dispositif d'imagerie et un procédé d'imagerie, avec lesquels la dégradation des performances d'imagerie est inhibée lors de l'utilisation d'un réseau à commande de phase optique qui ne nécessite pas de circuit de commande complexe, indépendamment des erreurs de production ou des conditions environnementales. Un dispositif d'imagerie (100) comporte : un réseau à commande de phase optique (30) qui émet une lumière d'éclairage B2 selon de multiples motifs ayant chacun une distribution de phase aléatoire ; un capteur d'image (50) qui constitue un premier capteur pour la détection, sous la forme d'une distribution, d'un état de rayonnement de la lumière d'éclairage B2 ; un élément de réception de lumière (60) qui constitue un second capteur pour la détection de l'intensité de la lumière de mesure B3 à partir d'un objet OB éclairé par la lumière d'éclairage B2 ; et une unité de traitement d'informations (70) qui effectue un traitement en vue de l'extraction d'une image relative à l'état de l'objet en combinant les informations de détection du capteur d'image (50) et les informations de détection de l'élément de réception de lumière (60).
PCT/JP2017/047256 2016-12-29 2017-12-28 Dispositif d'imagerie et procédé d'imagerie WO2018124285A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018559637A JP6765687B2 (ja) 2016-12-29 2017-12-28 イメージング装置及び方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016257455 2016-12-29
JP2016-257455 2016-12-29

Publications (1)

Publication Number Publication Date
WO2018124285A1 true WO2018124285A1 (fr) 2018-07-05

Family

ID=62709616

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/047256 WO2018124285A1 (fr) 2016-12-29 2017-12-28 Dispositif d'imagerie et procédé d'imagerie

Country Status (2)

Country Link
JP (1) JP6765687B2 (fr)
WO (1) WO2018124285A1 (fr)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111142270A (zh) * 2018-11-05 2020-05-12 青岛海信激光显示股份有限公司 一种激光散斑消除装置及其激光显示设备
JP2020076991A (ja) * 2018-11-09 2020-05-21 株式会社東芝 光デバイス
WO2020137908A1 (fr) * 2018-12-27 2020-07-02 株式会社小糸製作所 Appareil d'éclairage pour véhicule et véhicule
WO2020145266A1 (fr) * 2019-01-07 2020-07-16 国立大学法人東京大学 Dispositif d'exposition à la lumière, dispositif d'imagerie et dispositif de traitement laser
WO2021079811A1 (fr) * 2019-10-23 2021-04-29 株式会社小糸製作所 Dispositif d'imagerie, phare de véhicule, automobile et procédé d'imagerie
CN113302520A (zh) * 2019-01-16 2021-08-24 株式会社小糸制作所 成像装置、其运算处理装置、车辆用灯具、车辆、感测方法
CN114128245A (zh) * 2019-07-12 2022-03-01 株式会社小糸制作所 成像装置及其照明装置、车辆、车辆用灯具
WO2022091972A1 (fr) * 2020-10-28 2022-05-05 株式会社小糸製作所 Dispositif d'imagerie, appareil d'éclairage de véhicule et véhicule
US20220146903A1 (en) * 2020-11-11 2022-05-12 Analog Photonics LLC Optical Phased Array Light Steering
US11960117B2 (en) 2021-10-18 2024-04-16 Analog Photonics LLC Optical phased array light shaping
JP7470702B2 (ja) 2019-02-21 2024-04-18 エレクトロ サイエンティフィック インダストリーズ インコーポレーテッド 材料加工用フェイズドアレイビームステアリング

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011247868A (ja) * 2010-05-26 2011-12-08 Korea Institute Of Science And Technology バイオ物質を検出するビームスキャニングシステム

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011247868A (ja) * 2010-05-26 2011-12-08 Korea Institute Of Science And Technology バイオ物質を検出するビームスキャニングシステム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SUN, J. ET AL.: "Large-scale nanophotonic phased array", NATURE, vol. 493, 9 January 2013 (2013-01-09), pages 195 - 199, XP055124083 *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111142270A (zh) * 2018-11-05 2020-05-12 青岛海信激光显示股份有限公司 一种激光散斑消除装置及其激光显示设备
JP2020076991A (ja) * 2018-11-09 2020-05-21 株式会社東芝 光デバイス
JPWO2020137908A1 (ja) * 2018-12-27 2021-11-11 株式会社小糸製作所 車両用灯具および車両
WO2020137908A1 (fr) * 2018-12-27 2020-07-02 株式会社小糸製作所 Appareil d'éclairage pour véhicule et véhicule
CN113227838B (zh) * 2018-12-27 2024-07-12 株式会社小糸制作所 车辆用灯具及车辆
JP7408572B2 (ja) 2018-12-27 2024-01-05 株式会社小糸製作所 車両用灯具および車両
CN113227838A (zh) * 2018-12-27 2021-08-06 株式会社小糸制作所 车辆用灯具及车辆
JP2020112582A (ja) * 2019-01-07 2020-07-27 国立大学法人 東京大学 光照射装置、イメージング装置、及びレーザー加工装置
JP7281064B2 (ja) 2019-01-07 2023-05-25 国立大学法人 東京大学 光照射装置、イメージング装置、及びレーザー加工装置
WO2020145266A1 (fr) * 2019-01-07 2020-07-16 国立大学法人東京大学 Dispositif d'exposition à la lumière, dispositif d'imagerie et dispositif de traitement laser
CN113302520A (zh) * 2019-01-16 2021-08-24 株式会社小糸制作所 成像装置、其运算处理装置、车辆用灯具、车辆、感测方法
JP7470702B2 (ja) 2019-02-21 2024-04-18 エレクトロ サイエンティフィック インダストリーズ インコーポレーテッド 材料加工用フェイズドアレイビームステアリング
CN114128245A (zh) * 2019-07-12 2022-03-01 株式会社小糸制作所 成像装置及其照明装置、车辆、车辆用灯具
WO2021079811A1 (fr) * 2019-10-23 2021-04-29 株式会社小糸製作所 Dispositif d'imagerie, phare de véhicule, automobile et procédé d'imagerie
WO2022091972A1 (fr) * 2020-10-28 2022-05-05 株式会社小糸製作所 Dispositif d'imagerie, appareil d'éclairage de véhicule et véhicule
US20220146903A1 (en) * 2020-11-11 2022-05-12 Analog Photonics LLC Optical Phased Array Light Steering
US11960117B2 (en) 2021-10-18 2024-04-16 Analog Photonics LLC Optical phased array light shaping

Also Published As

Publication number Publication date
JP6765687B2 (ja) 2020-10-07
JPWO2018124285A1 (ja) 2020-01-16

Similar Documents

Publication Publication Date Title
WO2018124285A1 (fr) Dispositif d'imagerie et procédé d'imagerie
US7189984B2 (en) Object data input apparatus and object reconstruction apparatus
JP6956964B2 (ja) 光偏向デバイスおよびライダー装置
US7787132B2 (en) Method and arrangement for a rapid and robust chromatic confocal 3D measurement technique
US8213022B1 (en) Spatially smart optical sensing and scanning
US11002601B2 (en) Spectroscopic microscope and spectroscopic observation method
CN103250036A (zh) 基于影像测绘的光学相干层析成像技术
US20190117077A1 (en) Fast parallel optical coherence tomographic image generating apparatus and method
WO2020145266A1 (fr) Dispositif d'exposition à la lumière, dispositif d'imagerie et dispositif de traitement laser
US11579299B2 (en) 3D range imaging method using optical phased array and photo sensor array
JP2020190557A (ja) 時間分解ハイパースペクトル単一画素撮像
KR102125483B1 (ko) 공초점 계측 장치
US20110292389A1 (en) Device and Method for Determining a Piece of Polarisation Information and Polarimetric Imaging Device
JP6887350B2 (ja) 光画像計測装置
JP6273109B2 (ja) 光干渉測定装置
KR101078190B1 (ko) 파장 검출기 및 이를 갖는 광 간섭 단층 촬영 장치
RU2528109C1 (ru) Система импульсной лазерной локации
JP5740701B2 (ja) 干渉計
US11391426B2 (en) Light source device and light-amount adjusting method
JP2022125206A (ja) 走査装置及び光検出装置
JP2011203156A (ja) 距離測定装置
TWI755690B (zh) 光學測量裝置、光學測量方法以及光學測量程式
JP7021061B2 (ja) 液晶可変リターダに光を分配する射出瞳拡張器
JP5506266B2 (ja) 画像形成装置
La Torre et al. Smart optical shape sensor using electronically controlled lens and laser line illumination scanning

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17886856

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2018559637

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17886856

Country of ref document: EP

Kind code of ref document: A1