WO2023085328A1 - Imaging device, imaging method, vehicular lamp, and vehicle - Google Patents

Imaging device, imaging method, vehicular lamp, and vehicle Download PDF

Info

Publication number
WO2023085328A1
WO2023085328A1 PCT/JP2022/041755 JP2022041755W WO2023085328A1 WO 2023085328 A1 WO2023085328 A1 WO 2023085328A1 JP 2022041755 W JP2022041755 W JP 2022041755W WO 2023085328 A1 WO2023085328 A1 WO 2023085328A1
Authority
WO
WIPO (PCT)
Prior art keywords
frequency
light
beat
detection signal
continuous wave
Prior art date
Application number
PCT/JP2022/041755
Other languages
French (fr)
Japanese (ja)
Inventor
輝明 鳥居
祐太 春瀬
Original Assignee
株式会社小糸製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社小糸製作所 filed Critical 株式会社小糸製作所
Publication of WO2023085328A1 publication Critical patent/WO2023085328A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S17/34Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated using transmission of continuous, frequency-modulated waves while heterodyning the received signal, or a signal derived therefrom, with a locally-generated signal related to the contemporaneously transmitted signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • the present disclosure relates to imaging apparatus and methods.
  • An object identification system that senses the position and type of objects around the vehicle is used for automated driving and automatic control of headlamp light distribution.
  • An object identification system includes a sensor and a processor that analyzes the output of the sensor. Sensors are selected from cameras, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), millimeter-wave radar, ultrasonic sonar, etc., taking into consideration the application, required accuracy, and cost.
  • LiDAR Light Detection and Ranging, Laser Imaging Detection and Ranging
  • millimeter-wave radar ultrasonic sonar
  • Ghost imaging illuminates an object while randomly switching the intensity distribution (pattern) of illumination light, and measures the light detection intensity of the reflected light for each pattern.
  • the light detection intensity is the integral of energy or intensity over a plane, not the intensity distribution. Correlation calculation between the corresponding pattern and the light detection intensity is then performed to reconstruct a restored image of the object.
  • FMCW-LiDAR Frequency Modulated Continuous Wave
  • FMCW-LiDAR is known as a sensor capable of generating distance information.
  • FMCW-LiDAR is a ranging technology conventionally used for radar, and has excellent characteristics such as high resolution and high detection sensitivity. Relative velocity can also be measured using the Doppler effect.
  • the present disclosure has been made in such a situation, and one exemplary purpose of certain aspects thereof is to provide an imaging device capable of generating distance information in a short period of time.
  • An imaging apparatus includes a light source that generates a frequency-modulated continuous wave, and a spatial light modulator that spatially modulates the frequency-modulated continuous wave with a different pattern for each time slot.
  • a lighting device that irradiates a field of view with a modulated continuous wave as illumination light; a beat detector that generates a detection signal based on a signal obtained by combining the reflected light from an object and the frequency-modulated continuous wave; and an arithmetic processing unit that performs correlation calculation between the intensity of the beat and the intensity distribution of the illumination light and reconstructs the restored image of the object.
  • An imaging device includes a light source that generates a frequency-modulated continuous wave, an illumination device that irradiates a field of view with the frequency-modulated continuous wave as illumination light, and a spatial light reflected from an object for each time slot.
  • a spatial light modulator that randomly patterns the spatial light modulator;
  • a beat detector that generates a detection signal based on the combined light of the reflected light modulated by the spatial light modulator and the frequency-modulated continuous wave; and an arithmetic processing unit that performs correlation calculation between the intensity of the illumination light and the intensity distribution of the illumination light, and reconstructs a restored image of the object.
  • Another aspect of the present disclosure is an imaging method.
  • This method comprises the steps of: generating a frequency-modulated continuous wave; modulating the frequency-modulated continuous wave with a different pattern for each time slot to generate illumination light; a step of generating a detection signal based on the combined light of continuous waves; and a step of performing a correlation calculation between the intensity of the beat and the intensity distribution of the illumination light for each beat of the detection signal, and reconstructing a restored image of the object.
  • Another aspect of the present disclosure is also an imaging method.
  • This method comprises the steps of: irradiating a field of view with a frequency-modulated continuous wave as illumination light; modulating the reflected light of the illumination light reflected by an object with a different pattern for each time slot; a step of generating a detection signal based on the combined light of the modulated continuous waves, performing a correlation calculation between the intensity of the beat and the intensity distribution of the illumination light for each beat of the detection signal, and reconstructing a restored image of the object; Prepare.
  • distance information can be generated in a short time.
  • FIG. 1 is a diagram showing an imaging device according to Embodiment 1;
  • FIG. FIG. 4 is a diagram for explaining beats included in the detection signal Di ;
  • FIGS. 3A and 3B are diagrams for explaining beat intensity.
  • 2A and 2B are diagrams for explaining the operation of the imaging apparatus of FIG. 1;
  • FIG. 2A and 2B are diagrams for explaining the operation of the imaging apparatus of FIG. 1;
  • FIG. 10 is a diagram for explaining synthesis of a plurality of restored images;
  • FIG. Fig. 3 shows spectra obtained in a plurality of time slots;
  • Fig. 3 shows spectra obtained in a plurality of time slots;
  • 4 is a time chart for explaining the operation of the imaging apparatus in TDGI mode;
  • FIG. 11 is a diagram illustrating another sequence in TDGI mode;
  • FIG. 11(a) is a diagram showing the influence of noise in batch restoration, and
  • FIG. 11(b) is a diagram showing the influence of noise in division restoration.
  • FIG. 12(a) is a diagram showing a target image, and
  • FIG. 12(b) is a diagram showing an image obtained by batch restoration and division restoration when noise does not exist.
  • FIG. 10 is a diagram showing a restored image G(x, y) obtained by collective restoration and divisional restoration when linear noise exists;
  • 2 is a block diagram of an imaging device according to Embodiment 2;
  • FIG. 1 is a block diagram of an object identification system;
  • FIG. 1 is a block diagram of a vehicle with an object identification system;
  • FIG. 1 is a block diagram showing a vehicle lamp equipped with an object detection system;
  • An imaging apparatus includes a light source that generates a frequency-modulated continuous wave and a spatial light modulator that spatially modulates the frequency-modulated continuous wave with a different pattern for each time slot.
  • a lighting device that irradiates a field of view with a wave as illumination light;
  • a beat detector that generates a detection signal based on a signal obtained by combining the reflected light from an object and a frequency-modulated continuous wave; and an arithmetic processing unit that performs correlation calculation between the intensity and the intensity distribution of the illumination light and reconstructs a restored image of the object.
  • the arithmetic processing unit may Fourier transform the detection signal and detect the strength of the beat.
  • the arithmetic processing unit may apply different colors to a plurality of restored images and synthesize them. This makes it possible to generate an image that represents the distance and brightness (reflectance) of an object.
  • the processor may calculate the velocity of the object based on the Doppler shift, and estimate the distance to the object in the next time slot based on the velocity.
  • the processor may estimate the beat frequency of the next time slot based on the estimated distance to the object. As a result, when there are a plurality of beats with different frequencies, the beat of the previous time slot can be appropriately associated with the beat of the current time slot.
  • the imaging device may notify the outside when the number of beats of the detection signal changes.
  • a new frame measurement may be started from there.
  • beats and objects can be associated on a one-to-one basis, and image accuracy can be improved.
  • the light source and spatial light modulator may be an array of light emitting elements.
  • the intensity distribution is random in this specification does not mean that it is completely random, but it is sufficient if it is random enough to reconstruct an image in ghost imaging. Therefore, “random” in this specification can include a certain degree of regularity therein. Also, “random” does not require unpredictability, but may be predictable and reproducible.
  • FIG. 1 is a diagram showing an imaging apparatus 100 according to Embodiment 1.
  • FIG. Imaging device 100 is a correlation function image sensor that uses the principle of ghost imaging (also called single pixel imaging), and includes illumination device 110 , beat detection section 120 and arithmetic processing device 130 .
  • the illumination device 110 is a pseudo thermal light source, generates illumination light S1 having a spatial intensity distribution I(x, y) that can be regarded as substantially random, and illuminates the object OBJ.
  • the illumination light S1 is sequentially irradiated a plurality of M times while randomly changing its intensity distribution.
  • the number of times of irradiation M is the number of times that can restore the original image.
  • Illumination device 110 includes light source 112 , patterning device 114 and pattern generator 132 .
  • the light source 112 generates a frequency-modulated continuous wave S0 that has a uniform intensity distribution and whose frequency changes over time. For example, the frequency of the frequency-modulated continuous wave S0 changes (increases or decreases) with a constant slope over time.
  • a semiconductor laser with a frequency modulation function can be used for the light source 112 .
  • the wavelength and spectrum of the frequency-modulated continuous wave S0 are not particularly limited, and may be white light having multiple or continuous spectra, or monochromatic light containing a predetermined wavelength.
  • the wavelength of the illumination light S1 may be infrared or ultraviolet.
  • the patterning device 114 has a plurality of pixels arranged in a matrix, and is configured to be able to spatially modulate the light intensity distribution I based on a combination of ON and OFF of the plurality of pixels.
  • a pixel in an ON state is called an ON pixel
  • a pixel in an OFF state is called an OFF pixel.
  • each pixel takes only two values (1, 0) of ON and OFF, but it is not limited to this and may take intermediate gradations.
  • a reflective DMD Digital Micromirror Device
  • a transmissive liquid crystal device can be used as the patterning device 114 .
  • a pattern signal PTN image data generated by a pattern generator 132 is applied to the patterning device 114 .
  • the frequency-modulated continuous wave S0 is spatially modulated with a different pattern for each time slot to generate the illumination light S1.
  • the frequency of the frequency-modulated continuous wave S0 is preferably swept at least once, preferably twice or more.
  • Sensing by the imaging apparatus 100 is performed with M patterned irradiations as one set, and one restored image is generated corresponding to the M patterned irradiations.
  • One sensing based on M pattern irradiations is called one frame. That is, one frame includes M time slots TS 1 to TS M .
  • Imaging apparatus 100 receives reflected light S2 reflected by object OBJ when illumination light S1 hits object OBJ.
  • the beat detector 120 detects the reflected light S2 by heterodyne detection and outputs a detection signal Dr.
  • the detection signal Dr is a spatially integrated value of light energy (or intensity) incident on the beat detection section 120 when the object OBJ is irradiated with illumination light having an intensity distribution Ir.
  • the beat detector 120 includes a multiplexer 122 and a photodetector 124 .
  • the combining unit 122 combines the frequency-modulated continuous wave S0 and the reflected light S2 to generate combined light S3.
  • the combined light S3 will contain a beat according to the frequency difference between the two lights S0 and S2.
  • the photodetector 124 detects the combined light S3 and generates a detection signal D according to the beat contained in the combined light S3.
  • the photodetector 124 can use a single-pixel photodetector (photodetector) 124 . Note that an image sensor may be used as the photodetector 124 to synthesize the values of a plurality of pixels.
  • the beat detector 120 outputs a plurality of detection signals D 1 to D M respectively corresponding to a plurality of M intensity distributions I 1 to I M .
  • a processor 130 receives a plurality of detection signals D 1 to D M .
  • the detection signal D i includes a frequency beat corresponding to the distance to the object OBJ, similarly to FMCW-LiDAR.
  • FIG. 2 is a diagram for explaining beats included in the detection signal Di.
  • the horizontal axis represents time, and the vertical axis represents the frequency of the optical continuous wave.
  • FIG. 2 shows the frequencies of the frequency-modulated continuous wave S0, the illumination light S1, and the reflected light S2.
  • the reflected light S2 is shifted to the right by the time ⁇ when the illumination light S1 hits the object OBJ and returns.
  • c is the speed of light.
  • the arithmetic processing unit 130 includes a reconstruction processing unit 134 .
  • the reconstruction processing unit 134 detects the intensity b 1 to b M of each beat of the detection signals D 1 to D M .
  • the intensity of the k-th beat corresponding to the r-th pattern irradiation is expressed as b k r .
  • the spectral data indicates the beat frequency f k and the beat intensity b k r .
  • FIGS. 3A and 3B are diagrams for explaining beat intensity.
  • the detection signal (also referred to as beat signal) Dr that indicates the beat contains a single frequency f1 .
  • the intensity b 1 r of f 1 is obtained.
  • the beat signal D r includes multiple frequencies f 1 and f 2 .
  • this detection signal Dr By Fourier transforming this detection signal Dr , the intensity b 1 r of the beat with frequency f 1 and the intensity b 2 r of the beat with frequency f 2 are obtained.
  • the reconstruction processing unit 134 performs correlation calculation of the intensity distributions I 1 to I M of the illumination light S1 with the intensities b k 1 to b k M of the beats having different frequencies f 1 . . . reconstruct the restored image G k (x, y) of . Equation (1) can be used for correlation calculation.
  • the arithmetic processing unit 130 can be implemented by combining a processor (hardware) such as a CPU (Central Processing Unit), MPU (Micro Processing Unit), or microcomputer, and a software program executed by the processor (hardware).
  • processor hardware
  • processing unit 130 may be a combination of multiple processors.
  • the arithmetic processing unit 130 may be composed only of hardware.
  • the functions of the arithmetic processing unit 130 may be realized by software processing, hardware processing, or a combination of software processing and hardware processing.
  • software processing is implemented by combining processors (hardware) such as CPUs (Central Processing Units), MPUs (Micro Processing Units), microcomputers, and software programs executed by the processors (hardware).
  • processors hardware
  • hardware processing is implemented by hardware such as ASIC (Application Specific Integrated Circuit), controller IC, and FPGA (Field Programmable Gate Array).
  • the configuration of the imaging apparatus 100 is as described above. Next, the operation will be explained.
  • FIG. 4 is a diagram for explaining the operation of the imaging apparatus 100 of FIG.
  • the detection signal Dr contains a beat with a single frequency f1 .
  • the pattern of the irradiation light S1 is switched in order like I1 , I2 , . . . , IM for each of the time slots TS1 , TS2, .
  • Detection signals D 1 , D 2 . . . , DM are obtained according to the irradiation patterns I 1 , I 2 .
  • D M have different amplitudes depending on the pattern, but equal frequency f 1 depending on the distance to the object.
  • the amplitudes of the beats of frequency f1 contained in the detection signals D1, D2, . . . , DM are detected as intensities b11 , b12 , .
  • the restored image G 1 (x, y) is calculated based on equation (1).
  • FIG. 5 is a diagram explaining the operation of the imaging apparatus 100 of FIG.
  • the detection signal Dr includes a beat of a first frequency f1 corresponding to the distance to the object OBJ1 and a beat of a second frequency f2 corresponding to the distance to the object OBJ2 .
  • the pattern of the irradiation light S1 is switched in order like I 1 , I 2 . . . , IM for each time slot.
  • Detection signals D 1 , D 2 . . . , DM are obtained according to the irradiation patterns I 1 , I 2 .
  • the detection signals D 1 , D 2 . . . , D M each contain two frequency components f 1 , f 2 .
  • the amplitude of the frequency f 1 of each of the detection signals D 1 , D 2 . . . , D M changes based on the shape of the object OBJ 1 and the patterns I 1 to I M .
  • the amplitude of the frequency f 2 of each of the detection signals D 1 , D 2 . . . , D M changes based on the shape of the object OBJ 2 and the patterns I 1 to I M .
  • the intensity b 1 r at the frequency f 1 and the intensity b 2 r at the frequency f 2 can be obtained. Then , for the frequency f 1 , the restored image G 1 ( x , y) can be generated. Similarly, the restored image G 2 (x, y) of the object OBJ 2 can be generated by performing correlation calculation using I 1 to I M and b 2 1 to b 2 M for the frequency f 2 .
  • the imaging apparatus 100 by combining FMCW-LiDAR and ghost imaging, a three-dimensional image can be generated without scanning the illumination light S1. Since the imaging apparatus 100 does not require scanning as compared with the conventional FMCW-LiDAR, it can generate a two-dimensional range image in a short period of time. Also, since no mechanical or electronic control is required for scanning, costs can be reduced.
  • the imaging apparatus 100 can generate an independent restored image G k (x, y) for each object.
  • G k discriminator
  • the recognition rate is reduced compared to the case where one image contains multiple objects. can be increased.
  • the imaging apparatus 100 may synthesize a plurality of restored images G 1 (x, y), G 2 (x, y), . . . obtained for a plurality of frequencies to generate one synthesized image IMG.
  • FIG. 6 is a diagram for explaining combining of a plurality of restored images.
  • Each image G k (x, y) is a monochrome multi-tone image.
  • the reconstruction processing unit 134 synthesizes the restored images G 1 (x, y), G 2 (x, y), . . . with different colors.
  • Image G 1 is assigned a first color C 1 (eg red)
  • image G 2 is assigned a second color C 2 (eg green)
  • eg image G 3 is assigned a third color C 3 (eg blue).
  • the pixel value of each pixel of the original restored image is 8 bits (256 gradations) from 0 to 255.
  • the pixel value of the restored image G 1 (x, y) is the R component of the color composite image
  • the pixel value of the restored image G 2 (x, y) is the G component of the color composite image.
  • the pixel values of the image G 3 (x, y) may be used as the B component of the color composite image.
  • the technique of adding color to and synthesizing multiple monochrome images is not limited to this.
  • the colors represent the distance to the object, and the brightness of each color represents the reflectance of the object.
  • the distance to the object was unchanged during one frame of sensing, so the frequency of one beat was constant during sensing.
  • the distance to an object is not always constant during one frame for sensing an object. For example, if the vehicle is running and the object is stopped, the distance to the object will decrease over time. As the distance to the object changes, the frequency of the beat changes. That is, the beat frequency can change from time slot to time slot.
  • FIG. 7 is a diagram showing spectra obtained in a plurality of time slots.
  • the frequency of the beat increases as the distance to the object increases with each time slot. In other words, the position of the peak included in the spectrum shifts to the high frequency side.
  • the spectral data contains only a single peak, it is possible to perform correlation calculations assuming that they have the same beat frequency.
  • the problem is how to associate them.
  • the arithmetic processing unit 130 calculates the velocity of the object based on the Doppler shift. Velocity estimation based on Doppler shift is a technique used in FMCW-LiDAR, so detailed description of processing is omitted. Processing unit 130 then estimates the distance to the object in the next time slot based on this velocity.
  • Arithmetic processing unit 130 estimates the beat frequency based on the estimated distance d ⁇ to the object.
  • beats in the previous time slot and beats in the current time slot can be appropriately associated when there are multiple beats with different frequencies.
  • FIG. 8 is a diagram showing spectra obtained in a plurality of time slots.
  • the relative velocity of one object OBJ 1 is zero and the relative velocity of the other object OBJ 2 is v.
  • the beat frequency of the reflected light is constant.
  • the beat frequency of the reflected light from object OBJ2 changes with time. As the time slot elapses, the beat frequency f2 of the object OBJ2 approaches the beat frequency f1 of the object OBJ1 .
  • the velocity of the object OBJ 1 was measured to be 0 at the time slot TS i ⁇ 1 based on past measurement results. Then the frequency f 1 ⁇ of the beat of object OBJ 1 in the next time slot TS i is estimated to be the same as f 1 . Also, assume that the velocity v of the object OBJ2 can be measured. Then the frequency f 2 of the beat of object OBJ 2 in the next time slot TS 3 can be estimated.
  • the next time slot TS i uses the frequencies f 1 ⁇ , f 2 ⁇ predicted in the previous time slot TS i ⁇ 1 . That is, of the two beats included in the spectrum of the current time slot TS i , the one close to f 1 ⁇ is associated with the beat of object OBJ 1 , and the one close to f 2 ⁇ is associated with the beat of object OBJ 2 . correspond to Also in this time slot TSi, the beat frequencies f 1 ⁇ , f 2 ⁇ of the next time slot TS i+1 are estimated.
  • the next time slot TS i+1 utilizes the frequencies f 1 ⁇ , f 2 ⁇ predicted in the previous time slot TS i . That is, of the two beats included in the spectrum of the current time slot TS i+1 , the one close to f 1 ⁇ is associated with the beat of the object OBJ 1, and the one close to f 2 ⁇ is associated with the beat of the object OBJ 2 . correspond to
  • the processing unit 130 may notify the outside. This notification can be used as an alert indicating the appearance of a new object in the field of view, or the disappearance of an object from the field of view, or indicates that the reliability of the reconstructed image obtained in that frame may be declining. do.
  • the imaging apparatus 100 may start measuring a new frame when the number of beats of the detection signal, that is, the number of peak frequencies changes.
  • the number of beats of the detection signal that is, the number of peak frequencies changes.
  • the reconstructed image generation processing in the reconstruction processing unit 134 of the arithmetic processing unit 130 may be performed by dividing M times of irradiation into a plurality of k (k ⁇ 2) units. Specifically, the reconstruction processing unit 134 performs correlation calculation for each divided unit to generate an intermediate image M j (x, y). Then, the k intermediate images M 1 to M k (x, y) obtained for the k units are combined to generate the final restored image G TDGI (x, y).
  • the restored image G TDGI (x, y) in this case is represented by Equation (2).
  • I r is the r-th intensity distribution
  • b r is the value of the r-th detected intensity
  • ⁇ b r [j] > is the average value of the detected intensity b r measured in the j-th unit.
  • the j-th term on the right side of equation (2) represents the intermediate image M j (x, y) of the j-th unit. Therefore, the restored image G TDGI (x, y) is synthesized by simply adding the corresponding pixels of the plurality of intermediate images M j (x, y). Note that the combining method is not limited to simple addition, and weighted addition or other processing may be performed.
  • FIG. 9 is a time chart for explaining the operation of the imaging apparatus 100 in TDGI mode.
  • the reconstruction image is generated by dividing it into k units each containing n irradiations (that is, n time slots).
  • the first unit includes the 1st to nth irradiations, the detected intensities b 1 to bn corresponding to the n irradiations are generated, and their average value ⁇ b r [1] > is generated. Then, correlation calculation is performed using the detected intensities b 1 to b n , their average value ⁇ b r [1] >, and the intensity distributions I n+1 to I 2n , and the intermediate image M 1 (x, y) is generated.
  • the second unit contains the n+1 to 2n exposures, and the detected intensities b n+1 to b 2n corresponding to the n exposures are generated and their average value ⁇ b r [2] > is generated. Then, correlation calculation is performed using the detected intensities b n+1 to b 2n , their average values ⁇ b r [2] >, and the intensity distributions I n+1 to I 2n , and the intermediate image M 2 (x, y) is generated.
  • the j-th unit includes (j-1)n+1 to jn-th irradiation, and the detection intensities b (j-1)n+1 to b jn corresponding to the n-th irradiation are generated, and their average A value ⁇ b r [j] > is generated. Then, correlation calculation is performed using the detected intensities b (j ⁇ 1)n+1 to b jn , their average values ⁇ b r [j] >, and the intensity distributions I (j ⁇ 1)n+1 to I jn , An intermediate image M j (x,y) is generated.
  • the last k-th unit contains the (k ⁇ 1)n+1 to kn-th irradiations, and the detected intensities b (k ⁇ 1)n+1 to b kn corresponding to the n irradiations are generated, and their average ⁇ b r [k] > is generated. Then, correlation calculation is performed using the detected intensities b (k ⁇ 1)n+1 to b kn , their average values ⁇ b r [k] >, and the intensity distributions I (k ⁇ 1)n+1 to I kn , An intermediate image M k (x,y) is generated.
  • a final restored image G(x, y) is generated by synthesizing the k intermediate images M 1 (x, y) to M k (x, y).
  • the above is the operation of the TDGI mode.
  • the TDGI mode by dividing the correlation calculation into units and performing it, the amount of noise change per correlation calculation can be reduced, the restoration accuracy can be increased, and the image quality can be improved.
  • correlation calculation can be started without waiting for the completion of M irradiations, the time required to generate a restored image can be shortened.
  • Imaging using correlation calculation for each unit based on Equation (2) is hereinafter referred to as segmentation reconstruction.
  • conventional imaging using correlation calculation based on equation (1) is referred to as collective reconstruction.
  • linear noise For example, consider noise that monotonously increases over time (referred to as linear noise). Such linear noise can occur in cases where the distance between the noise light source and the beat detector approaches over time.
  • FIG. 10 is a diagram explaining another sequence in TDGI mode.
  • illumination light continues to be emitted, correlation calculation is performed every n times of illumination, and an intermediate image M(x, y) is generated.
  • an intermediate image M(x, y) is generated.
  • a restored image G(x, y) is generated by synthesizing the latest k intermediate images M(x, y).
  • the restored image G(x, y) can be updated every n times of irradiation, so the update rate can be increased.
  • FIG. 11(a) is a diagram showing the influence of noise in batch restoration
  • FIG. 11(b) is a diagram showing the influence of noise in division restoration.
  • the horizontal axis represents the irradiation pattern number, that is, the time.
  • the integrated value over all irradiation can be obtained by multiplying the value of equation (5) by k, which is kn 2 ⁇ /4.
  • FIG. 12(a) is a diagram showing a target image
  • FIG. 12(b) is a diagram showing an image obtained by batch restoration and division restoration when noise does not exist.
  • M 100000 is required in order to restore the original target image to a recognizable degree, and the same is true for segmented restoration.
  • FIG. 13 is a diagram showing a restored image G(x, y) obtained by collective restoration and divisional restoration when linear noise exists.
  • FIG. 14 is a block diagram of an imaging device 100A according to the second embodiment.
  • Embodiment 2 differs from Embodiment 1 in the arrangement of spatial light modulators 140 .
  • illumination light S1 is frequency-modulated continuous wave S0 and has a spatially uniform intensity distribution.
  • the spatial light modulator 140 spatially modulates the reflected light S2 from the object OBJ.
  • the multiplexer 122 multiplexes the light S4 spatially modulated by the spatial light modulator 140 and the frequency-modulated continuous wave S0. Subsequent processing is the same as in the first embodiment.
  • Modification 1 In TDGI, the number of times of irradiation for each unit is assumed to be equal, but this is not the case, and the number of times of irradiation for each unit may not be equal.
  • Modification 2 Also, in TDGI, the unit number k is fixed, but the unit number k may be dynamically controlled. Image quality can be further improved by selecting the optimum number of units k according to the noise fluctuation speed and noise waveform.
  • the illumination device 110 is configured by a combination of the light source 112 and the patterning device 114, but this is not the only option.
  • the illumination device 110 may be composed of an array of a plurality of semiconductor light sources (LEDs (light emitting diodes) and LDs (laser diodes)) arranged in a matrix, and configured so that the brightness of each semiconductor light source can be controlled. .
  • LEDs light emitting diodes
  • LDs laser diodes
  • FIG. 15 is a block diagram of the object identification system 10. As shown in FIG. This object identification system 10 is mounted on a vehicle such as an automobile or a motorcycle, and determines types (categories) of objects OBJ existing around the vehicle.
  • the object identification system 10 includes an imaging device 100 and an arithmetic processing device 40 . As described above, the imaging apparatus 100 generates the restored image G of the object OBJ by irradiating the object OBJ with the illumination light S1 and measuring the reflected light S2.
  • the arithmetic processing device 40 processes the output image G of the imaging device 100 and determines the position and type (category) of the object OBJ.
  • the classifier 42 of the arithmetic processing unit 40 receives the image G as an input and determines the position and type of the object OBJ contained therein.
  • Classifier 42 is implemented based on a model generated by machine learning.
  • the algorithm of the classifier 42 is not particularly limited, YOLO (You Only Look Once), SSD (Single Shot MultiBox Detector), R-CNN (Region-based Convolutional Neural Network), SPPnet (Spatial Pyramid Pooling), Faster R-CNN , DSSD (Deconvolution-SSD), Mask R-CNN, etc., or algorithms that will be developed in the future.
  • the above is the configuration of the object identification system 10.
  • the imaging device 100 as a sensor for the object identification system 10, the following advantages can be obtained.
  • the imaging device 100 that is, the quantum radar camera
  • the noise immunity is greatly improved. For example, when it is raining, snowing, or driving in fog, it is difficult to recognize the object OBJ with the naked eye. A restored image G can be obtained.
  • the calculation delay can be reduced. This can provide low latency sensing. Particularly in in-vehicle applications, there are cases where the object OBJ moves at high speed.
  • FIG. 16 is a block diagram of an automobile equipped with the object identification system 10.
  • FIG. Automobile 300 includes headlights 302L and 302R. Imaging device 100 is built into at least one of headlights 302L and 302R. The headlamp 302 is positioned at the extreme end of the vehicle body and is the most advantageous location for installing the imaging device 100 in terms of detecting surrounding objects.
  • FIG. 17 is a block diagram showing a vehicle lamp 200 including an object detection system 210.
  • the vehicle lamp 200 constitutes a lamp system 310 together with a vehicle-side ECU 304 .
  • a vehicle lamp 200 includes a light source 202 , a lighting circuit 204 and an optical system 206 .
  • the vehicle lamp 200 is provided with an object detection system 210 .
  • Object detection system 210 corresponds to object identification system 10 described above and includes imaging device 100 and arithmetic processing device 40 .
  • Information on the object OBJ detected by the processing unit 40 may be used for light distribution control of the vehicle lamp 200 .
  • the lamp-side ECU 208 generates an appropriate light distribution pattern based on the information about the type and position of the object OBJ generated by the arithmetic processing unit 40 .
  • the lighting circuit 204 and the optical system 206 operate so as to obtain the light distribution pattern generated by the lamp-side ECU 208 .
  • Information regarding the object OBJ detected by the arithmetic processing unit 40 may be transmitted to the vehicle-side ECU 304 .
  • the vehicle-side ECU may perform automatic driving based on this information.
  • the present disclosure relates to imaging apparatus and methods.
  • OBJ... Object S0... Frequency-modulated continuous wave, S1... Illumination light, S2... Reflected light, S3... Combined light, 10... Object identification system, 40... Arithmetic processing unit, 42... Classifier, 100... Imaging device, 110... Illuminating device 112 Light source 114 Patterning device 120 Beat detector 122 Combiner 124 Photodetector 130 Arithmetic processor 132 Pattern generator 134 Reconstruction processor 200 Vehicle lamp 202 Light source 204 Lighting circuit 206 Optical system 300 Automobile 302 Headlamp 310 Lamp system 304 Vehicle side ECU.

Abstract

A light source 112 in an illumination device 110 generates frequency-modulated continuous waves S0. A patterning device 114 modulates the frequency-modulated continuous waves S0 using patterns that differ for each time slot to generate illumination light S1. A beat detection unit 120 generates a detection signal based on reflected light S2 received from an object and multiplexed light S3 of the frequency-modulated continuous waves S0. A calculation processing device 130 calculates a correlation, for each beat in the detection signal, between the intensity of the beat and the intensity distribution of the illumination light S1 to reconstruct a restored image of the object.

Description

イメージング装置およびイメージング方法、車両用灯具、車両IMAGING APPARATUS AND IMAGING METHOD, VEHICLE LAMP, VEHICLE
 本開示は、イメージング装置および方法に関する。 The present disclosure relates to imaging apparatus and methods.
 自動運転やヘッドランプの配光の自動制御のために、車両の周囲に存在する物体の位置および種類をセンシングする物体識別システムが利用される。物体識別システムは、センサと、センサの出力を解析する演算処理装置を含む。センサは、カメラ、LiDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)、ミリ波レーダ、超音波ソナーなどの中から、用途、要求精度やコストを考慮して選択される。 An object identification system that senses the position and type of objects around the vehicle is used for automated driving and automatic control of headlamp light distribution. An object identification system includes a sensor and a processor that analyzes the output of the sensor. Sensors are selected from cameras, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), millimeter-wave radar, ultrasonic sonar, etc., taking into consideration the application, required accuracy, and cost.
 イメージング装置(センサ)のひとつとして、ゴーストイメージングの原理を利用したものが知られている。ゴーストイメージングは、照明光の強度分布(パターン)をランダムに切り替えながら物体に照射し、パターンごとに反射光の光検出強度を測定する。光検出強度はある平面にわたるエネルギーあるいは強度の積分値であり、強度分布ではない。そして、対応するパターンと光検出強度との相関計算を行い、物体の復元画像を再構成(reconstruct)する。 As one of the imaging devices (sensors), one that uses the principle of ghost imaging is known. Ghost imaging illuminates an object while randomly switching the intensity distribution (pattern) of illumination light, and measures the light detection intensity of the reflected light for each pattern. The light detection intensity is the integral of energy or intensity over a plane, not the intensity distribution. Correlation calculation between the corresponding pattern and the light detection intensity is then performed to reconstruct a restored image of the object.
特許第6412673号公報Japanese Patent No. 6412673 国際公開WO2020/218282号International publication WO2020/218282 国際公開WO2021/079810号International publication WO2021/079810
 距離情報を生成可能なセンサとして、FMCW(周波数変調連続波)-LiDARが知られている。FMCW-LiDARは従来、レーダーに使用されていた測距技術であり、高分解能、高検出感度等の優れた特性を有する。またドップラー効果を利用して相対速度を測定できる。 FMCW (Frequency Modulated Continuous Wave)-LiDAR is known as a sensor capable of generating distance information. FMCW-LiDAR is a ranging technology conventionally used for radar, and has excellent characteristics such as high resolution and high detection sensitivity. Relative velocity can also be measured using the Doppler effect.
 FMCW-LiDARによって3次元情報を取得するためには、光を左右方向および上下方向にスキャンする必要があり、コストが高くなる。また、スキャンが必要であるため、測定時間が長くなるという問題がある。 In order to acquire 3D information with FMCW-LiDAR, it is necessary to scan the light in the horizontal and vertical directions, which increases the cost. Moreover, since scanning is required, there is a problem that the measurement time becomes long.
 本開示は係る状況においてなされたものであり、そのある態様の例示的な目的のひとつは、短時間で距離情報を生成可能なイメージング装置の提供にある。 The present disclosure has been made in such a situation, and one exemplary purpose of certain aspects thereof is to provide an imaging device capable of generating distance information in a short period of time.
 本開示のある態様に係るイメージング装置は、周波数変調連続波を生成する光源と、周波数変調連続波をタイムスロットごとに異なるパターンで空間変調する空間光変調器と、を含み、空間変調された周波数変調連続波を照明光として視野に照射する照明装置と、物体からの反射光と周波数変調連続波を合波した信号にもとづく検出信号を生成するビート検出部と、検出信号のビートごとに、当該ビートの強度と照明光の強度分布の相関計算を行い、物体の復元画像を再構成する演算処理装置と、を備える。 An imaging apparatus according to an aspect of the present disclosure includes a light source that generates a frequency-modulated continuous wave, and a spatial light modulator that spatially modulates the frequency-modulated continuous wave with a different pattern for each time slot. a lighting device that irradiates a field of view with a modulated continuous wave as illumination light; a beat detector that generates a detection signal based on a signal obtained by combining the reflected light from an object and the frequency-modulated continuous wave; and an arithmetic processing unit that performs correlation calculation between the intensity of the beat and the intensity distribution of the illumination light and reconstructs the restored image of the object.
 本開示のある態様に係るイメージング装置は、周波数変調連続波を生成する光源を含み、周波数変調連続波を照明光として視野に照射する照明装置と、物体からの反射光をタイムスロットごとに空間的にランダムにパターニングする空間光変調器と、空間光変調器により変調された反射光と周波数変調連続波の合波光にもとづく検出信号を生成するビート検出部と、検出信号のビートごとに、当該ビートの強度と照明光の強度分布の相関計算を行い、物体の復元画像を再構成する演算処理装置と、を備える。 An imaging device according to an aspect of the present disclosure includes a light source that generates a frequency-modulated continuous wave, an illumination device that irradiates a field of view with the frequency-modulated continuous wave as illumination light, and a spatial light reflected from an object for each time slot. a spatial light modulator that randomly patterns the spatial light modulator; a beat detector that generates a detection signal based on the combined light of the reflected light modulated by the spatial light modulator and the frequency-modulated continuous wave; and an arithmetic processing unit that performs correlation calculation between the intensity of the illumination light and the intensity distribution of the illumination light, and reconstructs a restored image of the object.
 本開示の別の態様は、イメージング方法である。この方法は、周波数変調連続波を生成するステップと、周波数変調連続波をタイムスロットごとに異なるパターンで変調して照明光を生成するステップと、物体が前記照明光を反射した反射光と周波数変調連続波の合波光にもとづく検出信号を生成するステップと、検出信号のビートごとに、当該ビートの強度と照明光の強度分布の相関計算を行い、物体の復元画像を再構成するステップと、を備える。 Another aspect of the present disclosure is an imaging method. This method comprises the steps of: generating a frequency-modulated continuous wave; modulating the frequency-modulated continuous wave with a different pattern for each time slot to generate illumination light; a step of generating a detection signal based on the combined light of continuous waves; and a step of performing a correlation calculation between the intensity of the beat and the intensity distribution of the illumination light for each beat of the detection signal, and reconstructing a restored image of the object. Prepare.
 本開示の別の態様も、イメージング方法である。この方法は、周波数変調連続波を照明光として視野に照射するステップと、物体が前記照明光を反射した反射光を、タイムスロットごとに異なるパターンで変調するステップと、変調された反射光と周波数変調連続波の合波光にもとづく検出信号を生成するステップと、検出信号のビートごとに、当該ビートの強度と照明光の強度分布の相関計算を行い、物体の復元画像を再構成するステップと、を備える。 Another aspect of the present disclosure is also an imaging method. This method comprises the steps of: irradiating a field of view with a frequency-modulated continuous wave as illumination light; modulating the reflected light of the illumination light reflected by an object with a different pattern for each time slot; a step of generating a detection signal based on the combined light of the modulated continuous waves, performing a correlation calculation between the intensity of the beat and the intensity distribution of the illumination light for each beat of the detection signal, and reconstructing a restored image of the object; Prepare.
 本開示のある態様によれば、短時間で距離情報を生成できる。 According to one aspect of the present disclosure, distance information can be generated in a short time.
実施形態1に係るイメージング装置を示す図である。1 is a diagram showing an imaging device according to Embodiment 1; FIG. 検出信号Dに含まれるビートを説明する図である。FIG. 4 is a diagram for explaining beats included in the detection signal Di ; 図3(a)、(b)は、ビートの強度を説明する図である。FIGS. 3A and 3B are diagrams for explaining beat intensity. 図1のイメージング装置の動作を説明する図である。2A and 2B are diagrams for explaining the operation of the imaging apparatus of FIG. 1; FIG. 図1のイメージング装置の動作を説明する図である。2A and 2B are diagrams for explaining the operation of the imaging apparatus of FIG. 1; FIG. 複数の復元画像の合成を説明する図である。FIG. 10 is a diagram for explaining synthesis of a plurality of restored images; FIG. 複数のタイムスロットにおいて得られるスペクトルを示す図である。Fig. 3 shows spectra obtained in a plurality of time slots; 複数のタイムスロットにおいて得られるスペクトルを示す図である。Fig. 3 shows spectra obtained in a plurality of time slots; イメージング装置のTDGIモードの動作を説明するタイムチャートである。4 is a time chart for explaining the operation of the imaging apparatus in TDGI mode; TDGIモードの別のシーケンスを説明する図である。FIG. 11 is a diagram illustrating another sequence in TDGI mode; 図11(a)は、一括復元におけるノイズの影響を示す図であり、図11(b)は、分割復元におけるノイズの影響を示す図である。FIG. 11(a) is a diagram showing the influence of noise in batch restoration, and FIG. 11(b) is a diagram showing the influence of noise in division restoration. 図12(a)は、ターゲット画像を、図12(b)は、ノイズが存在しないときの、一括復元と分割復元により得られる画像を示す図である。FIG. 12(a) is a diagram showing a target image, and FIG. 12(b) is a diagram showing an image obtained by batch restoration and division restoration when noise does not exist. 線形ノイズが存在するときの、一括復元と分割復元により得られる復元画像G(x,y)を示す図である。FIG. 10 is a diagram showing a restored image G(x, y) obtained by collective restoration and divisional restoration when linear noise exists; 実施形態2に係るイメージング装置のブロック図である。2 is a block diagram of an imaging device according to Embodiment 2; FIG. 物体識別システムのブロック図である。1 is a block diagram of an object identification system; FIG. 物体識別システムを備える自動車のブロック図である。1 is a block diagram of a vehicle with an object identification system; FIG. 物体検出システムを備える車両用灯具を示すブロック図である。1 is a block diagram showing a vehicle lamp equipped with an object detection system; FIG.
 本開示のいくつかの例示的な実施形態の概要を説明する。この概要は、後述する詳細な説明の前置きとして、実施形態の基本的な理解を目的として、1つまたは複数の実施形態のいくつかの概念を簡略化して説明するものであり、発明あるいは開示の広さを限定するものではない。この概要は、考えられるすべての実施形態の包括的な概要ではなく、すべての実施形態の重要な要素を特定することも、一部またはすべての態様の範囲を線引きすることも意図していない。便宜上、「一実施形態」は、本明細書に開示するひとつの実施形態(実施例や変形例)または複数の実施形態(実施例や変形例)を指すものとして用いる場合がある。 An overview of some exemplary embodiments of the present disclosure is provided. This summary presents, in simplified form, some concepts of one or more embodiments, as a prelude to the more detailed description that is presented later, and for the purpose of a basic understanding of the embodiments. The size is not limited. This summary is not a comprehensive overview of all possible embodiments, and it is intended to neither identify key elements of all embodiments nor delineate the scope of some or all aspects. For convenience, "one embodiment" may be used to refer to one embodiment (example or variation) or multiple embodiments (examples or variations) disclosed herein.
 一実施形態に係るイメージング装置は、周波数変調連続波を生成する光源と、周波数変調連続波をタイムスロットごとに異なるパターンで空間変調する空間光変調器と、を含み、空間変調された周波数変調連続波を照明光として視野に照射する照明装置と、物体からの反射光と周波数変調連続波を合波した信号にもとづく検出信号を生成するビート検出部と、検出信号のビートごとに、当該ビートの強度と照明光の強度分布の相関計算を行い、物体の復元画像を再構成する演算処理装置と、を備える。 An imaging apparatus according to one embodiment includes a light source that generates a frequency-modulated continuous wave and a spatial light modulator that spatially modulates the frequency-modulated continuous wave with a different pattern for each time slot. a lighting device that irradiates a field of view with a wave as illumination light; a beat detector that generates a detection signal based on a signal obtained by combining the reflected light from an object and a frequency-modulated continuous wave; and an arithmetic processing unit that performs correlation calculation between the intensity and the intensity distribution of the illumination light and reconstructs a restored image of the object.
 この構成によると、相関計算を利用したシングルピクセルイメージングによって、照明光を走査せずに、3次元距離情報を生成することができる。 According to this configuration, it is possible to generate three-dimensional distance information by single-pixel imaging using correlation calculation without scanning illumination light.
 一実施形態において、演算処理装置は、検出信号をフーリエ変換し、ビートの強度を検出してもよい。 In one embodiment, the arithmetic processing unit may Fourier transform the detection signal and detect the strength of the beat.
 一実施形態において、演算処理装置は、複数の復元画像に異なる色を付けて合成してもよい。これにより、距離と物体の明るさ(反射率)を表す画像を生成できる。 In one embodiment, the arithmetic processing unit may apply different colors to a plurality of restored images and synthesize them. This makes it possible to generate an image that represents the distance and brightness (reflectance) of an object.
 一実施形態において、演算処理装置は、ドップラーシフトにもとづいて、物体の速度を算出し、速度にもとづいて、次のタイムスロットにおける物体までの距離を推定してもよい。 In one embodiment, the processor may calculate the velocity of the object based on the Doppler shift, and estimate the distance to the object in the next time slot based on the velocity.
 一実施形態において、演算処理装置は、推定された物体までの距離にもとづいて、次のタイムスロットのビートの周波数を推定してもよい。これにより、周波数が異なる複数のビートが存在する場合に、前のタイムスロットのビートと現在のタイムスロットのビートを適切に対応付けることができる。 In one embodiment, the processor may estimate the beat frequency of the next time slot based on the estimated distance to the object. As a result, when there are a plurality of beats with different frequencies, the beat of the previous time slot can be appropriately associated with the beat of the current time slot.
 一実施形態において、検出信号のビートの個数が変化したとき、イメージング装置は、外部に通知してもよい。 In one embodiment, the imaging device may notify the outside when the number of beats of the detection signal changes.
 一実施形態において、検出信号のビートの個数が変化したとき、そこから新しいフレームの測定を開始してもよい。これにより、ビートと物体を1対1で対応付けることが可能となり、画像の精度を高めることができる。 In one embodiment, when the number of beats in the detected signal changes, a new frame measurement may be started from there. As a result, beats and objects can be associated on a one-to-one basis, and image accuracy can be improved.
 一実施形態において、光源と空間光変調器は、発光素子のアレイであってもよい。 In one embodiment, the light source and spatial light modulator may be an array of light emitting elements.
 本明細書における「強度分布がランダム」とは、完全なランダムであることを意味するものではなく、ゴーストイメージングにおいて画像を再構築できる程度に、ランダムであればよい。したがって本明細書における「ランダム」は、その中にある程度の規則性を内包することができる。また「ランダム」は、予測不能であることを要求するものではなく、予想可能、再生可能であってもよい。 "The intensity distribution is random" in this specification does not mean that it is completely random, but it is sufficient if it is random enough to reconstruct an image in ghost imaging. Therefore, "random" in this specification can include a certain degree of regularity therein. Also, "random" does not require unpredictability, but may be predictable and reproducible.
(実施形態)
 以下、本発明を好適な実施形態をもとに図面を参照しながら説明する。各図面に示される同一または同等の構成要素、部材、処理には、同一の符号を付するものとし、適宜重複した説明は省略する。また、実施形態は、発明を限定するものではなく例示であって、実施形態に記述されるすべての特徴やその組み合わせは、必ずしも発明の本質的なものであるとは限らない。
(embodiment)
BEST MODE FOR CARRYING OUT THE INVENTION The present invention will be described below based on preferred embodiments with reference to the drawings. The same or equivalent constituent elements, members, and processes shown in each drawing are denoted by the same reference numerals, and duplication of description will be omitted as appropriate. Moreover, the embodiments are illustrative rather than limiting the invention, and not all features and combinations thereof described in the embodiments are necessarily essential to the invention.
(実施形態1)
 図1は、実施形態1に係るイメージング装置100を示す図である。イメージング装置100はゴーストイメージング(シングルピクセルイメージングともいう)の原理を用いた相関関数イメージセンサであり、照明装置110、ビート検出部120および演算処理装置130を備える。
(Embodiment 1)
FIG. 1 is a diagram showing an imaging apparatus 100 according to Embodiment 1. FIG. Imaging device 100 is a correlation function image sensor that uses the principle of ghost imaging (also called single pixel imaging), and includes illumination device 110 , beat detection section 120 and arithmetic processing device 130 .
 照明装置110は、疑似熱光源であり、実質的にランダムとみなしうる空間強度分布I(x,y)を有する照明光S1を生成し、物体OBJに照射する。照明光S1は、その強度分布を複数のM回、ランダムに変化させながらシーケンシャルに照射される。照射回数Mは、元の画像を復元しうる程度の回数である。 The illumination device 110 is a pseudo thermal light source, generates illumination light S1 having a spatial intensity distribution I(x, y) that can be regarded as substantially random, and illuminates the object OBJ. The illumination light S1 is sequentially irradiated a plurality of M times while randomly changing its intensity distribution. The number of times of irradiation M is the number of times that can restore the original image.
 照明装置110は、光源112、パターニングデバイス114およびパターン発生器132を含む。 Illumination device 110 includes light source 112 , patterning device 114 and pattern generator 132 .
 光源112は、均一な強度分布を有し、時間的に周波数が変化する周波数変調連続波S0を生成する。たとえば、周波数変調連続波S0の周波数は、時間的に一定の傾きで変化(増加または低下)する。 The light source 112 generates a frequency-modulated continuous wave S0 that has a uniform intensity distribution and whose frequency changes over time. For example, the frequency of the frequency-modulated continuous wave S0 changes (increases or decreases) with a constant slope over time.
 光源112は、周波数変調機能つきの半導体レーザを用いることができる。周波数変調連続波S0の波長やスペクトルは特に限定されず、複数のあるいは連続スペクトルを有する白色光であってもよいし、所定の波長を含む単色光であってもよい。照明光S1の波長は、赤外あるいは紫外であってもよい。 A semiconductor laser with a frequency modulation function can be used for the light source 112 . The wavelength and spectrum of the frequency-modulated continuous wave S0 are not particularly limited, and may be white light having multiple or continuous spectra, or monochromatic light containing a predetermined wavelength. The wavelength of the illumination light S1 may be infrared or ultraviolet.
 パターニングデバイス114は、マトリクス状に配置される複数の画素を有し、複数の画素のオン、オフの組み合わせにもとづいて、光の強度分布Iを空間的に変調可能に構成される。本明細書においてオン状態の画素をオン画素、オフ状態の画素をオフ画素という。なお、以下の説明では理解の容易化のために、各画素は、オンとオフの2値(1,0)のみをとるものとするがその限りでなく、中間的な階調をとってもよい。 The patterning device 114 has a plurality of pixels arranged in a matrix, and is configured to be able to spatially modulate the light intensity distribution I based on a combination of ON and OFF of the plurality of pixels. In this specification, a pixel in an ON state is called an ON pixel, and a pixel in an OFF state is called an OFF pixel. In the following description, for ease of understanding, it is assumed that each pixel takes only two values (1, 0) of ON and OFF, but it is not limited to this and may take intermediate gradations.
 パターニングデバイス114としては、反射型のDMD(Digital Micromirror Device)や透過型の液晶デバイスを用いることができる。パターニングデバイス114には、パターン発生器132が発生するパターン信号PTN(画像データ)が与えられている。 A reflective DMD (Digital Micromirror Device) or a transmissive liquid crystal device can be used as the patterning device 114 . A pattern signal PTN (image data) generated by a pattern generator 132 is applied to the patterning device 114 .
 パターン発生器132は、照明光S1の強度分布Iを指定するパターン信号PTNを発生し、タイムスロットTSごとに、パターン信号PTNを切り替える(r=1,2,…M)。これにより、周波数変調連続波S0がタイムスロットごとに異なるパターンで空間的に変調され、照明光S1が生成される。1つのタイムスロットTS内において、周波数変調連続波S0の周波数は、少なくとも1回、好ましくは2回以上、スイープされることが望ましい。イメージング装置100によるセンシングは、M個のパターン照射を1セットとして行われ、M個のパターン照射に対応して1個の復元画像が生成される。M個のパターン照射にもとづく1回のセンシングを1フレームと称する。つまり1フレームは、M個のタイムスロットTS~TSを含む。 The pattern generator 132 generates a pattern signal PTN r that specifies the intensity distribution Ir of the illumination light S1, and switches the pattern signal PTN r for each time slot TS (r=1, 2, . . . M). Thereby, the frequency-modulated continuous wave S0 is spatially modulated with a different pattern for each time slot to generate the illumination light S1. Within one time slot TS, the frequency of the frequency-modulated continuous wave S0 is preferably swept at least once, preferably twice or more. Sensing by the imaging apparatus 100 is performed with M patterned irradiations as one set, and one restored image is generated corresponding to the M patterned irradiations. One sensing based on M pattern irradiations is called one frame. That is, one frame includes M time slots TS 1 to TS M .
 イメージング装置100は、照明光S1が物体OBJにあたり、物体OBJが反射した反射光S2を受ける。ビート検出部120は、ヘテロダイン検波によって反射光S2を検出し、検出信号Dを出力する。検出信号Dは、強度分布Iを有する照明光を物体OBJに照射したときに、ビート検出部120に入射する光エネルギー(あるいは強度)の空間的な積分値である。 Imaging apparatus 100 receives reflected light S2 reflected by object OBJ when illumination light S1 hits object OBJ. The beat detector 120 detects the reflected light S2 by heterodyne detection and outputs a detection signal Dr. The detection signal Dr is a spatially integrated value of light energy (or intensity) incident on the beat detection section 120 when the object OBJ is irradiated with illumination light having an intensity distribution Ir.
 ビート検出部120は、合波部122および光検出器124を含む。合波部122は、周波数変調連続波S0と反射光S2を合波し、合波光S3を生成する。合波光S3は、2つの光S0とS2の周波数差に応じたビートを含むことになる。光検出器124は、合波光S3を検波し、合波光S3に含まれるビートに応じた検出信号Dを生成する。光検出器124は、シングルピクセルの光検出部(フォトディテクタ)124を用いることができる。なお、光検出器124としてイメージセンサを用い、複数の画素の値を合成してもよい。 The beat detector 120 includes a multiplexer 122 and a photodetector 124 . The combining unit 122 combines the frequency-modulated continuous wave S0 and the reflected light S2 to generate combined light S3. The combined light S3 will contain a beat according to the frequency difference between the two lights S0 and S2. The photodetector 124 detects the combined light S3 and generates a detection signal D according to the beat contained in the combined light S3. The photodetector 124 can use a single-pixel photodetector (photodetector) 124 . Note that an image sensor may be used as the photodetector 124 to synthesize the values of a plurality of pixels.
 ビート検出部120からは、複数M通りの強度分布I~Iそれぞれに対応する複数の検出信号D~Dが出力される。 The beat detector 120 outputs a plurality of detection signals D 1 to D M respectively corresponding to a plurality of M intensity distributions I 1 to I M .
 演算処理装置130は、複数の検出信号D~Dを受ける。理解の容易化のため、視野に単一の物体のみが存在するシーンを考える。r番目のパターンIの照明光S1を照射している間、FMCW-LiDARと同様に、検出信号Dには、物体OBJまでの距離に応じた周波数のビートが含まれる。 A processor 130 receives a plurality of detection signals D 1 to D M . For ease of understanding, consider a scene with only a single object in the field of view. While the illumination light S1 of the r-th pattern Ir is emitted, the detection signal D i includes a frequency beat corresponding to the distance to the object OBJ, similarly to FMCW-LiDAR.
 図2は、検出信号Dに含まれるビートを説明する図である。横軸は時間を、縦軸は、光連続波の周波数を表す。図2には、周波数変調連続波S0、照明光S1、反射光S2の周波数が示される。反射光S2は、照明光S1が物体OBJに当たって帰ってくる時間τだけ、右側にシフトしている。物体OBJまでの距離がdであるとき、
 τ=2×d/c
である。cは光速である。
FIG. 2 is a diagram for explaining beats included in the detection signal Di. The horizontal axis represents time, and the vertical axis represents the frequency of the optical continuous wave. FIG. 2 shows the frequencies of the frequency-modulated continuous wave S0, the illumination light S1, and the reflected light S2. The reflected light S2 is shifted to the right by the time τ when the illumination light S1 hits the object OBJ and returns. When the distance to the object OBJ is d,
τ=2×d/c
is. c is the speed of light.
 周波数変調連続波S0および照明光S1の周波数f(t)が、関係式
 f(t)=f+a×t
にしたがって直線的に増加(チャープ)するものとする。このとき、反射光S2の周波数f(t)は、
 f(t)=f(t-τ)=f+a×(t-τ)
となる。周波数差が小さい2つの周波数fとfの光が合波(干渉)すると、2つの周波数fとfの周波数の差Δf=f-fにもとづくビートが発生する。
 この例では、ビート周波数Δfは、
 Δf=a×τ=a×2×d/c
であるから、物体までの距離dに依存する。これが検出信号Dにビートが現れる原因である。異なる距離に複数の物体が存在する場合、複数の距離に対応して、周波数が異なる複数のビートが現れる。
The frequency f 0 (t) of the frequency-modulated continuous wave S0 and the illumination light S1 is expressed by the relational expression f 0 (t)=f 0 +a×t
shall increase (chirp) linearly according to At this time, the frequency f 2 (t) of the reflected light S2 is
f 2 (t)=f 0 (t−τ)=f 0 +a×(t−τ)
becomes. When light beams with two frequencies f 0 and f 2 with a small frequency difference are multiplexed (interfered), a beat is generated based on the frequency difference Δf=f 0 −f 2 between the two frequencies f 0 and f 2 .
In this example, the beat frequency Δf is
Δf=a×τ=a×2×d/c
Therefore, it depends on the distance d to the object. This is the reason why the beat appears in the detection signal Di. When multiple objects exist at different distances, multiple beats with different frequencies appear corresponding to the multiple distances.
 図1に戻る。演算処理装置130は、再構成処理部134を含む。再構成処理部134は、検出信号D~Dのビートごとに、当該ビートの強度b~bを検出する。周波数が異なる複数のビートが存在する場合、r番目のパターン照射に対応するk番目のビートの強度を、b のように表記するものとする。 Return to FIG. The arithmetic processing unit 130 includes a reconstruction processing unit 134 . The reconstruction processing unit 134 detects the intensity b 1 to b M of each beat of the detection signals D 1 to D M . When there are a plurality of beats with different frequencies, the intensity of the k-th beat corresponding to the r-th pattern irradiation is expressed as b k r .
 たとえば再構成処理部134は、r番目(r=1,2…M)の強度分布Iに対応する照明光S1を照射している間に得られる検出信号Dの波形を取り込む。そして検出信号Dの取り込んだ波形を高速フーリエ変換によって周波数領域のスペクトルデータに変換する。スペクトルデータは、ビートの周波数fとビートの強度b を示す。 For example, the reconstruction processing unit 134 acquires the waveform of the detection signal Dr obtained while the illumination light S1 corresponding to the r-th (r=1, 2, . . . M) intensity distribution Ir is irradiated. Then, the captured waveform of the detection signal Dr is converted into spectral data in the frequency domain by fast Fourier transform. The spectral data indicates the beat frequency f k and the beat intensity b k r .
 図3(a)、(b)は、ビートの強度を説明する図である。図3(a)では、ビートを示す検出信号(ビート信号ともいう)Dは、単一周波数fを含んでいる。この検出信号Drをフーリエ変換すると、fの強度b が得られる。 FIGS. 3A and 3B are diagrams for explaining beat intensity. In FIG. 3(a), the detection signal (also referred to as beat signal) Dr that indicates the beat contains a single frequency f1 . By Fourier transforming this detection signal Dr, the intensity b 1 r of f 1 is obtained.
 図3(b)では、ビート信号Dは、複数の周波数f、fを含んでいる。この検出信号Dをフーリエ変換すると、周波数fのビートの強度b と、周波数fのビートの強度b が得られる。 In FIG. 3(b), the beat signal D r includes multiple frequencies f 1 and f 2 . By Fourier transforming this detection signal Dr , the intensity b 1 r of the beat with frequency f 1 and the intensity b 2 r of the beat with frequency f 2 are obtained.
 図1に戻る。再構成処理部134は、周波数f…fが異なるビートそれぞれについて、当該ビートの強度b ~b を照明光S1の強度分布I~Iの相関計算を行い、物体OBJの復元画像G(x,y)を再構成する。相関計算には式(1)を用いることができる。 Return to FIG. The reconstruction processing unit 134 performs correlation calculation of the intensity distributions I 1 to I M of the illumination light S1 with the intensities b k 1 to b k M of the beats having different frequencies f 1 . . . reconstruct the restored image G k (x, y) of . Equation (1) can be used for correlation calculation.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 演算処理装置130は、CPU(Central Processing Unit)やMPU(Micro Processing Unit)、マイコンなどのプロセッサ(ハードウェア)と、プロセッサ(ハードウェア)が実行するソフトウェアプログラムの組み合わせで実装することができる。演算処理装置130は、複数のプロセッサの組み合わせであってもよい。あるいは演算処理装置130はハードウェアのみで構成してもよい。 The arithmetic processing unit 130 can be implemented by combining a processor (hardware) such as a CPU (Central Processing Unit), MPU (Micro Processing Unit), or microcomputer, and a software program executed by the processor (hardware). Processing unit 130 may be a combination of multiple processors. Alternatively, the arithmetic processing unit 130 may be composed only of hardware.
 演算処理装置130の機能は、ソフトウェア処理で実現してもよいし、ハードウェア処理で実現してもよいし、ソフトウェア処理とハードウェア処理の組み合わせで実現してもよい。ソフトウェア処理は、具体的には、CPU(Central Processing Unit)やMPU(Micro Processing Unit)、マイコンなどのプロセッサ(ハードウェア)と、プロセッサ(ハードウェア)が実行するソフトウェアプログラムの組み合わせで実装される。なお演算処理装置130は、複数のプロセッサの組み合わせであってもよい。ハードウェア処理は具体的には、ASIC(Application Specific Integrated Circuit)やコントローラIC、FPGA(Field Programmable Gate Array)などのハードウェアで実装される。 The functions of the arithmetic processing unit 130 may be realized by software processing, hardware processing, or a combination of software processing and hardware processing. Specifically, software processing is implemented by combining processors (hardware) such as CPUs (Central Processing Units), MPUs (Micro Processing Units), microcomputers, and software programs executed by the processors (hardware). Note that the arithmetic processing unit 130 may be a combination of multiple processors. Specifically, hardware processing is implemented by hardware such as ASIC (Application Specific Integrated Circuit), controller IC, and FPGA (Field Programmable Gate Array).
 以上がイメージング装置100の構成である。続いてその動作を説明する。 The configuration of the imaging apparatus 100 is as described above. Next, the operation will be explained.
 図4は、図1のイメージング装置100の動作を説明する図である。ここでは、理解の容易化のために、単一の物体のみが存在し、イメージング装置100と物体までの距離が変化しない場合を考える。この場合、検出信号Drには、単一周波数fのビートが含まれる。照射光S1のパターンが、タイムスロットTS,TS,…TSごとに、I,I…,Iのように順に切りかえられる。各照射パターンI,I…,Iに応じて、検出信号D,D…,Dが得られる。検出信号D,D…,Dが示すビートは、パターンに応じて異なる振幅を有するが、物体までの距離に応じた等しい周波数fを有している。上述のように、検出信号D,D…,Dそれぞれに含まれる周波数fのビートの振幅が、強度b ,b 、…,b として検出される。 FIG. 4 is a diagram for explaining the operation of the imaging apparatus 100 of FIG. Here, for ease of understanding, consider a case where only a single object exists and the distance between the imaging apparatus 100 and the object does not change. In this case, the detection signal Dr contains a beat with a single frequency f1 . The pattern of the irradiation light S1 is switched in order like I1 , I2 , . . . , IM for each of the time slots TS1 , TS2, . Detection signals D 1 , D 2 . . . , DM are obtained according to the irradiation patterns I 1 , I 2 . The beats represented by the detection signals D 1 , D 2 . . . , D M have different amplitudes depending on the pattern, but equal frequency f 1 depending on the distance to the object. As described above, the amplitudes of the beats of frequency f1 contained in the detection signals D1, D2, . . . , DM are detected as intensities b11 , b12 , .
 再構成処理部134は、複数のタイムスロットTS,TS,…TSにおいて得られた強度b ,b 、…,b と、強度分布I~Iと、から、式(1)にもとづいて、復元画像G(x,y)を計算する。 , b 1 M obtained in a plurality of time slots TS 1 , TS 2 , . , the restored image G 1 (x, y) is calculated based on equation (1).
 図5は、図1のイメージング装置100の動作を説明する図である。ここでは、異なる距離に2個の物体OBJ,OBJが存在し、イメージング装置100と各物体OBJ,OBJまでの距離が変化しない場合を考える。この場合、検出信号Dには、物体OBJまでの距離に応じた第1周波数fのビートと、物体OBJまでの距離に応じた第2周波数fのビートと、が含まれる。 FIG. 5 is a diagram explaining the operation of the imaging apparatus 100 of FIG. Here, consider a case where two objects OBJ 1 and OBJ 2 are present at different distances, and the distances between the imaging apparatus 100 and each of the objects OBJ 1 and OBJ 2 do not change. In this case, the detection signal Dr includes a beat of a first frequency f1 corresponding to the distance to the object OBJ1 and a beat of a second frequency f2 corresponding to the distance to the object OBJ2 .
 照射光S1のパターンが、タイムスロットごとに、I,I…,Iのように順に切りかえられる。各照射パターンI,I…,Iに応じて、検出信号D,D…,Dが得られる。検出信号D,D…,Dは、それぞれ、2つの周波数成分f,fを含んでいる。検出信号D,D…,Dそれぞれの周波数fの振幅は、物体OBJの形状とパターンI~Iにもとづいて変化する。検出信号D,D…,Dそれぞれの周波数fの振幅は、物体OBJの形状とパターンI~Iにもとづいて変化する。 The pattern of the irradiation light S1 is switched in order like I 1 , I 2 . . . , IM for each time slot. Detection signals D 1 , D 2 . . . , DM are obtained according to the irradiation patterns I 1 , I 2 . The detection signals D 1 , D 2 . . . , D M each contain two frequency components f 1 , f 2 . The amplitude of the frequency f 1 of each of the detection signals D 1 , D 2 . . . , D M changes based on the shape of the object OBJ 1 and the patterns I 1 to I M . The amplitude of the frequency f 2 of each of the detection signals D 1 , D 2 . . . , D M changes based on the shape of the object OBJ 2 and the patterns I 1 to I M .
 検出信号Dをフーリエ変換すると、周波数fの強度b と周波数fの強度b を得ることができる。そして、周波数fについて、複数のタイムスロットにおいて得られたI~Iとb ~b を利用して相関計算を行うことで、物体OBJの復元画像G(x,y)を生成できる。同様に、周波数fについてI~Iとb ~b を利用して相関計算を行うことで、物体OBJの復元画像G(x,y)を生成できる。 By Fourier transforming the detection signal Dr , the intensity b 1 r at the frequency f 1 and the intensity b 2 r at the frequency f 2 can be obtained. Then , for the frequency f 1 , the restored image G 1 ( x , y) can be generated. Similarly, the restored image G 2 (x, y) of the object OBJ 2 can be generated by performing correlation calculation using I 1 to I M and b 2 1 to b 2 M for the frequency f 2 .
 以上がイメージング装置100の動作である。このイメージング装置100によれば、FMCW-LiDARとゴーストイメージングを組み合わせることにより、照明光S1をスキャンしなくても、3次元画像を生成することができる。イメージング装置100は、従来のFMCW-LiDARに比べてスキャンが不要であるから、短時間で2次元の距離画像を生成することができる。また、スキャンのためのメカニカルな、あるいは電子的な制御が不要であるため、コストを下げることができる。 The above is the operation of the imaging apparatus 100 . According to this imaging apparatus 100, by combining FMCW-LiDAR and ghost imaging, a three-dimensional image can be generated without scanning the illumination light S1. Since the imaging apparatus 100 does not require scanning as compared with the conventional FMCW-LiDAR, it can generate a two-dimensional range image in a short period of time. Also, since no mechanical or electronic control is required for scanning, costs can be reduced.
 異なる距離に複数の物体が存在する場合、イメージング装置100によれば、物体ごとに独立した復元画像G(x,y)を生成することができる。これにより、後段において、分類器(識別器)によって物体検出を行う場合に、はじめから物体が分離しているため、1枚の画像に複数の物体が写っている場合に比べて、認識率を高めることが可能となる。 When multiple objects are present at different distances, the imaging apparatus 100 can generate an independent restored image G k (x, y) for each object. As a result, when object detection is performed by a classifier (discriminator) in the latter stage, since the objects are separated from the beginning, the recognition rate is reduced compared to the case where one image contains multiple objects. can be increased.
(画像の合成)
 イメージング装置100は、複数の周波数について得られる複数の復元画像G(x,y),G(x,y)…を合成して、1枚の合成画像IMGを生成してもよい。図6は、複数の復元画像の合成を説明する図である。各画像G(x,y)は、モノクロの多階調画像である。
(Synthesis of images)
The imaging apparatus 100 may synthesize a plurality of restored images G 1 (x, y), G 2 (x, y), . . . obtained for a plurality of frequencies to generate one synthesized image IMG. FIG. 6 is a diagram for explaining combining of a plurality of restored images. Each image G k (x, y) is a monochrome multi-tone image.
 再構成処理部134は、複数の復元画像G(x,y)、G(x,y)、…に異なる色を付けて合成する。画像Gには第1色C(たとえば赤)、画像Gには第2色C(たとえば緑)、たとえば画像Gには第3色C(たとえば青)が割り当てられる。 The reconstruction processing unit 134 synthesizes the restored images G 1 (x, y), G 2 (x, y), . . . with different colors. Image G 1 is assigned a first color C 1 (eg red), image G 2 is assigned a second color C 2 (eg green) and eg image G 3 is assigned a third color C 3 (eg blue).
 もとの復元画像の各画素の画素値が、0~255の8ビット(256階調)であるとする。この場合、復元画像G(x,y)の画素値を、カラーの合成画像のR成分とし、復元画像G(x,y)の画素値を、カラーの合成画像のG成分とし、復元画像G(x,y)の画素値を、カラーの合成画像のB成分としてもよい。 Assume that the pixel value of each pixel of the original restored image is 8 bits (256 gradations) from 0 to 255. In this case, the pixel value of the restored image G 1 (x, y) is the R component of the color composite image, and the pixel value of the restored image G 2 (x, y) is the G component of the color composite image. The pixel values of the image G 3 (x, y) may be used as the B component of the color composite image.
 なお、複数のモノクロ画像に色を付けて合成する手法はこれに限定されない。合成画像は、色が物体までの距離を表し、各色の明るさが物体の反射率を表すこととなる。 It should be noted that the technique of adding color to and synthesizing multiple monochrome images is not limited to this. In the synthesized image, the colors represent the distance to the object, and the brightness of each color represents the reflectance of the object.
(物体までの距離の変化)
 これまでの説明では、物体までの距離は1フレームのセンシングの間、不変であり、したがって1つのビートの周波数は、センシングの間、一定であった。一方、車載用途では、ある物体をセンシングする1フレームの間に、物体までの距離が一定とは限らない。たとえば、自車が走行しており、物体が停止していれば、物体までの距離は時間とともに短くなっていく。物体までの距離が変化すると、ビートの周波数が変化する。つまり、ビートの周波数は、タイムスロットごとに変化する可能性がある。
(change in distance to object)
In the discussion so far, the distance to the object was unchanged during one frame of sensing, so the frequency of one beat was constant during sensing. On the other hand, in in-vehicle applications, the distance to an object is not always constant during one frame for sensing an object. For example, if the vehicle is running and the object is stopped, the distance to the object will decrease over time. As the distance to the object changes, the frequency of the beat changes. That is, the beat frequency can change from time slot to time slot.
 図7は、複数のタイムスロットにおいて得られるスペクトルを示す図である。物体までの距離が、タイムスロットごとに遠くなると、ビートの周波数は高くなっていく。つまりスペクトルに含まれるピークの位置が、高周波側にシフトしていく。 FIG. 7 is a diagram showing spectra obtained in a plurality of time slots. The frequency of the beat increases as the distance to the object increases with each time slot. In other words, the position of the peak included in the spectrum shifts to the high frequency side.
 スペクトルデータの中に、単一のピークのみが含まれる場合、それらは同じビートの周波数と仮定して、相関計算をすることができる。 If the spectral data contains only a single peak, it is possible to perform correlation calculations assuming that they have the same beat frequency.
 一方、スペクトルデータの中に複数のピークが含まれる場合、つまり複数のビートが含まれる場合、それらをどのように対応付けるかが問題となる。 On the other hand, if the spectrum data contains multiple peaks, that is, multiple beats, the problem is how to associate them.
 演算処理装置130は、ドップラーシフトにもとづいて、物体の速度を算出する。ドップラーシフトにもとづく速度の推定は、FMCW-LiDARにおいて使用される技術であるから具体的な処理の説明を省略する。そして演算処理装置130は、この速度にもとづいて、次のタイムスロットにおける物体までの距離を推定する。 The arithmetic processing unit 130 calculates the velocity of the object based on the Doppler shift. Velocity estimation based on Doppler shift is a technique used in FMCW-LiDAR, so detailed description of processing is omitted. Processing unit 130 then estimates the distance to the object in the next time slot based on this velocity.
 つまり、あるタイムスロットにおいて、ビートの周波数がfであったとする。そうすると、物体までの距離は、
 d=f×c/(2×a)
で求まる。aは、周波数変調のチャープ速度である。
In other words, assume that the beat frequency is f in a certain time slot. Then the distance to the object is
d=f×c/(2×a)
is obtained by a is the frequency modulation chirp rate.
 タイムスロットの長さがTs、ドップラーシフトから計算される物体の相対速度がvであるとすると、次のタイムスロットにおける物体までの推定距離d^は、
 d^=d-v×Ts
と推定される。
If the length of the time slot is Ts and the relative velocity of the object calculated from the Doppler shift is v, then the estimated distance d to the object in the next time slot is
d^=d−v×Ts
It is estimated to be.
 演算処理装置130は、推定された物体までの距離d^にもとづいて、ビートの周波数を推定する。たとえばビートの周波数f^は、以下の式から計算できる。
 f^=a×2×d^/c
Arithmetic processing unit 130 estimates the beat frequency based on the estimated distance d^ to the object. For example, the beat frequency f^ can be calculated from the following equation.
f^=a*2*d^/c
 ビート周波数の推定を利用することにより、周波数が異なる複数のビートが存在する場合に、前のタイムスロットのビートと現在のタイムスロットのビートを適切に対応付けることができる。 By using beat frequency estimation, beats in the previous time slot and beats in the current time slot can be appropriately associated when there are multiple beats with different frequencies.
 図8は、複数のタイムスロットにおいて得られるスペクトルを示す図である。2つの物体OBJ、OBJが存在しており、一方の物体OBJの相対速度はゼロであり、他方の物体OBJの相対速度はvであるとする。この場合、物体OBJまでの距離は変化しないから、反射光によるビートの周波数は一定である。一方、物体OBJの反射光のビートの周波数は時間とともに変化する。タイムスロットの経過とともに、物体OBJのビートの周波数fが、物体OBJのビートの周波数fに近づいていく。 FIG. 8 is a diagram showing spectra obtained in a plurality of time slots. Suppose there are two objects OBJ 1 and OBJ 2 and the relative velocity of one object OBJ 1 is zero and the relative velocity of the other object OBJ 2 is v. In this case, since the distance to the object OBJ1 does not change, the beat frequency of the reflected light is constant. On the other hand, the beat frequency of the reflected light from object OBJ2 changes with time. As the time slot elapses, the beat frequency f2 of the object OBJ2 approaches the beat frequency f1 of the object OBJ1 .
 タイムスロットTSi-1において、過去の測定結果から、物体OBJの速度が0と測定できたとする。そうすると次のタイムスロットTSにおける物体OBJのビートの周波数f^は、fと同じであると推定される。また、物体OBJの速度vが測定できたとする。そうすると次のタイムスロットTSにおける物体OBJのビートの周波数f^が推定できる。 Assume that the velocity of the object OBJ 1 was measured to be 0 at the time slot TS i−1 based on past measurement results. Then the frequency f 1 ^ of the beat of object OBJ 1 in the next time slot TS i is estimated to be the same as f 1 . Also, assume that the velocity v of the object OBJ2 can be measured. Then the frequency f 2 of the beat of object OBJ 2 in the next time slot TS 3 can be estimated.
 次のタイムスロットTSでは、前のタイムスロットTSi-1で予測した周波数f^、f^を利用する。すなわち、現在のタイムスロットTSのスペクトルに含まれる2本のビートのうち、f^に近いものを、物体OBJのビートに対応付け、f^に近いものを、物体OBJのビートに対応付ける。このタイムスロットTSiにおいても、次のタイムスロットTSi+1のビート周波数f^、f^が推定される。 The next time slot TS i uses the frequencies f 1 ̂, f 2 ̂ predicted in the previous time slot TS i−1 . That is, of the two beats included in the spectrum of the current time slot TS i , the one close to f 1 ^ is associated with the beat of object OBJ 1 , and the one close to f 2 ^ is associated with the beat of object OBJ 2 . correspond to Also in this time slot TSi, the beat frequencies f 1 ̂, f 2 ̂ of the next time slot TS i+1 are estimated.
 次のタイムスロットTSi+1では、前のタイムスロットTSで予測した周波数f^、f^を利用する。すなわち、現在のタイムスロットTSi+1のスペクトルに含まれる2本のビートのうち、f^に近いものを、物体OBJのビートに対応付け、f^に近いものを、物体OBJのビートに対応付ける。 The next time slot TS i+1 utilizes the frequencies f 1 ̂, f 2 ̂ predicted in the previous time slot TS i . That is, of the two beats included in the spectrum of the current time slot TS i+1 , the one close to f 1 ^ is associated with the beat of the object OBJ 1, and the one close to f 2 ^ is associated with the beat of the object OBJ 2 . correspond to
 過去のタイムスロットの情報を利用して、将来のタイムスロットのビート周波数を推定することにより、複数のタイムスロットにわたって、同じ物体に起因するビートを適切に対応づけることが可能となる。 By estimating the beat frequency of future time slots using the information of past time slots, it is possible to appropriately associate beats caused by the same object over multiple time slots.
 なお、1フレーム内の複数のタイムスロットにおいて、ビートの数が変化すると、物体とビートの対応付けを誤る可能性がある。そこで演算処理装置130は、ビートの数の変化を検出すると、外部に通知してもよい。この通知は、視野内における新たな物体の出現、あるいは視野からの物体の消失を示すアラートとして使用することができ、あるいは当該フレームにおいて得られる復元画像の信頼性が低下している可能性を示唆する。 It should be noted that if the number of beats changes in multiple time slots within one frame, there is a possibility that the correspondence between objects and beats will be erroneous. Therefore, when the processing unit 130 detects a change in the number of beats, it may notify the outside. This notification can be used as an alert indicating the appearance of a new object in the field of view, or the disappearance of an object from the field of view, or indicates that the reliability of the reconstructed image obtained in that frame may be declining. do.
 イメージング装置100は、検出信号のビートの個数、つまりピーク周波数の個数が変化したとき、そこから新しいフレームの測定を開始してもよい。ビートの個数が変化すると、ビートと物体の対応関係が失われ、復元画像の信頼性が低下する。そこで、ピーク周波数の個数が変化した場合には、そこから新しいフレームの測定を開始することで、ビートと物体を1対1で対応付けることが可能となり、画像の精度を高めることができる。 The imaging apparatus 100 may start measuring a new frame when the number of beats of the detection signal, that is, the number of peak frequencies changes. When the number of beats changes, the correspondence between beats and objects is lost, and the reliability of the restored image decreases. Therefore, when the number of peak frequencies changes, by starting measurement of a new frame from that point, it becomes possible to associate beats and objects one-to-one, and the accuracy of images can be improved.
(TDGIモード)
 演算処理装置130の再構成処理部134における復元画像の生成処理は、M回の照射を複数k個(k≧2)のユニットに分割して行ってもよい。具体的には再構成処理部134は、分割したユニットごとに相関計算を行って中間画像M(x,y)を生成する。そしてk個のユニットについて得られたk個の中間画像M~M(x,y)を合成し、最終的な復元画像GTDGI(x,y)を生成する。
(TDGI mode)
The reconstructed image generation processing in the reconstruction processing unit 134 of the arithmetic processing unit 130 may be performed by dividing M times of irradiation into a plurality of k (k≧2) units. Specifically, the reconstruction processing unit 134 performs correlation calculation for each divided unit to generate an intermediate image M j (x, y). Then, the k intermediate images M 1 to M k (x, y) obtained for the k units are combined to generate the final restored image G TDGI (x, y).
 簡単のため、各ユニットの照射回数nは等しくM/kであるとする。この場合の復元画像GTDGI(x,y)は、式(2)で表される。Iは、r番目の強度分布であり、bはr番目の検出強度の値であり、<b [j]>は、j番目のユニットにおいて測定された検出強度bの平均値である。なお、検出強度bはビートごとに個別に生成され、復元画像もビートごとに生成される。
Figure JPOXMLDOC01-appb-M000002
For simplicity, it is assumed that the number of irradiation times n of each unit is equal to M/k. The restored image G TDGI (x, y) in this case is represented by Equation (2). I r is the r-th intensity distribution, b r is the value of the r-th detected intensity, and <b r [j] > is the average value of the detected intensity b r measured in the j-th unit. be. Note that the detection intensity br is generated individually for each beat, and the restored image is also generated for each beat.
Figure JPOXMLDOC01-appb-M000002
 式(2)の右辺のj番目の項は、j番目のユニットの中間画像M(x,y)を表す。したがって、復元画像GTDGI(x,y)は、複数の中間画像M(x,y)の対応する画素同士を単純加算することにより合成されている。なお、合成の方法は単純加算に限定されず、重み付け加算やその他の処理を行ってもよい。 The j-th term on the right side of equation (2) represents the intermediate image M j (x, y) of the j-th unit. Therefore, the restored image G TDGI (x, y) is synthesized by simply adding the corresponding pixels of the plurality of intermediate images M j (x, y). Note that the combining method is not limited to simple addition, and weighted addition or other processing may be performed.
 言い換えると、演算処理装置130は、n=M/k回の照射ごとに相関計算を行う。そして、k個の中間画像M(x,y)を合成することにより、復元画像GTDGI(x,y)を生成する。 In other words, the processor 130 performs the correlation calculation every n=M/k irradiations. Then, a restored image G TDGI (x, y) is generated by synthesizing the k intermediate images M(x, y).
 図9は、イメージング装置100のTDGIモードの動作を説明するタイムチャートである。図9では、パターニングデバイス114の全画素数はp=4×4=16画素で表され、1枚の復元画像G(x,y)の生成(1フレームの撮影)に関して、M個のランダムパターンI~Iが生成される。 FIG. 9 is a time chart for explaining the operation of the imaging apparatus 100 in TDGI mode. In FIG. 9, the total number of pixels of the patterning device 114 is represented by p=4×4=16 pixels, and M random patterns are used for generating one restored image G(x, y) (capturing one frame). I 1 to I M are generated.
 復元画像の生成は、それぞれがn回の照射(つまりn個のタイムスロット)を含むk個のユニットに分割して行われる。 The reconstruction image is generated by dividing it into k units each containing n irradiations (that is, n time slots).
 1個目のユニットは、1~n回目の照射を含み、n回の照射に対応する検出強度b~bが生成され、それらの平均値<b [1]>が生成される。そして、検出強度b~bと、それらの平均値<b [1]>と、強度分布In+1~I2nと、用いて相関計算を行い、中間画像M(x,y)が生成される。 The first unit includes the 1st to nth irradiations, the detected intensities b 1 to bn corresponding to the n irradiations are generated, and their average value <b r [1] > is generated. Then, correlation calculation is performed using the detected intensities b 1 to b n , their average value <b r [1] >, and the intensity distributions I n+1 to I 2n , and the intermediate image M 1 (x, y) is generated.
 2個目のユニットは、n+1~2n回目の照射を含み、n回の照射に対応する検出強度bn+1~b2nが生成され、それらの平均値<b [2]>が生成される。そして、検出強度bn+1~b2nと、それらの平均値<b [2]>と、強度分布In+1~I2nと、用いて相関計算を行い、中間画像M(x,y)が生成される。 The second unit contains the n+1 to 2n exposures, and the detected intensities b n+1 to b 2n corresponding to the n exposures are generated and their average value <b r [2] > is generated. Then, correlation calculation is performed using the detected intensities b n+1 to b 2n , their average values <b r [2] >, and the intensity distributions I n+1 to I 2n , and the intermediate image M 2 (x, y) is generated.
 同様にしてj個目のユニットは、(j-1)n+1~jn回目の照射を含み、n回の照射に対応する検出強度b(j-1)n+1~bjnが生成され、それらの平均値<b [j]>が生成される。そして、検出強度b(j-1)n+1~bjnと、それらの平均値<b [j]>と、強度分布I(j-1)n+1~Ijnと、用いて相関計算を行い、中間画像M(x,y)が生成される。 Similarly, the j-th unit includes (j-1)n+1 to jn-th irradiation, and the detection intensities b (j-1)n+1 to b jn corresponding to the n-th irradiation are generated, and their average A value <b r [j] > is generated. Then, correlation calculation is performed using the detected intensities b (j−1)n+1 to b jn , their average values <b r [j] >, and the intensity distributions I (j−1)n+1 to I jn , An intermediate image M j (x,y) is generated.
 最後のk番目のユニットは、(k-1)n+1~kn回目の照射を含み、n回の照射に対応する検出強度b(k-1)n+1~bknが生成され、それらの平均値<b [k]>が生成される。そして、検出強度b(k-1)n+1~bknと、それらの平均値<b [k]>と、強度分布I(k-1)n+1~Iknと、用いて相関計算を行い、中間画像M(x,y)が生成される。 The last k-th unit contains the (k−1)n+1 to kn-th irradiations, and the detected intensities b (k−1)n+1 to b kn corresponding to the n irradiations are generated, and their average < b r [k] > is generated. Then, correlation calculation is performed using the detected intensities b (k−1)n+1 to b kn , their average values <b r [k] >, and the intensity distributions I (k−1)n+1 to I kn , An intermediate image M k (x,y) is generated.
 そして、k個の中間画像M(x,y)~M(x,y)を合成することにより、最終的な復元画像G(x,y)が生成される。 A final restored image G(x, y) is generated by synthesizing the k intermediate images M 1 (x, y) to M k (x, y).
 以上がTDGIモードの動作である。TDGIモードによれば、相関計算をユニット単位に分割して行うことで、相関計算1回当たりのノイズ変化量を減らすことができ、復元精度を高めて画質を改善できる。また、M回の照射の完了を待たずに、相関計算を開始できるため、復元画像を生成するのに要する時間を短縮できる。 The above is the operation of the TDGI mode. According to the TDGI mode, by dividing the correlation calculation into units and performing it, the amount of noise change per correlation calculation can be reduced, the restoration accuracy can be increased, and the image quality can be improved. In addition, since correlation calculation can be started without waiting for the completion of M irradiations, the time required to generate a restored image can be shortened.
 以下、TDGIモードによるノイズ耐性の改善について説明する。以下では、式(2)にもとづくユニットごとの相関計算を利用したイメージングを分割復元と称する。また式(1)にもとづく相関計算を利用した従来のイメージングを一括復元と称する。  In the following, the improvement of noise resistance in the TDGI mode will be described. Imaging using correlation calculation for each unit based on Equation (2) is hereinafter referred to as segmentation reconstruction. Also, conventional imaging using correlation calculation based on equation (1) is referred to as collective reconstruction.
 たとえば、時間的に単調増加するノイズ(線形ノイズと称する)を考える。このような線形ノイズは、ノイズ光源とビート検出部との距離が、時間とともに接近するようなケースにおいて生じうる。 For example, consider noise that monotonously increases over time (referred to as linear noise). Such linear noise can occur in cases where the distance between the noise light source and the beat detector approaches over time.
 図10は、TDGIモードの別のシーケンスを説明する図である。図10の例では、照明光を照射し続け、n回の照射ごとに相関計算を行い、中間画像M(x,y)を生成する。そして、中間画像M(x,y)を生成するたびに、最新のk個の中間画像M(x,y)を合成することにより、復元画像G(x,y)を生成する。このシーケンスによれば、n回の照射ごとに、復元画像G(x,y)を更新できるため、更新レートを高めることができる。 FIG. 10 is a diagram explaining another sequence in TDGI mode. In the example of FIG. 10, illumination light continues to be emitted, correlation calculation is performed every n times of illumination, and an intermediate image M(x, y) is generated. Then, every time an intermediate image M(x, y) is generated, a restored image G(x, y) is generated by synthesizing the latest k intermediate images M(x, y). According to this sequence, the restored image G(x, y) can be updated every n times of irradiation, so the update rate can be increased.
 続いてTDGIモードのノイズについて説明する。図11(a)は、一括復元におけるノイズの影響を示す図であり、図11(b)は、分割復元におけるノイズの影響を示す図である。横軸は照射パターンの番号、すなわち時間を表す。σは、r番目のパターンを照射しているときのノイズ強度を表し、ここでは、σ=α×rにしたがって増加するものとする。 Next, noise in the TDGI mode will be explained. FIG. 11(a) is a diagram showing the influence of noise in batch restoration, and FIG. 11(b) is a diagram showing the influence of noise in division restoration. The horizontal axis represents the irradiation pattern number, that is, the time. σ r represents the noise intensity when illuminating the r-th pattern, where it increases according to σ r =α×r.
 図11(a)を参照する。一括復元を行う場合、各パターン照射時に検出されるノイズの強度bとノイズの平均値<σ>の差分Δσ(=σ-<σ>)の絶対値|Δσ|の全照射にわたる積算値は、ハッチングを付した面積に相当し、以下の式(4)で表される。
 Σ|Δσ|=2×1/2×(kn/2)×(knα/2)=nαk/4  …(4)
Please refer to FIG. When collective restoration is performed , the absolute value | Δσ r | The integrated value over irradiation corresponds to the hatched area and is expressed by the following equation (4).
Σ|Δσ r |=2×1/2×(kn/2)×(knα/2)=n 2 αk 2 /4 (4)
 図11(b)を参照する。ここではk=5としている。分割復元を行う場合、j番目のユニットにおける差分Δσ(=σ-<σ [j]>)の絶対値|Δσ|の積算値Σ|Δσ|は、ハッチングを付した面積に相当し、式(5)で表される。
 Σ|Δσ|=2×1/2×(n/2)×(nα/2)=nα/4 …(5)
 全照射にわたる積算値は、式(5)の値をk倍すればよく、knα/4となる。
Please refer to FIG. Here, k=5. When segmentation restoration is performed, the integrated value Σ|Δσ r | of the absolute value | Δσ r | of the difference Δσ r (=σ r −<σ r [j] >) in the j-th unit is the hatched area. It corresponds and is represented by Formula (5).
Σ|Δσ r |=2×1/2×(n/2)×(nα/2)=n 2 α/4 (5)
The integrated value over all irradiation can be obtained by multiplying the value of equation (5) by k, which is kn 2 α/4.
 このように、分割復元を行うことにより、ノイズ強度差分の絶対値|Δσ|の積算値が小さくなり、ノイズ耐性を高めることができる。 In this way, by performing division and restoration, the integrated value of the absolute value |Δσ r | of the noise intensity difference can be reduced, and the noise immunity can be improved.
 続いて、一括復元と分割復元に関するシミュレーション結果を説明する。
 図12(a)は、ターゲット画像を、図12(b)は、ノイズが存在しないときの、一括復元と分割復元により得られる画像を示す図である。シミュレーションは、M=1000,M=10000,M=100000について行った。またユニット数kは10としている。
Next, the simulation results for collective restoration and divided restoration will be described.
FIG. 12(a) is a diagram showing a target image, and FIG. 12(b) is a diagram showing an image obtained by batch restoration and division restoration when noise does not exist. The simulation was performed for M=1000, M=10000 and M=100000. Also, the number of units k is assumed to be 10.
 一括復元では、元のターゲット画像を認識しうる程度に復元するためには、M=100000が必要であり、分割復元においても同様である。M=100000において、最終的に得られる復元画像の精度は、一括復元と分割復元とで同等であると言える。 In batch restoration, M = 100000 is required in order to restore the original target image to a recognizable degree, and the same is true for segmented restoration. When M=100000, it can be said that the accuracy of the finally obtained restored image is the same between batch restoration and divisional restoration.
 図13は、線形ノイズが存在するときの、一括復元と分割復元により得られる復元画像G(x,y)を示す図である。総照射回数は、M=10000であり、検出強度の平均値は、<b>=1000である。ノイズの係数αは、0,0.01,0.1,1の4通りで計算している。α=0はノイズがない場合に相当する。分割復元に関しては、k=10,100,1000の場合について計算している。 FIG. 13 is a diagram showing a restored image G(x, y) obtained by collective restoration and divisional restoration when linear noise exists. The total number of irradiations is M=10000, and the average value of the detected intensity is <b r >=1000. The noise coefficient α is calculated in four ways: 0, 0.01, 0.1, and 1. α=0 corresponds to no noise. As for split reconstruction, calculations are made for k=10, 100, and 1000.
 図13のシミュレーション結果から分かる通り、分割数kを高めるほど、ノイズ耐性が高まることが分かる。 As can be seen from the simulation results in FIG. 13, the higher the number of divisions k, the higher the noise resistance.
(実施形態2)
 図14は、実施形態2に係るイメージング装置100Aのブロック図である。実施形態2では、空間光変調器140の配置が実施形態1と異なる。図14では、照明光S1は、周波数変調連続波S0であり、空間的に均一な強度分布を有する。
(Embodiment 2)
FIG. 14 is a block diagram of an imaging device 100A according to the second embodiment. Embodiment 2 differs from Embodiment 1 in the arrangement of spatial light modulators 140 . In FIG. 14, illumination light S1 is frequency-modulated continuous wave S0 and has a spatially uniform intensity distribution.
 空間光変調器140は、物体OBJからの反射光S2を空間的に変調する。合波部122は、空間光変調器140によって空間変調された光S4と、周波数変調連続波S0を合波する。それ以降の処理は実施形態1と同様である。 The spatial light modulator 140 spatially modulates the reflected light S2 from the object OBJ. The multiplexer 122 multiplexes the light S4 spatially modulated by the spatial light modulator 140 and the frequency-modulated continuous wave S0. Subsequent processing is the same as in the first embodiment.
 実施形態2によれば、実施形態1と同様の効果を得ることができる。 According to the second embodiment, effects similar to those of the first embodiment can be obtained.
 実施形態は例示であり、それらの各構成要素や各処理プロセスの組み合わせにいろいろな変形例が可能なこと、またそうした変形例も本発明の範囲にあることは当業者に理解されるところである。以下、こうした変形例について説明する。 It should be understood by those skilled in the art that the embodiments are examples, and that various modifications can be made to the combination of each component and each treatment process, and that such modifications are within the scope of the present invention. Such modifications will be described below.
(変形例1)
 TDGIにおいて、ユニットごとの照射回数が等しいものとしたが、その限りでなく、ユニットごとの照射回数は等しくなくてもよい。
(Modification 1)
In TDGI, the number of times of irradiation for each unit is assumed to be equal, but this is not the case, and the number of times of irradiation for each unit may not be equal.
(変形例2)
 またTDGIでは、ユニット数kが一定であるとしたが、ユニット数kを動的に制御してもよい。ノイズの変動速度やノイズの波形に応じて、最適なユニット数kを選択することで、画質をより改善できる。
(Modification 2)
Also, in TDGI, the unit number k is fixed, but the unit number k may be dynamically controlled. Image quality can be further improved by selecting the optimum number of units k according to the noise fluctuation speed and noise waveform.
(変形例3)
 実施形態1では、照明装置110を、光源112とパターニングデバイス114の組み合わせで構成したがその限りでない。たとえば照明装置110は、マトリクス状に配置される複数の半導体光源(LED(発光ダイオード)やLD(レーザダイオード))のアレイで構成し、個々の半導体光源の輝度を制御可能に構成してもよい。
(Modification 3)
In Embodiment 1, the illumination device 110 is configured by a combination of the light source 112 and the patterning device 114, but this is not the only option. For example, the illumination device 110 may be composed of an array of a plurality of semiconductor light sources (LEDs (light emitting diodes) and LDs (laser diodes)) arranged in a matrix, and configured so that the brightness of each semiconductor light source can be controlled. .
(用途)
 続いてイメージング装置100の用途を説明する。図15は、物体識別システム10のブロック図である。この物体識別システム10は、自動車やバイクなどの車両に搭載され、車両の周囲に存在する物体OBJの種類(カテゴリ)を判定する。
(Application)
Next, the application of the imaging apparatus 100 will be described. FIG. 15 is a block diagram of the object identification system 10. As shown in FIG. This object identification system 10 is mounted on a vehicle such as an automobile or a motorcycle, and determines types (categories) of objects OBJ existing around the vehicle.
 物体識別システム10は、イメージング装置100と、演算処理装置40を備える。イメージング装置100は、上述のように、物体OBJに照明光S1を照射し、反射光S2を測定することにより、物体OBJの復元画像Gを生成する。 The object identification system 10 includes an imaging device 100 and an arithmetic processing device 40 . As described above, the imaging apparatus 100 generates the restored image G of the object OBJ by irradiating the object OBJ with the illumination light S1 and measuring the reflected light S2.
 演算処理装置40は、イメージング装置100の出力画像Gを処理し、物体OBJの位置および種類(カテゴリ)を判定する。 The arithmetic processing device 40 processes the output image G of the imaging device 100 and determines the position and type (category) of the object OBJ.
 演算処理装置40の分類器42は、画像Gを入力として受け、それに含まれる物体OBJの位置および種類を判定する。分類器42は、機械学習によって生成されたモデルにもとづいて実装される。分類器42のアルゴリズムは特に限定されないが、YOLO(You Only Look Once)、SSD(Single Shot MultiBox Detector)、R-CNN(Region-based Convolutional Neural Network)、SPPnet(Spatial Pyramid Pooling)、Faster R-CNN、DSSD(Deconvolution -SSD)、Mask R-CNNなどを採用することができ、あるいは、将来開発されるアルゴリズムを採用できる。 The classifier 42 of the arithmetic processing unit 40 receives the image G as an input and determines the position and type of the object OBJ contained therein. Classifier 42 is implemented based on a model generated by machine learning. Although the algorithm of the classifier 42 is not particularly limited, YOLO (You Only Look Once), SSD (Single Shot MultiBox Detector), R-CNN (Region-based Convolutional Neural Network), SPPnet (Spatial Pyramid Pooling), Faster R-CNN , DSSD (Deconvolution-SSD), Mask R-CNN, etc., or algorithms that will be developed in the future.
 以上が物体識別システム10の構成である。物体識別システム10のセンサとして、イメージング装置100を用いることで、以下の利点を得ることができる。 The above is the configuration of the object identification system 10. By using the imaging device 100 as a sensor for the object identification system 10, the following advantages can be obtained.
 イメージング装置100すなわち量子レーダカメラを用いることで、ノイズ耐性が格段に高まる。たとえば、降雨時、降雪時、あるいは霧の中を走行する場合、肉眼では物体OBJを認識しにくいが、イメージング装置100を用いることで、雨、雪、霧の影響を受けずに、物体OBJの復元画像Gを得ることができる。 By using the imaging device 100, that is, the quantum radar camera, the noise immunity is greatly improved. For example, when it is raining, snowing, or driving in fog, it is difficult to recognize the object OBJ with the naked eye. A restored image G can be obtained.
 また、TDGIを用いることで、計算遅延を小さくできる。これにより低遅延のセンシングを提供できる。特に車載用途では物体OBJが高速に移動するケースがあるため、低遅延のセンシングがもたらす恩恵は、非常に大きい。 Also, by using TDGI, the calculation delay can be reduced. This can provide low latency sensing. Particularly in in-vehicle applications, there are cases where the object OBJ moves at high speed.
 図16は、物体識別システム10を備える自動車のブロック図である。自動車300は、前照灯302L,302Rを備える。イメージング装置100は、前照灯302L,302Rの少なくとも一方に内蔵される。前照灯302は、車体の最も先端に位置しており、周囲の物体を検出する上で、イメージング装置100の設置箇所として最も有利である。 FIG. 16 is a block diagram of an automobile equipped with the object identification system 10. FIG. Automobile 300 includes headlights 302L and 302R. Imaging device 100 is built into at least one of headlights 302L and 302R. The headlamp 302 is positioned at the extreme end of the vehicle body and is the most advantageous location for installing the imaging device 100 in terms of detecting surrounding objects.
 図17は、物体検出システム210を備える車両用灯具200を示すブロック図である。車両用灯具200は、車両側ECU304とともに灯具システム310を構成する。車両用灯具200は、光源202、点灯回路204、光学系206を備える。さらに車両用灯具200には、物体検出システム210が設けられる。物体検出システム210は、上述の物体識別システム10に対応しており、イメージング装置100および演算処理装置40を含む。 FIG. 17 is a block diagram showing a vehicle lamp 200 including an object detection system 210. FIG. The vehicle lamp 200 constitutes a lamp system 310 together with a vehicle-side ECU 304 . A vehicle lamp 200 includes a light source 202 , a lighting circuit 204 and an optical system 206 . Furthermore, the vehicle lamp 200 is provided with an object detection system 210 . Object detection system 210 corresponds to object identification system 10 described above and includes imaging device 100 and arithmetic processing device 40 .
 演算処理装置40が検出した物体OBJに関する情報は、車両用灯具200の配光制御に利用してもよい。具体的には、灯具側ECU208は、演算処理装置40が生成する物体OBJの種類とその位置に関する情報にもとづいて、適切な配光パターンを生成する。点灯回路204および光学系206は、灯具側ECU208が生成した配光パターンが得られるように動作する。 Information on the object OBJ detected by the processing unit 40 may be used for light distribution control of the vehicle lamp 200 . Specifically, the lamp-side ECU 208 generates an appropriate light distribution pattern based on the information about the type and position of the object OBJ generated by the arithmetic processing unit 40 . The lighting circuit 204 and the optical system 206 operate so as to obtain the light distribution pattern generated by the lamp-side ECU 208 .
 また演算処理装置40が検出した物体OBJに関する情報は、車両側ECU304に送信してもよい。車両側ECUは、この情報にもとづいて、自動運転を行ってもよい。 Information regarding the object OBJ detected by the arithmetic processing unit 40 may be transmitted to the vehicle-side ECU 304 . The vehicle-side ECU may perform automatic driving based on this information.
 実施形態にもとづき、具体的な語句を用いて本発明を説明したが、実施形態は、本発明の原理、応用の一側面を示しているにすぎず、実施形態には、請求の範囲に規定された本発明の思想を逸脱しない範囲において、多くの変形例や配置の変更が認められる。 Although the present invention has been described using specific terms based on the embodiments, the embodiments merely show one aspect of the principles and applications of the present invention, and the embodiments are defined in the scope of claims. Many modifications and changes in arrangement are permitted without departing from the spirit of the present invention.
 本開示は、イメージング装置および方法に関する。 The present disclosure relates to imaging apparatus and methods.
OBJ…物体、S0…周波数変調連続波、S1…照明光、S2…反射光、S3…合波光、10…物体識別システム、40…演算処理装置、42…分類器、100…イメージング装置、110…照明装置、112…光源、114…パターニングデバイス、120…ビート検出部、122…合波部、124…光検出器、130…演算処理装置、132…パターン発生器、134…再構成処理部、200…車両用灯具、202…光源、204…点灯回路、206…光学系、300…自動車、302…前照灯、310…灯具システム、304…車両側ECU。 OBJ... Object, S0... Frequency-modulated continuous wave, S1... Illumination light, S2... Reflected light, S3... Combined light, 10... Object identification system, 40... Arithmetic processing unit, 42... Classifier, 100... Imaging device, 110... Illuminating device 112 Light source 114 Patterning device 120 Beat detector 122 Combiner 124 Photodetector 130 Arithmetic processor 132 Pattern generator 134 Reconstruction processor 200 Vehicle lamp 202 Light source 204 Lighting circuit 206 Optical system 300 Automobile 302 Headlamp 310 Lamp system 304 Vehicle side ECU.

Claims (13)

  1.  周波数変調連続波を生成する光源と、前記周波数変調連続波をタイムスロットごとに異なるパターンで空間変調する空間光変調器と、を含み、空間変調された前記周波数変調連続波を照明光として視野に照射する照明装置と、
     物体からの反射光と前記周波数変調連続波の合波光にもとづく検出信号を生成するビート検出部と、
     前記検出信号のビートごとに、当該ビートの強度と前記照明光の強度分布の相関計算を行い、前記物体の復元画像を再構成する演算処理装置と、
     を備えることを特徴とするイメージング装置。
    A light source that generates a frequency-modulated continuous wave, and a spatial light modulator that spatially modulates the frequency-modulated continuous wave with a different pattern for each time slot, wherein the spatially-modulated frequency-modulated continuous wave is used as illumination light in a field of view. a lighting device for irradiation;
    a beat detector that generates a detection signal based on the light reflected from the object and the combined light of the frequency-modulated continuous wave;
    an arithmetic processing unit that performs correlation calculation between the intensity of the beat of the detection signal and the intensity distribution of the illumination light for each beat of the detection signal, and reconstructs a restored image of the object;
    An imaging device comprising:
  2.  周波数変調連続波を生成する光源を含み、前記周波数変調連続波を照明光として視野に照射する照明装置と、
     物体からの反射光をタイムスロットごとに異なるパターンで変調する空間光変調器と、
     前記空間光変調器により変調された前記反射光と前記周波数変調連続波の合波光にもとづく検出信号を生成するビート検出部と、
     前記検出信号のビートごとに、当該ビートの強度と前記照明光の強度分布の相関計算を行い、前記物体の復元画像を再構成する演算処理装置と、
     を備えることを特徴とするイメージング装置。
    an illumination device that includes a light source that generates a frequency-modulated continuous wave and irradiates a field of view with the frequency-modulated continuous wave as illumination light;
    a spatial light modulator that modulates reflected light from an object with a different pattern for each time slot;
    a beat detector that generates a detection signal based on the combined light of the reflected light modulated by the spatial light modulator and the frequency-modulated continuous wave;
    an arithmetic processing unit that performs correlation calculation between the intensity of the beat of the detection signal and the intensity distribution of the illumination light for each beat of the detection signal, and reconstructs a restored image of the object;
    An imaging device comprising:
  3.  前記演算処理装置は、前記検出信号をフーリエ変換し、前記ビートの強度を検出することを特徴とする請求項1に記載のイメージング装置。 The imaging apparatus according to claim 1, wherein the arithmetic processing unit Fourier-transforms the detection signal and detects the intensity of the beat.
  4.  前記演算処理装置は、複数の前記復元画像に異なる色を付けて合成することを特徴とする請求項1または2に記載のイメージング装置。 3. The imaging apparatus according to claim 1 or 2, wherein the arithmetic processing unit adds different colors to the plurality of restored images and synthesizes them.
  5.  前記演算処理装置は、ドップラーシフトにもとづいて、前記物体の速度を算出し、前記速度にもとづいて、次のタイムスロットにおける前記物体までの距離を推定することを特徴とする請求項1から3のいずれかに記載のイメージング装置。 4. The processor of claim 1, wherein the arithmetic processing unit calculates the velocity of the object based on the Doppler shift, and estimates the distance to the object in the next time slot based on the velocity. An imaging device according to any one of the preceding claims.
  6.  前記演算処理装置は、推定された前記物体までの距離にもとづいて、次のタイムスロットのビートの周波数を推定することを特徴とする請求項4に記載のイメージング装置。 The imaging apparatus according to claim 4, wherein the arithmetic processing unit estimates the beat frequency of the next time slot based on the estimated distance to the object.
  7.  前記検出信号のビートの個数が変化したとき、外部に通知することを特徴とする請求項1から5のいずれかに記載のイメージング装置。 The imaging apparatus according to any one of claims 1 to 5, wherein when the number of beats of the detection signal changes, it is notified to the outside.
  8.  前記検出信号のビートの個数が変化したとき、そこから新しいフレームの測定を開始することを特徴とする請求項1から5のいずれかに記載のイメージング装置。 The imaging apparatus according to any one of claims 1 to 5, wherein when the number of beats of the detection signal changes, measurement of a new frame is started from there.
  9.  前記光源と前記空間光変調器は、発光素子のアレイであることを特徴とする請求項1に記載の請求項1から7のいずれかに記載のイメージング装置。 The imaging apparatus according to any one of claims 1 to 7, wherein the light source and the spatial light modulator are arrays of light emitting elements.
  10.  請求項1から8のいずれかに記載のイメージング装置を備えることを特徴とする車両用灯具。 A vehicle lamp comprising the imaging device according to any one of claims 1 to 8.
  11.  請求項1から8のいずれかに記載のイメージング装置を備えることを特徴とする車両。 A vehicle comprising the imaging device according to any one of claims 1 to 8.
  12.  周波数変調連続波を生成するステップと、
     前記周波数変調連続波をタイムスロットごとに異なるパターンで変調して照明光を生成するステップと、
     物体が前記照明光を反射した反射光と前記周波数変調連続波の合波光にもとづく検出信号を生成するステップと、
     前記検出信号のビートごとに、当該ビートの強度と前記照明光の強度分布の相関計算を行い、前記物体の復元画像を再構成するステップと、
     を備えることを特徴とするイメージング方法。
    generating a frequency modulated continuous wave;
    modulating the frequency-modulated continuous wave with a different pattern for each time slot to generate illumination light;
    generating a detection signal based on the combined light of the reflected light of the illumination light reflected by the object and the frequency-modulated continuous wave;
    performing a correlation calculation between the intensity distribution of the illumination light and the intensity of the beat for each beat of the detection signal, and reconstructing a restored image of the object;
    An imaging method, comprising:
  13.  周波数変調連続波を照明光として視野に照射するステップと、
     物体が前記照明光を反射した反射光を、タイムスロットごとに異なるパターンで変調するステップと、
     変調された前記反射光と前記周波数変調連続波の合波光にもとづく検出信号を生成するステップと、
     前記検出信号のビートごとに、当該ビートの強度と前記照明光の強度分布の相関計算を行い、前記物体の復元画像を再構成するステップと、
     を備えることを特徴とするイメージング方法。
    illuminating the field of view with a frequency-modulated continuous wave as illumination light;
    modulating the light reflected by the object from the illumination light with a different pattern for each time slot;
    generating a detection signal based on the combined light of the modulated reflected light and the frequency-modulated continuous wave;
    performing a correlation calculation between the intensity distribution of the illumination light and the intensity of the beat for each beat of the detection signal, and reconstructing a restored image of the object;
    An imaging method, comprising:
PCT/JP2022/041755 2021-11-12 2022-11-09 Imaging device, imaging method, vehicular lamp, and vehicle WO2023085328A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-185115 2021-11-12
JP2021185115 2021-11-12

Publications (1)

Publication Number Publication Date
WO2023085328A1 true WO2023085328A1 (en) 2023-05-19

Family

ID=86335800

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/041755 WO2023085328A1 (en) 2021-11-12 2022-11-09 Imaging device, imaging method, vehicular lamp, and vehicle

Country Status (1)

Country Link
WO (1) WO2023085328A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005112130A2 (en) * 2004-01-16 2005-11-24 New Jersey Institute Of Technology Terahertz imaging for near field objects
JP6412673B1 (en) * 2017-07-21 2018-10-24 学校法人玉川学園 Image processing apparatus and method, and program
CN111239747A (en) * 2020-02-08 2020-06-05 西北工业大学 Sonar high-resolution low-sidelobe two-dimensional imaging method based on deconvolution
WO2020218282A1 (en) * 2019-04-22 2020-10-29 株式会社小糸製作所 Imaging device, vehicle light, automobile, and imaging method
WO2021079810A1 (en) * 2019-10-23 2021-04-29 株式会社小糸製作所 Imaging device, vehicle lamp, vehicle, and imaging method
JP2021513653A (en) * 2018-02-12 2021-05-27 テクノロギアン トゥトキムスケスクス ヴェーテーテー オイTeknologian Tutkimuskeskus Vtt Oy Monitoring of living facilities with multi-channel radar

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005112130A2 (en) * 2004-01-16 2005-11-24 New Jersey Institute Of Technology Terahertz imaging for near field objects
JP6412673B1 (en) * 2017-07-21 2018-10-24 学校法人玉川学園 Image processing apparatus and method, and program
JP2021513653A (en) * 2018-02-12 2021-05-27 テクノロギアン トゥトキムスケスクス ヴェーテーテー オイTeknologian Tutkimuskeskus Vtt Oy Monitoring of living facilities with multi-channel radar
WO2020218282A1 (en) * 2019-04-22 2020-10-29 株式会社小糸製作所 Imaging device, vehicle light, automobile, and imaging method
WO2021079810A1 (en) * 2019-10-23 2021-04-29 株式会社小糸製作所 Imaging device, vehicle lamp, vehicle, and imaging method
CN111239747A (en) * 2020-02-08 2020-06-05 西北工业大学 Sonar high-resolution low-sidelobe two-dimensional imaging method based on deconvolution

Similar Documents

Publication Publication Date Title
JP7463297B2 (en) In-vehicle imaging devices, vehicle lighting, automobiles
US10632899B2 (en) Illumination device for a motor vehicle for increasing the perceptibility of an obstacle
CN108895985B (en) Object positioning method based on single-pixel detector
WO2020218282A1 (en) Imaging device, vehicle light, automobile, and imaging method
US11604345B2 (en) Apparatuses and methods for backscattering elimination via spatial and temporal modulations
US20220132022A1 (en) Imaging device
WO2023085328A1 (en) Imaging device, imaging method, vehicular lamp, and vehicle
EP3553556A1 (en) Light modulating lidar apparatus
US20230009034A1 (en) Imaging apparatus
WO2022091972A1 (en) Imaging device, vehicle lighting fixture, and vehicle
WO2021079810A1 (en) Imaging device, vehicle lamp, vehicle, and imaging method
WO2022270476A1 (en) Imaging device, vehicle lamp, and vehicle
WO2023074759A1 (en) Imaging apparatus, vehicle lamp fitting, and vehicle
WO2022044961A1 (en) Imaging device, imaging method, vehicle lamp, and vehicle
US11002419B2 (en) Linearly polarized light emission by a vehicle headlight for use in a camera-based driver assistance system
WO2021079811A1 (en) Imaging device, vehicular lamp, vehicle, and imaging method
WO2021060397A1 (en) Gating camera, automobile, vehicle lamp, image processing device, and image processing method
CN115023943A (en) Monitoring system
WO2020149139A1 (en) Imaging apparatus, computational processing device thereof, lighting fixture for vehicle, vehicle, and sensing method
CN116457700A (en) Sensing device, vehicle lamp, and vehicle
WO2022102775A1 (en) Sensing device, vehicular lamp, and vehicle
RU2746614C1 (en) Method for suppressing backlight when forming images of road environment in front of vehicle and device for implementing method
JP2023055317A (en) Imaging system and imaging method
CN117222917A (en) Door control camera, sensing system and vehicle lamp

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22892826

Country of ref document: EP

Kind code of ref document: A1