WO2023085328A1 - Dispositif d'imagerie, procédé d'imagerie, feu de véhicule et véhicule - Google Patents

Dispositif d'imagerie, procédé d'imagerie, feu de véhicule et véhicule Download PDF

Info

Publication number
WO2023085328A1
WO2023085328A1 PCT/JP2022/041755 JP2022041755W WO2023085328A1 WO 2023085328 A1 WO2023085328 A1 WO 2023085328A1 JP 2022041755 W JP2022041755 W JP 2022041755W WO 2023085328 A1 WO2023085328 A1 WO 2023085328A1
Authority
WO
WIPO (PCT)
Prior art keywords
frequency
light
beat
detection signal
continuous wave
Prior art date
Application number
PCT/JP2022/041755
Other languages
English (en)
Japanese (ja)
Inventor
輝明 鳥居
祐太 春瀬
Original Assignee
株式会社小糸製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社小糸製作所 filed Critical 株式会社小糸製作所
Publication of WO2023085328A1 publication Critical patent/WO2023085328A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S17/34Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated using transmission of continuous, frequency-modulated waves while heterodyning the received signal, or a signal derived therefrom, with a locally-generated signal related to the contemporaneously transmitted signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • the present disclosure relates to imaging apparatus and methods.
  • An object identification system that senses the position and type of objects around the vehicle is used for automated driving and automatic control of headlamp light distribution.
  • An object identification system includes a sensor and a processor that analyzes the output of the sensor. Sensors are selected from cameras, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), millimeter-wave radar, ultrasonic sonar, etc., taking into consideration the application, required accuracy, and cost.
  • LiDAR Light Detection and Ranging, Laser Imaging Detection and Ranging
  • millimeter-wave radar ultrasonic sonar
  • Ghost imaging illuminates an object while randomly switching the intensity distribution (pattern) of illumination light, and measures the light detection intensity of the reflected light for each pattern.
  • the light detection intensity is the integral of energy or intensity over a plane, not the intensity distribution. Correlation calculation between the corresponding pattern and the light detection intensity is then performed to reconstruct a restored image of the object.
  • FMCW-LiDAR Frequency Modulated Continuous Wave
  • FMCW-LiDAR is known as a sensor capable of generating distance information.
  • FMCW-LiDAR is a ranging technology conventionally used for radar, and has excellent characteristics such as high resolution and high detection sensitivity. Relative velocity can also be measured using the Doppler effect.
  • the present disclosure has been made in such a situation, and one exemplary purpose of certain aspects thereof is to provide an imaging device capable of generating distance information in a short period of time.
  • An imaging apparatus includes a light source that generates a frequency-modulated continuous wave, and a spatial light modulator that spatially modulates the frequency-modulated continuous wave with a different pattern for each time slot.
  • a lighting device that irradiates a field of view with a modulated continuous wave as illumination light; a beat detector that generates a detection signal based on a signal obtained by combining the reflected light from an object and the frequency-modulated continuous wave; and an arithmetic processing unit that performs correlation calculation between the intensity of the beat and the intensity distribution of the illumination light and reconstructs the restored image of the object.
  • An imaging device includes a light source that generates a frequency-modulated continuous wave, an illumination device that irradiates a field of view with the frequency-modulated continuous wave as illumination light, and a spatial light reflected from an object for each time slot.
  • a spatial light modulator that randomly patterns the spatial light modulator;
  • a beat detector that generates a detection signal based on the combined light of the reflected light modulated by the spatial light modulator and the frequency-modulated continuous wave; and an arithmetic processing unit that performs correlation calculation between the intensity of the illumination light and the intensity distribution of the illumination light, and reconstructs a restored image of the object.
  • Another aspect of the present disclosure is an imaging method.
  • This method comprises the steps of: generating a frequency-modulated continuous wave; modulating the frequency-modulated continuous wave with a different pattern for each time slot to generate illumination light; a step of generating a detection signal based on the combined light of continuous waves; and a step of performing a correlation calculation between the intensity of the beat and the intensity distribution of the illumination light for each beat of the detection signal, and reconstructing a restored image of the object.
  • Another aspect of the present disclosure is also an imaging method.
  • This method comprises the steps of: irradiating a field of view with a frequency-modulated continuous wave as illumination light; modulating the reflected light of the illumination light reflected by an object with a different pattern for each time slot; a step of generating a detection signal based on the combined light of the modulated continuous waves, performing a correlation calculation between the intensity of the beat and the intensity distribution of the illumination light for each beat of the detection signal, and reconstructing a restored image of the object; Prepare.
  • distance information can be generated in a short time.
  • FIG. 1 is a diagram showing an imaging device according to Embodiment 1;
  • FIG. FIG. 4 is a diagram for explaining beats included in the detection signal Di ;
  • FIGS. 3A and 3B are diagrams for explaining beat intensity.
  • 2A and 2B are diagrams for explaining the operation of the imaging apparatus of FIG. 1;
  • FIG. 2A and 2B are diagrams for explaining the operation of the imaging apparatus of FIG. 1;
  • FIG. 10 is a diagram for explaining synthesis of a plurality of restored images;
  • FIG. Fig. 3 shows spectra obtained in a plurality of time slots;
  • Fig. 3 shows spectra obtained in a plurality of time slots;
  • 4 is a time chart for explaining the operation of the imaging apparatus in TDGI mode;
  • FIG. 11 is a diagram illustrating another sequence in TDGI mode;
  • FIG. 11(a) is a diagram showing the influence of noise in batch restoration, and
  • FIG. 11(b) is a diagram showing the influence of noise in division restoration.
  • FIG. 12(a) is a diagram showing a target image, and
  • FIG. 12(b) is a diagram showing an image obtained by batch restoration and division restoration when noise does not exist.
  • FIG. 10 is a diagram showing a restored image G(x, y) obtained by collective restoration and divisional restoration when linear noise exists;
  • 2 is a block diagram of an imaging device according to Embodiment 2;
  • FIG. 1 is a block diagram of an object identification system;
  • FIG. 1 is a block diagram of a vehicle with an object identification system;
  • FIG. 1 is a block diagram showing a vehicle lamp equipped with an object detection system;
  • An imaging apparatus includes a light source that generates a frequency-modulated continuous wave and a spatial light modulator that spatially modulates the frequency-modulated continuous wave with a different pattern for each time slot.
  • a lighting device that irradiates a field of view with a wave as illumination light;
  • a beat detector that generates a detection signal based on a signal obtained by combining the reflected light from an object and a frequency-modulated continuous wave; and an arithmetic processing unit that performs correlation calculation between the intensity and the intensity distribution of the illumination light and reconstructs a restored image of the object.
  • the arithmetic processing unit may Fourier transform the detection signal and detect the strength of the beat.
  • the arithmetic processing unit may apply different colors to a plurality of restored images and synthesize them. This makes it possible to generate an image that represents the distance and brightness (reflectance) of an object.
  • the processor may calculate the velocity of the object based on the Doppler shift, and estimate the distance to the object in the next time slot based on the velocity.
  • the processor may estimate the beat frequency of the next time slot based on the estimated distance to the object. As a result, when there are a plurality of beats with different frequencies, the beat of the previous time slot can be appropriately associated with the beat of the current time slot.
  • the imaging device may notify the outside when the number of beats of the detection signal changes.
  • a new frame measurement may be started from there.
  • beats and objects can be associated on a one-to-one basis, and image accuracy can be improved.
  • the light source and spatial light modulator may be an array of light emitting elements.
  • the intensity distribution is random in this specification does not mean that it is completely random, but it is sufficient if it is random enough to reconstruct an image in ghost imaging. Therefore, “random” in this specification can include a certain degree of regularity therein. Also, “random” does not require unpredictability, but may be predictable and reproducible.
  • FIG. 1 is a diagram showing an imaging apparatus 100 according to Embodiment 1.
  • FIG. Imaging device 100 is a correlation function image sensor that uses the principle of ghost imaging (also called single pixel imaging), and includes illumination device 110 , beat detection section 120 and arithmetic processing device 130 .
  • the illumination device 110 is a pseudo thermal light source, generates illumination light S1 having a spatial intensity distribution I(x, y) that can be regarded as substantially random, and illuminates the object OBJ.
  • the illumination light S1 is sequentially irradiated a plurality of M times while randomly changing its intensity distribution.
  • the number of times of irradiation M is the number of times that can restore the original image.
  • Illumination device 110 includes light source 112 , patterning device 114 and pattern generator 132 .
  • the light source 112 generates a frequency-modulated continuous wave S0 that has a uniform intensity distribution and whose frequency changes over time. For example, the frequency of the frequency-modulated continuous wave S0 changes (increases or decreases) with a constant slope over time.
  • a semiconductor laser with a frequency modulation function can be used for the light source 112 .
  • the wavelength and spectrum of the frequency-modulated continuous wave S0 are not particularly limited, and may be white light having multiple or continuous spectra, or monochromatic light containing a predetermined wavelength.
  • the wavelength of the illumination light S1 may be infrared or ultraviolet.
  • the patterning device 114 has a plurality of pixels arranged in a matrix, and is configured to be able to spatially modulate the light intensity distribution I based on a combination of ON and OFF of the plurality of pixels.
  • a pixel in an ON state is called an ON pixel
  • a pixel in an OFF state is called an OFF pixel.
  • each pixel takes only two values (1, 0) of ON and OFF, but it is not limited to this and may take intermediate gradations.
  • a reflective DMD Digital Micromirror Device
  • a transmissive liquid crystal device can be used as the patterning device 114 .
  • a pattern signal PTN image data generated by a pattern generator 132 is applied to the patterning device 114 .
  • the frequency-modulated continuous wave S0 is spatially modulated with a different pattern for each time slot to generate the illumination light S1.
  • the frequency of the frequency-modulated continuous wave S0 is preferably swept at least once, preferably twice or more.
  • Sensing by the imaging apparatus 100 is performed with M patterned irradiations as one set, and one restored image is generated corresponding to the M patterned irradiations.
  • One sensing based on M pattern irradiations is called one frame. That is, one frame includes M time slots TS 1 to TS M .
  • Imaging apparatus 100 receives reflected light S2 reflected by object OBJ when illumination light S1 hits object OBJ.
  • the beat detector 120 detects the reflected light S2 by heterodyne detection and outputs a detection signal Dr.
  • the detection signal Dr is a spatially integrated value of light energy (or intensity) incident on the beat detection section 120 when the object OBJ is irradiated with illumination light having an intensity distribution Ir.
  • the beat detector 120 includes a multiplexer 122 and a photodetector 124 .
  • the combining unit 122 combines the frequency-modulated continuous wave S0 and the reflected light S2 to generate combined light S3.
  • the combined light S3 will contain a beat according to the frequency difference between the two lights S0 and S2.
  • the photodetector 124 detects the combined light S3 and generates a detection signal D according to the beat contained in the combined light S3.
  • the photodetector 124 can use a single-pixel photodetector (photodetector) 124 . Note that an image sensor may be used as the photodetector 124 to synthesize the values of a plurality of pixels.
  • the beat detector 120 outputs a plurality of detection signals D 1 to D M respectively corresponding to a plurality of M intensity distributions I 1 to I M .
  • a processor 130 receives a plurality of detection signals D 1 to D M .
  • the detection signal D i includes a frequency beat corresponding to the distance to the object OBJ, similarly to FMCW-LiDAR.
  • FIG. 2 is a diagram for explaining beats included in the detection signal Di.
  • the horizontal axis represents time, and the vertical axis represents the frequency of the optical continuous wave.
  • FIG. 2 shows the frequencies of the frequency-modulated continuous wave S0, the illumination light S1, and the reflected light S2.
  • the reflected light S2 is shifted to the right by the time ⁇ when the illumination light S1 hits the object OBJ and returns.
  • c is the speed of light.
  • the arithmetic processing unit 130 includes a reconstruction processing unit 134 .
  • the reconstruction processing unit 134 detects the intensity b 1 to b M of each beat of the detection signals D 1 to D M .
  • the intensity of the k-th beat corresponding to the r-th pattern irradiation is expressed as b k r .
  • the spectral data indicates the beat frequency f k and the beat intensity b k r .
  • FIGS. 3A and 3B are diagrams for explaining beat intensity.
  • the detection signal (also referred to as beat signal) Dr that indicates the beat contains a single frequency f1 .
  • the intensity b 1 r of f 1 is obtained.
  • the beat signal D r includes multiple frequencies f 1 and f 2 .
  • this detection signal Dr By Fourier transforming this detection signal Dr , the intensity b 1 r of the beat with frequency f 1 and the intensity b 2 r of the beat with frequency f 2 are obtained.
  • the reconstruction processing unit 134 performs correlation calculation of the intensity distributions I 1 to I M of the illumination light S1 with the intensities b k 1 to b k M of the beats having different frequencies f 1 . . . reconstruct the restored image G k (x, y) of . Equation (1) can be used for correlation calculation.
  • the arithmetic processing unit 130 can be implemented by combining a processor (hardware) such as a CPU (Central Processing Unit), MPU (Micro Processing Unit), or microcomputer, and a software program executed by the processor (hardware).
  • processor hardware
  • processing unit 130 may be a combination of multiple processors.
  • the arithmetic processing unit 130 may be composed only of hardware.
  • the functions of the arithmetic processing unit 130 may be realized by software processing, hardware processing, or a combination of software processing and hardware processing.
  • software processing is implemented by combining processors (hardware) such as CPUs (Central Processing Units), MPUs (Micro Processing Units), microcomputers, and software programs executed by the processors (hardware).
  • processors hardware
  • hardware processing is implemented by hardware such as ASIC (Application Specific Integrated Circuit), controller IC, and FPGA (Field Programmable Gate Array).
  • the configuration of the imaging apparatus 100 is as described above. Next, the operation will be explained.
  • FIG. 4 is a diagram for explaining the operation of the imaging apparatus 100 of FIG.
  • the detection signal Dr contains a beat with a single frequency f1 .
  • the pattern of the irradiation light S1 is switched in order like I1 , I2 , . . . , IM for each of the time slots TS1 , TS2, .
  • Detection signals D 1 , D 2 . . . , DM are obtained according to the irradiation patterns I 1 , I 2 .
  • D M have different amplitudes depending on the pattern, but equal frequency f 1 depending on the distance to the object.
  • the amplitudes of the beats of frequency f1 contained in the detection signals D1, D2, . . . , DM are detected as intensities b11 , b12 , .
  • the restored image G 1 (x, y) is calculated based on equation (1).
  • FIG. 5 is a diagram explaining the operation of the imaging apparatus 100 of FIG.
  • the detection signal Dr includes a beat of a first frequency f1 corresponding to the distance to the object OBJ1 and a beat of a second frequency f2 corresponding to the distance to the object OBJ2 .
  • the pattern of the irradiation light S1 is switched in order like I 1 , I 2 . . . , IM for each time slot.
  • Detection signals D 1 , D 2 . . . , DM are obtained according to the irradiation patterns I 1 , I 2 .
  • the detection signals D 1 , D 2 . . . , D M each contain two frequency components f 1 , f 2 .
  • the amplitude of the frequency f 1 of each of the detection signals D 1 , D 2 . . . , D M changes based on the shape of the object OBJ 1 and the patterns I 1 to I M .
  • the amplitude of the frequency f 2 of each of the detection signals D 1 , D 2 . . . , D M changes based on the shape of the object OBJ 2 and the patterns I 1 to I M .
  • the intensity b 1 r at the frequency f 1 and the intensity b 2 r at the frequency f 2 can be obtained. Then , for the frequency f 1 , the restored image G 1 ( x , y) can be generated. Similarly, the restored image G 2 (x, y) of the object OBJ 2 can be generated by performing correlation calculation using I 1 to I M and b 2 1 to b 2 M for the frequency f 2 .
  • the imaging apparatus 100 by combining FMCW-LiDAR and ghost imaging, a three-dimensional image can be generated without scanning the illumination light S1. Since the imaging apparatus 100 does not require scanning as compared with the conventional FMCW-LiDAR, it can generate a two-dimensional range image in a short period of time. Also, since no mechanical or electronic control is required for scanning, costs can be reduced.
  • the imaging apparatus 100 can generate an independent restored image G k (x, y) for each object.
  • G k discriminator
  • the recognition rate is reduced compared to the case where one image contains multiple objects. can be increased.
  • the imaging apparatus 100 may synthesize a plurality of restored images G 1 (x, y), G 2 (x, y), . . . obtained for a plurality of frequencies to generate one synthesized image IMG.
  • FIG. 6 is a diagram for explaining combining of a plurality of restored images.
  • Each image G k (x, y) is a monochrome multi-tone image.
  • the reconstruction processing unit 134 synthesizes the restored images G 1 (x, y), G 2 (x, y), . . . with different colors.
  • Image G 1 is assigned a first color C 1 (eg red)
  • image G 2 is assigned a second color C 2 (eg green)
  • eg image G 3 is assigned a third color C 3 (eg blue).
  • the pixel value of each pixel of the original restored image is 8 bits (256 gradations) from 0 to 255.
  • the pixel value of the restored image G 1 (x, y) is the R component of the color composite image
  • the pixel value of the restored image G 2 (x, y) is the G component of the color composite image.
  • the pixel values of the image G 3 (x, y) may be used as the B component of the color composite image.
  • the technique of adding color to and synthesizing multiple monochrome images is not limited to this.
  • the colors represent the distance to the object, and the brightness of each color represents the reflectance of the object.
  • the distance to the object was unchanged during one frame of sensing, so the frequency of one beat was constant during sensing.
  • the distance to an object is not always constant during one frame for sensing an object. For example, if the vehicle is running and the object is stopped, the distance to the object will decrease over time. As the distance to the object changes, the frequency of the beat changes. That is, the beat frequency can change from time slot to time slot.
  • FIG. 7 is a diagram showing spectra obtained in a plurality of time slots.
  • the frequency of the beat increases as the distance to the object increases with each time slot. In other words, the position of the peak included in the spectrum shifts to the high frequency side.
  • the spectral data contains only a single peak, it is possible to perform correlation calculations assuming that they have the same beat frequency.
  • the problem is how to associate them.
  • the arithmetic processing unit 130 calculates the velocity of the object based on the Doppler shift. Velocity estimation based on Doppler shift is a technique used in FMCW-LiDAR, so detailed description of processing is omitted. Processing unit 130 then estimates the distance to the object in the next time slot based on this velocity.
  • Arithmetic processing unit 130 estimates the beat frequency based on the estimated distance d ⁇ to the object.
  • beats in the previous time slot and beats in the current time slot can be appropriately associated when there are multiple beats with different frequencies.
  • FIG. 8 is a diagram showing spectra obtained in a plurality of time slots.
  • the relative velocity of one object OBJ 1 is zero and the relative velocity of the other object OBJ 2 is v.
  • the beat frequency of the reflected light is constant.
  • the beat frequency of the reflected light from object OBJ2 changes with time. As the time slot elapses, the beat frequency f2 of the object OBJ2 approaches the beat frequency f1 of the object OBJ1 .
  • the velocity of the object OBJ 1 was measured to be 0 at the time slot TS i ⁇ 1 based on past measurement results. Then the frequency f 1 ⁇ of the beat of object OBJ 1 in the next time slot TS i is estimated to be the same as f 1 . Also, assume that the velocity v of the object OBJ2 can be measured. Then the frequency f 2 of the beat of object OBJ 2 in the next time slot TS 3 can be estimated.
  • the next time slot TS i uses the frequencies f 1 ⁇ , f 2 ⁇ predicted in the previous time slot TS i ⁇ 1 . That is, of the two beats included in the spectrum of the current time slot TS i , the one close to f 1 ⁇ is associated with the beat of object OBJ 1 , and the one close to f 2 ⁇ is associated with the beat of object OBJ 2 . correspond to Also in this time slot TSi, the beat frequencies f 1 ⁇ , f 2 ⁇ of the next time slot TS i+1 are estimated.
  • the next time slot TS i+1 utilizes the frequencies f 1 ⁇ , f 2 ⁇ predicted in the previous time slot TS i . That is, of the two beats included in the spectrum of the current time slot TS i+1 , the one close to f 1 ⁇ is associated with the beat of the object OBJ 1, and the one close to f 2 ⁇ is associated with the beat of the object OBJ 2 . correspond to
  • the processing unit 130 may notify the outside. This notification can be used as an alert indicating the appearance of a new object in the field of view, or the disappearance of an object from the field of view, or indicates that the reliability of the reconstructed image obtained in that frame may be declining. do.
  • the imaging apparatus 100 may start measuring a new frame when the number of beats of the detection signal, that is, the number of peak frequencies changes.
  • the number of beats of the detection signal that is, the number of peak frequencies changes.
  • the reconstructed image generation processing in the reconstruction processing unit 134 of the arithmetic processing unit 130 may be performed by dividing M times of irradiation into a plurality of k (k ⁇ 2) units. Specifically, the reconstruction processing unit 134 performs correlation calculation for each divided unit to generate an intermediate image M j (x, y). Then, the k intermediate images M 1 to M k (x, y) obtained for the k units are combined to generate the final restored image G TDGI (x, y).
  • the restored image G TDGI (x, y) in this case is represented by Equation (2).
  • I r is the r-th intensity distribution
  • b r is the value of the r-th detected intensity
  • ⁇ b r [j] > is the average value of the detected intensity b r measured in the j-th unit.
  • the j-th term on the right side of equation (2) represents the intermediate image M j (x, y) of the j-th unit. Therefore, the restored image G TDGI (x, y) is synthesized by simply adding the corresponding pixels of the plurality of intermediate images M j (x, y). Note that the combining method is not limited to simple addition, and weighted addition or other processing may be performed.
  • FIG. 9 is a time chart for explaining the operation of the imaging apparatus 100 in TDGI mode.
  • the reconstruction image is generated by dividing it into k units each containing n irradiations (that is, n time slots).
  • the first unit includes the 1st to nth irradiations, the detected intensities b 1 to bn corresponding to the n irradiations are generated, and their average value ⁇ b r [1] > is generated. Then, correlation calculation is performed using the detected intensities b 1 to b n , their average value ⁇ b r [1] >, and the intensity distributions I n+1 to I 2n , and the intermediate image M 1 (x, y) is generated.
  • the second unit contains the n+1 to 2n exposures, and the detected intensities b n+1 to b 2n corresponding to the n exposures are generated and their average value ⁇ b r [2] > is generated. Then, correlation calculation is performed using the detected intensities b n+1 to b 2n , their average values ⁇ b r [2] >, and the intensity distributions I n+1 to I 2n , and the intermediate image M 2 (x, y) is generated.
  • the j-th unit includes (j-1)n+1 to jn-th irradiation, and the detection intensities b (j-1)n+1 to b jn corresponding to the n-th irradiation are generated, and their average A value ⁇ b r [j] > is generated. Then, correlation calculation is performed using the detected intensities b (j ⁇ 1)n+1 to b jn , their average values ⁇ b r [j] >, and the intensity distributions I (j ⁇ 1)n+1 to I jn , An intermediate image M j (x,y) is generated.
  • the last k-th unit contains the (k ⁇ 1)n+1 to kn-th irradiations, and the detected intensities b (k ⁇ 1)n+1 to b kn corresponding to the n irradiations are generated, and their average ⁇ b r [k] > is generated. Then, correlation calculation is performed using the detected intensities b (k ⁇ 1)n+1 to b kn , their average values ⁇ b r [k] >, and the intensity distributions I (k ⁇ 1)n+1 to I kn , An intermediate image M k (x,y) is generated.
  • a final restored image G(x, y) is generated by synthesizing the k intermediate images M 1 (x, y) to M k (x, y).
  • the above is the operation of the TDGI mode.
  • the TDGI mode by dividing the correlation calculation into units and performing it, the amount of noise change per correlation calculation can be reduced, the restoration accuracy can be increased, and the image quality can be improved.
  • correlation calculation can be started without waiting for the completion of M irradiations, the time required to generate a restored image can be shortened.
  • Imaging using correlation calculation for each unit based on Equation (2) is hereinafter referred to as segmentation reconstruction.
  • conventional imaging using correlation calculation based on equation (1) is referred to as collective reconstruction.
  • linear noise For example, consider noise that monotonously increases over time (referred to as linear noise). Such linear noise can occur in cases where the distance between the noise light source and the beat detector approaches over time.
  • FIG. 10 is a diagram explaining another sequence in TDGI mode.
  • illumination light continues to be emitted, correlation calculation is performed every n times of illumination, and an intermediate image M(x, y) is generated.
  • an intermediate image M(x, y) is generated.
  • a restored image G(x, y) is generated by synthesizing the latest k intermediate images M(x, y).
  • the restored image G(x, y) can be updated every n times of irradiation, so the update rate can be increased.
  • FIG. 11(a) is a diagram showing the influence of noise in batch restoration
  • FIG. 11(b) is a diagram showing the influence of noise in division restoration.
  • the horizontal axis represents the irradiation pattern number, that is, the time.
  • the integrated value over all irradiation can be obtained by multiplying the value of equation (5) by k, which is kn 2 ⁇ /4.
  • FIG. 12(a) is a diagram showing a target image
  • FIG. 12(b) is a diagram showing an image obtained by batch restoration and division restoration when noise does not exist.
  • M 100000 is required in order to restore the original target image to a recognizable degree, and the same is true for segmented restoration.
  • FIG. 13 is a diagram showing a restored image G(x, y) obtained by collective restoration and divisional restoration when linear noise exists.
  • FIG. 14 is a block diagram of an imaging device 100A according to the second embodiment.
  • Embodiment 2 differs from Embodiment 1 in the arrangement of spatial light modulators 140 .
  • illumination light S1 is frequency-modulated continuous wave S0 and has a spatially uniform intensity distribution.
  • the spatial light modulator 140 spatially modulates the reflected light S2 from the object OBJ.
  • the multiplexer 122 multiplexes the light S4 spatially modulated by the spatial light modulator 140 and the frequency-modulated continuous wave S0. Subsequent processing is the same as in the first embodiment.
  • Modification 1 In TDGI, the number of times of irradiation for each unit is assumed to be equal, but this is not the case, and the number of times of irradiation for each unit may not be equal.
  • Modification 2 Also, in TDGI, the unit number k is fixed, but the unit number k may be dynamically controlled. Image quality can be further improved by selecting the optimum number of units k according to the noise fluctuation speed and noise waveform.
  • the illumination device 110 is configured by a combination of the light source 112 and the patterning device 114, but this is not the only option.
  • the illumination device 110 may be composed of an array of a plurality of semiconductor light sources (LEDs (light emitting diodes) and LDs (laser diodes)) arranged in a matrix, and configured so that the brightness of each semiconductor light source can be controlled. .
  • LEDs light emitting diodes
  • LDs laser diodes
  • FIG. 15 is a block diagram of the object identification system 10. As shown in FIG. This object identification system 10 is mounted on a vehicle such as an automobile or a motorcycle, and determines types (categories) of objects OBJ existing around the vehicle.
  • the object identification system 10 includes an imaging device 100 and an arithmetic processing device 40 . As described above, the imaging apparatus 100 generates the restored image G of the object OBJ by irradiating the object OBJ with the illumination light S1 and measuring the reflected light S2.
  • the arithmetic processing device 40 processes the output image G of the imaging device 100 and determines the position and type (category) of the object OBJ.
  • the classifier 42 of the arithmetic processing unit 40 receives the image G as an input and determines the position and type of the object OBJ contained therein.
  • Classifier 42 is implemented based on a model generated by machine learning.
  • the algorithm of the classifier 42 is not particularly limited, YOLO (You Only Look Once), SSD (Single Shot MultiBox Detector), R-CNN (Region-based Convolutional Neural Network), SPPnet (Spatial Pyramid Pooling), Faster R-CNN , DSSD (Deconvolution-SSD), Mask R-CNN, etc., or algorithms that will be developed in the future.
  • the above is the configuration of the object identification system 10.
  • the imaging device 100 as a sensor for the object identification system 10, the following advantages can be obtained.
  • the imaging device 100 that is, the quantum radar camera
  • the noise immunity is greatly improved. For example, when it is raining, snowing, or driving in fog, it is difficult to recognize the object OBJ with the naked eye. A restored image G can be obtained.
  • the calculation delay can be reduced. This can provide low latency sensing. Particularly in in-vehicle applications, there are cases where the object OBJ moves at high speed.
  • FIG. 16 is a block diagram of an automobile equipped with the object identification system 10.
  • FIG. Automobile 300 includes headlights 302L and 302R. Imaging device 100 is built into at least one of headlights 302L and 302R. The headlamp 302 is positioned at the extreme end of the vehicle body and is the most advantageous location for installing the imaging device 100 in terms of detecting surrounding objects.
  • FIG. 17 is a block diagram showing a vehicle lamp 200 including an object detection system 210.
  • the vehicle lamp 200 constitutes a lamp system 310 together with a vehicle-side ECU 304 .
  • a vehicle lamp 200 includes a light source 202 , a lighting circuit 204 and an optical system 206 .
  • the vehicle lamp 200 is provided with an object detection system 210 .
  • Object detection system 210 corresponds to object identification system 10 described above and includes imaging device 100 and arithmetic processing device 40 .
  • Information on the object OBJ detected by the processing unit 40 may be used for light distribution control of the vehicle lamp 200 .
  • the lamp-side ECU 208 generates an appropriate light distribution pattern based on the information about the type and position of the object OBJ generated by the arithmetic processing unit 40 .
  • the lighting circuit 204 and the optical system 206 operate so as to obtain the light distribution pattern generated by the lamp-side ECU 208 .
  • Information regarding the object OBJ detected by the arithmetic processing unit 40 may be transmitted to the vehicle-side ECU 304 .
  • the vehicle-side ECU may perform automatic driving based on this information.
  • the present disclosure relates to imaging apparatus and methods.
  • OBJ... Object S0... Frequency-modulated continuous wave, S1... Illumination light, S2... Reflected light, S3... Combined light, 10... Object identification system, 40... Arithmetic processing unit, 42... Classifier, 100... Imaging device, 110... Illuminating device 112 Light source 114 Patterning device 120 Beat detector 122 Combiner 124 Photodetector 130 Arithmetic processor 132 Pattern generator 134 Reconstruction processor 200 Vehicle lamp 202 Light source 204 Lighting circuit 206 Optical system 300 Automobile 302 Headlamp 310 Lamp system 304 Vehicle side ECU.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

Une source de lumière 112 d'un dispositif d'éclairage 110 génère des ondes continues modulées en fréquence S0. Un dispositif de formation de motifs 114 module les ondes continues modulées en fréquence S0 à l'aide de motifs qui diffèrent pour chaque intervalle de temps afin de générer une lumière d'éclairage S1. Une unité de détection de battement 12) génère un signal de détection sur la base de la lumière réfléchie S2 reçue en provenance d'un objet et d'une lumière multiplexée S3 des ondes continues modulées en fréquence S0. Un dispositif de traitement de calcul 130 calcule une corrélation, pour chaque battement dans le signal de détection, entre l'intensité du battement et la distribution d'intensité de la lumière d'éclairage S1 pour reconstruire une image restaurée de l'objet.
PCT/JP2022/041755 2021-11-12 2022-11-09 Dispositif d'imagerie, procédé d'imagerie, feu de véhicule et véhicule WO2023085328A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021185115 2021-11-12
JP2021-185115 2021-11-12

Publications (1)

Publication Number Publication Date
WO2023085328A1 true WO2023085328A1 (fr) 2023-05-19

Family

ID=86335800

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/041755 WO2023085328A1 (fr) 2021-11-12 2022-11-09 Dispositif d'imagerie, procédé d'imagerie, feu de véhicule et véhicule

Country Status (1)

Country Link
WO (1) WO2023085328A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005112130A2 (fr) * 2004-01-16 2005-11-24 New Jersey Institute Of Technology Imagerie terahertz pour objets en champ proche
JP6412673B1 (ja) * 2017-07-21 2018-10-24 学校法人玉川学園 画像処理装置及び方法、並びに、プログラム
CN111239747A (zh) * 2020-02-08 2020-06-05 西北工业大学 一种基于解卷积的声纳高分辨低旁瓣二维成像方法
WO2020218282A1 (fr) * 2019-04-22 2020-10-29 株式会社小糸製作所 Dispositif d'imagerie, phare de véhicule, automobile et procédé d'imagerie
WO2021079810A1 (fr) * 2019-10-23 2021-04-29 株式会社小糸製作所 Dispositif d'imagerie, phare de véhicule, véhicule et procédé d'imagerie
JP2021513653A (ja) * 2018-02-12 2021-05-27 テクノロギアン トゥトキムスケスクス ヴェーテーテー オイTeknologian Tutkimuskeskus Vtt Oy マルチチャネルレーダによる生活施設の監視

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005112130A2 (fr) * 2004-01-16 2005-11-24 New Jersey Institute Of Technology Imagerie terahertz pour objets en champ proche
JP6412673B1 (ja) * 2017-07-21 2018-10-24 学校法人玉川学園 画像処理装置及び方法、並びに、プログラム
JP2021513653A (ja) * 2018-02-12 2021-05-27 テクノロギアン トゥトキムスケスクス ヴェーテーテー オイTeknologian Tutkimuskeskus Vtt Oy マルチチャネルレーダによる生活施設の監視
WO2020218282A1 (fr) * 2019-04-22 2020-10-29 株式会社小糸製作所 Dispositif d'imagerie, phare de véhicule, automobile et procédé d'imagerie
WO2021079810A1 (fr) * 2019-10-23 2021-04-29 株式会社小糸製作所 Dispositif d'imagerie, phare de véhicule, véhicule et procédé d'imagerie
CN111239747A (zh) * 2020-02-08 2020-06-05 西北工业大学 一种基于解卷积的声纳高分辨低旁瓣二维成像方法

Similar Documents

Publication Publication Date Title
JP7463297B2 (ja) 車載用イメージング装置、車両用灯具、自動車
US10632899B2 (en) Illumination device for a motor vehicle for increasing the perceptibility of an obstacle
US11604345B2 (en) Apparatuses and methods for backscattering elimination via spatial and temporal modulations
WO2020218282A1 (fr) Dispositif d'imagerie, phare de véhicule, automobile et procédé d'imagerie
CN113227838B (zh) 车辆用灯具及车辆
US20220132022A1 (en) Imaging device
WO2023085328A1 (fr) Dispositif d'imagerie, procédé d'imagerie, feu de véhicule et véhicule
US20230009034A1 (en) Imaging apparatus
US20190310349A1 (en) Light modulating lidar apparatus
WO2022091972A1 (fr) Dispositif d'imagerie, appareil d'éclairage de véhicule et véhicule
WO2021079810A1 (fr) Dispositif d'imagerie, phare de véhicule, véhicule et procédé d'imagerie
WO2022270476A1 (fr) Dispositif d'imagerie, phare de véhicule et véhicule
WO2023074759A1 (fr) Appareil d'imagerie, accessoire de lampe de véhicule et véhicule
WO2023085329A1 (fr) Système d'imagerie, unité de détection, accessoire pour lampe de véhicule, et véhicule
WO2022044961A1 (fr) Dispositif d'imagerie, procédé d'imagerie, lampe de véhicule et véhicule
WO2021079811A1 (fr) Dispositif d'imagerie, phare de véhicule, automobile et procédé d'imagerie
WO2021060397A1 (fr) Caméra de déclenchement, automobile, feu de véhicule, dispositif de traitement d'image et procédé de traitement d'image
JP7395511B2 (ja) イメージング装置、その演算処理装置、車両用灯具、車両、センシング方法
CN115023943A (zh) 监视系统
CN116457700A (zh) 感测装置、车辆用灯具、车辆
WO2022102775A1 (fr) Dispositif de détection, phare de véhicule et véhicule
EP4382969A1 (fr) Dispositif synchronisé de prise de vues, système de détection pour véhicules et lampe de véhicule
RU2746614C1 (ru) Способ подавления встречной засветки при формировании изображений дорожного окружения перед транспортным средством и устройство для осуществления способа
JP2023055317A (ja) 撮像システム、撮像方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22892826

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023559871

Country of ref document: JP

Kind code of ref document: A