WO2023013777A1 - ゲーティングカメラ、車両用センシングシステム、車両用灯具 - Google Patents
ゲーティングカメラ、車両用センシングシステム、車両用灯具 Download PDFInfo
- Publication number
- WO2023013777A1 WO2023013777A1 PCT/JP2022/030166 JP2022030166W WO2023013777A1 WO 2023013777 A1 WO2023013777 A1 WO 2023013777A1 JP 2022030166 W JP2022030166 W JP 2022030166W WO 2023013777 A1 WO2023013777 A1 WO 2023013777A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- exposure
- image
- image sensor
- continuous
- gating camera
- Prior art date
Links
- 238000005286 illumination Methods 0.000 claims abstract description 63
- 238000009792 diffusion process Methods 0.000 claims abstract description 4
- 238000000034 method Methods 0.000 claims description 6
- 230000000007 visual effect Effects 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 18
- 238000001514 detection method Methods 0.000 description 8
- 230000004048 modification Effects 0.000 description 8
- 238000012986 modification Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 5
- 238000003384 imaging method Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 2
- 241001522296 Erithacus rubecula Species 0.000 description 1
- 101100198507 Schizosaccharomyces pombe (strain 972 / ATCC 24843) rng2 gene Proteins 0.000 description 1
- 101100198508 Schizosaccharomyces pombe (strain 972 / ATCC 24843) rng3 gene Proteins 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
- G01S17/18—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein range gates are used
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4861—Circuits for detection, sampling, integration or read-out
- G01S7/4863—Detector arrays, e.g. charge-transfer gates
Definitions
- the present invention relates to gating cameras.
- An object identification system that senses the position and type of objects around the vehicle is used for automated driving and automatic control of headlamp light distribution.
- An object identification system includes a sensor and a processor that analyzes the output of the sensor. Sensors are selected from cameras, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), millimeter-wave radar, ultrasonic sonar, etc., taking into consideration the application, required accuracy, and cost.
- LiDAR Light Detection and Ranging, Laser Imaging Detection and Ranging
- millimeter-wave radar ultrasonic sonar
- Depth information cannot be obtained from a general monocular camera. Therefore, it is difficult to separate multiple overlapping objects located at different distances.
- a TOF camera is known as a camera that can obtain depth information.
- a TOF (Time Of Flight) camera emits infrared light from a light-emitting device, measures the flight time until the reflected light returns to the image sensor, and obtains a TOF image by converting the flight time into distance information. is.
- a gating camera or gated camera has been proposed as an active sensor to replace the TOF camera (Patent Documents 1 and 2).
- a gating camera divides an imaging range into a plurality of ranges, changes exposure timing and exposure time for each range, and captures images. This yields a slice image for each range of interest, each slice image containing only the objects contained in the corresponding range.
- a certain aspect of the present disclosure has been made in such a situation, and one of its exemplary purposes is to provide a gating camera capable of shortening the generation time of normal images.
- An aspect of the present disclosure relates to a gating camera that divides a field of view into multiple ranges in the depth direction and generates multiple slice images corresponding to the multiple ranges.
- the gating camera consists of a lighting device that irradiates a field of view with pulsed illumination light, a multi-tap image sensor that has multiple FD (Floating Diffusion) regions per pixel, and the light emission timing of the lighting device and the exposure timing of the image sensor. and a camera controller for controlling.
- One of the multiple FD areas is assigned as a pulse exposure area for slice image generation, and another one of the multiple FD areas is assigned as a continuous exposure area for normal image generation.
- the image sensor performs multiple exposure of the reflected light of the pulsed illumination light from the field of view using the pulsed exposure area to generate a slice image, performs exposure using the continuous exposure area in the section where the pulsed exposure area is not used, Generate a normal image.
- the normal image generation time can be shortened.
- FIG. 1 is a block diagram of a sensing system according to Embodiment 1; FIG. It is a figure explaining the basic operation
- FIGS. 3A and 3B are diagrams for explaining slice images obtained by the gating camera. 4 is a time chart for explaining generation of slice images SIMG and normal images NIMG by a gating camera; 9 is a time chart of sensing according to Modification 1.
- FIG. 10 is a sensing time chart according to Modification 2.
- FIG. FIG. 8 is a block diagram of a sensing system including a gating camera according to Embodiment 2; FIG. 8 is a time chart for explaining the operation of the gating camera of FIG. 7; FIG. FIG.
- FIG. 11 is a diagram for explaining the operation of the gating camera according to Embodiment 3;
- FIG. 11 is a circuit diagram of an image sensor used in a gating camera according to Embodiment 4;
- 1 is a block diagram of a sensing system;
- FIG. 12(a) and 12(b) are diagrams showing a vehicle equipped with a gating camera.
- 1 is a block diagram showing a vehicle lamp equipped with a sensing system;
- a gating camera divides the field of view into multiple ranges in the depth direction, and generates multiple slice images corresponding to the multiple ranges.
- the gating camera consists of a lighting device that irradiates a field of view with pulsed illumination light, a multi-tap image sensor that has multiple FD (Floating Diffusion) regions per pixel, and the light emission timing of the lighting device and the exposure timing of the image sensor. and a camera controller for controlling.
- One of the plurality of FD areas is assigned as a pulse exposure area for slice image generation, and another one is assigned as a continuous exposure area for normal image generation.
- the image sensor performs multiple exposure of the reflected light of the pulsed illumination light from the field of view using the pulsed exposure area to generate a slice image, performs exposure using the continuous exposure area in the section where the pulsed exposure area is not used, Generate a normal image.
- the image sensor may be configured so that each FD area can be read at independent timing. As a result, the normal image capturing time can be further shortened.
- the illumination device may be capable of irradiating the field of view with continuous illumination light in addition to pulsed illumination light. By exposing continuous illumination light in the continuous exposure area, it is possible to shoot a normal image over a plurality of ranges in a shorter time.
- the illumination device may irradiate the field of view with continuous illumination light during nighttime photography. Power consumption can be reduced by turning off the continuous illumination light during the daytime when sunlight is present.
- the image sensor may generate a normal image by performing exposure using a plurality of continuous exposure areas in a time division manner during a continuous exposure period in which no pulse exposure area is used. Multiple consecutive exposure areas may have different exposure times. As a result, when there is a large difference in brightness in the field of view, an image with a wide dynamic range can be captured by using captured images in a plurality of continuous exposure areas.
- the image sensor may generate slice images by pixel binning and normal images by dot-by-dot.
- the generation rate of slice images can be increased at the cost of lower resolution.
- high-resolution images can be obtained by dot-by-dot readout.
- the image sensor may be capable of binning pixels of 2 rows and 2 columns as virtual pixels. Each pixel may include m (m ⁇ 1) pulsed exposure areas and n (n ⁇ 1) continuous exposure areas.
- the image sensor may comprise m first readout circuits and n second readout circuits. The m first readout circuits may be associated with the m pulsed exposure areas, and the n second readout circuits may be associated with the n continuous exposure areas.
- the i-th (1 ⁇ i ⁇ m) first readout circuit can read out by adding the signals of the i-th pulse exposure region of each of the four pixels included in the virtual pixel, and the j-th (1 ⁇ j ⁇ n) ) may be capable of reading the signal of the j-th continuous exposure area included in the corresponding pixel.
- slice images can be generated by pixel binning, and normal images can be generated by dot-by-dot.
- FIG. 1 is a block diagram of a sensing system 10 according to Embodiment 1.
- FIG. This sensing system 10 is mounted on a vehicle such as an automobile or a motorcycle, and detects an object OBJ existing around the vehicle (within the field of view of the sensor).
- the sensing system 10 mainly includes a gating camera 100.
- the gating camera 100 includes an illumination device 110 , an image sensor 120 , a camera controller 130 and an arithmetic processing device 140 .
- the imaging by the gating camera 100 is performed by dividing the visual field into a plurality of N (N ⁇ 2) ranges RNG 1 to RNG N in the depth direction. Adjacent ranges may overlap in the depth direction at their boundaries.
- the illumination device 110 emits pulsed illumination light L1 forward of the vehicle in synchronization with the light emission timing signal S1 given from the camera controller 130 .
- the pulsed illumination light L1 is preferably infrared light, but is not limited to this, and may be visible light or ultraviolet light having a predetermined wavelength.
- the image sensor 120 includes a plurality of pixels px, is capable of exposure control synchronized with the exposure timing signal S2 given from the camera controller 130, and generates an image composed of a plurality of pixels.
- the image sensor 120 has sensitivity to the same wavelength as the pulsed illumination light L1, and captures reflected light (return light) L2 reflected by the object OBJ.
- the camera controller 130 controls the irradiation timing (light emission timing) of the pulsed illumination light L1 by the lighting device 110 and the exposure timing by the image sensor 120 .
- the functions of the camera controller 130 may be realized by software processing, hardware processing, or a combination of software processing and hardware processing.
- software processing is implemented by combining processors (hardware) such as CPUs (Central Processing Units), MPUs (Micro Processing Units), microcomputers, and software programs executed by the processors (hardware).
- processors hardware
- hardware processing is implemented by hardware such as ASIC (Application Specific Integrated Circuit), controller IC, and FPGA (Field Programmable Gate Array).
- An image (slice image) SIMG i generated by the image sensor 120 is input to the arithmetic processing unit 140 .
- Arithmetic processing unit 140 processes a plurality of slice images SIMG 1 -SIMG N obtained for a plurality of ranges RNG 1 -RNG N to generate final output data CAMERAOUT.
- output data CAMERAOUT includes a set of multiple slice images SIMG 1 -SIMG N .
- the output data CAMERAOUT includes the normal image NIMG.
- the arithmetic processing unit 140 may be implemented in the same hardware as the camera controller 130, or may be configured with separate hardware. Alternatively, part or all of the functions of arithmetic processing unit 140 may be implemented as a processor or digital circuit built into the same module as image sensor 120 .
- the above is the basic configuration of the gating camera 100. Next, the operation will be explained.
- FIG. 2 is a diagram for explaining the basic operation of the gating camera 100. As shown in FIG. FIG. 2 shows how the i-th range RNG i is sensed.
- the illumination device 110 emits light during a light emission period ⁇ 1 between times t 0 and t 1 in synchronization with the light emission timing signal S1.
- ⁇ 1 Between times t 0 and t 1 in synchronization with the light emission timing signal S1.
- d MINi be the distance from the gating camera 100 to the front boundary of range RNG i
- d MAXi be the distance to the rear boundary of range RNG i .
- a round-trip time TMINi from the light leaving the illumination device 110 at a certain time reaches the distance dMINi and the reflected light returns to the image sensor 120 is T MINi 2 ⁇ d MINi /c is. c is the speed of light.
- T MAXi 2 ⁇ d MAXi /c is.
- a timing signal S2 is generated. This is one sensing operation.
- the sensing of the i-th range RNG i includes multiple sets of emissions and exposures.
- the camera controller 130 repeats the sensing operation described above multiple times at a predetermined period ⁇ 2 .
- the image sensor 120 is capable of multiple exposure, and the FD region (charge storage region) of each pixel px is multiple-exposed with multiple times of reflected light obtained as a result of multiple times of pulse emission. to generate slice images SIMG.
- FIG. 3A and 3B are diagrams for explaining slice images obtained by the gating camera 100.
- FIG. 3A an object (pedestrian) OBJ2 exists in the range RNG2
- an object (vehicle) OBJ3 exists in the range RNG3 .
- FIG. 3(b) shows a plurality of slice images SIMG 1 to SIMG 3 obtained in the situation of FIG. 3(a).
- the slice image SIMG 1 is taken, the image sensor is exposed only by reflected light from the range RNG 1 , so no object image is captured in the slice image SIMG 1 .
- the image sensor When the slice image SIMG 2 is captured, the image sensor is exposed only by reflected light from the range RNG 2 , so only the object image OBJ 2 appears in the slice image SIMG 2 .
- the slice image SIMG 3 when the slice image SIMG 3 is captured, the image sensor is exposed only by reflected light from the range RNG 3 , so only the object image OBJ 3 appears in the slice image SIMG 3 .
- an object can be photographed separately for each range.
- multiple slice images SIMG 1 to SIMG N can be combined to generate an image (normal image) similar to that taken by a normal camera. Then. However, in this case, it takes a very long time to generate one normal image.
- the gating camera 100 is configured to be able to generate a normal image NIMG that is not divided into a plurality of ranges in parallel with the generation of the slice images SIMG 1 to SIMG N. Generation of a normal image will be described below.
- the image sensor 120 is of a multi-tap type, and one pixel has a plurality of FD regions fd.
- Each pixel px of image sensor 120 includes a plurality of FD areas, at least one of which is assigned as pulse exposure area fdp for generation of slice image SIMG and another at least one for generation of normal image NIMG. is assigned as a continuous exposure area fdc for .
- the illumination device 110 Under the control of the camera controller 130, the illumination device 110 repeatedly irradiates the field of view with the pulsed illumination light L1, and the image sensor 120 performs multiple exposure of the reflected light L2 from the field of view using the pulsed exposure area fdp to obtain the slice image SIMG. to generate The exposure period of the pulse exposure region fdp is called a pulse exposure period Tp.
- the reflected light L2x from the object OBJx existing in the i-th range RNG i is detected by the pulse exposure area fdp, but the objects existing in other ranges RNG are detected by the pulse exposure area fdp.
- the reflected light L2y from OBJy does not enter the image sensor 120 during the pulse exposure period Tp, and therefore is not detected by the pulse exposure area fdp.
- the image sensor 120 performs exposure using the continuous exposure area fdc in the section in which the pulse exposure area fdp is not used, and generates the normal image NIMG.
- the exposure period of the continuous exposure area fdc is called a continuous exposure period Tc.
- Reflected lights L3x and L3y from objects OBJx and OBJy in the entire range are detected by the continuous exposure area fdc.
- the reflected light L3 may include reflected light of the pulsed illumination light L1 and reflected light of sunlight.
- the reflected light of the pulsed illumination light L1 from the detection target range RNG i does not enter the image sensor 120 during the continuous exposure period Tc, and therefore is not detected by the continuous exposure area fdc.
- the object OBJ within the detection target range RNG i is photographed using the reflected light of sunlight, not the reflected light of the pulsed illumination light L1.
- the above is the configuration of the gating camera 100. Next, the operation will be explained.
- FIG. 4 is a time chart for explaining the generation of the slice image SIMG and normal image NIMG by the gating camera 100.
- the pixel contains three FD areas fd, two of which are pulse exposure areas fdp1, fdp2, which are assigned to generate slice images SIMG of two adjacent ranges RNG i , RNG i+1. .
- a set of two ranges RNG i , RNG i+1 is called a zone.
- t0 represents the exposure start time of one zone
- t1 represents the exposure time.
- the high level of L1 represents the emission of pulsed illumination light
- the high levels of fdp1, fdp2, and fdc represent the exposure of each FD area.
- the exposure periods by the pulse exposure areas fdp1 and fpd2 are called pulse exposure periods Tp1 and Tp2
- the exposure period by the continuous exposure area fdc is called a continuous exposure period Tc.
- a pixel includes a light-receiving element such as a photodiode in addition to a plurality of FD regions. Each FD area is exclusively connected to the light receiving element during its exposure period.
- Qp1, Qp2 and Qc represent the charge amounts of the FD regions fdp1, fdp2 and fdc, respectively.
- the exposure timing of the pulse exposure regions fdp1 and fdp2 is determined according to the position of the imaging range.
- the arithmetic processing unit 140 may acquire detailed distance information within the range by the indirect ToF method using the pixel values of the two pulse exposure regions fdp1 and fdp2.
- the emission of the pulse illumination light L1 and the pulse exposure regions fdp1 and fdp2 are required.
- the fpd2 exposure must be repeated on the order of hundreds to hundreds of thousands of times.
- the remaining one of the three FD areas fd is the continuous exposure area fdc and is assigned to generate the normal image NIMG.
- the continuous exposure area fdc is used during a period (continuous exposure period) during which both the pulse exposure areas fdp1 and fdp2 are not used.
- the reflected light L2 of the pulsed illumination light L1 from objects in the entire range of the field of view is incident on the pixels.
- sunlight reflected by objects in the entire range of the field of view is incident on the pixels.
- the charge amount accumulated in the continuous exposure area fdc during the continuous exposure period constitutes the normal image NIMG photographed over the entire range of the field of view.
- the image sensor 120 can read multiple FD areas only at the same timing. In other words, it is assumed that the charges in a plurality of FD regions are reset when reading is performed once. In this case, when the sensing of one zone is completed at time t1 , the pulse exposure areas fdp1 and fdp2 and the continuous exposure area fdc are read out, and two slice images SIMG and one normal image NIMG can be obtained. .
- the above is the operation of the gating camera 100.
- this gating camera 100 by continuously exposing stationary light independent of the pulsed illumination light L1 using an FD area that is not used to generate slice images, in parallel with the generation of slice images SIMG, normal An image NIMG can be generated. Since the normal image NIMG can be obtained without waiting for the completion of photographing of all the ranges RNG 1 to RNG N , the generation time of the normal image NIMG can be shortened.
- the gating camera 100 when used in the daytime, sunlight becomes noise in generating slice images. Therefore, it is necessary to select the wavelength of the pulsed illumination light L1 and the sensitivity wavelength of the image sensor 120 from a region where the spectral intensity of sunlight is weak. In other words, the image sensor 120 is less sensitive to stationary light, which is typically dominant in capturing images NIMG.
- the pulse exposure periods Tp1 and Tp2 are on the order of about 1/100 to 1/1000 of the light emission interval of the pulse illumination light L1. Therefore, the continuous exposure period Tc is several ten to several hundred times longer than the pulse exposure periods Tp1 and Tp2. Therefore, even if the image sensor 120 has low sensitivity to sunlight, a sufficiently bright normal image can be generated.
- FIG. 5 is a sensing time chart according to Modification 1.
- FIG. 5 When the continuous exposure area fdc has accumulated a charge amount capable of obtaining a normal image with the required brightness without waiting for the completion of the sensing of one zone, the exposure of the continuous exposure area fdc is stopped at that point. may As a result, it is possible to prevent the normal image from being overexposed due to saturation of the pixels.
- FIG. 6 is a sensing time chart according to Modification 2.
- FIG. 4 multiple FD areas of the image sensor 120 are assumed to be read only at the same timing, but this is not the only option.
- the image sensor 120 is configured to be able to read at independent timing for each FD area.
- each time the exposure of the continuous exposure area fdc is completed the pulse exposure area fdp is read to generate the normal image NIMG without waiting for the completion of the exposure of the pulse exposure area fdp, and the next normal image NIMG is generated. Start a new continuous exposure for
- the frame rate of the normal image NIMG can be increased compared to the case of FIG. 4 or FIG.
- the reflected light of the pulsed illumination light L1 from the range RNG i to be detected does not enter the image sensor 120 during the continuous exposure period Tc, and therefore is not detected by the continuous exposure area fdc. .
- the object OBJ within the detection target range RNG i is photographed using the reflected light of the sunlight, not the reflected light of the pulsed illumination light L1.
- Embodiment 2 describes a technique for solving this problem.
- FIG. 7 is a block diagram of a sensing system 10A including a gating camera 100A according to the second embodiment. Regarding the gating camera 100A, differences from the gating camera 100 according to the first embodiment will be described.
- the illumination device 110A irradiates the field of view with the continuous illumination light L4 in addition to the pulsed illumination light L1. Others are the same as those of the first embodiment. Next, the operation of the gating camera 100A will be explained.
- FIG. 8 is a time chart explaining the operation of the gating camera 100A of FIG.
- the illumination device 110A Under the control of the camera controller 130, the illumination device 110A repeatedly irradiates the field of view with the pulsed illumination light L1, and the image sensor 120 performs multiple exposure of the reflected light L2 from the field of view using the pulsed exposure area fdp to obtain a slice image.
- Generate SIMG The generation of the slice image SIMG is the same as in the first embodiment.
- the illumination device 110A irradiates the field of view with the continuous illumination light L4.
- the intensity of the continuous illumination light L4 is lower than the peak intensity of the pulsed illumination light L1.
- the image sensor 120 Under the control of the camera controller 130, the image sensor 120 performs exposure using the continuous exposure area fdc during the continuous exposure period Tc in which the pulse exposure area fdp is not used, and generates the normal image NIMG. Reflected lights L3x and L3y from objects OBJx and OBJy in the entire range are detected by the continuous exposure area fdc.
- the reflected light L3 can include reflected light of the pulsed illumination light L1 and reflected light of the continuous illumination light L4.
- the reflected light of the pulsed illumination light L1 from the detection target range RNG i does not enter the image sensor 120 during the continuous exposure period Tc, and therefore is not detected by the continuous exposure area fdc. In other words, in photographing the normal image NIMG, the object OBJ within the detection target range RNG i is photographed using not the reflected light of the pulsed illumination light L1 but the reflected light of the continuous illumination light L4. .
- the gating camera 100A According to the gating camera 100A according to the second embodiment, it is possible to generate the normal image NIMG in a short time even at night when there is no sunlight.
- each pixel px has one continuous exposure region fdc, but it is not limited to this and may have two or more continuous exposure regions fdc.
- FIG. 9 is a diagram explaining the operation of the gating camera according to the third embodiment.
- the image sensor 120 Under the control of the camera controller 130, the image sensor 120 performs exposure using the two continuous exposure areas fdc1 and fdc2 in a time-sharing manner during the continuous exposure period Tc in which the pulse exposure area fdp is not used, to generate the normal image NIMG.
- the two continuous exposure areas fdc1 and fdc2 have different exposure times, and two normal images NIMG are shot with different exposures.
- images captured by a plurality of continuous exposure areas fdc1 and fdc2 are used to detect bright objects (objects with high reflectance) and dark objects (objects with low reflectance). objects) are easier to detect.
- an HDR (High Dynamic Range) image with a wide dynamic range that suppresses blown-out highlights and blocked-up shadows can be shot by synthesizing shot images from two continuous exposure areas fdc1 and fdc2.
- FIG. 10 is a circuit diagram of an image sensor 120C used in the gating camera according to the fourth embodiment.
- each pixel px includes m (m ⁇ 1) pulse exposure areas fdp and n (n ⁇ 1) continuous exposure areas fdc, for a total of m+n FD areas.
- Each FD area is provided with a tap so that signals can be read out.
- FD regions are pulse exposure regions fdp1 to fdp4, and four taps TP1 to TP4 are provided.
- the remaining two FD regions are continuous exposure regions fdc1 and fdc2, and two taps TP5 and TP6 are provided.
- each pixel includes four pulse exposure areas fdp1 to fdp4, four slice images corresponding to four ranges are generated at the same time. Also, as described in the third embodiment, two normal images NIMG are shot with different exposures using two continuous exposure areas fdc1 and fdc2.
- slice images are generated by pixel binning. Specifically, a plurality of adjacent pixels (four pixels in this example) are combined to generate a virtual pixel pxbin. That is, the resolution of slice images is lower than that of normal images.
- FIG. 10 shows the i-th and (i+1)-th columns and the j-th and (j+1)-th rows, where four pixels px i,j , px i+1,j , px across 2 rows and 2 columns. i,j+1 , px i+1,j+1 are integrated by pixel binning.
- the four first readout circuits RO_BIN1 to RO_BIN4 are associated with the four pulse exposure areas fdp1 to fpd4 of the corresponding two columns of pixels, respectively.
- the two second readout circuits RO_DBD1 to RO_DBD2 are associated with the two continuous exposure areas fdc1 to fdc2 in the corresponding columns.
- Adjacent readout circuits RO_DBD and ROBIN can be shared and can be used while switching.
- the i-th (1 ⁇ i ⁇ m) first readout circuit RO_BINi can read out by adding the signals of the i-th pulse exposure regions fdpi of the four pixels included in the virtual pixel pxbin.
- the j-th (1 ⁇ j ⁇ n) second readout circuit RO_DBDj can read out the signal of the j-th continuous exposure area fdcj included in the pixels of the corresponding column.
- a high-resolution normal image NIMG can be generated by dot-by-dot readout.
- the image sensor 120 by generating the slice image SIMG by pixel binning processing, it is possible to reduce the generation time in exchange for the reduction in resolution.
- FIG. 11 is a block diagram of the sensing system 10. As shown in FIG.
- the sensing system 10 includes an arithmetic processing unit 40 in addition to the gating camera 100 described above.
- the sensing system 10 is an object detection system that is mounted on a vehicle such as an automobile or a motorcycle and determines the type (category or class) of an object OBJ existing around the vehicle.
- Gating camera 100 generates a plurality of slice images IMG 1 -IMG N corresponding to a plurality of ranges RNG 1 -RNG N.
- Output data CAMERAOUT of the gating camera 100 includes a plurality of slice images SIMG 1 to SIMG N and a normal image NIMG.
- the arithmetic processing unit 40 is configured to be able to identify the type of object based on the output data CAMERAOUT of the gating camera 100 .
- the processing unit 40 has a classifier 42 implemented based on a trained model generated by machine learning.
- the processing unit 40 may include multiple classifiers 42 optimized for each range.
- the algorithm of the classifier 42 is not particularly limited, YOLO (You Only Look Once), SSD (Single Shot MultiBox Detector), R-CNN (Region-based Convolutional Neural Network), SPPnet (Spatial Pyramid Pooling), Faster R-CNN , DSSD (Deconvolution-SSD), Mask R-CNN, etc., or algorithms that will be developed in the future.
- the functions of the arithmetic processing unit 40 may be realized by software processing, hardware processing, or a combination of software processing and hardware processing.
- software processing is implemented by combining processors (hardware) such as CPUs (Central Processing Units), MPUs (Micro Processing Units), microcomputers, and software programs executed by the processors (hardware).
- processors hardware
- the arithmetic processing unit 40 may be a combination of multiple processors and software programs.
- hardware processing is implemented by hardware such as ASIC (Application Specific Integrated Circuit), controller IC, and FPGA (Field Programmable Gate Array).
- ASIC Application Specific Integrated Circuit
- controller IC integrated circuit
- FPGA Field Programmable Gate Array
- FIG. 12(a) and (b) are diagrams showing an automobile 300 equipped with the gating camera 100.
- FIG. Please refer to FIG.
- the automobile 300 includes headlamps (lamps) 302L and 302R.
- the illumination device 110 of the gating camera 100 may be built into at least one of the left and right headlamps 302L and 302R.
- the image sensor 120 can be mounted on a part of the vehicle, for example behind the rearview mirror. Alternatively, the image sensor 120 may be provided on the front grille or front bumper.
- the camera controller 130 may be provided inside the vehicle, in the engine room, or built in the headlamps 302L and 302R.
- the image sensor 120 may be incorporated in either of the left and right headlamps 302L, 302R together with the lighting device 110.
- the lighting device 110 may be provided on a part of the vehicle, for example, behind the room mirror, front grille, or front bumper.
- FIG. 13 is a block diagram showing a vehicle lamp 200 including the sensing system 10.
- the vehicle lamp 200 constitutes a lamp system 304 together with the vehicle-side ECU 310 .
- the vehicle lamp 200 includes a lamp side ECU 210 and a lamp unit 220 .
- the lamp unit 220 is low beam or high beam, and includes a light source 222 , a lighting circuit 224 and an optical system 226 . Further, the vehicle lamp 200 is provided with the sensing system 10 .
- Information about the object OBJ detected by the sensing system 10 may be used for light distribution control of the vehicle lamp 200 .
- the lamp-side ECU 210 generates an appropriate light distribution pattern based on the information about the type and position of the object OBJ generated by the sensing system 10 .
- the lighting circuit 224 and the optical system 226 operate so as to obtain the light distribution pattern generated by the lamp-side ECU 210 .
- the arithmetic processing unit 40 of the sensing system 10 may be provided outside the vehicle lamp 200, that is, on the vehicle side.
- Information regarding the object OBJ detected by the sensing system 10 may also be transmitted to the vehicle-side ECU 310 .
- the vehicle-side ECU 310 may use this information for automatic driving and driving assistance.
- the present invention relates to gating cameras.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Studio Devices (AREA)
Abstract
Description
本開示のいくつかの例示的な実施形態の概要を説明する。この概要は、後述する詳細な説明の前置きとして、実施形態の基本的な理解を目的として、1つまたは複数の実施形態のいくつかの概念を簡略化して説明するものであり、発明あるいは開示の広さを限定するものではない。またこの概要は、考えられるすべての実施形態の包括的な概要ではなく、実施形態の欠くべからざる構成要素を限定するものではない。便宜上、「一実施形態」は、本明細書に開示するひとつの実施形態(実施例や変形例)または複数の実施形態(実施例や変形例)を指すものとして用いる場合がある。
以下、好適な実施形態について図面を参照しながら説明する。各図面に示される同一または同等の構成要素、部材、処理には、同一の符号を付するものとし、適宜重複した説明は省略する。また、実施形態は、発明を限定するものではなく例示であって、実施形態に記述されるすべての特徴やその組み合わせは、必ずしも発明の本質的なものであるとは限らない。
TMINi=2×dMINi/c
である。cは光速である。
TMAXi=2×dMAXi/c
である。
実施形態1で説明したように、検出対象のレンジRNGiからのパルス照明光L1の反射光は、連続露光期間Tcの間に、イメージセンサ120には入射しないため、連続露光領域fdcによって検出されない。通常画像NIMGの撮影に関しては、検出対象のレンジRNGi内の物体OBJは、パルス照明光L1の反射光ではなく、太陽光の反射光を利用して撮影されることとなる。
これまでの説明では、各画素pxは、1個の連続露光領域fdcを有していたが、その限りでなく、2個以上の連続露光領域fdcを有してもよい。
図10は、実施形態4に係るゲーティングカメラに使用されるイメージセンサ120Cの回路図である。この例において、各画素pxは、m個(m≧1)のパルス露光領域fdpとn個(n≧1)の連続露光領域fdcを含み、合計m+n個のFD領域を含む。この例ではm=4,n=2である。各FD領域にはタップが設けられ、信号が読み出し可能となっている。
図11は、センシングシステム10のブロック図である。センシングシステム10は、上述のゲーティングカメラ100に加えて演算処理装置40を備える。このセンシングシステム10は、自動車やバイクなどの車両に搭載され、車両の周囲に存在する物体OBJの種類(カテゴリ、あるいはクラスともいう)を判定する物体検出システムである。
Claims (11)
- 視野を奥行き方向について複数のレンジに区切り、前記複数のレンジに対応する複数のスライス画像を生成するゲーティングカメラであって、
パルス照明光を前記視野に照射する照明装置と、
1画素が複数のFD(Floating Diffusion)領域を有するマルチタップ型のイメージセンサと、
前記照明装置の発光タイミングと前記イメージセンサの露光タイミングを制御するカメラコントローラと、
を備え、
前記複数のFD領域のうちのひとつは、前記スライス画像の生成のためのパルス露光領域として割り当てられ、前記複数のFD領域のうちの別のひとつは、通常画像の生成のための連続露光領域として割り当てられ、
前記イメージセンサは、前記視野からの前記パルス照明光の反射光を前記パルス露光領域を利用して多重露光を行ってスライス画像を生成し、前記パルス露光領域を使用しない区間において前記連続露光領域を利用した露光を行い、前記通常画像を生成することを特徴とするゲーティングカメラ。 - 前記イメージセンサは、前記FD領域ごとに独立したタイミングで読み出し可能に構成されることを特徴とする請求項1に記載のゲーティングカメラ。
- 前記照明装置は、前記パルス照明光に加えて、連続照明光を前記視野に照射可能であることを特徴とする請求項1または2に記載のゲーティングカメラ。
- 前記照明装置は、夜間の撮影において前記連続照明光を前記視野に照射することを特徴とする請求項3に記載のゲーティングカメラ。
- 前記連続露光領域は複数であり、前記イメージセンサは、前記パルス露光領域を使用しない区間において、前記複数の連続露光領域を時分割で利用した露光を行い、前記通常画像を生成し、前記複数の連続露光領域は露光時間が異なることを特徴とする請求項1または2に記載のゲーティングカメラ。
- 前記イメージセンサは、前記スライス画像をピクセルビニングによって生成し、
前記通常画像をドットバイドットによって生成することを特徴とする請求項1または2に記載のゲーティングカメラ。 - 前記イメージセンサは、
2行2列の画素を仮想ピクセルとしてビニング可能であり、
各画素は、m個(m≧1)のパルス露光領域と、n個(n≧1)の連続露光領域を含み、
前記イメージセンサは、
m個の第1読み出し回路と、
n個の第2読み出し回路と、
を備え、
前記m個の第1読み出し回路は、前記m個のパルス露光領域に対応付けられ、
前記n個の第2読み出し回路は、前記n個の連続露光領域に対応付けられ、
i番目(1≦i≦m)の第1読み出し回路は、前記仮想ピクセルに含まれる4画素それぞれのi番目のパルス露光領域の信号を加算して読み出し可能であり、
j番目(1≦j≦n)の第2読み出し回路は、対応する画素に含まれるj番目の連続露光領域の信号を読み出し可能であることを特徴とする請求項6に記載のゲーティングカメラ。 - m=4であり、n=2であることを特徴とする請求項7に記載のゲーティングカメラ。
- 車両に搭載されることを特徴とする請求項1または2に記載のゲーティングカメラ。
- 請求項1または2に記載のゲーティングカメラと、
前記ゲーティングカメラが撮影する前記複数のスライス画像を処理する演算処理装置と、
を備えることを特徴とする車両用センシングシステム。 - 請求項1または2に記載のゲーティングカメラを備えることを特徴とする車両用灯具。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280054150.3A CN117795377A (zh) | 2021-08-05 | 2022-08-05 | 门控照相机、车辆用传感系统、车辆用灯具 |
EP22853195.0A EP4382969A1 (en) | 2021-08-05 | 2022-08-05 | Gated camera, vehicular sensing system, and vehicular lamp |
JP2023540435A JPWO2023013777A1 (ja) | 2021-08-05 | 2022-08-05 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-129014 | 2021-08-05 | ||
JP2021129014 | 2021-08-05 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023013777A1 true WO2023013777A1 (ja) | 2023-02-09 |
Family
ID=85154581
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/030166 WO2023013777A1 (ja) | 2021-08-05 | 2022-08-05 | ゲーティングカメラ、車両用センシングシステム、車両用灯具 |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4382969A1 (ja) |
JP (1) | JPWO2023013777A1 (ja) |
CN (1) | CN117795377A (ja) |
WO (1) | WO2023013777A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009257981A (ja) | 2008-04-18 | 2009-11-05 | Calsonic Kansei Corp | 車両用距離画像データ生成装置 |
JP2010164440A (ja) * | 2009-01-16 | 2010-07-29 | Stanley Electric Co Ltd | 距離画像処理装置および撮影装置 |
JP2011243862A (ja) * | 2010-05-20 | 2011-12-01 | Sony Corp | 撮像デバイス及び撮像装置 |
JP2015121430A (ja) * | 2013-12-20 | 2015-07-02 | スタンレー電気株式会社 | 距離画像生成装置および距離画像生成方法 |
JP2015198361A (ja) * | 2014-04-01 | 2015-11-09 | キヤノン株式会社 | 撮像装置及びその制御方法、プログラム、記憶媒体 |
WO2017110417A1 (ja) | 2015-12-21 | 2017-06-29 | 株式会社小糸製作所 | 車両用画像取得装置、制御装置、車両用画像取得装置または制御装置を備えた車両および車両用画像取得方法 |
-
2022
- 2022-08-05 WO PCT/JP2022/030166 patent/WO2023013777A1/ja active Application Filing
- 2022-08-05 EP EP22853195.0A patent/EP4382969A1/en active Pending
- 2022-08-05 CN CN202280054150.3A patent/CN117795377A/zh active Pending
- 2022-08-05 JP JP2023540435A patent/JPWO2023013777A1/ja active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009257981A (ja) | 2008-04-18 | 2009-11-05 | Calsonic Kansei Corp | 車両用距離画像データ生成装置 |
JP2010164440A (ja) * | 2009-01-16 | 2010-07-29 | Stanley Electric Co Ltd | 距離画像処理装置および撮影装置 |
JP2011243862A (ja) * | 2010-05-20 | 2011-12-01 | Sony Corp | 撮像デバイス及び撮像装置 |
JP2015121430A (ja) * | 2013-12-20 | 2015-07-02 | スタンレー電気株式会社 | 距離画像生成装置および距離画像生成方法 |
JP2015198361A (ja) * | 2014-04-01 | 2015-11-09 | キヤノン株式会社 | 撮像装置及びその制御方法、プログラム、記憶媒体 |
WO2017110417A1 (ja) | 2015-12-21 | 2017-06-29 | 株式会社小糸製作所 | 車両用画像取得装置、制御装置、車両用画像取得装置または制御装置を備えた車両および車両用画像取得方法 |
Also Published As
Publication number | Publication date |
---|---|
CN117795377A (zh) | 2024-03-29 |
JPWO2023013777A1 (ja) | 2023-02-09 |
EP4382969A1 (en) | 2024-06-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210295065A1 (en) | Object identification system | |
EP3423865B1 (en) | Gated imaging apparatus, system and method | |
JP2022505772A (ja) | 構造化光照明付き飛行時間型センサ | |
WO2020184447A1 (ja) | ゲーティングカメラ、自動車、車両用灯具、物体識別システム、演算処理装置、物体識別方法、画像表示システム、検査方法、撮像装置、画像処理装置 | |
IL266025A (en) | Vehicle environment modulation system | |
EP3998767B1 (en) | Imaging device, lighting device for same, vehicle, and lamp fitting for vehicle | |
CN111398975B (zh) | 主动传感器、物体识别系统、车辆、车辆用灯具 | |
WO2021201269A1 (ja) | ゲーティングカメラ、車両用センシングシステム、車両用灯具 | |
JP6942637B2 (ja) | 車両用画像取得装置、および車両用画像取得装置を備えた車両 | |
US11353564B2 (en) | Disturbance light identifying apparatus, disturbance light separating apparatus, disturbance light identifying method, and disturbance light separating method | |
WO2023013777A1 (ja) | ゲーティングカメラ、車両用センシングシステム、車両用灯具 | |
US20220400240A1 (en) | Imaging system, control method of imaging system, and program | |
US20220214434A1 (en) | Gating camera | |
WO2021193646A1 (ja) | イメージング装置および車両用灯具、車両 | |
WO2021084891A1 (ja) | 移動量推定装置、移動量推定方法、移動量推定プログラム、及び移動量推定システム | |
WO2023013776A1 (ja) | ゲーティングカメラ、車両用センシングシステム、車両用灯具 | |
WO2022163721A1 (ja) | ゲーティングカメラ、車両用センシングシステム、車両用灯具 | |
WO2021015208A1 (ja) | アクティブセンサ、ゲーティングカメラ、自動車、車両用灯具 | |
WO2022145261A1 (ja) | ゲーティングカメラ、車両用センシングシステム、車両用灯具 | |
WO2023189599A1 (ja) | 距離画像センサ、および車両用灯具 | |
JP7474759B2 (ja) | 車両用灯具 | |
US20230003895A1 (en) | Method and apparatus for controlling distance measurement apparatus | |
WO2021079811A1 (ja) | イメージング装置、車両用灯具、車両、イメージング方法 | |
US20230168380A1 (en) | Method and device for acquiring image data | |
CN116964486A (zh) | 门控照相机、车辆用感测系统、车辆用灯具 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22853195 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023540435 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280054150.3 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022853195 Country of ref document: EP Effective date: 20240305 |