WO2023013776A1 - ゲーティングカメラ、車両用センシングシステム、車両用灯具 - Google Patents
ゲーティングカメラ、車両用センシングシステム、車両用灯具 Download PDFInfo
- Publication number
- WO2023013776A1 WO2023013776A1 PCT/JP2022/030165 JP2022030165W WO2023013776A1 WO 2023013776 A1 WO2023013776 A1 WO 2023013776A1 JP 2022030165 W JP2022030165 W JP 2022030165W WO 2023013776 A1 WO2023013776 A1 WO 2023013776A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- range
- slice images
- camera
- processing device
- Prior art date
Links
- 238000005286 illumination Methods 0.000 claims abstract description 33
- 239000002131 composite material Substances 0.000 claims abstract description 30
- 238000000034 method Methods 0.000 claims description 12
- 238000010586 diagram Methods 0.000 description 22
- 230000006870 function Effects 0.000 description 7
- 230000004048 modification Effects 0.000 description 7
- 238000012986 modification Methods 0.000 description 7
- 230000002194 synthesizing effect Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 101100198507 Schizosaccharomyces pombe (strain 972 / ATCC 24843) rng2 gene Proteins 0.000 description 1
- 101100198508 Schizosaccharomyces pombe (strain 972 / ATCC 24843) rng3 gene Proteins 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
- G01S17/18—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein range gates are used
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
Definitions
- the present disclosure relates to gating cameras.
- An object identification system that senses the position and type of objects around the vehicle is used for automated driving and automatic control of headlamp light distribution.
- An object identification system includes a sensor and a processor that analyzes the output of the sensor. Sensors are selected from cameras, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), millimeter-wave radar, ultrasonic sonar, etc., taking into consideration the application, required accuracy, and cost.
- LiDAR Light Detection and Ranging, Laser Imaging Detection and Ranging
- millimeter-wave radar ultrasonic sonar
- Depth information cannot be obtained from a general monocular camera. Therefore, it is difficult to separate multiple overlapping objects located at different distances.
- a TOF camera is known as a camera that can obtain depth information.
- a TOF (Time Of Flight) camera emits infrared light from a light-emitting device, measures the flight time until the reflected light returns to the image sensor, and obtains a TOF image by converting the flight time into distance information. is.
- a gating camera (Gating Camera or Gated Camera) has been proposed as an active sensor to replace the TOF camera (Patent Documents 1 and 2).
- a gating camera divides an imaging range into a plurality of ranges, changes exposure timing and exposure time for each range, and captures images. This yields a slice image for each range of interest, each slice image containing only the objects contained in the corresponding range.
- JP 2009-257981 A International publication WO2017/110417A1 JP 2019-159503 A JP 2017-126979 A JP 2012-113622 A
- a certain aspect of the present disclosure has been made in such a situation, and one of its exemplary purposes is to improve the image quality of the composite image.
- An exemplary object of another aspect is to improve the efficiency of image processing.
- a gating camera divides the field of view into multiple ranges in the depth direction, and generates multiple slice images corresponding to the multiple ranges.
- This gating camera includes an illumination device that irradiates a field of view with pulsed illumination light, an image sensor, and controls the light emission timing of the illumination device and the exposure timing of the image sensor.
- a camera controller that generates slice images, and an image processing device that combines a plurality of slice images output from an image sensor to generate a composite image.
- the image processing device defines a range that takes a maximum value when pixels at the same position in each of a plurality of slice images are arranged in order of range, and sets it as an effective range. Generate pixel values for the same location in the image.
- the gating camera receives a plurality of slice images from an illumination device that irradiates a field of view with pulsed illumination light, an image sensor, a camera controller that controls the timing of light emission of the illumination device and the timing of exposure of the image sensor, and the image sensor, An image processing apparatus for generating a distance map image, wherein pixels of the distance map image indicate in which range an object is included.
- image processing can be made more efficient.
- a high-quality composite image can be generated.
- FIG. 1 is a block diagram of a sensing system according to an embodiment
- FIG. It is a figure explaining the basic operation
- FIGS. 3A and 3B are diagrams for explaining slice images obtained by the gating camera.
- FIG. 4 is a diagram for explaining generation of a composite image CIMG based on a plurality of slice images SIMG 1 to SIMG N ;
- FIG. 4 is a diagram for explaining determination of a valid image and a valid range of a pixel of interest;
- 6A to 6D are diagrams showing the relationship between the pixel array and the effective range. It is a figure which shows the driving
- FIG. 1 is a block diagram of a sensing system according to an embodiment
- FIG. It is a figure explaining the basic operation
- FIGS. 3A and 3B are diagrams for explaining slice images obtained by the g
- FIG. 10 is a diagram illustrating generation of a composite image CIMG using a distance map image DIMG; 1 is a block diagram of a sensing system; FIG. 11(a) and 11(b) are diagrams showing a vehicle equipped with a gating camera. 1 is a block diagram showing a vehicle lamp equipped with a sensing system; FIG.
- a gating camera includes a lighting device that irradiates a field of view with pulsed illumination light, an image sensor, a camera controller that controls the timing of light emission of the lighting device and the timing of exposure of the image sensor, and the output from the image sensor. and an image processing device that combines a plurality of slice images to generate a composite image.
- the image processing device defines a range that takes a maximum value when pixels at the same position in each of a plurality of slice images are arranged in order of range, and sets it as an effective range. Generate pixel values for the same location in the image.
- the range in which the object exists can be detected more accurately than when it is determined that the object is included in the range having the maximum value.
- This processing can improve the image quality of the synthesized image.
- a gating camera divides the field of view into multiple ranges in the depth direction, and generates multiple slice images corresponding to the multiple ranges.
- the gating camera receives a plurality of slice images from an illumination device that irradiates a field of view with pulsed illumination light, an image sensor, a camera controller that controls the timing of light emission of the illumination device and the timing of exposure of the image sensor, and the image sensor, and an image processing device that generates a distance map image.
- the pixels of the range map image indicate in which range the object is included.
- the distance map image shows the range of objects in the left, right, up and down directions of the field of view.
- the image processing apparatus may determine that an object is included in a range that takes a maximum value when pixels at the same position in each of a plurality of slice images are arranged in range order. As a result, the range in which the object exists can be detected more accurately than when it is determined that the object is included in the range having the maximum value.
- the image processing device may generate a synthesized image by synthesizing a plurality of slice images based on the distance map image.
- a normal image obtained by a normal camera an object is obscured by fog or the like in bad weather such as thick fog.
- the synthetic image has the advantage that the object is clearly captured by removing the obstructing objects such as fog.
- N times of processing are required in order to detect an object existing in the entire depth of the field of view.
- the discriminator by inputting one synthesized image to the discriminator, it is possible to detect an object present in the entire depth of the field of view by one-time processing.
- FIG. 1 is a block diagram of a sensing system 10 according to an embodiment.
- This sensing system 10 is mounted on a vehicle such as an automobile or motorcycle, and detects an object OBJ existing around the vehicle.
- the sensing system 10 mainly includes a gating camera 100.
- Gating camera 100 includes illumination device 110 , image sensor 120 , camera controller 130 and image processing device 140 .
- the imaging by the gating camera 100 is performed by dividing the visual field into a plurality of N (N ⁇ 2) ranges RNG 1 to RNG N in the depth direction. Adjacent ranges may overlap in the depth direction at their boundaries.
- the illumination device 110 emits pulsed illumination light L1 forward of the vehicle in synchronization with the light emission timing signal S1 given from the camera controller 130 .
- the pulsed illumination light L1 is preferably infrared light, but is not limited to this, and may be visible light or ultraviolet light having a predetermined wavelength.
- the gating camera 100 according to the present embodiment is capable of sensing not only at night but also during the day, so wavelengths longer than 0.9 ⁇ m are selected.
- the image sensor 120 includes a plurality of pixels, is capable of exposure control synchronized with the exposure timing signal S2 given from the camera controller 130, and generates a slice image SIMG composed of a plurality of pixels.
- the image sensor 120 has sensitivity to the same wavelength as the pulsed illumination light L1, and captures reflected light (return light) L2 reflected by the object OBJ.
- the camera controller 130 controls the irradiation timing (light emission timing) of the pulsed illumination light L1 by the illumination device 110 and the exposure timing by the image sensor 120, and provides the image sensor 120 with a plurality of light sources corresponding to a plurality of ranges RNG 1 to RNG N. of slice images SIMG 1 to SIMG N are generated.
- the functions of the camera controller 130 may be realized by software processing, hardware processing, or a combination of software processing and hardware processing.
- software processing is implemented by combining processors (hardware) such as CPUs (Central Processing Units), MPUs (Micro Processing Units), microcomputers, and software programs executed by the processors (hardware).
- camera controller 130 may be a combination of multiple processors and software programs.
- hardware processing is implemented by hardware such as an ASIC (Application Specific Integrated Circuit), a controller IC, and an FPGA (Field Programmable Gate Array).
- Images (slice images) SIMG 1 to SIMG N generated by the image sensor 120 are input to the image processing device 140 .
- the image processing device 140 processes a plurality of slice images SIMG 1 -SIMG N obtained for a plurality of ranges RNG 1 -RNG N to generate final output data CAMERAOUT.
- the output data CAMERAOUT can include a set of multiple slice images SIMG 1 to SIMG N and a synthesized image CIMG obtained by synthesizing them.
- the output data CAMERAOUT may further include a distance map image DIMG, which will be described later.
- the image processing device 140 may be implemented in the same hardware as the camera controller 130, or may be configured with separate hardware. Alternatively, part or all of the functions of the image processing device 140 may be implemented as a processor or digital circuit built into the same module as the image sensor 120 .
- the above is the basic configuration of the gating camera 100. Next, the operation will be explained.
- FIG. 2 is a diagram for explaining the basic operation of the gating camera 100. As shown in FIG. FIG. 2 shows how the i-th range RNG i is sensed.
- the illumination device 110 emits light during a light emission period ⁇ 1 between times t 0 and t 1 in synchronization with the light emission timing signal S1.
- ⁇ 1 Between times t 0 and t 1 in synchronization with the light emission timing signal S1.
- d MINi be the distance from the gating camera 100 to the front boundary of range RNG i
- d MAXi be the distance to the rear boundary of range RNG i .
- a round-trip time TMINi from the light leaving the illumination device 110 at a certain time reaches the distance dMINi and the reflected light returns to the image sensor 120 is T MINi 2 ⁇ d MINi /c is. c is the speed of light.
- T MAXi 2 ⁇ d MAXi /c is.
- a timing signal S2 is generated. This is one sensing operation.
- Sensing of the i-th range RNG i can include multiple sets of emissions and exposures.
- the camera controller 130 repeats the sensing operation described above multiple times at a predetermined period ⁇ 2 .
- the image sensor 120 is capable of multiple exposure, and the FD area (charge accumulation area) of each pixel px is subjected to multiple exposure with multiple times of reflected light obtained as a result of multiple times of pulse emission, and one slice image SIMG can be generated.
- FIG. 3A and 3B are diagrams for explaining slice images obtained by the gating camera 100.
- FIG. 3A an object (pedestrian) OBJ2 exists in the range RNG2
- an object (vehicle) OBJ3 exists in the range RNG3 .
- FIG. 3(b) shows a plurality of slice images SIMG 1 to SIMG 3 obtained in the situation of FIG. 3(a).
- the slice image SIMG 1 is taken, the image sensor is exposed only by reflected light from the range RNG 1 , so no object image is captured in the slice image SIMG 1 .
- the image sensor When the slice image SIMG 2 is captured, the image sensor is exposed only by reflected light from the range RNG 2 , so only the object image OBJ 2 appears in the slice image SIMG 2 .
- the slice image SIMG 3 when the slice image SIMG 3 is captured, the image sensor is exposed only by reflected light from the range RNG 3 , so only the object image OBJ 3 appears in the slice image SIMG 3 .
- an object can be photographed separately for each range.
- the image processing device 140 generates a composite image CIMG by synthesizing a plurality of slice images SIMG 1 to SIMG N after sensing of all ranges RNG 1 to RNG N is completed.
- This composite image CIMG resembles a normal image captured by a normal camera in that it includes objects in all ranges RNG 1 to RNG N.
- FIG. 4 is a diagram illustrating generation of a composite image CIMG based on a plurality of slice images SIMG 1 to SIMG N.
- FIG. 4 is a diagram illustrating generation of a composite image CIMG based on a plurality of slice images SIMG 1 to SIMG N.
- an object OBJ i existing in the i-th range RNG i appears only in its corresponding slice image SIMG i .
- a pixel of interest in the slice image has a pixel value that is unrelated to the object.
- a slice image in which pixels of interest have pixel values corresponding to reflected light from an object is called an effective image, and a range corresponding to it is called an effective range.
- a slice image in which a pixel of interest does not have a pixel value corresponding to reflected light from an object is called an invalid image, and a range corresponding to it is called an invalid range.
- An effective image (effective range) can be selected for each position of the pixel of interest.
- the slice image SIMG2 is the valid image and the rest are the invalid images.
- the slice image SIMG N-1 is the valid image and the rest are the invalid images.
- the pixel value c1 of each pixel of the composite image CIMG corresponds to the pixel value c2 of the target pixel of the effective image when it is set as the target pixel.
- the pixel value c1 may be equal to the pixel value c2, or may be a value obtained by subjecting the pixel value c2 to predetermined arithmetic processing.
- FIG. 5 is a diagram for explaining determination of the effective image and effective range of the pixel of interest.
- the image processing device 140 generates an array (pixel array) in which pixel values of pixels of interest at the same position in each of the slice images SIMG 1 to SIMG N are arranged in order of range. This arrangement is shown in the lower part of FIG. Then, the range having the maximum value is detected and set as the valid range.
- range RNG j takes the maximum value, that range RNG j is the valid range and the corresponding slice image SIMG j is the valid image.
- Image processing device 140 generates a pixel value at the same position in composite image CIMG based on the pixel value at the same position as the pixel of interest in slice image SIMG j corresponding to effective range RNG j .
- the composite image CIMG has the advantage that the object is clearly captured by removing the obstructing objects such as fog.
- FIG. 6A to 6D are diagrams showing the relationship between the pixel array and the effective range.
- the array contains only a single local maximum v1, which is also the maximum value in the array.
- the local maximum v1 is the jth element of the array, the jth range RNG j becomes the valid range.
- the array contains a plurality of local maxima v1 and v2.
- the valid range may be determined based on the largest local maximum v1, so the jxth range RNG jx is the valid range.
- the effective range may be determined based on the maximum value v2 closest to the vehicle.
- the jy-th range RNG jy can be the valid range.
- the array contains the maximum value v1.
- This local maximum v1 is different from the maximum v2 in this array.
- the valid range is determined based on the largest local maximum v1 rather than the maximum v2 of the array.
- comparison technique 1 it is assumed that the effective range is determined based on the maximum value rather than the local maximum value.
- the kth range is the valid range, not the jth range.
- FIG. 7 is a diagram showing a foggy driving scene.
- the lower part of FIG. 7 shows a plot of the array of pixels of interest.
- Pulsed illumination light L1 emitted from gating camera 100 passes through fog 800 and reaches object OBJ.
- a part of the pulsed illumination light L1 is reflected by the fog 800, and a part of the weak pulsed illumination light L1 reaches the object OBJ existing in the far range. Therefore, in the arrangement of pixels of interest in which the object OBJ should be captured, the pixel value of the front range is the maximum, and the pixel value of the range RNG 8 where the object OBJ exists is small. Therefore, if the comparison technique is employed, the first range RNG 1 will be misjudged as a valid range.
- the range RNG 8 on the far side where the object OBJ exists can be correctly determined as the effective range.
- the image processing device 140 can generate the distance map image DIMG in the process of generating the composite image CIMG.
- Each pixel of the distance map image DIMG indicates in which range the object is included, in other words, which range is the effective range when the pixel of interest is the pixel of interest.
- the image processing device 140 can generate a composite image CIMG based on the distance map image DIMG.
- FIG. 9 is a diagram illustrating generation of a composite image CIMG using a distance map image DIMG.
- a pixel value of the distance map image DIMG is a pointer indicating which slice image SIMG is a valid image.
- a pixel value at coordinates (x, y) of the distance map image DIMG is denoted as DIMG(x, y).
- the pixel value at the coordinates (x, y) of the slice image SIMG i is denoted as DIMG (x, y)
- the pixel value at the coordinates (x, y) of the composite image CIMG is denoted as CIMG (x, y). do.
- CIMG(x,y) SIMGj (x,y)
- FIG. 10 is a block diagram of the sensing system 10. As shown in FIG.
- the sensing system 10 includes an arithmetic processing unit 40 in addition to the gating camera 100 described above.
- the sensing system 10 is an object detection system that is mounted on a vehicle such as an automobile or a motorcycle and determines the type (category or class) of an object OBJ existing around the vehicle.
- Gating camera 100 generates a plurality of slice images SIMG 1 -SIMG N corresponding to a plurality of ranges RNG 1 -RNG N.
- Output data CAMERAOUT of the gating camera 100 includes a plurality of slice images SIMG 1 to SIMG N and a composite image CIMG.
- the arithmetic processing unit 40 is configured to be able to identify the type of object based on the output data CAMERAOUT of the gating camera 100 .
- the processing unit 40 has a classifier 42 implemented based on a trained model generated by machine learning.
- the processing unit 40 may include multiple classifiers 42 optimized for each range.
- the algorithm of the classifier 42 is not particularly limited, YOLO (You Only Look Once), SSD (Single Shot MultiBox Detector), R-CNN (Region-based Convolutional Neural Network), SPPnet (Spatial Pyramid Pooling), Faster R-CNN , DSSD (Deconvolution-SSD), Mask R-CNN, etc., or algorithms that will be developed in the future.
- the functions of the arithmetic processing unit 40 may be realized by software processing, hardware processing, or a combination of software processing and hardware processing.
- software processing is implemented by combining processors (hardware) such as CPUs (Central Processing Units), MPUs (Micro Processing Units), microcomputers, and software programs executed by the processors (hardware).
- processors hardware
- the arithmetic processing unit 40 may be a combination of multiple processors and software programs.
- hardware processing is implemented by hardware such as ASIC (Application Specific Integrated Circuit), controller IC, and FPGA (Field Programmable Gate Array).
- ASIC Application Specific Integrated Circuit
- controller IC integrated circuit
- FPGA Field Programmable Gate Array
- an object By inputting each of a plurality of slice images SIMG 1 to SIMG N to the classifier 42, an object can be identified for each range. Also, by inputting the composite image CIMG to the classifier 42, objects present in the entire range can be identified all at once.
- FIG. 11(a) and (b) are diagrams showing an automobile 300 equipped with a gating camera 100.
- FIG. Please refer to FIG.
- the automobile 300 includes headlamps (lamps) 302L and 302R.
- the illumination device 110 of the gating camera 100 may be built into at least one of the left and right headlamps 302L and 302R.
- the image sensor 120 can be mounted on a part of the vehicle, for example behind the rearview mirror. Alternatively, the image sensor 120 may be provided on the front grille or front bumper.
- the camera controller 130 may be provided inside the vehicle, in the engine room, or built in the headlamps 302L and 302R.
- the image sensor 120 may be incorporated in either of the left and right headlamps 302L, 302R together with the illumination device 110.
- the lighting device 110 may be provided on a part of the vehicle, for example, behind the room mirror, front grille, or front bumper.
- FIG. 12 is a block diagram showing a vehicle lamp 200 including the sensing system 10.
- the vehicle lamp 200 constitutes a lamp system 304 together with the vehicle-side ECU 310 .
- the vehicle lamp 200 includes a lamp side ECU 210 and a lamp unit 220 .
- the lamp unit 220 is low beam or high beam, and includes a light source 222 , a lighting circuit 224 and an optical system 226 . Further, the vehicle lamp 200 is provided with the sensing system 10 .
- Information about the object OBJ detected by the sensing system 10 may be used for light distribution control of the vehicle lamp 200 .
- the lamp-side ECU 210 generates an appropriate light distribution pattern based on the information about the type and position of the object OBJ generated by the sensing system 10 .
- the lighting circuit 224 and the optical system 226 operate so as to obtain the light distribution pattern generated by the lamp-side ECU 210 .
- the arithmetic processing unit 40 of the sensing system 10 may be provided outside the vehicle lamp 200, that is, on the vehicle side.
- Information regarding the object OBJ detected by the sensing system 10 may also be transmitted to the vehicle-side ECU 310 .
- the vehicle-side ECU 310 may use this information for automatic driving and driving assistance.
- the output data OUT of the gating camera 100 may include only the multiple slice images SIMG 1 to SIMG N and the distance map image DIMG. That is, the image processing device 140 may generate only the distance map image DIMG without generating the composite image CIMG.
- a processing device external to the gating camera 100 can easily generate a composite image CIMG based on the plurality of slice images SIMG 1 to SIMG N and the distance map image DIMG.
- Modification 2 Only the synthesized image CIMG may be input to the classifier 42 of FIG.
- the position (range) of the object detected from the composite image CIMG by the classifier 42 can be obtained based on the distance image map DIMG. This processing can significantly reduce the computational load of the classifier 42 .
- CIMG(x,y) g( SIMGj (x,y), SIMGj (x,y)) g(u, v) is a function with u and v as arguments.
- a composite image CIMG may be generated directly from the largest local maxima without generating a distance map image DIMG.
- the effective image or effective range determination method is not limited to the method using the maximum value, and the range having the maximum value in the array may be used as the effective range.
- the present disclosure relates to gating cameras.
- SYMBOLS 10 Sensorsing system, 100... Gating camera, 110... Lighting apparatus, 120... Image sensor, 130... Camera controller, 140... Image processing apparatus, 40... Processing unit, 42... Classifier, 200... Vehicle lamp, 210 220 Lamp unit 222 Light source 224 Lighting circuit 226 Optical system 300 Automobile 302L Headlamp 304 Lamp system 310 Vehicle ECU L1 Pulse illumination light L2...reflected light, S1...light emission timing signal, S2...exposure timing signal.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Abstract
Description
TMINi=2×dMINi/c
である。cは光速である。
TMAXi=2×dMAXi/c
である。
j=DIMG(x,y)
CIMG(x,y)=SIMGj(x,y)
CIMG(x,y)=f(SIMGj(x,y))
としてもよい。
図10は、センシングシステム10のブロック図である。センシングシステム10は、上述のゲーティングカメラ100に加えて演算処理装置40を備える。このセンシングシステム10は、自動車やバイクなどの車両に搭載され、車両の周囲に存在する物体OBJの種類(カテゴリ、あるいはクラスともいう)を判定する物体検出システムである。
ゲーティングカメラ100の出力データOUTは、複数のスライス画像SIMG1~SIMGNと、距離マップ画像DIMGのみを含んでもよい。つまり画像処理装置140は、合成画像CIMGの生成は行わずに、距離マップ画像DIMGのみを生成してもよい。
図10の分類器42には、合成画像CIMGのみを入力してもよい。分類器42により合成画像CIMGから検出された物体の位置(レンジ)は、距離画像マップDIMGにもとづいて取得することができる。この処理によれば、分類器42の演算負荷を大幅に減らすことができる。
理想的には、あるレンジに含まれる物体は、それに対応するスライス画像にのみ写るが、現実的には、そのレンジと隣接するレンジのスライス画像にも物体が写り込む場合がある。その場合、以下の式にもとづいて、合成画像CIMGを生成してもよい。
CIMG(x,y)=g(SIMGj(x,y),SIMGj(x,y))
g(u,v)は、u,vを引数とする関数である。
距離マップ画像DIMGを生成せずに、最大の極大値から直接、合成画像CIMGを生成してもよい。
有効画像あるいは有効レンジの判定手法は、極大値を利用したものに限定されず、配列のうち、最大値をとるレンジを有効レンジとしてもよい。
Claims (7)
- 視野を奥行き方向について複数のレンジに区切り、前記複数のレンジに対応する複数のスライス画像を生成するゲーティングカメラであって、
パルス照明光を前記視野に照射する照明装置と、
イメージセンサと、
前記照明装置の発光タイミングと前記イメージセンサの露光のタイミングを制御するカメラコントローラと、
前記イメージセンサから出力される前記複数のスライス画像を合成して合成画像を生成する画像処理装置と、
を備え、
前記画像処理装置は、前記複数のスライス画像それぞれの同じ位置の画素をレンジ順に並べたときに極大値をとるレンジを有効レンジとし、有効レンジに対応するスライス画像の同じ位置の画素値にもとづいて、前記合成画像の同じ位置の画素値を生成することを特徴とするゲーティングカメラ。 - 前記画像処理装置は、各画素が、有効レンジを示す距離マップ画像を生成することを特徴とする請求項1に記載のゲーティングカメラ。
- 視野を奥行き方向について複数のレンジに区切り、前記複数のレンジに対応する複数のスライス画像を生成するゲーティングカメラであって、
パルス照明光を前記視野に照射する照明装置と、
イメージセンサと、
前記照明装置の発光タイミングと前記イメージセンサの露光のタイミングを制御するカメラコントローラと、
前記イメージセンサから複数のスライス画像を受け、距離マップ画像を生成する画像処理装置であって、前記距離マップ画像の画素は、どのレンジに物体が含まれるかを示す、画像処理装置と、
を備えることを特徴とするゲーティングカメラ。 - 前記画像処理装置は、前記複数のスライス画像それぞれの同じ位置の画素をレンジ順に並べたときに極大値をとるレンジに、前記物体が含まれると判定することを特徴とする請求項3に記載のゲーティングカメラ。
- 前記画像処理装置は、前記距離マップ画像にもとづいて前記複数のスライス画像を合成し、合成画像を生成することを特徴とする請求項3または4に記載のゲーティングカメラ。
- 請求項1または3に記載のゲーティングカメラと、
前記ゲーティングカメラが生成する前記複数のスライス画像と前記合成画像を処理する分類器と、
を備えることを特徴とする車両用センシングシステム。 - 請求項1または3に記載のゲーティングカメラを備えることを特徴とする車両用灯具。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22853194.3A EP4382968A1 (en) | 2021-08-05 | 2022-08-05 | Gating camera, vehicular sensing system, and vehicular lamp |
JP2023540434A JPWO2023013776A1 (ja) | 2021-08-05 | 2022-08-05 | |
CN202280054145.2A CN117795376A (zh) | 2021-08-05 | 2022-08-05 | 门控照相机、车辆用传感系统、车辆用灯具 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021129013 | 2021-08-05 | ||
JP2021-129013 | 2021-08-05 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023013776A1 true WO2023013776A1 (ja) | 2023-02-09 |
Family
ID=85154637
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/030165 WO2023013776A1 (ja) | 2021-08-05 | 2022-08-05 | ゲーティングカメラ、車両用センシングシステム、車両用灯具 |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4382968A1 (ja) |
JP (1) | JPWO2023013776A1 (ja) |
CN (1) | CN117795376A (ja) |
WO (1) | WO2023013776A1 (ja) |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009257981A (ja) | 2008-04-18 | 2009-11-05 | Calsonic Kansei Corp | 車両用距離画像データ生成装置 |
JP2010121995A (ja) * | 2008-11-18 | 2010-06-03 | Calsonic Kansei Corp | 車両用距離画像データ生成装置 |
JP2010145255A (ja) * | 2008-12-19 | 2010-07-01 | Calsonic Kansei Corp | 車両用距離画像データ生成装置及び方法 |
JP2012113622A (ja) | 2010-11-26 | 2012-06-14 | Sony Corp | 画像処理装置および方法、並びにプログラム |
WO2017110417A1 (ja) | 2015-12-21 | 2017-06-29 | 株式会社小糸製作所 | 車両用画像取得装置、制御装置、車両用画像取得装置または制御装置を備えた車両および車両用画像取得方法 |
US20170200044A1 (en) * | 2016-01-08 | 2017-07-13 | Electronics And Telecommunications Research Institute | Apparatus and method for providing surveillance image based on depth image |
JP2017126979A (ja) | 2016-01-13 | 2017-07-20 | キヤノン株式会社 | 画像処理装置および画像処理装置の制御方法、撮像装置、プログラム |
JP2018124890A (ja) * | 2017-02-03 | 2018-08-09 | 日本電信電話株式会社 | 画像処理装置、画像処理方法及び画像処理プログラム |
JP2019159503A (ja) | 2018-03-08 | 2019-09-19 | 富士ゼロックス株式会社 | 情報処理装置及びプログラム |
WO2020184447A1 (ja) * | 2019-03-11 | 2020-09-17 | 株式会社小糸製作所 | ゲーティングカメラ、自動車、車両用灯具、物体識別システム、演算処理装置、物体識別方法、画像表示システム、検査方法、撮像装置、画像処理装置 |
JP2020174154A (ja) * | 2019-04-12 | 2020-10-22 | 富士電機株式会社 | 検査装置、検査方法およびプログラム |
US20210165204A1 (en) * | 2017-11-24 | 2021-06-03 | Sigtuple Technologies Private Limited | Method and system for reconstructing a field of view |
-
2022
- 2022-08-05 EP EP22853194.3A patent/EP4382968A1/en active Pending
- 2022-08-05 JP JP2023540434A patent/JPWO2023013776A1/ja active Pending
- 2022-08-05 WO PCT/JP2022/030165 patent/WO2023013776A1/ja active Application Filing
- 2022-08-05 CN CN202280054145.2A patent/CN117795376A/zh active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009257981A (ja) | 2008-04-18 | 2009-11-05 | Calsonic Kansei Corp | 車両用距離画像データ生成装置 |
JP2010121995A (ja) * | 2008-11-18 | 2010-06-03 | Calsonic Kansei Corp | 車両用距離画像データ生成装置 |
JP2010145255A (ja) * | 2008-12-19 | 2010-07-01 | Calsonic Kansei Corp | 車両用距離画像データ生成装置及び方法 |
JP2012113622A (ja) | 2010-11-26 | 2012-06-14 | Sony Corp | 画像処理装置および方法、並びにプログラム |
WO2017110417A1 (ja) | 2015-12-21 | 2017-06-29 | 株式会社小糸製作所 | 車両用画像取得装置、制御装置、車両用画像取得装置または制御装置を備えた車両および車両用画像取得方法 |
US20170200044A1 (en) * | 2016-01-08 | 2017-07-13 | Electronics And Telecommunications Research Institute | Apparatus and method for providing surveillance image based on depth image |
JP2017126979A (ja) | 2016-01-13 | 2017-07-20 | キヤノン株式会社 | 画像処理装置および画像処理装置の制御方法、撮像装置、プログラム |
JP2018124890A (ja) * | 2017-02-03 | 2018-08-09 | 日本電信電話株式会社 | 画像処理装置、画像処理方法及び画像処理プログラム |
US20210165204A1 (en) * | 2017-11-24 | 2021-06-03 | Sigtuple Technologies Private Limited | Method and system for reconstructing a field of view |
JP2019159503A (ja) | 2018-03-08 | 2019-09-19 | 富士ゼロックス株式会社 | 情報処理装置及びプログラム |
WO2020184447A1 (ja) * | 2019-03-11 | 2020-09-17 | 株式会社小糸製作所 | ゲーティングカメラ、自動車、車両用灯具、物体識別システム、演算処理装置、物体識別方法、画像表示システム、検査方法、撮像装置、画像処理装置 |
JP2020174154A (ja) * | 2019-04-12 | 2020-10-22 | 富士電機株式会社 | 検査装置、検査方法およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023013776A1 (ja) | 2023-02-09 |
EP4382968A1 (en) | 2024-06-12 |
CN117795376A (zh) | 2024-03-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7369921B2 (ja) | 物体識別システム、演算処理装置、自動車、車両用灯具、分類器の学習方法 | |
WO2020184447A1 (ja) | ゲーティングカメラ、自動車、車両用灯具、物体識別システム、演算処理装置、物体識別方法、画像表示システム、検査方法、撮像装置、画像処理装置 | |
WO2017110415A1 (ja) | 車両用センサおよびそれを備えた車両 | |
JP2004514384A (ja) | 車両の周囲を監視するための方法及び装置 | |
JP6782433B2 (ja) | 画像認識装置 | |
US20170083775A1 (en) | Method and system for pattern detection, classification and tracking | |
CN109703555A (zh) | 用于探测道路交通中被遮蔽的对象的方法和设备 | |
US11303817B2 (en) | Active sensor, object identification system, vehicle and vehicle lamp | |
WO2021201269A1 (ja) | ゲーティングカメラ、車両用センシングシステム、車両用灯具 | |
WO2023013776A1 (ja) | ゲーティングカメラ、車両用センシングシステム、車両用灯具 | |
JP2012117872A (ja) | 物体検出装置及びこれを備えた車載機器制御装置 | |
WO2021193645A1 (ja) | ゲーティングカメラ、センシングシステム、車両用灯具 | |
WO2021060397A1 (ja) | ゲーティングカメラ、自動車、車両用灯具、画像処理装置、画像処理方法 | |
WO2023013777A1 (ja) | ゲーティングカメラ、車両用センシングシステム、車両用灯具 | |
WO2022145261A1 (ja) | ゲーティングカメラ、車両用センシングシステム、車両用灯具 | |
CN112153252B (zh) | 车辆用灯具 | |
WO2023189599A1 (ja) | 距離画像センサ、および車両用灯具 | |
US20240083346A1 (en) | Gating camera, vehicle sensing system, and vehicle lamp | |
WO2021015208A1 (ja) | アクティブセンサ、ゲーティングカメラ、自動車、車両用灯具 | |
WO2022004441A1 (ja) | 測距装置および測距方法 | |
WO2023224077A1 (ja) | ToFカメラ、車両用センシングシステム、および車両用灯具 | |
WO2021235033A1 (ja) | センシングシステム | |
WO2022014416A1 (ja) | ゲーティングカメラ、車両用センシングシステム、車両用灯具 | |
WO2023074903A1 (ja) | センシングシステムおよび自動車 | |
WO2021172478A1 (ja) | センサ、自動車および周囲環境のセンシング方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22853194 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023540434 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280054145.2 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022853194 Country of ref document: EP Effective date: 20240305 |