WO2022039229A1 - 車載用センシングシステムおよびゲーティングカメラ - Google Patents
車載用センシングシステムおよびゲーティングカメラ Download PDFInfo
- Publication number
- WO2022039229A1 WO2022039229A1 PCT/JP2021/030393 JP2021030393W WO2022039229A1 WO 2022039229 A1 WO2022039229 A1 WO 2022039229A1 JP 2021030393 W JP2021030393 W JP 2021030393W WO 2022039229 A1 WO2022039229 A1 WO 2022039229A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- gating camera
- gating
- main controller
- sensing system
- Prior art date
Links
- 238000000034 method Methods 0.000 claims abstract description 19
- 230000008569 process Effects 0.000 claims abstract description 16
- 238000005286 illumination Methods 0.000 claims description 13
- 230000007257 malfunction Effects 0.000 claims description 10
- 238000001514 detection method Methods 0.000 claims description 8
- 230000004044 response Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 16
- 238000012545 processing Methods 0.000 description 15
- 230000004048 modification Effects 0.000 description 8
- 238000012986 modification Methods 0.000 description 8
- 238000012544 monitoring process Methods 0.000 description 7
- 230000036760 body temperature Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 230000006866 deterioration Effects 0.000 description 4
- 230000004064 dysfunction Effects 0.000 description 4
- 230000035900 sweating Effects 0.000 description 4
- 230000004424 eye movement Effects 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 2
- 235000015256 Chionanthus virginicus Nutrition 0.000 description 1
- 241000234271 Galanthus Species 0.000 description 1
- 241000533950 Leucojum Species 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/958—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
- G01S17/18—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein range gates are used
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
- G03B15/03—Combinations of cameras with lighting apparatus; Flash units
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
- G03B15/03—Combinations of cameras with lighting apparatus; Flash units
- G03B15/05—Combinations of cameras with electronic flash apparatus; Electronic flash units
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
- G03B7/16—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly in accordance with both the intensity of the flash source and the distance of the flash source from the object, e.g. in accordance with the "guide number" of the flash bulb and the focusing of the camera
Definitions
- This disclosure relates to a sensing system for vehicles.
- an object identification system that senses the position and type of objects existing around the vehicle.
- the object identification system includes a sensor and an arithmetic processing unit that analyzes the output of the sensor.
- the sensor is selected from among cameras, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), millimeter wave radar, ultrasonic sonar, active sensor, etc. in consideration of application, required accuracy and cost.
- a gating camera (Gating Camera or Gated Camera) has been proposed as an active sensor in place of the TOF camera (Patent Documents 1 and 2).
- the gating camera divides the shooting range into a plurality of ranges, changes the exposure timing and the exposure time for each range, and takes an image. As a result, slice images are obtained for each target range, and each slice image contains only objects included in the corresponding range.
- the present inventors have considered adding a gating camera (sensor fusion) to the conventional object identification system, and have come to recognize the following problems.
- the gating camera is an active sensor, and is equipped with a lighting device that irradiates the subject with pulsed illumination light and an image sensor that captures the reflected light from the subject.
- a lighting device that irradiates the subject with pulsed illumination light
- an image sensor that captures the reflected light from the subject.
- This disclosure was made in such a situation, and one of the exemplary purposes of that aspect is to provide a sensing system fused with a gating camera.
- the sensing system controls the enable state and disable state according to the main sensor and the driving environment.
- the enable state the field of view is divided into multiple ranges in the depth direction, and multiple slice images corresponding to multiple ranges are displayed.
- It includes a gating camera to generate and a main controller to process the output of the main sensor and the output of the gating camera.
- This gating camera constitutes an in-vehicle sensing system together with a main sensor and a main controller that processes the output of the main sensor.
- the gating camera controls a lighting device that irradiates pulsed illumination light, an image sensor, a light emission timing of the lighting device, and an exposure timing of the image sensor, and outputs a plurality of image data corresponding to a plurality of ranges to the image sensor. It is equipped with a camera controller to generate.
- the gating camera is controlled to be enabled / disabled according to the instruction from the main controller.
- This gating camera constitutes an in-vehicle sensing system together with a main sensor and a main controller that processes the output of the main sensor.
- the gating camera determines the enable state and disable state of the gating camera according to the lighting device that irradiates the pulsed illumination light, the image sensor, and the driving environment, and in the enabled state, the light emission timing of the lighting device and the image sensor. It includes a camera controller that controls the timing of exposure and causes an image sensor to generate a plurality of image data corresponding to a plurality of ranges.
- the sensing system of a certain aspect of the present disclosure is used for driving assistance or automatic driving.
- the sensing system includes a main sensor, a gating camera, and a main controller that processes the output of the main sensor and the output of the gating camera.
- the gating camera generates a slice image corresponding to an interest range (target range) according to a control signal from the main controller.
- the gating camera of a certain aspect of the present disclosure constitutes an in-vehicle sensing system together with a main sensor and a main controller that processes the output of the main sensor.
- the gating camera controls the lighting device that irradiates the pulsed illumination light, the image sensor, the light emission timing of the lighting device, and the exposure timing of the image sensor. It is equipped with a camera controller that generates a slice image corresponding to the above.
- FIG. It is a block diagram of the sensing system which concerns on Embodiment 1.
- FIG. It is a figure explaining the operation of a gating camera.
- 3 (a) and 3 (b) are diagrams illustrating an image obtained by a gating camera.
- 4 (a) to 4 (c) are diagrams illustrating the advantages of a gating camera in bad weather. It is a time chart explaining the operation of the sensing system which concerns on Embodiment 1.
- FIG. It is a block diagram of the sensing system which concerns on Embodiment 2.
- FIG. It is a block diagram of the sensing system which concerns on Embodiment 3.
- FIG. 8 (a) and 8 (b) are diagrams showing an automobile equipped with a sensing system. It is a block diagram which shows the lamp for a vehicle which concerns on embodiment.
- the enable state and the disable state are controlled according to the main sensor and the driving environment, and in the enable state, the field of view is divided into a plurality of ranges in the depth direction, and the field of view corresponds to a plurality of ranges. It includes a gating camera that generates a plurality of slice images, and a main controller that processes the output of the main sensor and the output of the gating camera.
- the increase in power consumption can be suppressed by operating the gating camera only in the necessary situations based on the driving environment instead of operating it all the time and using it as an auxiliary sensor to assist the main sensor.
- the main controller may control the enable / disable state of the gating camera according to the driving environment.
- the gating camera may control the enabled / disabled state of the gating camera itself according to the driving environment.
- the gating camera may be enabled in bad weather.
- bad weather such as rain, snowfall, and fog
- the gating camera can remove rain, snow, and fog contained in a range other than the range to be measured. That is, the sliced image generated by the gating camera contains more information than a general camera in bad weather. Therefore, by determining whether the weather is good or bad in the main controller or the gating camera and utilizing the gating camera in bad weather, it is possible to recover from the deterioration of the sensing ability of the main sensor.
- the gating camera may be enabled if the main sensor malfunctions.
- the malfunction of the main sensor may include a situation in which the target cannot be recognized from the output image, a situation in which the recognition rate is lowered, and the like.
- the gating camera may be enabled when the object recognition accuracy by the main controller falls below a predetermined threshold.
- the enabled / disabled state of the gating camera may be controlled based on at least one of the rain sensor, the output of the fog sensor, the wiper, and the fog lamp mounted on the vehicle.
- the enabled / disabled state of the gating camera may control the gating camera based on the state of the driver. Poor visibility due to bad weather increases the driver's tension, which manifests itself in the driver's condition, such as gestures, posture, and eye movements. Therefore, by monitoring the driver's condition, it is possible to estimate whether or not the weather is bad.
- the main controller may use the output of the gating camera for driving assistance or control of automatic driving.
- the main controller may display the output of the gating camera on the display.
- the sensing system includes a main sensor, a gating camera, and a main controller that processes the output of the main sensor and the output of the gating camera.
- the gating camera generates a slice image corresponding to the range of interest according to the control signal from the main controller.
- the gating camera does not operate all the time, and under the control of the main controller, only shooting in the range requested by the main controller is performed.
- the gating camera as an auxiliary sensor to assist the main sensor, it is possible to suppress an increase in power consumption.
- the main sensor may include a sensor that can detect the distance to an object, such as a millimeter-wave radar or a stereo camera.
- the main sensor may include a monocular camera and the main controller may acquire the distance to the object by image processing.
- the gating camera may supply the slice image itself corresponding to the range of interest to the main controller.
- the gating camera defines a plurality of ranges that divide the field of view in the depth direction, and one of the plurality of ranges may be selected as the range of interest based on the control signal.
- the main controller When the main controller detects an object based on the output of the main sensor, it may supply the gating camera with a control signal including position data that specifies the distance to the object or the range in the depth direction including the object. ..
- the gating camera may return information indicating whether or not an object is included in the range of interest to the main controller.
- the gating camera includes a classifier that detects the type (also referred to as a class or category) of an object existing in the range of interest based on the slice image, and the detection result by the classifier may be returned to the main controller.
- a classifier that detects the type (also referred to as a class or category) of an object existing in the range of interest based on the slice image, and the detection result by the classifier may be returned to the main controller.
- the main controller supplies data indicating the type of object detected based on the output of the main sensor to the gating camera, and the gating camera detects the type of object existing in the range of interest based on the slice image.
- the main controller may be provided with a match / mismatch between the type detected by the classifier and the type indicated by the data, including the device.
- the gating camera controls a lighting device that irradiates pulsed illumination light, an image sensor, a light emission timing of the lighting device, and an exposure timing of the image sensor, and a control signal from the main controller to the image sensor. It is equipped with a camera controller that generates a slice image corresponding to the range of interest according to the above.
- the gating camera does not operate all the time, and only the shooting of the range requested by the gating camera is performed under the control of the main controller.
- the gating camera as an auxiliary sensor to assist the main sensor, it is possible to suppress an increase in power consumption.
- FIG. 1 is a block diagram of the sensing system 10 according to the first embodiment.
- the sensing system 10 is mounted on a vehicle such as an automobile or a motorcycle for the purpose of driving support or automatic driving, and detects an object OBJ existing around the vehicle.
- the sensing system 10 includes a main sensor group 50, a main controller 60, and a gating camera 20.
- the main sensor group 50 may include one or more sensors.
- the main sensor group 50 includes a camera 52 and a millimeter wave radar 54.
- the main sensor group 50 may include a stereo camera.
- the main sensor group 50 may include LiDAR or the like.
- the main controller 60 detects the position and type of an object around the vehicle based on the output of the main sensor group 50, and outputs the detection result RESULT.
- the main controller 60 may include a classifier (classifier), and the detection result RESULT may include information regarding the type (category, class) and position of the target.
- the gating camera 20 divides the field of view into a plurality of ranges RNG 1 to RNG N in the depth direction, and generates a plurality of slice images IMGs 1 to IMGs N corresponding to the plurality of ranges RNG 1 to RNG N. Adjacent ranges may overlap in the depth direction at their boundaries.
- the gating camera 20 includes a lighting device 22, an image sensor 24, a camera controller 26, and an arithmetic processing device 28.
- the lighting device (floodlight) 22 irradiates the field of view in front of the vehicle with the pulsed lighting light L1 in synchronization with the light emission timing signal S1 given from the camera controller 26.
- the pulse illumination light L1 is preferably infrared light, but is not limited to this, and may be visible light having a predetermined wavelength.
- a laser diode (LD) or an LED can be used.
- the wavelength of the pulse illumination light L1 can be set to near infrared in the vicinity of 800 nm.
- the pulse illumination light L1 may have a wavelength range longer than 1 ⁇ m.
- the image sensor 24 includes a plurality of pixels and is capable of exposure control synchronized with the exposure timing signal S2 given from the camera controller 26, and generates a slice image IMGr composed of the plurality of pixels.
- the image sensor 24 has sensitivity to the same wavelength as the pulse illumination light L1, and captures the reflected light (return light) L2 reflected by the object OBJ.
- the sliced image IMGr generated by the image sensor 24 with respect to the i-th range RNG i is referred to as a raw image or a primary image as necessary to distinguish it from the sliced image IMGs which is the final output of the gating camera 20.
- raw image IMGr and sliced image IMGs are generically referred to as simply sliced image IMG.
- the camera controller 26 changes the light emission timing signal S1 and the exposure timing signal S2 for each range RNG to change the time difference between the light emission by the lighting device 22 and the exposure of the image sensor 24.
- the light emission timing signal S1 defines the light emission start timing and the light emission time.
- the exposure timing signal S2 defines the exposure start timing (time difference from the light emission) and the exposure time.
- the arithmetic processing unit 28 is a combination of a processor (hardware) such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), a microcomputer, and a GPU (Graphics Processing Unit), and a software program executed by the processor (hardware). Can be implemented.
- the arithmetic processing unit 28 may be configured only by hardware.
- the arithmetic processing unit 28 processes the raw image data IMGr generated by the image sensor 24, and outputs the final sliced image IMGs. When the output IMGr of the image sensor 24 is used as it is as sliced image IMGs, the arithmetic processing unit 28 can be omitted.
- FIG. 2 is a diagram illustrating the operation of the gating camera 20.
- FIG. 2 shows a state when the i-th range RNG i is measured as an interest range.
- the lighting device 22 emits light during the light emission period ⁇ 1 between the times t 0 and t 1 in synchronization with the light emission timing signal S1.
- ⁇ 1 between the times t 0 and t 1 in synchronization with the light emission timing signal S1.
- dMINi be the distance from the gating camera 20 to the boundary in front of the range RNG i
- dMAXi be the distance to the boundary on the back side of the range RNG i .
- T MINi 2 ⁇ d MINi / c Is. c is the speed of light.
- T MAXi 2 ⁇ d MAXi / c Is.
- the timing signal S2 is generated. This is one exposure operation.
- the camera controller 26 may repeat the above-mentioned set of irradiation and exposure operations a plurality of times in a predetermined period ⁇ 2 .
- the image sensor 24 outputs a slice image integrated by a plurality of exposures.
- the gating camera 20 has a shutter speed (exposure time), number of exposures, sensitivity, and pulse for each range so that the exposure (brightness value of the object image in the slice image) does not vary from range to range.
- the irradiation intensity of the illumination light (shooting parameters) is optimized.
- FIG. 3A and 3B are diagrams illustrating an image obtained by the gating camera 20.
- the object (pedestrian) OBJ 2 exists in the range RNG 2
- the object (vehicle) OBJ 3 exists in the range RNG 3 .
- FIG. 3 (b) shows a plurality of slice images IMG 1 to IMG 3 obtained in the situation of FIG. 3 (a).
- the image sensor 24 is exposed only by the reflected light from the range RNG 1 , so that no object image is captured in the slice image IMG 1 .
- the image sensor 24 When the slice image IMG 2 is photographed, the image sensor 24 is exposed only by the reflected light from the range RNG 2 , so that only the object image OBJ 2 is captured in the slice image IMG 2 . Similarly, when the slice image IMG 3 is photographed, the image sensor 24 is exposed only by the reflected light from the range RNG 3 , so that only the object image OBJ 3 is captured in the slice image IMG 3 . In this way, according to the gating camera 20, it is possible to separate and shoot an object for each range.
- FIG. 4A shows an example of a driving scene in bad weather.
- An object (vehicle) OBJ 3 exists in the range RNG 3 .
- the points shown in the figure schematically show obstacles such as raindrops, snowflakes, and fog.
- FIG. 4 (b) shows the slice image IMG 3 in the third range obtained in the situation of FIG. 4 (a).
- the image sensor 24 When the slice image IMG 3 is photographed, the image sensor 24 is exposed only by the reflected light from the range RNG 3 , so that the slice image IMG 3 has obstacles (raindrops, snowdrops, fog) of the ranges RNG 1 and RNG 2 . ) Does not appear. That is, it is possible to remove rain, snow, and fog contained in a range other than the range to be measured.
- FIG. 4C shows an image taken by a general camera in the same field of view.
- the reflected light of all the objects of the range RNG 3 is captured, so many obstacles are captured so as to block the object OBJ 3 .
- the gating camera 20 is utilized as an auxiliary sensor that assists the main sensor group 50. Therefore, the gating camera 20 does not always operate, but is adaptively selected to operate or not operate (stop) according to the traveling environment.
- the camera controller 26 When the gating camera 20 is in the operating state, the camera controller 26 generates a light emission timing signal S1 and an exposure timing signal S2, whereby slice images of a plurality of ranges are generated. In the stopped state of the gating camera 20, the camera controller 26 does not generate the emission timing signal S1 and the exposure timing signal S2, and therefore does not generate a slice image.
- the operation and stop of the gating camera 20 are controlled by the main controller 60.
- the gating camera 20 enters an operating state in response to the assertion of the enable signal EN from the main controller 60, and takes a picture.
- the main controller 60 determines the situation in which the gating camera 20 is required, and asserts the enable signal EN only when necessary.
- the camera controller 26 of the gating camera 20 enters an operating state in response to the assertion of the enable signal EN, and generates a light emission timing signal S1 and an exposure timing signal S2.
- the gating camera 20 becomes active (enabled), and the slice image IMGs generated by the gating camera 20 are supplied to the main controller 60. Then, the output of the gating camera 20 is used for driving support or control of automatic driving.
- FIG. 5 is a time chart illustrating the operation of the sensing system 10.
- the reliability of the main sensor group 50 is high under good weather (visibility) conditions.
- the gating camera 20 is disabled, and the main controller 60 detects the target based on the output of the main sensor group 50.
- the gating camera 20 is enabled, and the main controller 60 detects the target in place of or in addition to the output of the main sensor group 50, based on the detection result of the gating camera 20. ..
- the above is the operation of the sensing system 10. According to this sensing system 10, it is possible to suppress an increase in power consumption due to the gating camera 20 and to suppress a deterioration in the performance of the sensing system 10 in bad weather.
- the main controller 60 determines whether the weather is good or bad. Then, when it is determined that the weather condition is bad, the enable signal is asserted. By utilizing the gating camera 20 in bad weather, it is possible to recover from the deterioration of the sensing ability of the main sensor group 50.
- the main controller 60 may determine whether it is raining, whether there is snowfall, whether there is fog, in other words, whether the visibility is good or bad, based on the sensor output of a rain sensor, a fog sensor, or the like mounted on the vehicle.
- the main controller 60 can determine whether or not there is rainfall or snowfall, and the amount of rainfall and the amount of snowfall, based on the operating state (on, off, or operating speed) of the wiper. Further, the main controller 60 can determine the presence or absence of fog based on the on / off of the fog lamp.
- the main controller 60 may determine whether it is raining, whether there is snowfall, whether there is fog, in other words, whether the visibility is good or bad, based on the information given by wireless communication from outside the vehicle.
- the main controller 60 may enable the gating camera 20 when the main sensor group 50 malfunctions.
- the dysfunction is not limited to the result of bad weather and may include other factors such as a failure of the camera 52 of the main sensor group 50 or a dysfunction due to dirt.
- the main controller 60 may enable the gating camera 20 as a malfunction when the recognition accuracy of the object falls below a predetermined threshold value. For example, the main controller 60 may determine that the recognition accuracy is lowered when the probability of belonging to each of a plurality of classes (types, categories) of an object does not exceed a predetermined threshold value. ..
- the sensing system including the main sensor group 50 and the main controller 60 outputs a fail signal (also referred to as an error signal) to the upper ECU (Electronic Control Unit) of the vehicle.
- the main controller 60 asserts a fail signal when the sensing based on the output of the main sensor group 50 fails. This fail signal may be used as an enable signal of the gating camera 20.
- the main controller 60 may monitor the driver, estimate the quality of the weather based on the driver's condition, and control the gating camera 20 based on the result. For example, (i) poor visibility due to bad weather can increase the driver's tension, which can manifest itself as the driver's behavior, such as gestures, posture, and eye movements. In this case, the behavior of the driver can be monitored and the weather can be estimated based on the monitoring result.
- FIG. 6 is a block diagram of the sensing system 10 according to the second embodiment.
- the main controller 60 controls the operating and non-operating states of the gating camera 20.
- the gating camera 20A switches between operating and non-operating states by itself.
- the operation / non-operation control method of the gating camera 20A is the same as that described in the first embodiment.
- the camera controller 26A determines whether the weather is good or bad, and is in a stopped state in good weather with high reliability of the main sensor group 50, and is in an operating state in bad weather where the reliability of the main sensor group 50 is lowered. Become. By utilizing the gating camera 20A in bad weather, it is possible to recover from the deterioration of the sensing ability of the main sensor group 50.
- the camera controller 26A may determine whether or not there is rain, whether or not there is snowfall, whether or not there is fog, in other words, whether or not the field of view is good, based on the sensor outputs of the rain sensor, fog sensor, and the like.
- the camera controller 26A can determine whether or not there is rainfall or snowfall, and the amount of rainfall and the amount of snowfall, based on the operating state (on, off, or operating speed) of the wiper. Further, the camera controller 26A can determine the presence / absence of fog based on the on / off of the fog lamp.
- the camera controller 26A may determine whether it is raining, whether there is snowfall, whether there is fog, in other words, whether the visibility is good or bad, based on the information given by wireless communication from outside the vehicle.
- the camera controller 26A may put the gating camera 20A into an operating state when the main sensor group 50 malfunctions.
- the dysfunction is not limited to the result of bad weather and may include other factors such as a failure of the camera 52 of the main sensor group 50 or a dysfunction due to dirt.
- the main controller 60 when the recognition accuracy of an object falls below a predetermined threshold value, the main controller 60 considers that it has fallen into a malfunction and sends a fail signal (also referred to as an error signal) to the upper ECU (Electronic Control Unit) of the vehicle. Output.
- the camera controller 26A may select operation or non-operation of the gating camera 20A based on the fail signal generated by the main controller 60.
- the camera controller 26A may monitor the driver, estimate the quality of the weather based on the driver's condition, and control the operation and stop of the gating camera 20A based on the result. ..
- the camera controller 26A may output a status signal STATUS indicating whether the gating camera 20A is in the operating state or the non-operating state to the main controller 60.
- the main controller 60 can know whether or not the gating camera 20A is operating by referring to the status signal STATUS.
- the main controller 60 can use the slice image IMGs generated by the gating camera 20A for object recognition and the like while the gating camera 20A is in operation. Whether or not the main controller 60 uses the sliced images IMGs for object recognition may be left to the judgment of the main controller 60.
- FIG. 7 is a block diagram of the sensing system 10 according to the third embodiment. Since the basic configuration of the sensing system 10 is the same as that of the first embodiment, the description thereof will be omitted, and the differences will be described.
- the gating camera 20 divides the field of view into a plurality of ranges RNG 1 to RNG N in the depth direction, and generates a plurality of slice images IMGs 1 to IMGs N corresponding to the plurality of ranges RNG 1 to RNG N. (Normal shooting mode). Adjacent ranges may overlap in the depth direction at their boundaries.
- the gating camera 20 supports an on-demand shooting mode in addition to the normal shooting mode in which shooting is performed for the entire range.
- the normal shooting mode is not essential, and the gating camera 20 may support only the on-demand shooting mode.
- the gating camera 20 When set to the on-demand shooting mode, the gating camera 20 generates a slice image RNG_ROI corresponding to the interest range ROI corresponding to the control signal CTRL from the main controller 60.
- the interest range ROI is a section in the depth direction in front of the vehicle, and is set or selected by control from the main controller 60.
- the vehicle is running in bad weather (for example, in thick fog).
- the camera 52 cannot accurately photograph a distant object.
- the millimeter wave radar 54 since the millimeter wave radar 54 operates effectively even in fog, it can detect the existence of some object, but it cannot detect the type of the object.
- the main controller 60 gives a control signal CTRL to the gating camera 20 so that the object detected by the millimeter wave radar 54 is captured.
- the gating camera 20 can obtain a higher quality image than the normal camera 52 even in bad weather, information on an object that cannot be photographed by the camera 52 is acquired by using the gating camera 20. can do.
- the gating camera 20 is not always operated, and only the range ROI requested by the main controller 60 is photographed under the control of the main controller 60.
- shooting is performed up to a range in which no object exists, which increases waste.
- the on-demand shooting mode only the range in which the presence of an object is estimated by the main controller 60 is shot. By doing so, it is possible to suppress an increase in power consumption.
- the control method of the interest range ROI is not particularly limited, but some methods will be described below.
- a plurality of ranges RNG 1 to RNG N are defined in advance in the camera controller 26 of the gating camera 20.
- the main controller 60 estimates the distance to the object based on the output of the millimeter wave radar 54, and gives the distance information to the camera controller 26.
- the camera controller 26 selects one of a plurality of ranges RNG 1 to RNG N as the interest range ROI based on the distance indicated by the distance information.
- a plurality of ranges RNG 1 to RNG N are defined in advance in the camera controller 26 of the gating camera 20.
- the main controller 60 knows the range of each of the plurality of ranges RNG 1 to RNG N.
- the main controller 60 estimates the distance to the object based on the output of the millimeter wave radar 54, and selects one of a plurality of ranges RNG 1 to RNG N as the interest range ROI based on the estimated distance. Data indicating whether to shoot the second range is given to the camera controller 26.
- No range is predefined in the camera controller 26 of the gating camera 20.
- the main controller 60 estimates the distance to the object based on the output of the millimeter wave radar 54, and gives the distance information to the camera controller 26.
- the camera controller 26 dynamically determines the range of interest ROI so that the object detected by the main controller 60 is included.
- the gating camera 20 may supply the slice image IMGs themselves corresponding to the interest range ROI to the main controller 60.
- the main controller 60 may determine the type of an object existing in the range of interest ROI by using its own classifier.
- the gating camera 20 may include a classifier (classifier) that analyzes sliced images IMGs. This classifier can be mounted on the arithmetic processing unit 28. The classifier may determine whether or not an object is included in the sliced image IMGs of the interest range ROI, and return the determination result to the main controller 60. At a higher altitude, the gating camera 20 may return the type of object detected by the classifier to the main controller 60.
- classifier classifier
- the main controller 60 may transmit data indicating the type of the detected object based on the output of the main sensor group 50 to the gating camera 20.
- the gating camera 20 may return the match or disagreement between the type of the object detected based on the slice images IMGs and the type indicated by the received data to the main controller 60.
- the main controller 60 determines whether the weather is good or bad. Then, when it is determined that the weather condition is bad, the gating camera 20 may be operated in the on-demand mode.
- the main controller 60 may determine whether it is raining, whether there is snowfall, whether there is fog, in other words, whether the visibility is good or bad, based on the sensor output of a rain sensor, a fog sensor, or the like mounted on the vehicle.
- the main controller 60 can determine whether or not there is rainfall or snowfall, and the amount of rainfall and the amount of snowfall, based on the operating state (on, off, or operating speed) of the wiper. Further, the main controller 60 can determine the presence or absence of fog based on the on / off of the fog lamp.
- the main controller 60 determines whether there is rainfall, snowfall, fog, in other words, good or bad visibility, based on information given by wireless communication from outside the vehicle, such as VICS (registered trademark) (Vehicle Information and Communication System) information. You may.
- VICS registered trademark
- Vehicle Information and Communication System Vehicle Information and Communication System
- the main controller 60 may enable the gating camera 20 when a part of the main sensor group 50 malfunctions.
- the main controller 60 may enable the gating camera 20 as a malfunction when the recognition accuracy of the object falls below a predetermined threshold value. For example, the main controller 60 may determine that the recognition accuracy is lowered when the probability of belonging to each of a plurality of classes (types, categories) of an object does not exceed a predetermined threshold value. ..
- the sensing system including the main sensor group 50 and the main controller 60 outputs a fail signal (also referred to as an error signal) to the upper ECU (Electronic Control Unit) of the vehicle.
- the main controller 60 asserts a fail signal when the sensing based on the output of the main sensor group 50 fails.
- the gating camera 20 may be set to the on-demand mode by using the assertion of this fail signal as a trigger.
- the main controller 60 may monitor the driver, estimate the quality of the weather based on the driver's condition, and control the gating camera 20 based on the result. For example, (i) poor visibility due to bad weather can increase the driver's tension, which can manifest itself as the driver's behavior, such as gestures, posture, and eye movements. In this case, the behavior of the driver can be monitored and the weather can be estimated based on the monitoring result.
- FIG. 8 (a) and 8 (b) are diagrams showing an automobile 300 including the sensing system 10 according to the first to third embodiments. See FIG. 8 (a).
- the automobile 300 includes headlamps (lamps) 302L and 302R.
- the camera 52 and the millimeter wave radar 54 of the main sensor group 50 are arranged at locations suitable for vehicle sensing.
- the camera 52 is provided behind the rear-view mirror, and the millimeter-wave radar 54 is arranged in front of the vehicle.
- the main controller 60 is arranged in the vehicle interior or the engine room.
- the lighting device 22 of the gating camera 20 is built in at least one of the left and right headlamps 302L and 302R.
- the image sensor 24 can be attached to a part of the vehicle, for example, the back side of the rear-view mirror. Alternatively, the image sensor 24 may be provided on the front grill or the front bumper.
- the camera controller 26 may be provided in the vehicle interior, in the engine room, or may be built in the headlamp.
- the image sensor 24 may be incorporated in either the left or right headlamps 302L or 302R together with the lighting device 22.
- FIG. 9 is a block diagram showing a vehicle lamp 200.
- the vehicle lamp 200 corresponds to the headlamp 302 shown in FIG. 8B, and includes a low beam unit 202, a high beam unit 204, a lamp ECU 210, and a gating camera 20.
- the lamp ECU 210 controls on / off or light distribution of the low beam unit 202 and the high beam unit 204 based on a control command from the vehicle side ECU 310.
- the gating camera 20 is built in the housing of the vehicle lighting equipment 200. At least one of the image sensor 24, the camera controller 26, and the arithmetic processing unit 28 may be provided on the outside of the housing of the vehicle lamp 200.
- the lamp ECU 210 may control the enable / disable state of the gating camera 20.
- the slice images IMGs are output from the gating camera 20 to the main controller 60, but this is not the case.
- a classifier (classifier) may be mounted on the arithmetic processing unit 28 of the gating camera 20, and the identification result, that is, the type (category) and position of the target may be output to the main controller 60.
- the output of the gating camera 20 is used for driving support or control of automatic driving, but this is not the case.
- the gating camera 20 may be activated in bad weather, and the sliced images IMGs generated by the gating camera 20 may be displayed on a display device such as a HUD (Head Up Display) to assist the user's field of view.
- a display device such as a HUD (Head Up Display) to assist the user's field of view.
- This disclosure can be used for sensing systems for vehicles.
- Sensing system 20 Gating camera 22
- Lighting device 24 Image sensor 26
- Camera controller 28 Arithmetic processing device 50
- Main sensor group 52 Camera 54
- Main controller 200 Vehicle lighting equipment 202
- Low beam unit 204 High beam Unit 210
- Lighting ECU 300 Automobile 302 Headlamp 310
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Mechanical Engineering (AREA)
- Theoretical Computer Science (AREA)
- Transportation (AREA)
- Mathematical Physics (AREA)
- Automation & Control Theory (AREA)
- Computing Systems (AREA)
- Traffic Control Systems (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
Abstract
Description
以下、好適な実施形態について、図面を参照しながら説明する。各図面に示される同一または同等の構成要素、部材、処理には、同一の符号を付するものとし、適宜重複した説明は省略する。また、実施形態は、開示および発明を限定するものではなく例示であって、実施形態に記述されるすべての特徴やその組み合わせは、必ずしも開示および発明の本質的なものであるとは限らない。
図1は、実施形態1に係るセンシングシステム10のブロック図である。このセンシングシステム10は、自動車やバイクなどの車両に、運転支援あるいは自動運転を目的として搭載され、車両の周囲に存在する物体OBJを検出する。
TMINi=2×dMINi/c
である。cは光速である。
TMAXi=2×dMAXi/c
である。
図6は、実施形態2に係るセンシングシステム10のブロック図である。実施形態1では、メインコントローラ60が、ゲーティングカメラ20の動作、非動作状態を制御した。これに対して実施形態2では、ゲーティングカメラ20Aが、自身で動作、非動作状態を切りかえる。
図7は、実施形態3に係るセンシングシステム10のブロック図である。センシングシステム10の基本構成は実施形態1と同様であるから説明を省略し、相違点を説明する。
ゲーティングカメラ20のカメラコントローラ26には、予め複数のレンジRNG1~RNGNが定義されている。メインコントローラ60は、ミリ波レーダ54の出力にもとづいて、物体までの距離を推定し、距離情報をカメラコントローラ26に与える。カメラコントローラ26は、距離情報が示す距離にもとづいて、複数のレンジRNG1~RNGNのうちひとつを、興味レンジROIとして選択する。
ゲーティングカメラ20のカメラコントローラ26には、予め複数のレンジRNG1~RNGNが定義されている。メインコントローラ60は、複数のレンジRNG1~RNGNがそれぞれ、どの範囲であるかを知っている。メインコントローラ60は、ミリ波レーダ54の出力にもとづいて、物体までの距離を推定し、推定した距離にもとづいて、複数のレンジRNG1~RNGNのうちひとつを興味レンジROIとして選択し、何番目のレンジを撮影すべきかを示すデータを、カメラコントローラ26に与える。
ゲーティングカメラ20のカメラコントローラ26には、いかなるレンジも予め定義されていない。メインコントローラ60は、メインコントローラ60は、ミリ波レーダ54の出力にもとづいて、物体までの距離を推定し、距離情報をカメラコントローラ26に与える。カメラコントローラ26は、メインコントローラ60が検出した物体が含まれるように、興味レンジROIを動的に決定する。
実施形態では、ゲーティングカメラ20からメインコントローラ60に対してスライス画像IMGsを出力することとしたが、その限りでない。たとえばゲーティングカメラ20の演算処理装置28に、識別器(分類器)を実装し、識別結果、すなわち物標の種類(カテゴリ)および位置を、メインコントローラ60に出力するようにしてもよい。
実施形態では、ゲーティングカメラ20の出力を、運転支援または自動運転の制御に利用することとしたがその限りでない。たとえば悪天候時にはゲーティングカメラ20をアクティブとし、ゲーティングカメラ20が生成するスライス画像IMGsを、HUD(Head Up Display)などの表示装置に表示し、ユーザの視界を補助してもよい。
S2 露光タイミング信号
10 センシングシステム
20 ゲーティングカメラ
22 照明装置
24 イメージセンサ
26 カメラコントローラ
28 演算処理装置
50 メインセンサ群
52 カメラ
54 ミリ波レーダ
60 メインコントローラ
200 車両用灯具
202 ロービームユニット
204 ハイビームユニット
210 灯具ECU
300 自動車
302 ヘッドランプ
310 車両側ECU
Claims (31)
- 運転支援または自動運転のためのセンシングシステムであって、
メインセンサと、
走行環境に応じて、イネーブル状態/ディセーブル状態が制御され、前記イネーブル状態において、視野を奥行き方向について複数のレンジに区切り、前記複数のレンジに対応する複数のスライス画像を生成するゲーティングカメラと、
前記メインセンサの出力およびゲーティングカメラの出力を処理するメインコントローラと、
を備えることを特徴とするセンシングシステム。 - 前記メインコントローラは、前記走行環境に応じて前記ゲーティングカメラのイネーブル状態/ディセーブル状態を制御することを特徴とする請求項1に記載のセンシングシステム。
- 前記ゲーティングカメラは、前記走行環境に応じて前記ゲーティングカメラのイネーブル状態/ディセーブル状態を制御することを特徴とする請求項1に記載のセンシングシステム。
- 前記ゲーティングカメラの少なくとも一部は、車両用灯具に内蔵され、
前記ゲーティングカメラのイネーブル状態/ディセーブル状態は、前記車両用灯具のコントロールユニットによって制御されることを特徴とする請求項1に記載のセンシングシステム。 - 前記ゲーティングカメラは、悪天候時にイネーブル化されることを特徴とする請求項1から4のいずれかに記載のセンシングシステム。
- 前記ゲーティングカメラは、前記メインセンサが機能不全に陥るとイネーブル化されることを特徴とする請求項1から5のいずれかに記載のセンシングシステム。
- 前記ゲーティングカメラは、前記メインコントローラにおける物体の認識精度が所定のしきい値を下回るとイネーブル化されることを特徴とする請求項1から6のいずれかに記載のセンシングシステム。
- 前記ゲーティングカメラのイネーブル状態/ディセーブル状態は、車両に搭載されるレインセンサ、フォグセンサの出力、ワイパー、フォグランプの動作状態の少なくともひとつにもとづいて制御されることを特徴とする請求項1から7のいずれかに記載のセンシングシステム。
- 前記ゲーティングカメラのイネーブル状態/ディセーブル状態は、運転者の状態にもとづいて制御されることを特徴とする請求項1から8のいずれかに記載のセンシングシステム。
- 前記メインコントローラは、前記ゲーティングカメラの出力を、運転支援または自動運転の制御に利用することを特徴とする請求項1から9のいずれかに記載のセンシングシステム。
- メインセンサと、前記メインセンサの出力を処理するメインコントローラとともに車載用のセンシングシステムを構成するゲーティングカメラであって、
パルス照明光を照射する照明装置と、
イメージセンサと、
前記照明装置の発光タイミングと前記イメージセンサの露光のタイミングを制御し、前記イメージセンサに、複数のレンジに対応する複数の画像データを生成させるカメラコントローラと、
を備え、
前記ゲーティングカメラは、前記メインコントローラからの指示に応じてイネーブル/ディセーブルが制御されることを特徴とするゲーティングカメラ。 - メインセンサと、前記メインセンサの出力を処理するメインコントローラとともに車載用のセンシングシステムを構成するゲーティングカメラであって、
パルス照明光を照射する照明装置と、
イメージセンサと、
走行環境に応じて前記ゲーティングカメラのイネーブル状態、ディセーブル状態を決定し、前記イネーブル状態において前記照明装置の発光タイミングと前記イメージセンサの露光のタイミングを制御し、前記イメージセンサに、複数のレンジに対応する複数の画像データを生成させるカメラコントローラと、
を備えることを特徴とするゲーティングカメラ。 - 前記カメラコントローラは、悪天候時に前記ゲーティングカメラを前記イネーブル状態とすることを特徴とする請求項12に記載のゲーティングカメラ。
- 前記カメラコントローラは、前記メインセンサが機能不全に陥ると前記ゲーティングカメラを前記イネーブル状態とすることを特徴とする請求項12または13に記載のゲーティングカメラ。
- 前記カメラコントローラは、前記メインコントローラにおける物体の認識精度が所定のしきい値を下回ると、前記ゲーティングカメラを前記イネーブル状態とすることを特徴とする請求項12から14のいずれかに記載のゲーティングカメラ。
- 前記カメラコントローラは、車両に搭載されるレインセンサ、フォグセンサの出力、ワイパー、フォグランプの動作状態の少なくともひとつにもとづいて、前記ゲーティングカメラのイネーブル状態/ディセーブル状態を制御することを特徴とする請求項12から15のいずれかに記載のゲーティングカメラ。
- 前記カメラコントローラは、運転者の状態にもとづいて、前記ゲーティングカメラのイネーブル状態/ディセーブル状態を制御することを特徴とする請求項12から16のいずれかに記載のゲーティングカメラ。
- 運転支援または自動運転のためのセンシングシステムであって、
メインセンサと、
ゲーティングカメラと、
前記メインセンサの出力および前記ゲーティングカメラの出力を処理するメインコントローラと、
を備え、
前記ゲーティングカメラは、前記メインコントローラからの制御信号に応じた興味レンジに対応するスライス画像を生成することを特徴とするセンシングシステム。 - 前記メインセンサはミリ波レーダを含むことを特徴とする請求項18に記載のセンシングシステム。
- 前記ゲーティングカメラは、前記興味レンジに対応する前記スライス画像そのものを、前記メインコントローラに供給することを特徴とする請求項18または19に記載のセンシングシステム。
- 前記ゲーティングカメラは、前記興味レンジに物体が含まれるか否かを示す情報を、前記メインコントローラに返すことを特徴とする請求項18または19に記載のセンシングシステム。
- 前記ゲーティングカメラは、前記スライス画像にもとづいて前記興味レンジに存在する物体の種類を検出する識別器を含み、前記識別器による検出結果を、前記メインコントローラに返すことを特徴とする請求項18または19に記載のセンシングシステム。
- 前記メインコントローラは、前記メインセンサの出力にもとづいて検出した物体の種類を示すデータを、前記ゲーティングカメラに供給し、
前記ゲーティングカメラは、前記スライス画像にもとづいて前記興味レンジに存在する物体の種類を検出する識別器を含み、前記識別器が検出した種類と、前記データが示す前記種類の一致・不一致を、前記メインコントローラに返すことを特徴とする請求項18または19に記載のセンシングシステム。 - 前記ゲーティングカメラには、視野を奥行き方向に区切った複数のレンジが規定されており、前記制御信号にもとづいて、前記複数のレンジの中のひとつを前記興味レンジとして選択することを特徴とする請求項18から23のいずれかに記載のセンシングシステム。
- 前記メインコントローラは、前記メインセンサの出力にもとづいて物体を検出すると、その物体までの距離または当該物体を含む奥行き方向の範囲を指定する位置データを含む前記制御信号を、前記ゲーティングカメラに供給することを特徴とする請求項18から23のいずれかに記載のセンシングシステム。
- メインセンサと、前記メインセンサの出力を処理するメインコントローラとともに車載用のセンシングシステムを構成するゲーティングカメラであって、
パルス照明光を照射する照明装置と、
イメージセンサと、
前記照明装置の発光タイミングと前記イメージセンサの露光のタイミングを制御し、前記イメージセンサに、前記メインコントローラからの制御信号に応じた興味レンジに対応するスライス画像を生成させるカメラコントローラと、
を備えることを特徴とするゲーティングカメラ。 - 前記ゲーティングカメラは、前記興味レンジに対応する前記スライス画像そのものを、前記メインコントローラに供給することを特徴とする請求項26に記載のゲーティングカメラ。
- 前記ゲーティングカメラは、前記スライス画像にもとづいて前記興味レンジに存在する物体の種類を検出する識別器を含み、前記識別器による検出結果を、前記メインコントローラに返すことを特徴とする請求項26に記載のゲーティングカメラ。
- 前記メインコントローラは、前記メインセンサの出力にもとづいて検出した物体の種類を示すデータを、前記ゲーティングカメラに供給し、
前記ゲーティングカメラは、前記スライス画像にもとづいて前記興味レンジに存在する物体の種類を検出する識別器を含み、前記識別器が検出した種類と、前記データが示す前記種類の一致・不一致を、前記メインコントローラに返すことを特徴とする請求項26に記載のゲーティングカメラ。 - 前記カメラコントローラには、視野を奥行き方向に区切った複数のレンジが規定されており、前記制御信号にもとづいて、前記複数のレンジの中のひとつを前記興味レンジとして選択することを特徴とする請求項26から29のいずれかに記載のゲーティングカメラ。
- 前記ゲーティングカメラは、前記興味レンジに物体が含まれるか否かを示す情報を、前記メインコントローラに返すことを特徴とする請求項26から29のいずれかに記載のゲーティングカメラ。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP21858373.0A EP4201745A4 (en) | 2020-08-21 | 2021-08-19 | AUTOMOTIVE DETECTION SYSTEM AND TRIGGER CAMERA |
IL300789A IL300789A (en) | 2020-08-21 | 2021-08-19 | Vehicle sensing system and distance measuring camera |
CN202180051616.XA CN116710838A (zh) | 2020-08-21 | 2021-08-19 | 车载用感测系统及门控照相机 |
US18/042,428 US20230311897A1 (en) | 2020-08-21 | 2021-08-19 | Automotive sensing system and gating camera |
JP2022543996A JPWO2022039229A1 (ja) | 2020-08-21 | 2021-08-19 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020140273 | 2020-08-21 | ||
JP2020-140273 | 2020-08-21 | ||
JP2020-140277 | 2020-08-21 | ||
JP2020140277 | 2020-08-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022039229A1 true WO2022039229A1 (ja) | 2022-02-24 |
Family
ID=80323537
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/030393 WO2022039229A1 (ja) | 2020-08-21 | 2021-08-19 | 車載用センシングシステムおよびゲーティングカメラ |
Country Status (5)
Country | Link |
---|---|
US (1) | US20230311897A1 (ja) |
EP (1) | EP4201745A4 (ja) |
JP (1) | JPWO2022039229A1 (ja) |
IL (1) | IL300789A (ja) |
WO (1) | WO2022039229A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003061939A (ja) * | 2001-08-28 | 2003-03-04 | Pioneer Electronic Corp | 情報提供システム、情報提供方法、情報提供プログラム、情報提供システムにおけるサーバ装置および、情報提供システムにおける端末装置 |
JP2009257983A (ja) | 2008-04-18 | 2009-11-05 | Calsonic Kansei Corp | 車両用距離画像データ生成装置および車両用距離画像データの生成方法 |
WO2017110413A1 (ja) | 2015-12-21 | 2017-06-29 | 株式会社小糸製作所 | 車両用画像取得装置、制御装置、車両用画像取得装置または制御装置を備えた車両および車両用画像取得方法 |
WO2018003532A1 (ja) * | 2016-06-29 | 2018-01-04 | 京セラ株式会社 | 物体検出表示装置、移動体及び物体検出表示方法 |
JP2018131084A (ja) * | 2017-02-16 | 2018-08-23 | 日立オートモティブシステムズ株式会社 | 車載制御装置 |
WO2020121973A1 (ja) * | 2018-12-10 | 2020-06-18 | 株式会社小糸製作所 | 物体識別システム、演算処理装置、自動車、車両用灯具、分類器の学習方法 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9254846B2 (en) * | 2013-05-03 | 2016-02-09 | Google Inc. | Predictive reasoning for controlling speed of a vehicle |
EP3423865B1 (en) * | 2016-03-01 | 2024-03-06 | Brightway Vision Ltd. | Gated imaging apparatus, system and method |
JP7472571B2 (ja) * | 2020-03-19 | 2024-04-23 | マツダ株式会社 | ヘッドライト制御装置 |
-
2021
- 2021-08-19 EP EP21858373.0A patent/EP4201745A4/en active Pending
- 2021-08-19 US US18/042,428 patent/US20230311897A1/en active Pending
- 2021-08-19 JP JP2022543996A patent/JPWO2022039229A1/ja active Pending
- 2021-08-19 WO PCT/JP2021/030393 patent/WO2022039229A1/ja active Application Filing
- 2021-08-19 IL IL300789A patent/IL300789A/en unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003061939A (ja) * | 2001-08-28 | 2003-03-04 | Pioneer Electronic Corp | 情報提供システム、情報提供方法、情報提供プログラム、情報提供システムにおけるサーバ装置および、情報提供システムにおける端末装置 |
JP2009257983A (ja) | 2008-04-18 | 2009-11-05 | Calsonic Kansei Corp | 車両用距離画像データ生成装置および車両用距離画像データの生成方法 |
WO2017110413A1 (ja) | 2015-12-21 | 2017-06-29 | 株式会社小糸製作所 | 車両用画像取得装置、制御装置、車両用画像取得装置または制御装置を備えた車両および車両用画像取得方法 |
WO2018003532A1 (ja) * | 2016-06-29 | 2018-01-04 | 京セラ株式会社 | 物体検出表示装置、移動体及び物体検出表示方法 |
JP2018131084A (ja) * | 2017-02-16 | 2018-08-23 | 日立オートモティブシステムズ株式会社 | 車載制御装置 |
WO2020121973A1 (ja) * | 2018-12-10 | 2020-06-18 | 株式会社小糸製作所 | 物体識別システム、演算処理装置、自動車、車両用灯具、分類器の学習方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4201745A4 |
Also Published As
Publication number | Publication date |
---|---|
IL300789A (en) | 2023-04-01 |
US20230311897A1 (en) | 2023-10-05 |
EP4201745A4 (en) | 2024-02-28 |
EP4201745A1 (en) | 2023-06-28 |
JPWO2022039229A1 (ja) | 2022-02-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9904859B2 (en) | Object detection enhancement of reflection-based imaging unit | |
US20200174100A1 (en) | Lamp device, sensor system, and sensor device | |
US9827956B2 (en) | Method and device for detecting a braking situation | |
US20150358540A1 (en) | Method and device for generating a surround-view image of the surroundings of a vehicle, method for providing at least one driver-assistance function for a vehicle, surround-view system for a vehicle | |
CN105270254B (zh) | 用于控制车辆的至少一个大灯的光发射的方法和设备 | |
US20070069135A1 (en) | Method and device for controlling a radiation source | |
US20210053483A1 (en) | Information display device and information display method | |
US20230117346A1 (en) | System for Monitoring the Surround of a Motor Vehicle | |
JP2024103607A (ja) | 車両用赤外線センサシステム | |
JP2007015660A (ja) | 赤外画像撮像装置 | |
WO2022039229A1 (ja) | 車載用センシングシステムおよびゲーティングカメラ | |
WO2022039231A1 (ja) | 車載用センシングシステムおよびゲーティングカメラ | |
CN111845347B (zh) | 车辆行车安全的提示方法、车辆和存储介质 | |
CN116710838A (zh) | 车载用感测系统及门控照相机 | |
WO2022039230A1 (ja) | 車載用センシングシステムおよびゲーティングカメラ | |
US10990834B2 (en) | Method and apparatus for object detection in camera blind zones | |
EP3227742B1 (en) | Object detection enhancement of reflection-based imaging unit | |
WO2023085403A1 (ja) | センシングシステム | |
CN116648664A (zh) | 车载用传感系统以及门控照相机 | |
WO2023074903A1 (ja) | センシングシステムおよび自動車 | |
US20230311818A1 (en) | Sensing system and vehicle | |
WO2021172478A1 (ja) | センサ、自動車および周囲環境のセンシング方法 | |
US20240302510A1 (en) | Sensor system | |
EP4382968A1 (en) | Gating camera, vehicular sensing system, and vehicular lamp | |
US20230022313A1 (en) | Object detection system for a motor vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21858373 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022543996 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180051616.X Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021858373 Country of ref document: EP Effective date: 20230321 |