EP4440902A1 - Système de détection d'objet à haute performance utilisant des images hdr obtenues à partir de caméras ldr dans des véhicules autonomes - Google Patents
Système de détection d'objet à haute performance utilisant des images hdr obtenues à partir de caméras ldr dans des véhicules autonomesInfo
- Publication number
- EP4440902A1 EP4440902A1 EP22917078.2A EP22917078A EP4440902A1 EP 4440902 A1 EP4440902 A1 EP 4440902A1 EP 22917078 A EP22917078 A EP 22917078A EP 4440902 A1 EP4440902 A1 EP 4440902A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- cameras
- ldr
- hdr
- object detection
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/92—Dynamic range modification of images or parts thereof based on global image properties
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0475—Generative networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/09—Supervised learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20208—High dynamic range [HDR] image processing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/147—Details of sensors, e.g. sensor lenses
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/16—Image acquisition using multiple overlapping images; Image stitching
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
Definitions
- the present invention relates to a high-performance object detection system in autonomous vehicles using HDR (High Dynamic Range) images obtained from LDR (Low Dynamic Range) cameras.
- HDR High Dynamic Range
- LDR Low Dynamic Range
- Image processing and enhancement which is used to obtain information about the nature of an object, is one of the main tools used for object identification [1], tracking [2], detection [3] and classification [4].
- Image processing methods are frequently used in many fields such as military industry, security, medicine, robotics, physics, biomedical and satellite images.
- the presence of the target object to be tracked in a scene with a high difference in illumination is one of the most important problems that complicates object tracking and analysis. Different methods have been developed to solve this problem and to successfully track the object and reconstruct the 3D structure of the scene [5,6,7].
- Document No. US8811811 mentions a system for generating an output image.
- the first camera of a camera pair is configured to record the first part of a scene to obtain the first recorded image.
- the second camera of the camera pair is configured to record a second part of the scene to obtain a second recorded image.
- a central camera is configured to record another part of the scene to obtain a central image.
- a processor is configured to generate the output image.
- the initial brightness range of the first camera of each camera pair is different from the central camera brightness range and differs from the first brightness range of the first camera of any other camera pair of one or more camera pairs.
- high dynamic range 3D images are generated with relatively narrow dynamic range image sensors.
- the input frames of different views can be adjusted to different exposure settings. Pixels in input frames can be normalized to a common range of brightness levels. The difference between normalized pixels in the input frames can be calculated and interpolated. Pixels in different input frames can be shifted to or remain in a common frame of reference.
- the pre-normalized brightness levels of the pixels can be used to generate high dynamic range pixels that form one, two or more output frames of different views.
- a modulated synopter with electronic mirrors is combined with a stereoscopic camera to capture monoscopic HDR, variable monoscopic HDR and stereoscopic LDR images or stereoscopic HDR images.
- the system subject to the invention can detect a number of objects captured on the overlapping area between a computer system, a first field of view associated with the first camera, and a second field of view associated with a second camera.
- the system can set a corresponding priority order for each of the objects.
- the system can select an object from the objects according to the corresponding priority order for the object.
- the system may determine a first illumination condition for the first camera associated with the first field of view.
- the system can determine a second illumination condition for the second camera associated with the second field of view.
- the system can determine a shared exposure time for the selected object based on the first illumination condition and the second illumination condition.
- the system can cause at least one image of the selected object to be captured using the shared exposure time.
- Document No. US 11094043 describes devices, systems and methods for generating high dynamic range images and video from a series of low dynamic range images and video using convolutional neural networks (CNNs).
- An exemplary method for generating high dynamic range visual media comprises using the first CNN to combine the first set of images with the first dynamic range to generate a final image with a second dynamic range greater than the first dynamic range.
- Another exemplary method for generating training data comprises generating static and dynamic image sets with the first dynamic range, and generating a real image set with a second dynamic range greater than the first based on the weighted sum of the static image set. It is related to dynamic range and replacing at least one of the dynamic image sets with an image from the static image set to generate a set of training images.
- LDR cameras are used to a large extent in autonomous vehicles in the state of the art, and for this reason, it is not possible to distinguish and recognize objects in images in scenes with high illumination difference (tunnels, sunrise or sunset, etc.).
- High Dynamic Range (HDR) sensors and cameras are expensive for consumers requires that the same quality images be obtained with economical LDR (Low Dynamic Range) cameras. .
- Our invention relates to a high-performance object detection system using HDR images obtained from LDR cameras, which allows for the separation and recognition of objects in images under high illumination difference conditions (tunnels, sunrise or sunset, etc.) and prevents autonomous vehicles from causing undesired accidents.
- the invention tries to eliminate this fundamental problem.
- Our invention presents an integrated solution for automatically finding people, vehicles and objects that cannot be detected by the eye as a result of dark areas or high glare in the scene by receiving input from autonomous vehicles through economic cameras.
- LDR standard
- HDR high dynamic range
- the exposure fusion block in Figure 2 covers the steps taken during the transfer of the pixel values of the two cameras (1,3) located on the sides, which were able to record the difficult- to-see points in the scene in such a way that the details can be noticed due to the appropriate exposure values, to the camera pixels in the middle to generate an HDR image.
- the 3 images normalized according to their exposure times are combined by weighting on the middle camera (2), taking into account the disparity values, and an HDR image is created.
- the usable pixels from the middle camera (2) are detected and they provide direct input to the HDR image.
- the unusable pixels too dark/bright
- Figure 3 shows the block diagram of the end-to-end jointly trained system for detecting objects from HDR images.
- the HDR image obtained by combining 3 standard cameras with different exposure values in the previous step will be used to improve automatic object detection.
- the related system can work with two different approaches:
- the obtained HDR images can be given as raw input to the object detection algorithms trained with similarly labeled HDR data.
- Another solution is to receive the help of a tone mapping algorithm that automatically extracts detail-rich information from HDR data.
- tone mapping and object detection sub-blocks are trained end-to-end together in a unique way to increase performance.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| TR202121665 | 2021-12-29 | ||
| PCT/TR2022/051657 WO2023129079A1 (fr) | 2021-12-29 | 2022-12-28 | Système de détection d'objet à haute performance utilisant des images hdr obtenues à partir de caméras ldr dans des véhicules autonomes |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| EP4440902A1 true EP4440902A1 (fr) | 2024-10-09 |
| EP4440902A4 EP4440902A4 (fr) | 2025-03-19 |
Family
ID=92710311
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP22917078.2A Pending EP4440902A4 (fr) | 2021-12-29 | 2022-12-28 | Système de détection d'objet à haute performance utilisant des images hdr obtenues à partir de caméras ldr dans des véhicules autonomes |
Country Status (1)
| Country | Link |
|---|---|
| EP (1) | EP4440902A4 (fr) |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10757320B2 (en) * | 2017-12-28 | 2020-08-25 | Waymo Llc | Multiple operating modes to expand dynamic range |
-
2022
- 2022-12-28 EP EP22917078.2A patent/EP4440902A4/fr active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| EP4440902A4 (fr) | 2025-03-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20230298141A1 (en) | Bright Spot Removal Using A Neural Network | |
| US10771697B2 (en) | Still image stabilization/optical image stabilization synchronization in multi-camera image capture | |
| CN109636754B (zh) | 基于生成对抗网络的极低照度图像增强方法 | |
| EP4109392A1 (fr) | Procédé de traitement d'image et dispositif de traitement d'image | |
| Trinidad et al. | Multi-view image fusion | |
| CN113112403B (zh) | 一种红外图像拼接方法、系统、介质及电子设备 | |
| US11037308B2 (en) | Intelligent method for viewing surveillance videos with improved efficiency | |
| CN110210541B (zh) | 图像融合方法及设备、存储装置 | |
| WO2018163843A1 (fr) | Dispositif d'imagerie et procédé d'imagerie, et dispositif de traitement d'image et procédé de traitement d'image | |
| EP4139840A2 (fr) | Traitement de signal d'image d'objets joints dans le domaine temporel | |
| CN111582074A (zh) | 一种基于场景深度信息感知的监控视频树叶遮挡检测方法 | |
| CN110930440A (zh) | 图像对齐方法、装置、存储介质及电子设备 | |
| WO2021214712A1 (fr) | Pipelines de traiment d'images ou de vidéo de caméra prise en charge par un réseau neuronal | |
| US20110085026A1 (en) | Detection method and detection system of moving object | |
| Sheng et al. | Guided colorization using mono-color image pairs | |
| US11044399B2 (en) | Video surveillance system | |
| US9860456B1 (en) | Bayer-clear image fusion for dual camera | |
| WO2020237366A1 (fr) | Système et procédé de suppression de la réflexion au moyen d'un capteur à deux pixels | |
| US20250078491A1 (en) | High-performance object detection system using hdr images obtained from ldr cameras in autonomous vehicles | |
| KR102389284B1 (ko) | 인공지능 기반 이미지 인페인팅 방법 및 디바이스 | |
| EP4440902A1 (fr) | Système de détection d'objet à haute performance utilisant des images hdr obtenues à partir de caméras ldr dans des véhicules autonomes | |
| CN109348140A (zh) | 一种监控场景下实时视频的拼接方法 | |
| Sippel et al. | Multispectral snapshot image registration using learned cross spectral disparity estimation and a deep guided occlusion reconstruction network | |
| Wang et al. | Asymmetric stereo color transfer | |
| Zhao et al. | LUCK: Lighting up colors in the dark |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
| 17P | Request for examination filed |
Effective date: 20240702 |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Free format text: PREVIOUS MAIN CLASS: B60W0060000000 Ipc: G06T0005500000 |
|
| A4 | Supplementary search report drawn up and despatched |
Effective date: 20250219 |
|
| RIC1 | Information provided on ipc code assigned before grant |
Ipc: G03B 29/00 20210101ALN20250213BHEP Ipc: G06N 3/045 20230101ALN20250213BHEP Ipc: G06N 3/0475 20230101ALN20250213BHEP Ipc: G06N 3/09 20230101ALN20250213BHEP Ipc: G06T 5/92 20240101ALI20250213BHEP Ipc: H04N 23/741 20230101ALI20250213BHEP Ipc: H04N 23/45 20230101ALI20250213BHEP Ipc: G06V 20/56 20220101ALI20250213BHEP Ipc: B60W 60/00 20200101ALI20250213BHEP Ipc: G06V 20/80 20220101ALI20250213BHEP Ipc: G06V 10/70 20220101ALI20250213BHEP Ipc: G06T 7/70 20170101ALI20250213BHEP Ipc: G06V 40/10 20220101ALI20250213BHEP Ipc: G06V 40/00 20220101ALI20250213BHEP Ipc: G06T 5/90 20240101ALI20250213BHEP Ipc: G06T 5/50 20060101AFI20250213BHEP |
|
| DAV | Request for validation of the european patent (deleted) | ||
| DAX | Request for extension of the european patent (deleted) |