WO2006034924A1 - Verfahren zur darstellung eines von einer videokamera aufgenommenen bildes - Google Patents
Verfahren zur darstellung eines von einer videokamera aufgenommenen bildes Download PDFInfo
- Publication number
- WO2006034924A1 WO2006034924A1 PCT/EP2005/054016 EP2005054016W WO2006034924A1 WO 2006034924 A1 WO2006034924 A1 WO 2006034924A1 EP 2005054016 W EP2005054016 W EP 2005054016W WO 2006034924 A1 WO2006034924 A1 WO 2006034924A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- factor
- image
- vehicle
- gray values
- picture elements
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 238000005286 illumination Methods 0.000 claims description 6
- 238000005282 brightening Methods 0.000 claims description 2
- 239000011159 matrix material Substances 0.000 claims description 2
- 230000001419 dependent effect Effects 0.000 abstract description 2
- 238000012935 Averaging Methods 0.000 description 5
- 230000015572 biosynthetic process Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000003467 diminishing effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003760 hair shine Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000004297 night vision Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/76—Circuitry for compensating brightness variation in the scene by influencing the image signals
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/103—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using camera systems provided with artificial illumination device, e.g. IR light source
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/106—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using night vision cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/804—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for lane monitoring
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8053—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for bad weather conditions or night vision
Definitions
- the invention relates to a method for displaying an image of a scene in front of a vehicle taken by a video camera, wherein the scene in front of the vehicle contains a roadway which is illuminated by headlights of the vehicle.
- the healthy human eye is able to detect such high contrast situations relatively well. Even with modern video cameras, a high contrast range can be processed.
- One problem, however, is that it is currently not technically possible to display the full range of contrast on a screen so that the human eye can fully capture it. This leads to the situation described above, that the roadway or nearby objects that are in the headlight cone, image radiated appear while distant dimly lit objects are barely perceptible.
- Object of the present invention is therefore, even scenes with high
- This object is achieved according to the invention in that the gray values of the picture elements of the image data generated by the video camera are evaluated in such a way that a reduction factor which depends on the location of the respective picture element within the picture and on the brightness of objects in the vicinity in front of the vehicle the contrast between the reproduction of the close range in front of the vehicle and the reproduction of other parts of the image is reduced.
- the objects in the vicinity are usually formed by the road, with the near range being that distance range which differs with respect to the brightness of the road
- Headlamp beam clearly different from further away areas.
- the method also allows other illuminated objects at close range to affect the descent factor.
- Image elements are calculated from a first factor that determines the distribution of the subsidence across the image, and a second factor that determines the degree of subsidence regardless of the location within the image.
- the invention is not limited to use with infrared headlamps and an infrared sensitive camera, but may well be with visible
- An advantageous embodiment of the method according to the invention consists in that the first factor substantially corresponds to the camera image of the brightness distribution of a well-remitted carriageway illuminated by the headlamps.
- the first factor is stored as a matrix of values.
- the determination of the first factor can be carried out experimentally by directing the headlights of a vehicle onto a lane with a light surface, whereby an averaging can be carried out over a series of images, wherein the vehicle preferably moves on the lane.
- the image thus obtained can be filtered with a low-pass filter, so that irregularities in the illumination are smoothed and thus do not affect the first factor.
- the first factor analytically, for example, from the result of using a ray tracing method, wherein the illumination by the headlights, the reflection on the road surface and the image acquisition are mathematically modeled.
- the second factor is formed from the ratio of a first and a second average value, wherein an assignment of the average values for forming the first and second average value is carried out inversely in dependence on the first factor.
- a first embodiment of this embodiment consists in the fact that the assignment takes place in that the gray values of picture elements for which the first factor is smaller than a threshold value are included in the first average value, and in that the gray values of picture elements for which the first factor is greater as the threshold is in the - A -
- the threshold value can be at half the maximum value of the first factor, but other threshold values are also possible. This embodiment allows the formation of the second factor with relatively little computational effort.
- a second embodiment of this embodiment is more precisely adapted to the variation of the values of the first factor and consists in the gray values of the picture elements being multiplied by the first factor to form the first average value and multiplied by the first factor to form the second average value, and summed in each case become.
- each tenth pixel can be used in both the horizontal and vertical directions.
- the second factor is limited to a value which does not cause brightening of the image.
- the weighting of the extent to which image elements used in the formation of the second factor belong to an inner, brighter image area or to an outer, darker image area need not be continually recalculated become.
- weights for the picture elements used to form the second factor are stored in a memory.
- FIG. 2 shows a schematic representation of a screen for displaying the first factor
- FIGS. 3 and 4 show gradients of the gray values of the first factor in a selected one
- FIGS. 5 and 6 show weighting factors in the course of the selected line according to a first exemplary embodiment of the method according to the invention
- FIG. 10 shows the course of the gray values in a column of the image according to FIG. 9 without reduced contrast
- Fig. 1 illustrates a device for carrying out the method according to the invention, wherein some blocks circuits and other parts (hardware) and other blocks represent parts of a program which is executed in a microcomputer.
- a camera 1 "sees" for example through the windscreen 2 of a vehicle, not shown. This is preferably a sensitive in the near infrared range camera.
- the scene in front of the vehicle is illuminated by headlamps 12, 13 which are switched on or off by a control device 11 according to the driver's input or automatically. Also pivotable headlights, such as those for tracking curves, can be operated together with the inventive method.
- the video camera 1 is provided with a scattered light aperture, not shown, and connected to a control device 3, which makes in particular the adaptation of the camera to the different lighting conditions.
- the video camera 1 generates digital video signals, also called image data. Because it is usually a monochrome
- the image data for each pixel (pixels) at the coordinates x, y contain a gray value g.
- This is in each case fed to a weighted averaging of gray values, wherein the weighted averaging 4 substantially takes into account the picture elements lying outside the illuminated roadway, while in FIG Gray values of the picture elements are essentially averaged in the area of the illuminated roadway.
- the two mean values ml and m2 are divided at 6, resulting in the second factor f.
- the first factor h is stored in a memory 7 as well as parameters w1 and w2. Since the illumination is dependent on the respective headlamp setting, a plurality of parameter sets wl, w2, h are stored in the memory 7, of which a parameter set is read out of the memory by the control device 11 of the headlights 12, 13.
- the weighted gray values are then fed to a driver 9 of a screen 10.
- the video camera 1 is directed onto a surface that is easy to remedy, which is illuminated by the headlights as well as a roadway. Then, the image shown in Fig. 2 is formed, the decreasing brightness being represented as lines of equal brightness. In simple terms, this results in an inner area
- FIGS. 3 to 8 as well as FIGS. 10 and 11 respectively show the course of the gray values and parameters for a selected column V and a selected row
- the gray values in the selected column gradually increase from the center to the bottom, while in the selected row in a middle region the gray values are high and gradually towards the edges lose weight.
- the recorded values are stored in memory as the first factor h (x, y) for all picture elements.
- a possibility shown in FIGS. 5 and 6 is that by comparing the factor h with a Threshold all parameters w2 for the image area 21 take the value 1 and outside this range are O, while the parameter wl for pixels outside the range 1 and for pixels within the range O is.
- a 1 means that the corresponding picture element is included in the averaging and an O that it does not enter the averaging.
- the second factor calculated with the aid of the parameters w1 and w2 and thus also the lowering factor acts over the entire image, so that no edges occur at the points where w1 and w2 have jumps. At most, sudden changes in the reduction factor may occur if a brightness jump in the image exceeds the limits between the regions 21 and 22, which are caused by the jumps in the parameters w1 and w2. Otherwise, this embodiment is characterized by a relatively low demand for computing power.
- the image areas 21 and 22 are not artificially separated by a sharp boundary, but the actually existing gradual transitions of the first factor h to produce the parameters w1 and w2 used. This happens because w2 corresponds to the first factor h (FIG. 7) and that to form w1 the
- nl and n2 are the number of the representative picture elements in the respective Area are.
- FIG. 9 illustrates the simplified image of a real scene with the image of a lane 24 on which a person 23 is located.
- the image areas 21 and 22 and the selected column V and line H shown in FIG. 2 are shown.
- 10 shows the gray values G of the recorded signal.
- the gray values 25 which represent the person are typically substantially lower than the gray levels 26 of the lane 24 illuminated in the foreground, provided that the roadway 24 is light, that is, for example, a dry concrete or gravel road.
- the gray values 25 are essentially retained, while the gray values 26 'are markedly lowered compared to the gray values 26 (FIG. 11).
- Figures 12 and 13 show screens, wherein Figure 12 was taken without the application of the inventive measures. It can be clearly seen that the roadway, in this case the forest track, outshines other image details such that the images of the on-the-way people do not stand out as clearly as in the screen of Figure 13 in which the brightness of the roadway is significantly lowered is. Even at a glance, the driver immediately perceives the persons on the way. As a further effect is shown in Fig. 13, that the structures of the path are better recognizable than without the lowering of the contrasts.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
- Studio Devices (AREA)
- Picture Signal Circuits (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/664,357 US8487997B2 (en) | 2004-09-30 | 2005-08-16 | Method for displaying an image recorded by a video camera |
EP05774079A EP1797710A1 (de) | 2004-09-30 | 2005-08-16 | Verfahren zur darstellung eines von einer videokamera aufgenommenen bildes |
JP2007533983A JP4457150B2 (ja) | 2004-09-30 | 2005-08-16 | ビデオカメラにより撮影された画像を表示する方法 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102004047474 | 2004-09-30 | ||
DE102004047474.5 | 2004-09-30 | ||
DE102004050990.5 | 2004-10-20 | ||
DE102004050990A DE102004050990A1 (de) | 2004-09-30 | 2004-10-20 | Verfahren zur Darstellung eines von einer Videokamera aufgenommenen Bildes |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006034924A1 true WO2006034924A1 (de) | 2006-04-06 |
Family
ID=36062246
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2005/054016 WO2006034924A1 (de) | 2004-09-30 | 2005-08-16 | Verfahren zur darstellung eines von einer videokamera aufgenommenen bildes |
Country Status (5)
Country | Link |
---|---|
US (1) | US8487997B2 (de) |
EP (1) | EP1797710A1 (de) |
JP (1) | JP4457150B2 (de) |
DE (1) | DE102004050990A1 (de) |
WO (1) | WO2006034924A1 (de) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102006027121A1 (de) * | 2006-06-12 | 2007-12-13 | Robert Bosch Gmbh | Bildaufnahmesystem und Verfahren für die Entfernungsbestimmung mit einem Bildaufnahmesystem |
DE102008043880A1 (de) * | 2008-11-19 | 2010-05-20 | Robert Bosch Gmbh | Beleuchtungseinheit für ein Fahrzeug, Fahrzeug und Verfahren hierfür |
DE102020213270A1 (de) | 2020-10-21 | 2022-04-21 | Conti Temic Microelectronic Gmbh | System zur Vermeidung von Unfällen durch Wildwechsel bei Dämmerung und Nacht |
EP4342735A1 (de) * | 2022-09-23 | 2024-03-27 | Volvo Truck Corporation | Hilfssicherheitsvorrichtung für fahrzeuge und fahrzeug mit solch einer hilfssicherheitsvorrichtung |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06233309A (ja) * | 1993-02-03 | 1994-08-19 | Daihatsu Motor Co Ltd | 表示画像のコントラスト改善方法 |
FR2726144A1 (fr) * | 1994-10-24 | 1996-04-26 | Valeo Vision | Procede et dispositif d'amelioration de la vision nocturne, notamment pour vehicule automobile |
DE10261290A1 (de) | 2001-12-28 | 2003-07-17 | Yazaki Corp | Im-Fahrzeug-Bildkorrektur-Einrichtung und Nachtfahrt-Sichtfeld-Unterstützungs-Einrichtung |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4238949B2 (ja) | 1999-06-30 | 2009-03-18 | ソニー株式会社 | 赤外線式監視装置 |
JP2003259363A (ja) * | 2002-02-27 | 2003-09-12 | Denso Corp | ナイトビジョン装置 |
JP4798945B2 (ja) * | 2003-03-05 | 2011-10-19 | トヨタ自動車株式会社 | 撮像装置 |
-
2004
- 2004-10-20 DE DE102004050990A patent/DE102004050990A1/de not_active Withdrawn
-
2005
- 2005-08-16 US US11/664,357 patent/US8487997B2/en not_active Expired - Fee Related
- 2005-08-16 EP EP05774079A patent/EP1797710A1/de not_active Ceased
- 2005-08-16 JP JP2007533983A patent/JP4457150B2/ja not_active Expired - Fee Related
- 2005-08-16 WO PCT/EP2005/054016 patent/WO2006034924A1/de active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06233309A (ja) * | 1993-02-03 | 1994-08-19 | Daihatsu Motor Co Ltd | 表示画像のコントラスト改善方法 |
FR2726144A1 (fr) * | 1994-10-24 | 1996-04-26 | Valeo Vision | Procede et dispositif d'amelioration de la vision nocturne, notamment pour vehicule automobile |
DE10261290A1 (de) | 2001-12-28 | 2003-07-17 | Yazaki Corp | Im-Fahrzeug-Bildkorrektur-Einrichtung und Nachtfahrt-Sichtfeld-Unterstützungs-Einrichtung |
Non-Patent Citations (1)
Title |
---|
PATENT ABSTRACTS OF JAPAN vol. 018, no. 608 (E - 1633) 18 November 1994 (1994-11-18) * |
Also Published As
Publication number | Publication date |
---|---|
DE102004050990A1 (de) | 2006-04-06 |
US20090295920A1 (en) | 2009-12-03 |
JP2008515305A (ja) | 2008-05-08 |
EP1797710A1 (de) | 2007-06-20 |
US8487997B2 (en) | 2013-07-16 |
JP4457150B2 (ja) | 2010-04-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE19619734C2 (de) | Elektronisches Endoskopsystem | |
DE4107021C2 (de) | Vorrichtung zur Darstellung von Fernsehbildern | |
DE60100525T2 (de) | Bildverarbeitung und Überwachungsgerät | |
DE102010030044A1 (de) | Wiederherstellvorrichtung für durch Wettereinflüsse verschlechterte Bilder und Fahrerunterstützungssystem hiermit | |
DE10011744B4 (de) | Automatische Hintergrundszenendefokussierung für die Bildzusammensetzung | |
DE10292327T5 (de) | Fahrzeugumgebungsbildverarbeitungsvorrichtung und Aufzeichnungsmedium | |
DE10203421C1 (de) | Automobiles Infrarot-Nachtsichtgerät und automobiles Display | |
DE102016121755A1 (de) | Verfahren zum Bestimmen eines zusammengesetzten Bilds eines Umgebungsbereichs eines Kraftfahrzeugs mit Anpassung von Helligkeit und/oder Farbe, Kamerasystem sowie Krafzfahrzeug | |
EP4078941A2 (de) | Umwandlung von eingangs-bilddaten einer mehrzahl von fahrzeugkameras eines rundumsichtsystems in optimierte ausgangs-bilddaten | |
DE102004010908B4 (de) | Bildaufnahmevorrichtung und Bildaufnahmeverfahren | |
WO2006034924A1 (de) | Verfahren zur darstellung eines von einer videokamera aufgenommenen bildes | |
DE102010002310A1 (de) | Verfahren und Vorrichtung zur Freisichtprüfung einer Kamera für ein automobiles Umfeld | |
DE10016184A1 (de) | Vorrichtung zur Anzeige der Umgebung eines Fahrzeugs | |
EP0951776A1 (de) | Verfahren zur regelung der belichtung von videokameras | |
EP2539851B1 (de) | Verfahren und vorrichtung zur analyse eines bildes einer bilderfassungseinrichtung für ein fahrzeug | |
WO2019072374A1 (de) | Verfahren zur maskierung eines bildes einer bildsequenz mit einer maske, computerprogramm, maschinenlesbares speichermedium und elektronische steuereinheit | |
DE102015119871A1 (de) | Verfahren zum Betreiben eines Kamerasystems mit einer Vielzahl von Bildsensorelementen und Kraftfahrzeug | |
DE102018118996A1 (de) | Verfahren zum Bestimmen eines Farbkorrekturwerts, Bildverarbeitungseinrichtung,Kamerasystem und Kraftfahrzeug | |
DE102004047476B4 (de) | Vorrichtung und Verfahren zur Einstellung einer Kamera | |
DE69531412T2 (de) | Bildkodierung beruhend auf Bildbereichen und deren segmentierten Randlinien | |
DE102015112389A1 (de) | Verfahren zum Erfassen zumindest eines Objekts auf einer Straße in einem Umgebungsbereich eines Kraftfahrzeugs, Kamerasystem sowie Kraftfahrzeug | |
DE102005041241A1 (de) | Verfahren und System zum Betreiben eines Kamerasystems eines Fahrzeugs | |
DE102019213069A1 (de) | System und verfahren zum steuern eines fahrzeugscheinwerfers zum verhindern einer schattenfläche | |
DE102018212179A1 (de) | Bildverarbeitungsvorrichtung und verfahren zur bildverarbeitung | |
DE102017114611A1 (de) | Verfahren zum Erzeugen zumindest eines fusionierten perspektivischen Bildes eines Kraftfahrzeugs und eines Umgebungsbereiches des Kraftfahrzeugs, Kamerasystem sowie Kraftfahrzeug |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
REEP | Request for entry into the european phase |
Ref document number: 2005774079 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005774079 Country of ref document: EP |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2007533983 Country of ref document: JP |
|
WWP | Wipo information: published in national office |
Ref document number: 2005774079 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11664357 Country of ref document: US |