WO2007012300A1 - Method for determining a visibility range using a camera - Google Patents

Method for determining a visibility range using a camera Download PDF

Info

Publication number
WO2007012300A1
WO2007012300A1 PCT/DE2006/000782 DE2006000782W WO2007012300A1 WO 2007012300 A1 WO2007012300 A1 WO 2007012300A1 DE 2006000782 W DE2006000782 W DE 2006000782W WO 2007012300 A1 WO2007012300 A1 WO 2007012300A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
measurement
visibility
objects
distance
Prior art date
Application number
PCT/DE2006/000782
Other languages
German (de)
French (fr)
Inventor
Michael Walter
Matthias Zobel
Original Assignee
Adc Automotive Distance Control Systems Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Adc Automotive Distance Control Systems Gmbh filed Critical Adc Automotive Distance Control Systems Gmbh
Priority to DE112006001119T priority Critical patent/DE112006001119A5/en
Publication of WO2007012300A1 publication Critical patent/WO2007012300A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures

Definitions

  • the present invention relates to a method for determining the visibility.
  • Application finds this invention z. B. in motor vehicles.
  • Camera sensors are used here to an increased extent for environmental detection, in particular for lane detection and / or object recognition, with a view in the direction of travel or in the return journey area.
  • the camera sensors are usually mounted inside the vehicle behind a windshield.
  • An error-free function is only possible if the visibility is known. For example, it is unnecessary to place at distances below 40m measuring windows at distances above it.
  • Common methods of vision determination rely on transmission or reflection measurements. In a transmission measurement, the visibility is calculated from the attenuation between the intensity emitted by a known source and the intensity measured at a fixed distance. In a reflection measurement, the visibility is determined by the intensity of the scattered light.
  • both methods have the disadvantage of requiring a reference of a known intensity at a known distance, which is not present in a moving vehicle due to dynamic environments.
  • the object of the invention is to detect the visibility with a camera, which is also used for other applications in particular for the environmental detection, cost-effective.
  • the object is achieved by a passive camera-based approach.
  • a camera is provided, which is aligned to a vehicle environment.
  • An advantage of this arrangement is that the camera can be used to determine the visibility and the environment detection. Pictures are read out of the camera in a given cycle, eg 25 pictures / s. The recorded images are evaluated in terms of determining the visibility. Thus, no further artificial reference images or reference objects are required, so that there are no additional costs and / or adjustment costs for the user.
  • the relative speed v re i of the camera and at least one recorded object in the environment must be non-zero.
  • the relative speed to static objects in the vehicle environment is v re i> Om / s.
  • Objects with v re i> Om / s continuously change their position relative to the camera and thus their position in the picture.
  • the calculated course of this movement in the picture is referred to below as a trajectory.
  • the same object is thus displayed offset in time in different image sections.
  • the object may be a single object such as an oncoming vehicle or repetitive objects such as lane markers.
  • a number of measurement windows are set, which the object passes through at a suitable read-out frequency and / or a suitable number of averages of a measurement at different times.
  • the trajectories of static objects in the imaging region of the camera run along straight lines that extend radially from the vanishing point of the image.
  • the trajectory of objects based on environmental features and / or the current camera movement and / or the installation position of the camera is calculated in an advantageous embodiment of the invention.
  • an object trajectory in particular that of a static object, can be calculated from the estimated lane parameters such as curvature and offset, the mounting position of the camera, its height, pitch, yaw and roll angles, as well as the intrinsic camera parameters, the focal length, principal point, etc. , If no lane parameters are available because the system is unable to estimate a lane due to contamination or missing road markings, the curvature can be determined from the steering angle or a lateral acceleration / yaw rate sensor.
  • the distance of the objects recorded in a measurement window to the camera is determined. Visibility is calculated from an image contrast of a measurement window or parts of a measurement window and the associated distance. At least two contrast measurements at different distances are used for the evaluation.
  • the image contrast is preferably determined over an entire measurement window in order to avoid computation-intensive object recognition.
  • the trajectories of lane markings in the image are predicted and the measurement windows are set along these trajectories. From a contrast measurement in the measurement windows or an averaging over several measurements, the visibility can be determined. Due to the periodically recurring markings, it is not necessary for a specific marking section to be imaged in each measuring window, so that even in this exemplary embodiment no computationally intensive object recognition is necessary.
  • the distance between the objects recorded in the measurement windows and the camera is estimated from the position of the measurement window in the calibrated camera image. This method has the advantage that no complex stereo method or another sensor for distance measurement is needed.
  • An advantageous embodiment of the invention is a motor vehicle with a camera for determining the visibility and the environment observation, e.g. for lane detection and / or object recognition, with a view in the direction of travel or in the return area.
  • Fig. 1 Determination of the column c of a placed in a row y measuring window with lateral
  • Fig. 2 Determination of the distance of a point d to the camera.
  • the camera is geared to the vehicle environment and is also used for environmental detection.
  • the vehicle moves on a straight line on average over time.
  • Static objects in the environment thus move in the imaging range of the camera on straight lines that extend radially from the vanishing point of the image. This of course also applies to the median strip and the side boundaries of a road, the course of the road and oncoming vehicles.
  • measurement windows are positioned along the radial straight line, they will be under ideal conditions, ie no damping or soiling, in the corresponding measurement windows, for example in the imaging area of the central strip or the roadway, expected identical contrasts.
  • the roadway can be subdivided into areas of equidistant width. The size and position of the measuring windows in the picture result from the imaging model of the camera.
  • FIG. 1 shows the simplest case of a downwardly directed camera.
  • the column c of a measurement window with a lateral offset x placed in a row y is calculated as
  • FIG. 2 illustrates the distance determination of a point d to the camera. The distance becomes from h the camera installation height, a the camera pitch angle, y the image line of the point, ⁇ the pixel size and / or the camera focal length
  • the visibility can be determined as follows.
  • Co and C are the known and the measured contrast of a reference measured at a damping K
  • d is the distance between the camera and the reference.
  • the contrast is calculated from the quotient of the difference and sum of foreground intensity I 0 and background intensity I b
  • C 1 and C 2 are available from two contrast measurements, eg the averaged contrast in two measurement windows along the radial lines or the average contrast of a lane marking, at the distances d 1 and d 2 , then

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Method for determining a visibility range using a camera that is preferably provided for detection of the surroundings as well. A prerequisite for the application of the method is that the relative velocity vrel of the camera and at least one recorded object in the surroundings is not equal to zero. A trajectory of at least one object where vrel is not equal to zero is predicted and a plurality of measurement windows are placed along the trajectory. The distance between the objects imaged in the measurement windows and the camera is determined and the visibility range is ascertained from an image contrast of a measurement window or parts of a measurement window and the associated distance. At least two contrast measurements at different distances are used for evaluation.

Description

Verfahren zur Sichtweitenbestimmung mit einer KameraMethod of determining visibility with a camera
Die vorliegende Erfindung betrifft ein Verfahren zur Bestimmung der Sichtweite. Anwendung findet diese Erfindung z. B. in Kraftfahrzeugen. Kamerasensoren werden hier im vermehrten Umfang zur Umgebungserfassung, insbesondere zur Fahrspurerkennung und/oder Objekterkennung, mit Blick in Fahrrichtung oder in den Rückfahrtbereich eingesetzt. Die Kamerasensoren sind in der Regel im Inneren des Fahrzeugs hinter einer Windschutzscheibe angebracht. Eine fehlerfreie Funktion ist jedoch nur bei Kenntnis der Sichtweite möglich. So ist es zum Beispiel unnötig bei Sichtweiten unter 40m Messfenster in Entfernungen darüber zu platzieren. Gängige Verfahren zur Sichtweitenbestimmung beruhen auf Transmissionsoder Reflexionsmessungen. Bei einer Transmissionsmessung wird die Sichtweite aus der Dämpfung zwischen der von einer bekannten Quelle ausgestrahlten und der in einem festen Abstand gemessenen Intensität berechnet. Bei einer Reflexionsmessung wird die Sichtweite durch die Intensität des Streulichtes bestimmt. Beide Verfahren haben jedoch den Nachteil, dass eine Referenz mit einer bekannten Intensität in einer bekannten Entfernung benötigt wird, welche in einem fahrenden Fahrzeug aufgrund dynamischer Umgebungen nicht vorhanden ist.The present invention relates to a method for determining the visibility. Application finds this invention z. B. in motor vehicles. Camera sensors are used here to an increased extent for environmental detection, in particular for lane detection and / or object recognition, with a view in the direction of travel or in the return journey area. The camera sensors are usually mounted inside the vehicle behind a windshield. An error-free function is only possible if the visibility is known. For example, it is unnecessary to place at distances below 40m measuring windows at distances above it. Common methods of vision determination rely on transmission or reflection measurements. In a transmission measurement, the visibility is calculated from the attenuation between the intensity emitted by a known source and the intensity measured at a fixed distance. In a reflection measurement, the visibility is determined by the intensity of the scattered light. However, both methods have the disadvantage of requiring a reference of a known intensity at a known distance, which is not present in a moving vehicle due to dynamic environments.
Die Aufgabe der Erfindung liegt darin, die Sichtweite mit einer Kamera, die auch für andere Anwendungen insbesondere für die Umgebungserfassung genutzt wird, kostengünstig zu erkennen.The object of the invention is to detect the visibility with a camera, which is also used for other applications in particular for the environmental detection, cost-effective.
Diese Aufgabe wird gemäß einem Verfahren nach Patentanspruch 1 gelöst. Die abhängigen Patentansprüche zeigen vorteilhafte Ausführungsformen und Weiterentwicklungen der Erfindung auf.This object is achieved according to a method according to claim 1. The dependent claims show advantageous embodiments and further developments of the invention.
Erfindungsgemäß wird die Aufgabe durch einen passiven kamerabasierten Ansatz gelöst. Hierfür ist eine Kamera vorgesehen, die auf eine Fahrzeugumgebung ausgerichtet ist. Vorteilhaft an dieser Anordnung ist, dass die Kamera zur Bestimmung der Sichtweite und zur Umgebungserfassung genutzt werden kann. In einem vorgegebenen Takt werden Bilder aus der Kamera ausgelesen, z.B. 25 Bilder / s. Die aufgenommen Bilder werden hinsichtlich der Bestimmung der Sichtweite ausgewertet. Es werden also keine weiteren künstlichen Referenzbilder oder Referenzobjekte benötigt, so dass hier für den Anwender keine weiteren Kosten und/oder Justageaufwand entsteht. Für die Anwendung des Verfahrens muss die Relativgeschwindigkeit vrei von Kamera und zumindest einem aufgenommenen Objekt in der Umgebung ungleich Null sein. Dies wird bei einer Anwendung z.B. im Kraftfahrzeug durch Eigengeschwindigkeit des Fahrzeugs VFahrzeug>0m/s gewährleistet. Damit ist die Relativgeschwindigkeit zu statischen Objekten in der Fahrzeugumgebung vrei>Om/s. Objekte mit vrei>Om/s verändern ihre Position zur Kamera kontinuierlich und damit auch ihre Position im Bild. Der berechnete Verlauf dieser Bewegung im Bild wird im Folgenden als Trajektorie bezeichnet. Dasselbe Objekt wird also zeitlich versetzt in verschiedenen Bildausschnitten abgebildet. Bei dem Objekt kann es sich z.B. um ein einzelnes Objekt wie ein entgegenkommendes Fahrzeug oder um sich wiederholende Objekte wie z.B. Fahrspurmarkierungen handeln. Entlang der prädizierten Trajektorie wird eine Anzahl von Messfenster gesetzt, die das Objekt bei einer geeigneten Auslesefrequenz und/oder einer geeigneten Anzahl der Mittlungen einer Messung zu unterschiedlichen Zeitpunkten durchläuft. Bei einer geradlinigen Eigenbewegung der Kamera verlaufen die Trajektorien von statischen Objekten im Abbildungsbereich der Kamera entlang von Geraden, die radial vom Fluchtpunkt des Bildes ausgehen. Bei einer nichtgeradlinigen Eigenbewegung der Kamera oder einer zu erwartenden ungradlinigen Bewegung wird in einer vorteilhaften Ausgestaltung der Erfindung die Trajektorie von Objekten ausgehend von Umgebungsmerkmalen und/oder der momentanen Kamerabewegung und/oder der Einbaulage der Kamera berechnet. Z.B. kann eine Objekttrajektorie, insbesondere die eines statischen Objekts, aus den geschätzten Fahrspurparametern wie Krümmung und Offset, der Einbaulage der Kamera, ihre Höhe, Nick-, Gier- und Rollwinkel, sowie den intrinsischen Kameraparametern, der Brennweite, Hauptpunkt, etc. berechnet werden. Stehen keine Fahrspurparameter zur Verfügung, da das System aufgrund einer Verschmutzung oder fehlender Straßenmarkierungen nicht in der Lage ist eine Fahrspur zu schätzen, lässt sich die Krümmung anhand des Lenkwinkels oder eines Querbeschleunigungs- / Drehratensensors ermitteln.According to the invention, the object is achieved by a passive camera-based approach. For this purpose, a camera is provided, which is aligned to a vehicle environment. An advantage of this arrangement is that the camera can be used to determine the visibility and the environment detection. Pictures are read out of the camera in a given cycle, eg 25 pictures / s. The recorded images are evaluated in terms of determining the visibility. Thus, no further artificial reference images or reference objects are required, so that there are no additional costs and / or adjustment costs for the user. For the application of the method, the relative speed v re i of the camera and at least one recorded object in the environment must be non-zero. This is ensured in an application, for example in the motor vehicle, by the vehicle's own speed V vehicle > 0 m / s. Thus, the relative speed to static objects in the vehicle environment is v re i> Om / s. Objects with v re i> Om / s continuously change their position relative to the camera and thus their position in the picture. The calculated course of this movement in the picture is referred to below as a trajectory. The same object is thus displayed offset in time in different image sections. For example, the object may be a single object such as an oncoming vehicle or repetitive objects such as lane markers. Along the predicted trajectory, a number of measurement windows are set, which the object passes through at a suitable read-out frequency and / or a suitable number of averages of a measurement at different times. In the case of a rectilinear self-movement of the camera, the trajectories of static objects in the imaging region of the camera run along straight lines that extend radially from the vanishing point of the image. In a nichtgeradlinigen own movement of the camera or an expected ungradlinigen movement, the trajectory of objects based on environmental features and / or the current camera movement and / or the installation position of the camera is calculated in an advantageous embodiment of the invention. For example, an object trajectory, in particular that of a static object, can be calculated from the estimated lane parameters such as curvature and offset, the mounting position of the camera, its height, pitch, yaw and roll angles, as well as the intrinsic camera parameters, the focal length, principal point, etc. , If no lane parameters are available because the system is unable to estimate a lane due to contamination or missing road markings, the curvature can be determined from the steering angle or a lateral acceleration / yaw rate sensor.
Der Abstand der in einem Messfenstern aufgenommenen Objekte zur Kamera wird bestimmt. Die Sichtweite wird aus einem Bildkontrast eines Messfensters oder Teilen eines Messfensters und dem zugehörigen Abstand berechnet. Es werden wenigstens zwei Kontrastmessungen bei verschiedenen Abständen zur Auswertung herangezogen. Vorzugsweise wird der Bildkontrast über ein gesamtes Messfenster ermittelt um eine rechenintensive Objekterkennung zu vermeiden. In einem vorteilhaften Ausführungsbeispiel werden die Trajektorien von Fahrbahnmarkierungen im Bild prädiziert und die Messfenster entlang dieser Trajektorien gesetzt. Aus einer Kontrastmessung in den Messfenstern oder einer Mittelung über mehrere Messungen kann die Sichtweite bestimmt werden. Aufgrund der periodisch wiederkehrenden Markierungen ist es nicht notwendig, dass ein bestimmter Markierungsabschnitt in jedem Messfenster abgebildet wird, sodass auch in diesem Ausführungsbeispiel keine rechenintensive Objekterkennung notwendig ist. In einer vorteilhaften Ausgestaltung der Erfindung wird der Abstand der in den Messfenstern aufgenommenen Objekte zur Kamera aus der Position des Messfensters im kalibrierten Kamerabild abgeschätzt. Dieses Verfahren bietet den Vorteil, dass kein aufwändiges Stereoverfahren oder ein weiterer Sensor zur Entfernungsmessung benötigt wird.The distance of the objects recorded in a measurement window to the camera is determined. Visibility is calculated from an image contrast of a measurement window or parts of a measurement window and the associated distance. At least two contrast measurements at different distances are used for the evaluation. The image contrast is preferably determined over an entire measurement window in order to avoid computation-intensive object recognition. In an advantageous embodiment the trajectories of lane markings in the image are predicted and the measurement windows are set along these trajectories. From a contrast measurement in the measurement windows or an averaging over several measurements, the visibility can be determined. Due to the periodically recurring markings, it is not necessary for a specific marking section to be imaged in each measuring window, so that even in this exemplary embodiment no computationally intensive object recognition is necessary. In an advantageous embodiment of the invention, the distance between the objects recorded in the measurement windows and the camera is estimated from the position of the measurement window in the calibrated camera image. This method has the advantage that no complex stereo method or another sensor for distance measurement is needed.
Vorzugsweise wird, wenn ein Wert für die aktuelle Sichtweite bekannt ist, die Position der Messfenster an die Sichtweite angepasst. So ist es z. B. unnötig bei Sichtweiten unter 60m Messfenster in Entfernungen darüber zu platzieren. Eine vorteilhafte Ausgestaltung der Erfindung ist ein Kraftfahrzeug mit einer Kamera zur Bestimmung der Sichtweite und zur Umgebungsbeobachtung, z.B. für Fahrspurerkennung und/oder Objekterkennung, mit Blick in Fahrrichtung oder in den Rückfahrtbereich.Preferably, if a value for the current visibility is known, the position of the measurement windows is adjusted to the visibility. So it is z. B. unnecessarily in visibility under 60m measurement window at distances above it. An advantageous embodiment of the invention is a motor vehicle with a camera for determining the visibility and the environment observation, e.g. for lane detection and / or object recognition, with a view in the direction of travel or in the return area.
Weitere Vorteile und Besonderheiten der Erfindung werden anhand eines Ausführungsbeispiels und zwei Abbildungen beispielhaft näher erläutert.Further advantages and features of the invention will be explained in more detail by way of example with reference to an embodiment and two figures.
Es zeigen:Show it:
Fig. 1: Bestimmung der Spalte c eines in einer Zeile y platzierten Messfensters mit seitlichemFig. 1: Determination of the column c of a placed in a row y measuring window with lateral
Versatz x .Offset x.
Fig. 2: Bestimmung des Abstands eines Punktes d zur Kamera.Fig. 2: Determination of the distance of a point d to the camera.
Das Ausführungsbeispiel beschreibt ein Kraftfahrzeug mit einer Kamera zur Bestimmung derThe exemplary embodiment describes a motor vehicle with a camera for determining the
Sichtweite. Die Kamera ist auf die Fahrzeugumgebung ausgerichtet und wird auch zur Umgebungserfassung genutzt. Das Fahrzeug bewegt sich im zeitlichen Mittel auf einer Geraden. Statische Objekte in der Umgebung bewegen sich somit im Abbildungsbereich der Kamera auf Geraden, die radial vom Fluchtpunkt des Bildes ausgehen. Dies gilt natürlich auch für den Mittelstreifen und die Seitenbegrenzungen einer Straße, den Verlauf der Fahrbahn und entgegenkommende Fahrzeuge. Werden Messfenster entlang der radial verlaufenden Geraden positioniert, werden unter idealen Bedingungen, d.h. keine Dämpfung oder Verschmutzung, in den korrespondierenden Messfenstern, z.B. im Abbildungsbereich des Mittelstreifens oder der Fahrbahn, identische Kontraste erwartet. Zur Bestimmung von Größe und Position der Messfenster im Bild lässt sich die Fahrbahn in Bereiche äquidistanter Breite unterteilen. Größe und Position der Messfenster im Bild ergeben sich aus dem Abbildungsmodell der Kamera. In Figur 1 ist der einfachste Fall einer nach unten gerichteten Kamera dargestellt. Die Spalte c eines in einer Zeile y platzierten Messfensters mit seitlichem Versatz x berechnet sich zuSight. The camera is geared to the vehicle environment and is also used for environmental detection. The vehicle moves on a straight line on average over time. Static objects in the environment thus move in the imaging range of the camera on straight lines that extend radially from the vanishing point of the image. This of course also applies to the median strip and the side boundaries of a road, the course of the road and oncoming vehicles. If measurement windows are positioned along the radial straight line, they will be under ideal conditions, ie no damping or soiling, in the corresponding measurement windows, for example in the imaging area of the central strip or the roadway, expected identical contrasts. To determine the size and position of the measurement windows in the image, the roadway can be subdivided into areas of equidistant width. The size and position of the measuring windows in the picture result from the imaging model of the camera. FIG. 1 shows the simplest case of a downwardly directed camera. The column c of a measurement window with a lateral offset x placed in a row y is calculated as
C = ^- d - η ' C = ^ - d - η '
Hierbei ist d der Abstand eines Punktes von der Kamera, c die Bildspalte des Punktes, η die Pixelgröße und / die Kamerabrennweite. Figur 2 stellt die Abstandsbestimmung eines Punktes d zur Kamera dar. Der Abstand wird aus h der Kameraeinbauhöhe, a dem Kameranickwinkel, y der Bildzeile des Punktes, η der Pixelgröße und / die Kamerabrennweite zuWhere d is the distance of a point from the camera, c is the image column of the point, η is the pixel size and / or the camera's focal length. FIG. 2 illustrates the distance determination of a point d to the camera. The distance becomes from h the camera installation height, a the camera pitch angle, y the image line of the point, η the pixel size and / or the camera focal length
7 , 1-t - tan« . η . . d = h mit t = y • — bestimmt. t + tanor / 7 , 1-t-tan «. η. , d = h with t = y • - determined. t + tanor /
Aus dem Abstand von in einem Messfenster abgebildeten Objekten zur Kamera und dem Kontrast, der z.B. über das gesamte Messfenster ermittelt wird, kann die Sichtweite wie folgt bestimmt werden.From the distance of objects imaged in a measurement window to the camera and the contrast, e.g. over the entire measuring window is determined, the visibility can be determined as follows.
Unterhalb eines Kontrastschwellwerts S ist es für das menschliche Auge nicht möglich zwei Grauwerte zu unterscheiden. In der Regel sind ca. 5% Kontrast ausreichend. Zur Sichtweitenbestimmung ist folglich der Punkt zu ermitteln an dem der Kontrast den Schwellwert S unterschreitet. Ist ein Referenzobjekt mit bekanten Kontrast vorhanden, lässt sich die Sichtweite aufgrund des Kontrastunterschiedes bei unterschiedlichen Wetterbedingungen nach C = C0 exp(-Kd) berechnen.Below a contrast threshold S, it is not possible for the human eye to distinguish two gray levels. As a rule, about 5% contrast is sufficient. For visibility determination, the point at which the contrast falls below the threshold value S is therefore to be determined. If a reference object with known contrast is present, the visibility can be calculated as C = C 0 exp (-Kd) due to the contrast difference in different weather conditions.
Hierbei sind Co und C der bekannte und der gemessene Kontrast einer bei einer Dämpfung K gemessenen Referenz., d ist die Entfernung zwischen Kamera und Referenz. Der Kontrast berechnet sich aus dem Quotienten der Differenz und Summe aus Vordergrundintensität I0 und Hintergrundintensität Ib Here, Co and C are the known and the measured contrast of a reference measured at a damping K, d is the distance between the camera and the reference. The contrast is calculated from the quotient of the difference and sum of foreground intensity I 0 and background intensity I b
Figure imgf000006_0001
Figure imgf000006_0001
Stehen C1 und C2 aus zwei Kontrastmessungen, z.B. der gemittelte Kontrast in zwei Messfenstern entlang der Radialen oder der gemittelte Kontrast einer Fahrbahnmarkierung, in den Abständen dl und d2 zur Verfügung, so wird mitIf C 1 and C 2 are available from two contrast measurements, eg the averaged contrast in two measurement windows along the radial lines or the average contrast of a lane marking, at the distances d 1 and d 2 , then
K + d, berechnet.
Figure imgf000006_0002
K + d, calculated.
Figure imgf000006_0002

Claims

Patenansprüchepatent claims
1) Verfahren zur Sichtweitenbestimmung mit einer Kamera, wobei die Kamera insbesondere zur Umgebungserfassung im Kraftfahrzeug vorgesehen ist, dadurch gekennzeichnet, dass o zumindest ein Objekt erfasst wird, für welches die Relativgeschwindigkeit (vrel) von1) A method for determining the visibility with a camera, wherein the camera is provided in particular for detecting the environment in the motor vehicle, characterized in that o at least one object is detected, for which the relative speed (v rel ) of
Kamera und aufgenommenen Objekt ungleich Null ist, o eine Trajektorie für zumindest dieses Objekt prädiziert wird und entlang der Trajektorie eine Anzahl von Messfenster gesetzt werden, o der Abstand der in den Messfenstern abgebildeten Objekte zur Kamera bestimmt wird und o die Sichtweite aus einem Bildkontrast eines Messfensters oder Teilen eines Messfensters und dem zugehörigen Abstand bestimmt wird, wobei wenigstens zwei Kontrastmessungen bei verschiedenen Abständen zur Auswertung herangezogen werden.O is a trajectory for at least this object is predicted and along the trajectory a number of measurement windows are set, o the distance of the imaged in the measurement windows objects to the camera is determined and o the visibility of an image contrast of a measurement window or parts of a measurement window and the associated distance is determined, wherein at least two contrast measurements are used at different distances for evaluation.
2.) Verfahren nach Anspruch 1, dadurch gekennzeichnet, dass der Bildkontrast über ein gesamtes Messfenster zeitlich gemittelt wird.2.) A method according to claim 1, characterized in that the image contrast is averaged over an entire measuring window in time.
3) Verfahren nach einem der vorherigen Ansprüche dadurch gekennzeichnet, dass die Trajektorien von statischen Objekten insbesondere von Fahrbahnmarkierungen oder Leitplanken oder dynamischen Objekten insbesondere von sich auf der Fahrbahn bewegenden Objekten im Bild prädiziert und die Messfenster entlang dieser Trajektorien gesetzt werden.3) Method according to one of the preceding claims, characterized in that the trajectories of static objects in particular of lane markings or guardrails or dynamic objects in particular of moving on the road surface objects in the image predicted and the measurement windows are set along these trajectories.
4) Verfahren nach einem der vorherigen Ansprüche dadurch gekennzeichnet, dass der Abstand der in den Messfenstern aufgenommenen Objekte zur Kamera aus der Position des Messfensters im kalibrierten Kamerabild bestimmt wird.4) Method according to one of the preceding claims, characterized in that the distance of the objects recorded in the measurement windows to the camera from the position of the measurement window in the calibrated camera image is determined.
5) Verfahren nach einem der vorherigen Ansprüche, dadurch gekennzeichnet, dass bei Vorlage eines Wertes für die Sichtweite, die Position der Messfenster an diesen Wert angepasst wird. 6) Kraftfahrzeug mit Kamera, wobei Bilddaten hinsichtlich der Sichtweite in der5) Method according to one of the preceding claims, characterized in that upon presentation of a value for the visibility, the position of the measurement window is adapted to this value. 6) Motor vehicle with camera, wherein image data in terms of visibility in the
Fahrzeugumgebung mit einem Verfahren nach einem der Ansprüche 1-4 ausgewertet werden. Vehicle environment are evaluated with a method according to any one of claims 1-4.
PCT/DE2006/000782 2005-07-27 2006-05-06 Method for determining a visibility range using a camera WO2007012300A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE112006001119T DE112006001119A5 (en) 2005-07-27 2006-05-06 Method of determining visibility with a camera

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102005035810.1 2005-07-27
DE102005035810A DE102005035810A1 (en) 2005-07-27 2005-07-27 Method of determining visibility with a camera

Publications (1)

Publication Number Publication Date
WO2007012300A1 true WO2007012300A1 (en) 2007-02-01

Family

ID=36646192

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/DE2006/000782 WO2007012300A1 (en) 2005-07-27 2006-05-06 Method for determining a visibility range using a camera

Country Status (2)

Country Link
DE (2) DE102005035810A1 (en)
WO (1) WO2007012300A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2050644A1 (en) * 2007-10-18 2009-04-22 Renault S.A.S. Methods for measuring the visibility of an automobile driver and calculating speed instructions for the vehicle, and method for implementing same

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4784452B2 (en) * 2006-09-12 2011-10-05 株式会社デンソー In-vehicle fog determination device
DE102011105074A1 (en) 2011-06-21 2011-12-22 Daimler Ag Method for determining visual range for vehicle, involves determining surrounding of vehicle by camera, where contrast dimensions are determined for object depicted in images, where visual range is determined from contrast dimensions
EP3215807B1 (en) 2014-11-06 2020-04-15 Gentex Corporation System and method for visibility range detection
US10803570B2 (en) 2018-05-10 2020-10-13 Eagle Technology, Llc Method and system for a measure of visibility from a single daytime image

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5987152A (en) * 1994-07-06 1999-11-16 Volkswagen Ag Method for measuring visibility from a moving vehicle
EP1067399A2 (en) * 1999-06-24 2001-01-10 Robert Bosch Gmbh Method of visibility determination
DE10034461A1 (en) * 2000-07-15 2002-01-31 Bosch Gmbh Robert Procedure for determining visibility

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4921349A (en) * 1987-06-30 1990-05-01 Sonoma Technology Photographic method for monitoring visibility through measurement of transmittance and path radiance
DE10303046A1 (en) * 2003-01-24 2004-10-21 Daimlerchrysler Ag Quantitative estimation of visibility in motor vehicle, by using e.g. measurement of sharpness or contrast of image obtained from intensity differences of adjacent pixels

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5987152A (en) * 1994-07-06 1999-11-16 Volkswagen Ag Method for measuring visibility from a moving vehicle
EP1067399A2 (en) * 1999-06-24 2001-01-10 Robert Bosch Gmbh Method of visibility determination
DE10034461A1 (en) * 2000-07-15 2002-01-31 Bosch Gmbh Robert Procedure for determining visibility

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
POMERLEAU D: "Visibility estimation from a moving vehicle using the RALPH vision system", INTELLIGENT TRANSPORTATION SYSTEM, 1997. ITSC '97., IEEE CONFERENCE ON BOSTON, MA, USA 9-12 NOV. 1997, NEW YORK, NY, USA,IEEE, US, 9 November 1997 (1997-11-09), pages 906 - 911, XP010270909, ISBN: 0-7803-4269-0 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2050644A1 (en) * 2007-10-18 2009-04-22 Renault S.A.S. Methods for measuring the visibility of an automobile driver and calculating speed instructions for the vehicle, and method for implementing same
FR2922506A1 (en) * 2007-10-18 2009-04-24 Renault Sas METHODS FOR MEASURING THE VISIBILITY DISTANCE OF A MOTOR VEHICLE DRIVER AND CALCULATING A VEHICLE SPEED SET, AND SYSTEMS FOR THEIR IMPLEMENTATION

Also Published As

Publication number Publication date
DE102005035810A1 (en) 2007-02-01
DE112006001119A5 (en) 2008-02-07

Similar Documents

Publication Publication Date Title
DE112006001291B4 (en) Method of detecting soiling on a transparent pane
DE102007011616B4 (en) Vehicle environment monitoring device
DE102009006113B4 (en) Device and method for sensor fusion with dynamic objects
EP2057581B1 (en) Detection and categorization of light spots using a camera in a vehicle environment
DE102018203807A1 (en) Method and device for detecting and evaluating road conditions and weather-related environmental influences
EP2788245B1 (en) Method and device for locating a predefined parking position
DE102017221691A1 (en) Method and device for self-localization of a vehicle
EP2033165B1 (en) Method for picking up a traffic space
DE102018104243B3 (en) Method and system for detecting parking spaces suitable for a vehicle
WO2017017077A1 (en) Method for identifying an object in a surrounding region of a motor vehicle, driver assistance system and motor vehicle
EP2707862B1 (en) Distance measurement by means of a camera sensor
EP3671546A1 (en) Method and system for determining landmarks in an environment of a vehicle
EP1787847B1 (en) Driver assistance system comprising distance to obstacle detection
DE102015209147A1 (en) Method for parking area detection
DE102007025147B4 (en) Lane departure warning and / or lane departure warning system
WO2007012300A1 (en) Method for determining a visibility range using a camera
DE102006044615A1 (en) Image capturing device calibrating method for vehicle, involves locating objects in vehicle surrounding based on image information detected by image capturing device, and evaluating geometrical relations of objects
DE102008017833A1 (en) A method of operating an image pickup device and an image pickup device
EP1944212B1 (en) Method and device for recognising potentially dangerous objects for a vehicle
EP2562685B1 (en) Method and device for classifying a light object located in front of a vehicle
DE102018003784A1 (en) Method for determining a range of an environmental sensor for a vehicle
DE102007013501B4 (en) Driver assistance system with differently oriented cameras
WO2013026599A1 (en) Method and device for detecting disturbing objects in the surrounding air of a vehicle
EP1962245B1 (en) Method and device for detecting the movement state of objects
DE102018213994A1 (en) Method and system for determining the movement of a motor vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 1120060011197

Country of ref document: DE

REF Corresponds to

Ref document number: 112006001119

Country of ref document: DE

Date of ref document: 20080207

Kind code of ref document: P

122 Ep: pct application non-entry in european phase

Ref document number: 06722840

Country of ref document: EP

Kind code of ref document: A1