EP2622324A2 - Verfahren und vorrichtung zur erkennung von nebel in der nacht - Google Patents

Verfahren und vorrichtung zur erkennung von nebel in der nacht

Info

Publication number
EP2622324A2
EP2622324A2 EP11779751.4A EP11779751A EP2622324A2 EP 2622324 A2 EP2622324 A2 EP 2622324A2 EP 11779751 A EP11779751 A EP 11779751A EP 2622324 A2 EP2622324 A2 EP 2622324A2
Authority
EP
European Patent Office
Prior art keywords
image
light source
scene
camera
fog
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP11779751.4A
Other languages
English (en)
French (fr)
Inventor
Romain Gallen
Aurélien CORD
Nicolas Hautiere
Didier Aubert
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institut Francais des Sciences et Technologirs des Transports de lAmenagement et des Reseaux
Original Assignee
Institut Francais des Sciences et Technologirs des Transports de lAmenagement et des Reseaux
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institut Francais des Sciences et Technologirs des Transports de lAmenagement et des Reseaux filed Critical Institut Francais des Sciences et Technologirs des Transports de lAmenagement et des Reseaux
Publication of EP2622324A2 publication Critical patent/EP2622324A2/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N21/49Scattering, i.e. diffuse reflection within a body or fluid
    • G01N21/53Scattering, i.e. diffuse reflection within a body or fluid within a flowing fluid, e.g. smoke
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N21/49Scattering, i.e. diffuse reflection within a body or fluid
    • G01N21/53Scattering, i.e. diffuse reflection within a body or fluid within a flowing fluid, e.g. smoke
    • G01N21/538Scattering, i.e. diffuse reflection within a body or fluid within a flowing fluid, e.g. smoke for determining atmospheric attenuation and visibility
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01WMETEOROLOGY
    • G01W1/00Meteorology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/30Indexing codes relating to the vehicle environment
    • B60Q2300/31Atmospheric conditions
    • B60Q2300/312Adverse weather
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N2021/4704Angular selective
    • G01N2021/4709Backscatter

Definitions

  • the invention relates to a method for detecting, at night, the presence of an element such as fog, disrupting the visibility of a scene, the scene being illuminated by one (or more) light sources.
  • the invention relates in particular to the detection of such a disturbing element in the case where the scene is that which appears in the field of vision of a vehicle driver, in particular a road vehicle; it helps to determine, by the detection of such an element, the visibility distance of the driver, and allows to adapt automatically or not, the driving or the behavior of the vehicle to visibility conditions.
  • the invention also relates to a computer program for implementing the method, a device for implementing this method, and finally a vehicle comprising such a device.
  • the invention thus has applications in the field of the automobile and, in particular, in the field of lighting and signaling of road vehicles, but also in the field of video surveillance.
  • LIDAR Light Detection and Ranging
  • Another device proposed by the document US6853453, uses targets arranged in the scene studied to detect the presence of a disturbing element. This method requires to have targets in the scene, which is very restrictive, and impossible for mobile applications.
  • Another device proposed by the document JP11278182, is based on the identification and characterization of the luminous halo appearing around the taillights of a vehicle appearing in the scene studied. The disadvantage of this system is that it does not work when no vehicle light appears in the scene.
  • the objective of the invention is therefore to remedy the shortcomings of these various devices, and to propose a method for detecting the presence of an element disturbing the visibility of a scene illuminated by at least one light source, at night; said member belonging to the group comprising fog, smoke, rain or snow;
  • the method comprising a step a) in which at least one image of the scene is acquired by means of a camera, said at least one light source being fixed or pivotable with respect to the camera;
  • the method combines two complementary techniques to detect the presence of the disturbing element:
  • the backscattering analysis techniques of lighting work very badly, because the intensity of the backscattered lighting is weak and thus little identifiable among the different radiations reflected towards the objective of the camera.
  • an illuminated environment typically, an urban environment
  • the halo analysis techniques are effective.
  • the image of the scene largely depends on how the scene is lit by the fixed or rotating light source (s).
  • the fixed or rotating light source By exploiting known properties in advance of the light beam produced by the fixed or pivoting source or sources, it is possible to effectively implement the backscattering analysis technique of the light emitted, to detect the presence of the element. disturbing.
  • the light source identified during step b1) is an active component that actually emits light, generally from electrical energy, and is not limited to re-emitting radiation received elsewhere. So it's not a component passive, whose role is only to reflect and / or diffuse the light it receives, and which would be most of the time very difficult to discern at night.
  • each of steps b1), b2), and c) can be performed from a single image, or possibly from a few smoothed successive images, acquired sufficiently rapidly to represent substantially the same view of the studied scene (possibly after a slight computer registration).
  • the processing constituted by steps b1), b2) and / or c) mentioned above requires only a substantially instantaneous acquisition.
  • the information of presence of a disturbing element can advantageously be obtained very quickly, for example as soon as the vehicle enters a fog sheet, which is of considerable interest from the point of view of driving safety.
  • steps b1), b2), and c) can be performed just as well with a monochrome camera as with a color camera, the color information being used in a way that is optional for the implementation. of the invention.
  • the indication provided by the device according to the invention as to the presence of a disturbing element can be binary or digital, such as a percentage in a range between 0% and 100%.
  • the light source which illuminates the scene studied is most often fixed with respect to the camera.
  • the light source and the camera are fixed on the vehicle.
  • the light source is then typically a beacon of the vehicle and in particular, in many cases, its focus (the radiation emission part) does not appear in the image of the camera.
  • the light source can be any fixed light source relative to the ground, for example a street lamp or other.
  • the light source is pivotable, or orientable, relative to the camera.
  • the rotating light sources may be the steerable headlights of the vehicle.
  • the method may further advantageously have one or more of the following steps;
  • step a) a plurality of images can be acquired, in which case the image of the scene is an average of said plurality of images.
  • Step b2) is preferably a step during which the luminous intensity decrease in the halo is characterized for the various halos studied.
  • Step b2) can comprise the intermediate step:
  • segments of images corresponding to light sources detected in step b1) are segmented so as to identify the light sources one by one and not grouped.
  • Step b2) can comprise the following intermediate step
  • the presence of a disturbing element is detected by identifying a center of the light source in said image portion.
  • Step b2) can comprise the intermediate step:
  • the presence of a disturbing element is detected by analyzing an intensity profile along a segment traced in said at least one portion of a light source. 'picture.
  • This segment (analysis segment) can be determined by interpolating centers of the light source, calculated for different halo extraction threshold value surrounding the light source.
  • Step c) can comprise the intermediate step:
  • cl provides at least one reference image produced either by a camera or as a result of an optical simulation calculation.
  • Step c) can comprise the intermediate step:
  • a second object of the invention is to propose a computer program comprising instructions for the implementation of a method for detecting the presence of an element disturbing the visibility of a scene illuminated by at least one light source. the night ; said element belonging to the group consisting of fog, smoke, rain or snow;
  • the term computer includes any type of computer or computer, including on-board computers on board vehicles.
  • the invention also provides a computer-readable recording medium on which a computer program as described above is recorded.
  • a third object of the invention is to provide a device for detecting the presence of an element disturbing the visibility of a scene illuminated by at least one light source at night; said member belonging to the group comprising fog, smoke, rain or snow;
  • the device comprising a camera capable of acquiring at least one image of the scene, and calculation means able to execute a program for processing the images supplied by the camera; said at least one light source being a fixed or rotatable light source with respect to the camera;
  • the calculation means are able to execute the image processing program provided by the camera so as to perform the following operations:
  • the treatment program may be able to perform the following operation:
  • detecting the presence of a disturbing element by analyzing an intensity profile along a segment drawn in said at least one portion of 'picture.
  • This segment can in particular be determined by interpolation of centers of the light source, calculated for different halo extraction threshold value surrounding the light source.
  • the calculation means may be able to execute the processing program so as to perform the following operation:
  • the invention also relates to a device for detecting the presence of an element disturbing the visibility of a scene illuminated by at least one light source, at night; said member belonging to the group comprising fog, smoke, rain or snow;
  • the device comprising:
  • a camera capable of acquiring at least one image of the scene, said at least one light source being a fixed or pivoting light source with respect to the camera;
  • b2) means for detecting the presence of a disturbing element, as a function of the halo appearing in said at least one image in the vicinity of said at least one light source; c) detection means in said at least one image or in a part thereof, of the presence of a disturbing element, as a function of the backscattering of the light emitted by said light source (s) (s) ) onboard (s);
  • the invention finally relates to a vehicle, in particular a road vehicle, whose driving is made safer because it is able to detect the presence of an element disrupting the visibility, at night, of a scene illuminated by at least a light source on board the vehicle; said element belonging to the group comprising fog, smoke, rain or snow.
  • Such a vehicle is characterized in that it incorporates a device as described above.
  • the term "fog” will be used to refer to the disturbing element, it being understood that the disturbing element may also be rain, snow, smoke, etc.
  • FIG. 1 is a schematic view of a road vehicle comprising a device according to the invention
  • FIG. 2 is a diagram showing the steps of the method according to the invention, in one embodiment
  • FIGS. 3A and 3B are simplified images of a night scene, respectively in dry weather and in foggy weather;
  • FIGS. 4A, 4B and 4C represent the same portion of the image of FIG. 3B, but for different values of an extraction threshold serving to extract the different zones respectively corresponding to the different light sources of this image portion; ;
  • FIGS. 5A and 5B show an image portion in which a light source appears, respectively in foggy weather and in dry weather;
  • FIGS. 6A and 6B show the evolution curves of the light intensity along segments shown in FIGS. 5A and 5B;
  • FIG. 7 is a reference image, representative of the image obtained or likely to be obtained in an unlit environment, in foggy weather.
  • FIG. 1 A vehicle 100 comprising a fog detection device 110 according to the invention is illustrated in FIG. 1.
  • This vehicle comprises headlights 105 as on-board light sources. When these headlights are on, they illuminate the road in front of the vehicle. The road is illuminated further by other light sources, namely lampposts 107.
  • the device 110 includes an on-board camera 120, and an on-board computer 130.
  • the computer is provided to execute a fog detection program, from the images provided by the camera 120, in the case of a night ride.
  • This computer 130 comprises a read-only memory, which constitutes a recording medium within the meaning of the invention, in which a program is recorded in the sense of the invention.
  • the information obtained as to the presence of fog is transmitted to the driver, and / or used by the on-board computer 130 to control other equipment of the vehicle 100.
  • the method implemented by the on-board computer 130 through a computer program comprises three phases (FIG. 2).
  • a first phase a one acquires one or more images of the scene, thanks to the camera.
  • An image is usually sufficient.
  • one can from several images produce an image that is the average thereof. We thus obtain a unique image (initial image).
  • the first treatment b is a fog detection detection by halo detection.
  • a first step b1 of this first treatment it detects the light sources visible in the scene, that is to say appearing in the initial image.
  • the pixels illuminated by the different light sources are identified and grouped into groups of pixels called 'halos' and corresponding to the different light sources.
  • Pixels included in the halo or halos of light sources are identified by the fact that they have a higher light intensity than their environment.
  • a thresholding operation applied to the initial image the various halos appearing in the image are identified, which are considered to represent the light sources visible in the scene.
  • an extraction threshold intensity of between 80% and 99% of the maximum intensity perceptible by the camera can be chosen.
  • a halo may possibly encompass several light sources, if they are close (in the picture).
  • FIGS. 3A and 3B illustrate the considerable differences that occur during the detection of light sources, according to the fog, for the same scene;
  • Figure 3A shows a scene without fog. Only the illuminating portion of each lamppost is identified (bulb 108 and cache 109), A halo (and therefore a light source) is identified for each lamppost 107.
  • Figure 3B represents the same scene, a night of fog.
  • the parts of the image illuminated by the light of the lampposts merge and thus form only a single halo H, extending on both sides of the road.
  • the light source detection step therefore leads, in a first step, to identify only one light source.
  • a second step b2 depending on the halo (s) appearing in the or images in the vicinity of the light sources, the presence of fog is detected.
  • the halos identified in step b1) are analyzed; thanks to this analysis, the presence of fog is detected.
  • this halo analysis step comprises several operations; he
  • a first operation b21 the light sources identified in step b1) are segmented, so as to identify halos H1, H2, H3, each corresponding to a single light source, and not a halo H corresponding in fact to a set of light sources.
  • several thresholding operations are performed on the image by varying the value of the extraction threshold, until the halos corresponding to distinct light sources have been separated.
  • related component analysis algorithms can be used in particular.
  • FIGS. 4A to 4C show the halos obtained for different values of the extraction threshold.
  • FIG. 4A corresponds to a fairly low threshold: The different light sources form a single halo H.
  • FIG. 4B corresponds to an intermediate value.
  • FIG. 4C corresponds to a fairly high threshold, which makes it possible to segment the initial halo, and to distinguish within it the halos H1, H2, H3 corresponding to the three light sources actually present in this part of the image.
  • tests may be performed to eliminate false detections of light sources in the scene. These tests can be based on the complexity of the identified halo (which normally remains limited), on its size (elimination of the smallest light sources, considered as artifacts), etc.
  • an analysis segment is determined.
  • analysis segment we designate a line segment that is considered in the image, and according to which an intensity profile analysis will be performed.
  • the analysis segment is determined through an iterative procedure. The proposed procedure starts on the basis of a tight halo H1 (fig.5A), obtained with a very high extraction threshold of intensity, thus very selective.
  • the center of the halo may be the center of gravity of the pixels of the light source ; the center of gravity of these same pixels, weighted by the luminous intensity; the center of the ellipse optimally approximating the outer contour of the pixels of the light source; the center of the circle comprising the pixels of the light source; etc.
  • a line segment is drawn in the general direction of this series of centers, from the center corresponding to the smallest halo.
  • the analysis segment is oriented along the direction in which the center of the halo of the light source considered moves when the value of the halo extraction threshold is varied. It is calculated by interpolation of the position of the determined centers C11, C12, C13, or by a similar method.
  • the analysis segment thus corresponds to the direction in which the halo extends, which generally corresponds to the lighting direction of the light source considered.
  • FIG. 5A thus presents an analysis segment M defined from the centers Ci, C12 and C13 of halos H11, H12, and H13 corresponding to three different values of the extraction threshold.
  • Figure 5B shows a second analysis segment obtained for the same source, in case of time without fog.
  • a single halo H11 'corresponding to the area directly illuminated by the bulb can be extracted.
  • the latter case in which no preferred direction of extension of the halo can not be identified, may correspond for example to the case where the light source is non-directional, as for example, often for rear lights of road vehicle. If no preferential direction appears, one can choose preferably an analysis segment whose end is towards the center and the top of the image, so that the end of the segment is likely to correspond to a zone of sky, in the analyzed image.
  • a third operation b23 for each light source, the variations in the light intensity profile are analyzed according to the analysis segment defined in step b22. Smoothing can optionally be performed on the intensity profile curve to reduce the noise, for example by convolution using a smoothing window, by calculating average or median curve using a window slippery along the curve, etc.
  • the intensity profile variations are calculated for each of the light sources identified in the scene.
  • a fog presence index is calculated. This can be based on, for example but not exclusively: the width at half height; the width at different heights; the inverse of the slope at different heights; etc. Note that for each of these indicators, the higher the value, the greater the probability of fog.
  • FIGS. 6A and 6B illustrate the results obtained, respectively in fog and foggy weather.
  • the luminous intensity is indicated on the ordinate, and the distance expressed in pixels, on the abscissa. These curves therefore correspond to FIGS. 5A and 5B respectively.
  • the luminous intensity curve comprises a first plateau portion for which the pixels are saturated CSAT), then a progressive downward slope to a zero intensity value.
  • the luminous intensity curve also includes a first plateau portion, but almost no progressive slope: the plateau is followed by a stair step in which the intensity drops sharply. It can therefore easily be verified that for the various criteria previously mentioned (width at half height or at several heights, etc.), the curves of FIGS. 6A and 6B provide significantly different results.
  • an overall fog presence index is calculated. For this we aggregate the different indices obtained for each of the light sources. This aggregation or combination of indices can take into account the indices coming not from a single initial image, but from a set of initial images acquired in step a.
  • the aggregate index may for example be an average or median value of the various presence indices; it can be obtained by rank filtering, with different levels; it can still be a linear combination, weighted according to a criterion for example related to the surface or the shape of the halo of the light sources; etc.
  • the first fog detection detection process can be concluded by a step of choice, that is to say a step during which it is decided whether or not there is fog, to obtain a binary indicator.
  • This decision is made by comparing the overall index of presence of fog with a predefined minimum threshold, in relation to usual thresholds, such as those recommended in the French standard NF-P-99-320 on road weather.
  • the second treatment c of the second phase is a fog detection process by backscattering analysis.
  • This second treatment is based on the principle that in case of fog, a portion of the radiation of the onboard light sources (in this case, the headlights of the vehicle) is backscattered towards the camera.
  • one or more reference images of the scene are produced (FIG. These images represent the scene as can be observed in case of fog. If it is a fixed scene (CCTV case), the reference frame (s) may include shapes from elements actually present in the scene. On the other hand, if the scene is variable (case of a device embedded on a vehicle), the reference images are only based on the backscattering of the light emitted by the onboard light sources.
  • FIG. 7 thus represents a reference image that can be used for a motor vehicle with two headlights.
  • This is a grayscale (or at least monochrome) image, showing only the variations in light intensity produced in the image by the two beams of the headlights.
  • Curves have been shown arbitrarily in FIG. 7 to separate the image portions of different brightnesses. The curves appearing in FIG. 7 are therefore iso-luminosity curves. These curves, and more generally the brightness distribution in the reference image, are characteristic of the illumination produced by the fixed or rotating light sources (the headlights of the vehicle), in association with defined fog conditions.
  • a picture acquired by the camera, or a computed synthetic image can be used as reference image.
  • any rendering method known to provide a relatively realistic rendering of the scene may be used, provided that it is able to take into account the backscattering of the light in the fog.
  • the computer image must be calculated in such a way as to represent as realistically as possible the image that the camera could acquire in the presence of fog.
  • the positions and lighting properties of the various light sources on board the vehicle and illuminating the scene must be taken into account.
  • reference images can be used to represent the scene for different atmospheric conditions: more or less dense fog, but also rain, snow, etc.
  • the second and last step c2 of the second treatment c is a comparison step.
  • the image or the images acquired by the camera are compared with the reference image or images retained in step c1.
  • several images provided by the camera can be aggregated to provide a unique initial image, which is then compared to the reference image (s).
  • each of the pixels can be calculated either as an average of the starting images, or by means of a rank filter (for example a median filter) applied to a series of successive images, or by another calculation method.
  • the comparison between images from the camera, and reference images is made using image correlation methods. Different methods are usable.
  • the comparison method can thus be based on the sum of absolute differences (SAD), the sum of absolute differences with zero mean (ZSAD), the sum of squares of differences (SSD), the standardized sum of squares of differences at zero mean (ZNSSD), etc., or on a method quantifying a difference between two images by a function of type 'distance'.
  • This comparison can be made on the entire image, or on one or more image portions, for example the lower half of the image (in which the illumination produced by the headlights is the most marked).
  • This comparison provides one or more signs of fog in the scene.
  • An overall index of fog presence can then be calculated by aggregating the different indices. To calculate this aggregate, especially when the indices are obtained by comparison of a single camera image with several reference images corresponding to different fog densities, it is possible, for example, to choose the presence index having the highest value: should match the image that has a fog density close to the actual fog density in the scene at the instant of observation. In this way we have a characterization of the fog.
  • the overall index of presence of fog like the indices provided directly by the comparisons operated, has a value that increases as the probability of presence of fog is high. If necessary, a binary fog indicator (With / Without fog) can then be established by comparing the overall index with a predetermined threshold.
  • the second phase of the method according to the invention is thus completed by obtaining two families of fog presence indices;
  • These indices can be in each family aggregated or one to provide a single overall index. This one can be binary or not.
  • the fog presence indices are weighted using a decision criterion and a final index is calculated.
  • a decision criterion it is possible to choose, for example, the number of light sources identified during step b21: If at least two sources are identified, the overall index at the end of treatment b) is selected as indicating the presence or absence of no fog. If less than two sources are observed, then the presence index retained may be the overall index resulting from treatment c). Of course, other criteria may be retained while remaining within the scope of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental & Geological Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Analytical Chemistry (AREA)
  • Pathology (AREA)
  • Immunology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Environmental Sciences (AREA)
  • Ecology (AREA)
  • Atmospheric Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
EP11779751.4A 2010-09-28 2011-09-28 Verfahren und vorrichtung zur erkennung von nebel in der nacht Withdrawn EP2622324A2 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1057802A FR2965354B1 (fr) 2010-09-28 2010-09-28 Procede et dispositif de detection de brouillard, la nuit
PCT/FR2011/052262 WO2012042171A2 (fr) 2010-09-28 2011-09-28 Procede et dispositif de detection de brouillard, la nuit

Publications (1)

Publication Number Publication Date
EP2622324A2 true EP2622324A2 (de) 2013-08-07

Family

ID=43432007

Family Applications (1)

Application Number Title Priority Date Filing Date
EP11779751.4A Withdrawn EP2622324A2 (de) 2010-09-28 2011-09-28 Verfahren und vorrichtung zur erkennung von nebel in der nacht

Country Status (6)

Country Link
US (1) US9171216B2 (de)
EP (1) EP2622324A2 (de)
JP (1) JP5952822B2 (de)
KR (1) KR20130138791A (de)
FR (1) FR2965354B1 (de)
WO (1) WO2012042171A2 (de)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6125900B2 (ja) * 2013-05-16 2017-05-10 株式会社小糸製作所 リアフォグランプ制御装置及びリアフォグランプシステム
WO2014207592A2 (en) * 2013-06-26 2014-12-31 Koninklijke Philips N.V. An apparatus and method employing sensor-based luminaires to detect areas of reduced visibility and their direction of movement
US9514373B2 (en) * 2013-08-28 2016-12-06 Gentex Corporation Imaging system and method for fog detection
JP6299720B2 (ja) * 2015-10-02 2018-03-28 トヨタ自動車株式会社 物体認識装置及び煙判定方法
DE102016213059A1 (de) * 2016-07-18 2018-01-18 Robert Bosch Gmbh Verfahren und Steuergerät zum Verarbeiten eines zumindest einen Lichthof repräsentierenden Bildes sowie Bildaufnahmesystem
FR3079614B1 (fr) 2018-03-30 2024-04-12 Syscience Procede et dispositif de mesure des conditions de visibilite

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040143380A1 (en) * 2002-08-21 2004-07-22 Stam Joseph S. Image acquisition and processing methods for automatic vehicular exterior lighting control
JP2008267837A (ja) * 2007-04-16 2008-11-06 Toyota Motor Corp 車両の排気状態検出装置
EP2195688A2 (de) * 2007-08-30 2010-06-16 Valeo Schalter und Sensoren GmbH Verfahren und system zur wetterbedingungsdetektion mit auf bildern basierender strassencharakterisierung

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0643587U (ja) * 1992-11-12 1994-06-10 コーア株式会社 霧の濃度の測定装置
US5796094A (en) * 1993-02-26 1998-08-18 Donnelly Corporation Vehicle headlight control using imaging sensor
JP3232502B2 (ja) * 1996-06-11 2001-11-26 株式会社日立製作所 霧監視システム
JPH11278182A (ja) * 1998-03-31 1999-10-12 Nissan Motor Co Ltd 車両用霧状況検出装置
US6853453B2 (en) 1999-03-12 2005-02-08 Regents Of The University Of Minnesota Video camera-based visibility measurement system
ITTO20020950A1 (it) * 2002-11-05 2004-05-06 Fiat Ricerche Sistema di visione integrato multifunzionale, con matrice
FR2884637B1 (fr) 2005-04-19 2007-06-29 Valeo Vision Sa Procede de detection de brouillard nocturne et systeme de mise en oeuvre de ce procede
JP4730267B2 (ja) 2006-07-04 2011-07-20 株式会社デンソー 車両用視界状況判定装置
US8023760B1 (en) * 2007-12-06 2011-09-20 The United States Of America As Represented By The Secretary Of The Navy System and method for enhancing low-visibility imagery
US8254635B2 (en) * 2007-12-06 2012-08-28 Gideon Stein Bundling of driver assistance systems
WO2010057170A1 (en) * 2008-11-17 2010-05-20 Cernium Corporation Analytics-modulated coding of surveillance video

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040143380A1 (en) * 2002-08-21 2004-07-22 Stam Joseph S. Image acquisition and processing methods for automatic vehicular exterior lighting control
JP2008267837A (ja) * 2007-04-16 2008-11-06 Toyota Motor Corp 車両の排気状態検出装置
EP2195688A2 (de) * 2007-08-30 2010-06-16 Valeo Schalter und Sensoren GmbH Verfahren und system zur wetterbedingungsdetektion mit auf bildern basierender strassencharakterisierung

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2012042171A2 *

Also Published As

Publication number Publication date
WO2012042171A2 (fr) 2012-04-05
WO2012042171A3 (fr) 2013-07-18
US20140029790A1 (en) 2014-01-30
FR2965354A1 (fr) 2012-03-30
KR20130138791A (ko) 2013-12-19
FR2965354B1 (fr) 2012-10-12
JP2013546038A (ja) 2013-12-26
US9171216B2 (en) 2015-10-27
JP5952822B2 (ja) 2016-07-13

Similar Documents

Publication Publication Date Title
EP1715456B1 (de) Verfahren zur Erkennung von nächtlichem Nebel und System zur Umsetzung dieses Verfahrens
EP2056093B1 (de) Verfahren zur Erkennung eines die Sichtweite eines Fahrzeuges störenden Phänomens
EP2061006B1 (de) Erfassungsverfahren eines Störphänomens der Sicht für ein Fahrzeug
EP2020595B1 (de) Erfassungsverfahren eines Fahrzeugs für ein Störphänomen der Sicht
EP1832471B1 (de) Verfahren zur Steuerung der automatischen Schaltung eines Fahrzeugscheinwerfers
EP2622324A2 (de) Verfahren und vorrichtung zur erkennung von nebel in der nacht
EP2199152B1 (de) Umschaltverfahren des Beleuchtungsmodus von Kraftfahrzeugscheinwerfern
EP1708125B1 (de) Verfahren zur vorzeitigen Erfassung der Ankunft eines Kraftfahrzeugs in einem dunklen Sektor
FR2880848A1 (fr) Dispositif de detection de tunnel pour vehicule et dispositif de commande des feux pour vehicule
FR2903493A1 (fr) Dispositif de determination de conditions de visibilite pour vehicule
WO2018146400A1 (fr) Procédé de détection de faux-positifs relatifs à un feu de signalisation
EP3616161A1 (de) Bildverarbeitungsverfahren zur entfernung von lichtzonen
EP1422663B1 (de) Verfahren und Anordnung zur Bestimmung der Sichtweite eines Fahrzeuglenkers
EP2165882B1 (de) Regulierungsverfahren der Leuchtstärke von Kraftfahrzeugscheinwerfern
EP3488380A1 (de) Verfahren zum bestimmen einer blendungshöhe für einen kraftfahrzeugführer
EP2165881A1 (de) Regulierungsverfahren der Leuchtstärke von Fahrzeugscheinwerfern
EP1812262A1 (de) Abblendlichtsystem für kraftfahrzeug
FR3115144A1 (fr) Procédé de traitement d’images
FR3105143A1 (fr) Procédé de détection d’un état local de la route sur laquelle circule un véhicule automobile
FR3109122A1 (fr) Véhicule à feu de route à adaptation d’intensité lumineuse pour réduire l’éblouissement par réflexion
WO2016131528A1 (fr) Procede et dispositif d'aide a la conduite d'un vehicule automobile
FR2911400A1 (fr) Systeme et procede de communication et d'identification inter-vehiculaire

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20130325

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

R17D Deferred search report published (corrected)

Effective date: 20130718

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20190314

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20190725