FR3120258A1 - Procédé de commande d'un dispositif d'éclairage automobile - Google Patents

Procédé de commande d'un dispositif d'éclairage automobile Download PDF

Info

Publication number
FR3120258A1
FR3120258A1 FR2101885A FR2101885A FR3120258A1 FR 3120258 A1 FR3120258 A1 FR 3120258A1 FR 2101885 A FR2101885 A FR 2101885A FR 2101885 A FR2101885 A FR 2101885A FR 3120258 A1 FR3120258 A1 FR 3120258A1
Authority
FR
France
Prior art keywords
reliable
objects
contours
luminance
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
FR2101885A
Other languages
English (en)
Inventor
Yasser Almehio
Hafid El Idrissi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Valeo Vision SAS
Original Assignee
Valeo Vision SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Valeo Vision SAS filed Critical Valeo Vision SAS
Priority to FR2101885A priority Critical patent/FR3120258A1/fr
Priority to FR2107801A priority patent/FR3120257B3/fr
Priority to EP22707428.3A priority patent/EP4298867A1/fr
Priority to US18/547,706 priority patent/US20240130025A1/en
Priority to PCT/EP2022/054441 priority patent/WO2022180054A1/fr
Publication of FR3120258A1 publication Critical patent/FR3120258A1/fr
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/0017Devices integrating an element dedicated to another function
    • B60Q1/0023Devices integrating an element dedicated to another function the element being a sensor, e.g. distance sensor, camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • B60Q1/14Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights having dimming means
    • B60Q1/1415Dimming circuits
    • B60Q1/1423Automatic dimming circuits, i.e. switching between high beam and low beam due to change of ambient light or light level in road traffic
    • B60Q1/143Automatic dimming circuits, i.e. switching between high beam and low beam due to change of ambient light or light level in road traffic combined with another condition, e.g. using vehicle recognition from camera images or activation of wipers
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21SNON-PORTABLE LIGHTING DEVICES; SYSTEMS THEREOF; VEHICLE LIGHTING DEVICES SPECIALLY ADAPTED FOR VEHICLE EXTERIORS
    • F21S41/00Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps
    • F21S41/10Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by the light source
    • F21S41/14Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by the light source characterised by the type of light source
    • F21S41/141Light emitting diodes [LED]
    • F21S41/151Light emitting diodes [LED] arranged in one or more lines
    • F21S41/153Light emitting diodes [LED] arranged in one or more lines arranged in a matrix
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21SNON-PORTABLE LIGHTING DEVICES; SYSTEMS THEREOF; VEHICLE LIGHTING DEVICES SPECIALLY ADAPTED FOR VEHICLE EXTERIORS
    • F21S41/00Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps
    • F21S41/10Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by the light source
    • F21S41/14Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by the light source characterised by the type of light source
    • F21S41/16Laser light sources
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21SNON-PORTABLE LIGHTING DEVICES; SYSTEMS THEREOF; VEHICLE LIGHTING DEVICES SPECIALLY ADAPTED FOR VEHICLE EXTERIORS
    • F21S41/00Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps
    • F21S41/60Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by a variable light distribution
    • F21S41/65Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by a variable light distribution by acting on light sources
    • F21S41/663Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by a variable light distribution by acting on light sources by switching light sources
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21SNON-PORTABLE LIGHTING DEVICES; SYSTEMS THEREOF; VEHICLE LIGHTING DEVICES SPECIALLY ADAPTED FOR VEHICLE EXTERIORS
    • F21S41/00Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps
    • F21S41/60Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by a variable light distribution
    • F21S41/67Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by a variable light distribution by acting on reflectors
    • F21S41/675Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by a variable light distribution by acting on reflectors by moving reflectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/40Indexing codes relating to other road users or special conditions
    • B60Q2300/45Special conditions, e.g. pedestrians, road signs or potential dangers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2800/00Features related to particular types of vehicles not otherwise provided for
    • B60Q2800/10Autonomous vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Mechanical Engineering (AREA)
  • Mathematical Physics (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)

Abstract

The invention provides a method for controlling an automotive lighting device, the method comprising the steps of projecting a first light pattern, capturing an image of a region in front of the lighting device, obtaining a luminance map from the captured image, identifying objects in the luminance map and classify them as reliable or not reliable, according to at least one reliability criterion and modifying the first light pattern to modify the luminous intensity in at least one zone intended to project light on a non-reliable object.Figure pour l’abrégé : figure 3

Description

Method for controlling an automotive lighting device
This invention is related to the field of automotive lighting devices, and more particularly, to the control thereof in low visibility conditions.
Automotive luminous devices comprise light sources, so that the lighting device may provide some light, either for lighting and/or signalling. Several types of light sources families are used nowadays, all of them having advantages and disadvantages.
In some low visibility scenarios, more light is needed so that the user of the vehicle (or the sensor in the event of an autonomous driving situation) may clearly identify any person or object that is on the road, so that the user may react properly.
However, this light addition may introduce other visibility problems, such as self-glaring or dazzling to other vehicles.
A solution for this problem is therefore sought.
The invention provides a solution for these problems by means of a method for controlling an automotive lighting device, the method comprising the steps of
  • projecting a first light pattern;
  • capturing an image of a region in front of the lighting device;
  • obtaining a luminance map from the captured image;
  • identifying objects in the luminance map and classify them as reliable or not reliable, according to at least one reliability criterion; and
  • modifying the first light pattern to modify the luminous intensity in at least one zone intended to project light on a non-reliable object.
With such a method, the lighting device may provide a way of improving the visibility of the zone in front of the vehicle by identifying the visibility problem of each object. The visibility may be adversely affected by low visibility conditions and/or by excessively reflective surfaces, such as a wet road, etc.
A sensor for autonomous driving or the human eye would need the help of this method for a better identification of the objects located in the way of the vehicle.
In some particular embodiments, the step of capturing the image is carried out by a luminance sensor, so that the luminance map is directly obtained when the image is captured.
A luminance camera is expensive, but if offers a luminance map directly, without the need of converting the original image in a luminance map, which is also possible, according to known methods, such as the one shown in Bellia, Laura & Spada, Gennaro & Pedace, Alessia & Fragliasso, Francesca. (2015). Methods to Evaluate Lighting Quality in Educational Environments. Energy Procedia. 78. 3138-3143. 10.1016/j.egypro.2015.11.770.
In some particular embodiments, the method further comprises the step of sub-classifying the non-reliable objects in dark objects of overexposed objects.
Dark objects would need an increase in the luminous intensity employed to light them, while overexposed objects would need less light so that the shape of the object may be more clearly appreciated. The method of the invention detects these two types of non-recognizable objects so that it may adapt the first light pattern to each situation. In some particular embodiments, the step of modifying the first light pattern includes increasing the light in the zone of a dark object and/or decreasing the light in an overexposed object.
In some particular embodiments
  • the method further comprises the step of identifying contours before the step of identifying objects;
  • the step of identifying objects comprise grouping contours in sets, so that each set of contours is defined as an object; and
  • each reliability criterion includes choosing one reliability feature and compare the value of this reliability feature in each contour with the sum of the mean value plus one standard variation of this reliability feature in the whole set of contours.
In these particular cases, the reliability criterion is based on some features of the contour of the objects. The objects are identified by first identifying the contours and then grouping the contours in objects by their proximity or continuity. Some features may be easily quantified in the contours, and the evaluation of these features is used to check the reliability of the object.
In these cases, the criterion is related to the relative quantification of the features with respect to the value of these features in the rest of the contours. A statistic arrangement of the quantification of the features will define a mean value and a standard variation value for each feature. Those contours which has a value which is higher than the sum of said mean value plus said standard variation are considered to be reliable.
The contour detection may be performed by an available method, such as the Laplace coefficient or the Gaussian difference.
In some particular embodiments, the reliability feature comprises at least one of the shape, the size and the contrast of said contours.
The reliability may be measured in n-dimensions. In some cases, the reliability is accepted when the value of two or more features are comprised in the reliability zone defined by the mean value and the standard deviation. The shape, the size and the contrast of the contours are magnitudes which are easily quantifiable in the luminance map, so the statistic arrangement of the data may be easily built.
In some particular embodiments, the step of identifying the objects is carried out by dividing the luminance map according to contours, and evaluating each contour according to the reliability criterion.
In some particular embodiments, the step of modifying the first light pattern comprises
  • evaluating the luminance in a non-reliable object; and
  • if the luminance in a first contour of the non-reliable object is lower than a predetermined threshold, increasing the luminous intensity in a zone of the first light pattern corresponding to the first contour, with an intensity proportional to the luminance of that region.
This is the way to react to a non-reliable object when the sensor is a sensor for autonomous driving, and the object has a low visibility. The luminance provided by the luminance map for each contour is chosen as the criterion for a correct visibility. Since each sensor defines the predetermined luminance threshold, to discern whether an object or part of an object can be properly identified by said sensor, the method identifies which zones should be more illuminated by increasing the luminous intensity of the lighting device.
In some particular embodiments, the method further comprising a step of filtering the luminance map after the step of obtaining the luminance map.
With a previous step of filtering the luminance map, most noisy features are eliminated from the image, so that it may be more clearly appreciated.
In some particular embodiments, the step of filtering comprises performing a luminous intensity histogram equalization.
This histogram equalization is useful to obtain a greater luminous information of the objects involved in the image.
In some particular embodiments, the step of filtering comprises performing a contrast sensitivity function suitable to discern if a contour is seen by the human eye or not.
A contrast sensitivity function is a function intended to identify the contrast of the contours to discern whether they may be seen by the human eye or not. There are different examples of contrast sensitivity functions that would be used in these embodiments, so that a contour (and then, the object associated to this contour) may be classified as reliable or non-reliable.
In some particular embodiments, an object is classified as non-reliable according to a machine learning process. In more particular embodiments, the machine learning process comprises training the lighting device to perform the step of classifying the objects as reliable or non-reliable, by providing a training dataset of reliable and non-reliable objects. In more particular embodiments, the machine learning process comprises the use of an activation function from at least one of Softmax, ReLU, LeakyReLU, Sigmoid or Tanh.
In a second inventive aspect, the invention provides an automotive lighting device comprising
  • a plurality of light sources;
  • a control unit configured to selectively control the activation of the plurality of light sources; and
  • a camera configured to acquire images from the exterior of the lighting device;
  • wherein the control unit is configured to carried out a method according to the first inventive aspect.
In some particular embodiments, the control unit comprises at least part of a convolutional neural network, wherein the convolutional neural network comprises a convolutional layer, a pooling layer and a machine vector support layer, the machine vector support layer being configured to classify the descriptors exiting from the convolutional neural network to optimize some weights used in the network.
Unless otherwise defined, all terms (including technical and scientific terms) used herein are to be interpreted as is customary in the art. It will be further understood that terms in common usage should also be interpreted as is customary in the relevant art and not in an idealised or overly formal sense unless expressly so defined herein.
In this text, the term “comprises” and its derivations (such as “comprising”, etc.) should not be understood in an excluding sense, that is, these terms should not be interpreted as excluding the possibility that what is described and defined may include further elements, steps, etc.
To complete the description and in order to provide for a better understanding of the invention, a set of drawings is provided. Said drawings form an integral part of the description and illustrate an embodiment of the invention, which should not be interpreted as restricting the scope of the invention, but just as an example of how the invention can be carried out. The drawings comprise the following figures:
shows an external view of an automotive luminous device according to the invention.
shows a first luminance map as calculated by the software..
shows an example of a local histogram equalization.
shows the statistic distribution of the length and contrast values of these contours.
The example embodiments are described in sufficient detail to enable those of ordinary skill in the art to embody and implement the systems and processes herein described. It is important to understand that embodiments can be provided in many alternate forms and should not be construed as limited to the examples set forth herein.
Accordingly, while embodiment can be modified in various ways and take on various alternative forms, specific embodiments thereof are shown in the drawings and described in detail below as examples. There is no intent to limit to the particular forms disclosed. On the contrary, all modifications, equivalents, and alternatives falling within the scope of the appended claims should be included.
shows a general perspective view of an automotive lighting device according to the invention.
This headlamp 1 is installed in an automotive vehicle 100 and comprises
  • a matrix arrangement of LEDs 2, intended to provide a light pattern;
  • a control unit 3 to perform a control of the operation of the LEDs 2; and
  • a camera 4 intended to provide some external data.
This matrix configuration is a high-resolution module, having a resolution greater than 2000 pixels. However, no restriction is attached to the technology used for producing the projection modules.
A first example of this matrix configuration comprises a monolithic source. This monolithic source comprises a matrix of monolithic electroluminescent elements arranged in several columns by several rows. In a monolithic matrix, the electroluminescent elements can be grown from a common substrate and are electrically connected to be selectively activatable either individually or by a subset of electroluminescent elements. The substrate may be predominantly made of a semiconductor material. The substrate may comprise one or more other materials, for example non-semiconductors (metals and insulators). Thus, each electroluminescent element/group can form a light pixel and can therefore emit light when its/their material is supplied with electricity. The configuration of such a monolithic matrix allows the arrangement of selectively activatable pixels very close to each other, compared to conventional light-emitting diodes intended to be soldered to printed circuit boards. The monolithic matrix may comprise electroluminescent elements whose main dimension of height, measured perpendicularly to the common substrate, is substantially equal to one micrometre.
The monolithic matrix is coupled to the control centre so as to control the generation and/or the projection of a pixelated light beam by the matrix arrangement. The control centre is thus able to individually control the light emission of each pixel of the matrix arrangement.
Alternatively to what has been presented above, the matrix arrangement may comprise a main light source coupled to a matrix of mirrors. Thus, the pixelated light source is formed by the assembly of at least one main light source formed of at least one light emitting diode emitting light and an array of optoelectronic elements, for example a matrix of micro-mirrors, also known by the acronym DMD, for "Digital Micro-mirror Device", which directs the light rays from the main light source by reflection to a projection optical element. Where appropriate, an auxiliary optical element can collect the rays of at least one light source to focus and direct them to the surface of the micro-mirror array.
Each micro-mirror can pivot between two fixed positions, a first position in which the light rays are reflected towards the projection optical element, and a second position in which the light rays are reflected in a different direction from the projection optical element. The two fixed positions are oriented in the same manner for all the micro-mirrors and form, with respect to a reference plane supporting the matrix of micro-mirrors, a characteristic angle of the matrix of micro-mirrors defined in its specifications. Such an angle is generally less than 20° and may be usually about 12°. Thus, each micro-mirror reflecting a part of the light beams which are incident on the matrix of micro-mirrors forms an elementary emitter of the pixelated light source. The actuation and control of the change of position of the mirrors for selectively activating this elementary emitter to emit or not an elementary light beam is controlled by the control centre.
In different embodiments, the matrix arrangement may comprise a scanning laser system wherein a laser light source emits a laser beam towards a scanning element which is configured to explore the surface of a wavelength converter with the laser beam. An image of this surface is captured by the projection optical element.
The exploration of the scanning element may be performed at a speed sufficiently high so that the human eye does not perceive any displacement in the projected image.
The synchronized control of the ignition of the laser source and the scanning movement of the beam makes it possible to generate a matrix of elementary emitters that can be activated selectively at the surface of the wavelength converter element. The scanning means may be a mobile micro-mirror for scanning the surface of the wavelength converter element by reflection of the laser beam. The micro-mirrors mentioned as scanning means are for example MEMS type, for "Micro-Electro-Mechanical Systems". However, the invention is not limited to such a scanning means and can use other kinds of scanning means, such as a series of mirrors arranged on a rotating element, the rotation of the element causing a scanning of the transmission surface by the laser beam.
In another variant, the light source may be complex and include both at least one segment of light elements, such as light emitting diodes, and a surface portion of a monolithic light source.
A particular embodiment of the method of the invention comprises that a lighting device first projects a first light pattern, to light the road that is ahead. The automotive vehicle comprises sensors for autonomous driving, so the images must provide all necessary information for the sensors to detect and identify every object which is in the surroundings of the vehicle.
Then, the camera captures an image of this region in front of the automotive vehicle. This image is acquired and sent to a control unit, which obtains a luminance map from the acquired image. This luminance map could be directly obtained by a luminance camera in different embodiments, but it is less expensive to use a standard camera and then calculate the luminance map by a software.
shows a first luminance map 5 as calculated by the software. In this figure, the luminance map 5 provides the method with a plurality of contours 6. These contours 6 are analysed to verify some features of the same. Length and contrast are two features which will be very useful for the rest of this particular embodiment of the method.
In some particular embodiments, an intermediate step of histogram equalization is performed before the identification of contours, since a histogram equalization is particularly advantageous in this case, since it contributes for a better contour recognition.
shows an example of a local histogram equalization 7. Although in this figure it is shown from a real image, the histogram equalization may be also performed on the luminance map.
The histogram equalization allows a better recognition of contours by a re-scaling of the luminous intensity values in the figure. Since it is a very dark image, intensity values are comprised in a narrow interval. By re-scaling these values, much more details may be obtained from the original image. This can also be applied to luminance in a luminance map.
Returning to the method, once the contours have been identified, they are grouped by proximity to form objects. These objects are therefore formed by contours. The contours of an object may be classified as reliable or non-reliable. This classification has the origin in the ability of the sensors to identify clearly the object (a person, a tree, a dog, a fence, a traffic signal…). A criterion for reliability is used for this classification.
shows the statistic distribution of the length and contrast values of these contours. Every contour has a length and a contrast value. With this statistic distribution, the mean value and the standard deviation are calculated for both features (length and contrast). Those contours having a contrast which is higher than the mean contrast value plus one standard deviation are considered to be “visible” and those contours having a length higher than the mean length value plus one standard deviation are considered to be “long”. Those contours which are “visible” and “long” are considered to be reliable 8. Those contours which are “non-visible” and “non-long” are considered to be non-reliable 9. Contours which are “visible” but “non-long” or “non-visible” and “long” will be treated as non-reliable 10, but in a different manner.
Once the contours have been classified in reliable and non-reliable, a contour map is elaborated, where the contour of the non-reliable objects are identified.
The control unit has the information of which light pixels are in charge of lighting the non-reliable zones. When a non-reliable zone is detected, and the non-reliability is caused by a low lighting, the control unit controls the light pixels in charge of this zone to increase the luminous intensity so that the light provided to this dark zone is increased.
If the non-reliability comes from a glaring, the control unit controls the light pixels in charge of this zone to decrease the luminous intensity
This method may be also applied to non-autonomous driving. In this case, there is an additional filtering step which is called contrast A contrast sensitivity function is a function intended to identify the contrast of the contours to discern whether they may be seen by the human eye or not. There are different examples of contrast sensitivity functions that would be used in these embodiments, so that a contour (and then, the object associated to this contour) may be classified as reliable or non-reliable.
In some cases, while training the system, a machine learning algorithm is used for improving the reliability hierarchizing of objects. This machine learning process is carried out by a convolutional neural network.
Convolutional layers use filters to recall data from the acquired image, while pooling layers are used to perform a sampling process over the data obtained in the convolutional layers. This network is fully-connected (FC), so every input neuron is connected to every neuron of the next layer.
While the activation function may be chosen between Softmax, ReLU, LeakyReLU, Sigmoid or Tanh, there is an advantage in replacing the last activation layer by a machine vector support, which learns how to minimize the error in an adaptive way. The machine vector support layer is also configured to classify the descriptors exiting from the convolutional neural network to optimize some weights used in the network.

Claims (15)

  1. Method for controlling an automotive lighting device (1), the method comprising the steps of:
    • projecting a first light pattern;
    • capturing an image of a region in front of the lighting device (1);
    • obtaining a luminance map (5) from the captured image;
    • identifying objects in the luminance map and classify them as reliable or not reliable, according to at least one reliability criterion; and
    • modifying the first light pattern to modify the luminous intensity in at least one zone intended to project light on a non-reliable object.
  2. Method according to claim 1, wherein the step of capturing the image is carried out by a luminance sensor, so that the luminance map is directly obtained when the image is captured.
  3. Method according to any of the preceding claims, further comprising the step of sub-classifying the non-reliable objects in dark objects of overexposed objects.
  4. Method according to claim 3, the step of modifying the first light pattern includes increasing the light in the zone of a dark object and/or decreasing the light in an overexposed object.
  5. Method according to any of the preceding claims, wherein
    • the method further comprises the step of identifying contours (6) before the step of identifying objects;
    • the step of identifying objects comprise grouping contours (6) in sets, so that each set of contours is defined as an object; and
    • each reliability criterion includes choosing one reliability feature and compare the value of this reliability feature in each contour with the sum of the mean value plus one standard variation of this reliability feature in the whole set of contours.
  6. Method according to claim 5, wherein the reliability feature comprises at least one of the shape, the size and the contrast of said contours.
  7. Method according to any of claims 5 or 6, wherein the step of identifying the objects is carried out by dividing the luminance map (5) according to contours, and evaluating each contour according to the reliability criterion.
  8. Method according to any of claims 5 to 7, wherein the step of modifying the first light pattern comprises
    • evaluating the luminance in a non-reliable object; and
    • if the luminance in a first contour of the non-reliable object is lower than a predetermined threshold, increasing the luminous intensity in a zone of the first light pattern corresponding to the first contour, with an intensity proportional to the luminance of that region.
  9. Method according to any of the preceding claims, further comprising a step of filtering the luminance map after the step of obtaining the luminance map.
  10. Method according to claim 9, wherein the step of filtering comprises performing a contrast sensitivity function suitable to discern if a contour is seen by the human eye or not.
  11. Method according to any of the preceding claims, wherein an object is classified as non-reliable according to a machine learning process.
  12. Method according to claim 11, wherein the machine learning process comprises training the lighting device to perform the step of classifying the objects as reliable or non-reliable, by providing a training dataset of reliable and non-reliable objects.
  13. Method according to claim 12, wherein the machine learning process comprises the use of an activation function from at least one of Softmax, ReLU, LeakyReLU, Sigmoid or Tanh.
  14. Automotive lighting device (1) comprising
    • a plurality of light sources (2); and
    • a control unit (3) configured to selectively control the activation of the plurality of light sources (2); and
    • a camera (4) configured to acquire images from the exterior of the lighting device (1);
    • wherein the control unit (3) is configured to carried out a method according to any of the preceding claims.
  15. Automotive lighting device according to claim 10, wherein the control unit comprises at least part of a convolutional neural network, wherein the convolutional neural network comprises a convolutional layer, a pooling layer and a machine vector support layer, the machine vector support layer being configured to classify the descriptors exiting from the convolutional neural network to optimize some weights used in the network.
FR2101885A 2021-02-26 2021-02-26 Procédé de commande d'un dispositif d'éclairage automobile Pending FR3120258A1 (fr)

Priority Applications (5)

Application Number Priority Date Filing Date Title
FR2101885A FR3120258A1 (fr) 2021-02-26 2021-02-26 Procédé de commande d'un dispositif d'éclairage automobile
FR2107801A FR3120257B3 (fr) 2021-02-26 2021-07-20 Procédé de commande d'un dispositif d'éclairage automobile
EP22707428.3A EP4298867A1 (fr) 2021-02-26 2022-02-22 Procédé de commande d'un dispositif d'éclairage automobile
US18/547,706 US20240130025A1 (en) 2021-02-26 2022-02-22 Method for controlling an automotive lighting device
PCT/EP2022/054441 WO2022180054A1 (fr) 2021-02-26 2022-02-22 Procédé de commande d'un dispositif d'éclairage automobile

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR2101885A FR3120258A1 (fr) 2021-02-26 2021-02-26 Procédé de commande d'un dispositif d'éclairage automobile
FR2101885 2021-02-26

Publications (1)

Publication Number Publication Date
FR3120258A1 true FR3120258A1 (fr) 2022-09-02

Family

ID=83050153

Family Applications (2)

Application Number Title Priority Date Filing Date
FR2101885A Pending FR3120258A1 (fr) 2021-02-26 2021-02-26 Procédé de commande d'un dispositif d'éclairage automobile
FR2107801A Active FR3120257B3 (fr) 2021-02-26 2021-07-20 Procédé de commande d'un dispositif d'éclairage automobile

Family Applications After (1)

Application Number Title Priority Date Filing Date
FR2107801A Active FR3120257B3 (fr) 2021-02-26 2021-07-20 Procédé de commande d'un dispositif d'éclairage automobile

Country Status (1)

Country Link
FR (2) FR3120258A1 (fr)

Also Published As

Publication number Publication date
FR3120257A3 (fr) 2022-09-02
FR3120257B3 (fr) 2023-08-25

Similar Documents

Publication Publication Date Title
JP7369921B2 (ja) 物体識別システム、演算処理装置、自動車、車両用灯具、分類器の学習方法
CN105473393B (zh) 用于检测车辆上操控姿态的传感器机构
KR100682067B1 (ko) 차량 전조등들이나 또는 타의 차량 장비를 제어하기 위한 영상 처리 시스템
CN114616489A (zh) Lidar图像处理
US20130050710A1 (en) Object detecting device and information acquiring device
US8254632B2 (en) Detection of motor vehicle lights with a camera
JP2005534903A (ja) 車両外部照明自動制御のための光源検出及び分類システムと製造方法
RU2691939C1 (ru) Система управления передними фарами
CN113227838A (zh) 车辆用灯具及车辆
US12047667B2 (en) Imaging device
KR20220139933A (ko) 자동차의 주변 모니터링 시스템
KR102481310B1 (ko) 자동차 헤드램프용 조명 장치
CN109969073A (zh) 车辆用灯具系统、车辆用灯具的控制装置以及控制方法
US11993201B2 (en) Method for controlling modules for projecting pixelated light beams for a vehicle
US20240130025A1 (en) Method for controlling an automotive lighting device
JP2020106376A (ja) アクティブセンサ、物体識別システム、車両、車両用灯具
FR3120258A1 (fr) Procédé de commande d'un dispositif d'éclairage automobile
US20100085580A1 (en) Optical Sensor
CN116918457A (zh) 用于控制机动车的照明装置的方法
US20220207884A1 (en) Object recognition apparatus and object recognition program product
JP2019156276A (ja) 車両検出方法及び車両検出装置
WO2022263683A1 (fr) Procédé de détection d'objet sur la surface de la chaussée, procédé de conduite autonome et dispositif d'éclairage automobile
WO2022263684A1 (fr) Procédé de détection d'objet sur une surface de roulement, procédé de conduite autonome et dispositif d'éclairage automobile
WO2022263685A1 (fr) Procédé de détection d'un objet dans une surface de roulement, procédé de conduite autonome et dispositif d'éclairage automobile
US11284050B1 (en) Calibration of camera and projector