US11993201B2 - Method for controlling modules for projecting pixelated light beams for a vehicle - Google Patents
Method for controlling modules for projecting pixelated light beams for a vehicle Download PDFInfo
- Publication number
- US11993201B2 US11993201B2 US17/281,859 US201917281859A US11993201B2 US 11993201 B2 US11993201 B2 US 11993201B2 US 201917281859 A US201917281859 A US 201917281859A US 11993201 B2 US11993201 B2 US 11993201B2
- Authority
- US
- United States
- Prior art keywords
- projection
- controlling
- road
- control device
- projection modules
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 36
- 230000008859 change Effects 0.000 claims description 4
- 230000004927 fusion Effects 0.000 claims description 3
- 230000006870 function Effects 0.000 description 21
- 239000011159 matrix material Substances 0.000 description 15
- 230000003287 optical effect Effects 0.000 description 5
- 239000000463 material Substances 0.000 description 4
- 239000000758 substrate Substances 0.000 description 4
- 230000000694 effects Effects 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 239000012212 insulator Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000015654 memory Effects 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 150000002739 metals Chemical class 0.000 description 1
- 230000005693 optoelectronics Effects 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
- B60Q9/008—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/02—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
- B60Q1/04—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
- B60Q1/06—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle
- B60Q1/08—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle automatically
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/0017—Devices integrating an element dedicated to another function
- B60Q1/0023—Devices integrating an element dedicated to another function the element being a sensor, e.g. distance sensor, camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
- B60W30/12—Lane keeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
- B60W40/11—Pitch movement
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/025—Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
- B62D15/0265—Automatic obstacle avoidance by steering
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/029—Steering assistants using warnings or proposing actions to the driver without influencing the steering system
- B62D15/0295—Steering assistants using warnings or proposing actions to the driver without influencing the steering system by overlaying a vehicle path based on present steering angle over an image without processing that image
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q2300/00—Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
- B60Q2300/10—Indexing codes relating to particular vehicle conditions
- B60Q2300/13—Attitude of the vehicle body
- B60Q2300/136—Roll
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q2400/00—Special features or arrangements of exterior signal lamps for vehicles
- B60Q2400/50—Projected symbol or information, e.g. onto the road or car body
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/16—Pitch
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/20—Road profile, i.e. the change in elevation or curvature of a plurality of continuous road segments
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/50—Barriers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2300/00—Purposes or special features of road vehicle drive control systems
- B60Y2300/10—Path keeping
- B60Y2300/12—Lane keeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2400/00—Special features of vehicle units
- B60Y2400/30—Sensors
Definitions
- the present invention relates to a method for controlling modules for projecting pixelated light beams for a vehicle. It is particularly applicable to the control of these projection modules so as to allow assistance in driving a vehicle.
- a motor vehicle generally comprises a set of light beam projection modules, generally left and right, limited to the basic functions of lighting and/or signaling means, smart to a certain extent, when driving at night or with low visibility, and/or in a context of adverse weather conditions.
- a light beam projection module may be associated with one or more functions such as the “high-beam” function for illuminating the road or its edges with high intensity, and/or with the “low-beam” function for illuminating the road or its edges at shorter range without dazzling other users coming in the opposite direction.
- the present invention aims to provide better visual comfort to the driver and/or to the passengers of a motor vehicle by providing them with new driving assistance functionalities for new user experiences.
- a first aspect of the invention relates to a method for controlling modules for projecting pixelated light beams from a host vehicle, comprising
- the data/image acquisition means able to collect the data required to model the profile of the road is a camera, and/or a radar, and/or a lidar;
- the image/data acquisition means when it detects an obstacle, it transmits, to the control device, the data relating to the obstacle in order to define a safety margin so as to prevent patterns from being projected onto the obstacle;
- the distance Dc is parameterizable according to the type of pattern projected
- the control device controls the projection modules such that the patterns of the right and left projection zones are superposed, in one and the same projection zone;
- the patterns able to be projected into the projection zone may be circles, squares, triangles, rectangles, chevrons, arrows, or more complex shapes, or numbers such as the display of a speedometer, or continuous or broken lines;
- control device is able to dynamically increase the width Lm of the patterns projected farthest away onto the road in order to correct the effect of perspective;
- control device is associated with a set of sensors able to determine the pitch of the host vehicle configured so as to compensate the mechanical and/or digital calibration of the projection modules;
- control device is able to compensate the light intensity according to the distance of projection of the patterns and the “flat-beam” base beam;
- control device associated with the data/image acquisition means is configured so as to determine whether the outline of the host vehicle is able to pass between two obstacles by projecting said outline between the two obstacles;
- the orientation of the projection of the outline of the host vehicle is dynamically related to the angle of the steering system of said host vehicle
- control device associated with the data/image acquisition means is able to project an obstacle avoidance strategy
- control device associated with the data/image acquisition means is able to project a set of patterns configured so as to establish a trajectory for the host vehicle when the lane narrows in the area of works;
- control device associated with the “GPS” navigation system of the vehicle, is able to project a change of trajectory, in the form of arrows on the ground, for the host vehicle;
- Another aspect of the invention relates to a lighting device for a motor vehicle intended to be controlled by a control device able to implement the method for controlling modules for projecting pixelated light beams according to any one of the preceding features;
- a device for fusion of information is able to determine the relevance of each datum from the various sensors associated with the host vehicle, in order to transmit reliable data for aiding in decision making to the control device.
- FIG. 1 illustrates a system according to one embodiment of the invention
- FIG. 2 is a diagram illustrating the steps of a method according to the invention.
- FIG. 3 illustrates the implementation of the method according to the invention in a first driving situation
- FIG. 4 illustrates the implementation of the method according to the invention in a second driving situation
- FIG. 5 illustrates the implementation of the method according to the invention in a third driving situation
- FIG. 6 illustrates the implementation of the method according to the invention in a fourth driving situation.
- FIG. 1 shows a motor vehicle 100 comprising a system 110 comprising a set of sensors 120 , at least one device 130 for controlling modules 140 projecting light beams, said control device 130 being connected to the control unit 150 of the vehicle 100 .
- Said control device 130 comprises at least one microcontroller associated with one or more memories and a graphics processing unit.
- the motor vehicle 100 comprising such a system 110 will be referred to hereinafter as a host vehicle 100 .
- the projection module 140 is a high-resolution module, in other words one having a resolution higher than 1000 pixels. However, no restriction is attached to the technology used to produce the projection modules 140 .
- a projection module 140 may for example comprise a monolithic source.
- a monolithic source is a monolithic matrix array of electroluminescent elements arranged in at least two columns by at least two rows.
- the electroluminescent elements may be grown from a common substrate and may be electrically connected so as to be able to be activated selectively, individually or by subset of electroluminescent elements.
- the substrate may be made predominantly of semiconductor material.
- the substrate may comprise one or more further materials, for example non-semiconductor materials (metals and insulators).
- Each electroluminescent element or group of electroluminescent elements may thus form a luminous pixel and is able to emit light when its or their material is supplied with electricity.
- the configuration of such a monolithic matrix array makes it possible to arrange selectively activatable pixels very close to each other, in comparison with conventional light-emitting diodes that are intended to be soldered onto printed circuit boards.
- the monolithic matrix array may comprise electroluminescent elements a main dimension of elongation of which, specifically the height, is substantially perpendicular to a common substrate, this height being equal to one micrometer.
- the one or more monolithic matrix arrays may be coupled to the control device 130 so as to control the generation and/or the projection of a pixelated light beam by the projection module 140 .
- the control device 130 is thus able to individually control the light emission of each pixel of a matrix array.
- the projection module 140 may comprise a light source coupled to a matrix array of mirrors.
- the pixelated light source may be formed by the assembly of at least one light source formed of at least one light-emitting diode emitting light and a matrix array of optoelectronic elements, for example a matrix array of micromirrors, also known by the acronym DMD, for “digital micromirror device”, which directs the light rays originating from the light source by reflection toward an optical projection element.
- DMD matrix array of micromirrors
- an optical collection element may make it possible to collect the rays from at least one light source in order to concentrate them and to direct them toward the surface of the matrix array of micromirrors.
- Each micromirror is able to pivot between two fixed positions, a first position in which the light rays are reflected toward the optical projection element, and a second position in which the light rays are reflected in a direction other than the optical projection element.
- the two fixed positions are oriented in the same way for all of the micromirrors and form, with respect to a support reference plane of the matrix array of micromirrors, an angle characteristic of the matrix array of micromirrors, defined in the specifications thereof. Such an angle is generally less than 20° and may usually have a value of about 12°.
- each micromirror reflecting a portion of the light rays incident on the matrix array of micromirrors forms an elementary emitter of the pixelated light source, the actuation and control of the change in position of the mirrors making it possible to selectively activate this elementary emitter in order to emit or not to emit an elementary light beam.
- the light beam projection module may be formed by a laser scanning system in which a laser source emits a laser beam toward scanning means that are configured so as to scan, with the laser beam, the surface of a wavelength element converter, which surface is imaged by the optical projection element.
- the scanning of the beam may be brought about by the scanning means at a speed high enough that the human eye does not perceive its movement in the projected image.
- the scanning means may be a mobile micromirror for scanning the surface of the wavelength converter element through reflection of the laser beam.
- the micromirrors mentioned as scanning means are, for example, of MEMS (microelectromechanical system) type.
- the invention is not limited to such a scanning means, and may use other types of scanning means, such as a series of mirrors arranged on a rotary element, the rotation of the element causing the transmission surface to be scanned by the laser beam.
- the light source may be a matrix array and comprise at least one segment of light elements, such as light-emitting diodes or a surface portion of a monolithic light source.
- FIG. 2 illustrates the steps of a method implemented by the one or more sensors and by the control device 130 .
- a step 200 the method begins, for example when starting up the host vehicle or when the high-beam or low-beam function is activated.
- the set of sensors 120 of the host vehicle is able to collect a set of data.
- at least one of the sensors is configured so as to collect the data required to model the profile of the road.
- the data collected and the accuracy of this data depend on the nature of the one or more sensors, whether it is a camera, a radar, or a lidar.
- Known modeling methods may be applied with a view to estimating the profile of the road, according to the images and/or data acquired by a camera, and/or a radar, and/or a lidar.
- a camera To facilitate understanding of the method and system according to the invention, only one camera 121 will be shown, and its operation and its interaction with all of the other elements of the system 100 will be described below. It should however be noted that this camera 121 is shown schematically in FIGS. 3 to 6 as being located at the height of the central rear-view mirror of the vehicle. A totally different location for the camera and/or for the other means for acquiring data and/or images relating to the road 160 stretching in front of the host vehicle may also be envisaged. However, it should be obvious that a different location for these said means will require a person skilled in the art to determine the various parameters and constants which will be described below according to this new frame of reference.
- the present invention more specifically provides, in a step 202 , for determining a polynomial function modeling the profile of the edge of the road.
- the modeling of the profile of the edge of the road in the form of a polynomial makes it possible to represent the profiles of the edge of the road more or less accurately according to the degree of the polynomial.
- the camera 121 when the camera 121 has acquired an image of the road 160 stretching in front of the host vehicle 100 , said camera 121 is able to transmit, to the control device 130 , the x and y coordinates according to the profile of the right edge 161 , left edge 163 and center 162 of the road 160 .
- the control device 130 determines the distance Ai between the virtual projection of the axis Ac of the camera 121 on the plane Pr of the road 160 and respectively the right edge 161 , left edge 163 and the center of the road 160 .
- the parameters of the polynomial may vary dynamically.
- the parameters are for example updated by the camera, the radar, or the lidar, at a given frequency or upon detection of a variation in the profile of the road.
- the invention provides for the use of a third degree polynomial function, thus providing an optimized trade-off between complexity and accuracy.
- the road profiles in a field of view FOV of a camera 121 are, generally speaking, rarely more complex than a succession of two turns, and the use of polynomial functions of a degree higher than or equal to four would lead to substantial computing times in the data processing unit of the control device.
- the present invention is in no way restricted to the use of a polynomial function for estimating the profile of the road edge. It extends to any other type of function, for example trigonometric, logarithmic, exponential, etc.
- the control device 130 determines, in a step 203 , a starting point Pd and an end point Pa of a zone ZPd, ZPg, ZP for the projection of patterns 170 .
- the projection distance DP is defined as being the distance between the proximal point Pd and the distal point Pa of the projection zone ZPd, ZPg, ZP.
- Pd is a parameter that is predefined by default by the maker of the host vehicle 100 , but it may also be modifiable by the driver or the operator using said host vehicle.
- Pa is a parameter that reaches its maximum value when no obstacle is detected by the data/image acquisition means.
- a safety margin MS is then predefined by the control device 130 so as to prevent patterns 170 from being projected onto the obstacle 180 .
- the means for acquiring data and/or images relating to the road 160 stretching in front of the host vehicle 100 are able to determine the type of obstacles 180 . No fewer than six categories of obstacles 180 are referenced. Thus, 0 corresponds to an unclassified object, 1 corresponds to an unknown object of small size, 2 corresponds to an unknown object of large size, 3 corresponds to a pedestrian, 4 corresponds to a bicycle, 5 corresponds to a motor car, and 6 corresponds to a truck.
- the control device 130 determines, in a step 204 , a distance Dc by default between the virtual projection onto the plane Pr of the road of the axis Ac of the camera 121 and the projection zones ZPd, ZPg of patterns 170 , respectively for a right projection zone ZPd and a left projection zone ZPg.
- the virtual projection onto the plane of the road Pr of the virtual axis Ac of the camera 121 appears as an axis of symmetry between the projection zones Zpd and Zpg.
- the list of patterns 170 able to be projected in the projection zone ZPd, ZPg, ZP is not exhaustive, and it may be defined by the maker of the host vehicle 100 and/or updated by the driver or the operator according to their needs.
- a circle, square, triangle, chevron, or a continuous or broken line may be projected.
- the control device 130 determines the width Lm of the pattern 170 . This value is defined by default by the maker of the host vehicle 100 but it is parameterizable by the driver or the operator using said host vehicle 100 .
- the method according to the invention is able to dynamically increase the width of the patterns 170 projected farthest away onto the road in order to correct the effect of perspective.
- the distance Dm between each projected pattern 170 may also be parameterizable by the driver or the operator using said host vehicle 100 , so as to provide better visual comfort.
- the control device 130 comprises a step 206 for allowing self-calibration of the projection modules 140 so that the projection of a pattern 170 by the right projection module 141 and the left projection module 143 , respectively, is symmetrical with respect to the virtual axis of camera 121 projected onto the road 160 .
- This step of self-calibration of the projection modules 140 is also able to mechanically and/or digitally configure said projection modules 140 so that the projection of a pattern 171 , 173 by the right module 141 and the left module 143 , respectively, allows the two patterns 170 to be superposed in order to form one single pattern 172 .
- the beam relating to the low-beam function is split with the juxtaposition of a lower portion called the “flat-beam” base beam and an upper portion called the “kink” which is intended to illuminate the road 160 while avoiding dazzling other users.
- the beam relating to the “high-beam” function is split with the superposition of the “flat-beam” base beam and a “head-beam” central portion with a restricted and more intense base.
- the patterns 170 are intended to be projected with a beam from the low beam or the high beam.
- control device 130 associated with a set of sensors 120 for determining the pitch 122 of the host vehicle 100 and taking into account the altitude and/or roll of the projection module 140 , is configured so as to compensate the mechanical and/or digital calibration of the projection modules 140 so that the projection of the patterns 170 remains stable and comfortable for the driver and/or the operator using the host vehicle 100 .
- the control device is able to compensate the light intensity according to the distance of projection of the patterns and the “flat-beam” base beam.
- control device is able to determine whether the outline of the vehicle is able to pass between two obstacles 180 .
- control device 130 associated with the data/image acquisition means is able to project an obstacle 180 avoidance strategy (see FIG. 6 ).
- control device 130 associated with the data/image acquisition means is able to project a trajectory for the host vehicle 100 when the lane narrows in the area of works.
- control device 130 associated with the “GPS” navigation system of the vehicle 123 , is able to project a change of trajectory for the host vehicle 100 .
- control device associated with the means for acquiring data/images and/or for detecting line crossing, is able to project a trajectory assist so that the host vehicle 100 stops going over the marking lines on the road and has a stable trajectory.
- control device associated with the data/image acquisition means, is able to project a virtual marking onto the road when they have disappeared or are not visible.
- a device 180 for fusion of information is able to determine the relevance of each datum from the various sensors associated with the host vehicle 100 , in order to transmit, to the control unit 150 of the vehicle 100 and consequently to the control device 130 , reliable data for aiding in decision making.
- the host vehicle 100 is able to be completely self-driving, so as to require no driver in order to follow a predetermined trajectory.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
- Lighting Device Outwards From Vehicle And Optical Signal (AREA)
- Image Analysis (AREA)
- Toys (AREA)
- Length Measuring Devices By Optical Means (AREA)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1859087A FR3086901B1 (fr) | 2018-10-01 | 2018-10-01 | Procede de pilotage de modules de projection de faisceaux de lumiere pixellise pour vehicule |
FR1859087 | 2018-10-01 | ||
PCT/EP2019/076482 WO2020070078A1 (fr) | 2018-10-01 | 2019-09-30 | Procédé de pilotage de modules de projection de faisceaux de lumiere pixellise pour vehicule |
Publications (2)
Publication Number | Publication Date |
---|---|
US20220118901A1 US20220118901A1 (en) | 2022-04-21 |
US11993201B2 true US11993201B2 (en) | 2024-05-28 |
Family
ID=67441147
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/281,859 Active 2040-12-10 US11993201B2 (en) | 2018-10-01 | 2019-09-30 | Method for controlling modules for projecting pixelated light beams for a vehicle |
Country Status (7)
Country | Link |
---|---|
US (1) | US11993201B2 (ko) |
EP (1) | EP3860878A1 (ko) |
JP (1) | JP7515468B2 (ko) |
KR (1) | KR102709526B1 (ko) |
CN (1) | CN112805180B (ko) |
FR (1) | FR3086901B1 (ko) |
WO (1) | WO2020070078A1 (ko) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102020007645A1 (de) * | 2020-04-03 | 2021-10-07 | Daimler Ag | Verfahren zur Kalibrierung eines Lidarsensors |
DE102022107700A1 (de) * | 2022-03-31 | 2023-10-05 | HELLA GmbH & Co. KGaA | Verfahren zum Betrieb eines lichtbasierten Fahrerassistenzsystems eines Kraftfahrzeugs |
CN117074046B (zh) * | 2023-10-12 | 2024-01-02 | 中汽研汽车检验中心(昆明)有限公司 | 高原环境下汽车实验室排放测试方法及装置 |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009143413A (ja) | 2007-12-14 | 2009-07-02 | Toyota Motor Corp | 運転支援装置 |
JP2012247369A (ja) | 2011-05-30 | 2012-12-13 | Honda Motor Co Ltd | 車両用投影装置 |
DE102011119923A1 (de) | 2011-11-28 | 2013-05-29 | Son Hao Vu | Beleuchtungssystem |
US20130173232A1 (en) * | 2010-04-20 | 2013-07-04 | Conti Temic Microelectronic Gmbh | Method for determining the course of the road for a motor vehicle |
DE202013006071U1 (de) | 2013-07-05 | 2013-09-12 | Stephan Kaut | Projezierte Lichtgitter aus Fahrzeugen |
DE102015201764A1 (de) | 2015-02-02 | 2016-08-04 | Volkswagen Aktiengesellschaft | Verfahren und Fahrerassistenzsystem zum Erzeugen einer Lichtverteilung durch ein Fahrzeug zur Ausgabe einer Fahranweisung |
DE102015201766A1 (de) | 2015-02-02 | 2016-08-04 | Volkswagen Aktiengesellschaft | Verfahren zum Erzeugen einer Lichtverteilung zur Ausgabe einer Fahranweisung für ein erstes Fahrzeug |
DE102016006919A1 (de) | 2016-06-07 | 2017-02-09 | Daimler Ag | Verfahren zum Betrieb eines Fahrzeugs |
EP3147821A1 (fr) | 2015-09-28 | 2017-03-29 | Valeo Vision | Système et procédé d'éclairage |
CN107161076A (zh) | 2016-03-07 | 2017-09-15 | 丰田自动车株式会社 | 车辆照明系统 |
CN107369336A (zh) | 2016-05-02 | 2017-11-21 | 福特全球技术公司 | 直观的触觉警报 |
FR3055979A1 (fr) | 2016-09-15 | 2018-03-16 | Valeo Vision | Caracteristiques de faisceau lumineux pixelise |
US20180086254A1 (en) * | 2016-09-29 | 2018-03-29 | Valeo Vision | Illumination system for an automotive vehicle |
CN107878300A (zh) | 2016-09-29 | 2018-04-06 | 法雷奥照明公司 | 通过机动车辆的投影系统投射图像的方法和相关联的投影系统 |
DE102016223650A1 (de) | 2016-11-29 | 2018-05-30 | Continental Automotive Gmbh | Leuchtsystem für ein Kraftfahrzeug und Verfahren dazu |
CN108216242A (zh) | 2016-12-14 | 2018-06-29 | 现代自动车株式会社 | 用于控制车辆的狭窄道路行驶的装置和方法 |
FR3062217A1 (fr) * | 2017-01-20 | 2018-07-27 | Valeo Vision | Aboutage de sources lumineuses pixelisees |
WO2018162219A1 (de) | 2017-03-09 | 2018-09-13 | Bayerische Motoren Werke Aktiengesellschaft | Kraftfahrzeug mit einem beleuchtungsmodul zur generierung einer symbolik |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102848969B (zh) * | 2012-09-07 | 2015-09-02 | 上海小糸车灯有限公司 | 一种旋转式电磁执行机构及其车灯远近光切换装置 |
DE102014007914A1 (de) * | 2014-05-27 | 2015-12-03 | Elektrobit Automotive Gmbh | Graphisches Darstellen von Straßen und Routen unter Benutzung von Hardware-Tesselierung |
CN104700071B (zh) * | 2015-01-16 | 2018-04-27 | 北京工业大学 | 一种全景图道路轮廓的提取方法 |
FR3056775B1 (fr) * | 2016-09-29 | 2021-08-20 | Valeo Vision | Procede de projection d'images par un systeme de projection d'un vehicule automobile, et systeme de projection associe |
-
2018
- 2018-10-01 FR FR1859087A patent/FR3086901B1/fr active Active
-
2019
- 2019-09-30 CN CN201980065120.0A patent/CN112805180B/zh active Active
- 2019-09-30 WO PCT/EP2019/076482 patent/WO2020070078A1/fr unknown
- 2019-09-30 JP JP2021517983A patent/JP7515468B2/ja active Active
- 2019-09-30 EP EP19782969.0A patent/EP3860878A1/fr active Pending
- 2019-09-30 US US17/281,859 patent/US11993201B2/en active Active
- 2019-09-30 KR KR1020217009600A patent/KR102709526B1/ko active IP Right Grant
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009143413A (ja) | 2007-12-14 | 2009-07-02 | Toyota Motor Corp | 運転支援装置 |
US20130173232A1 (en) * | 2010-04-20 | 2013-07-04 | Conti Temic Microelectronic Gmbh | Method for determining the course of the road for a motor vehicle |
JP2012247369A (ja) | 2011-05-30 | 2012-12-13 | Honda Motor Co Ltd | 車両用投影装置 |
DE102011119923A1 (de) | 2011-11-28 | 2013-05-29 | Son Hao Vu | Beleuchtungssystem |
DE202013006071U1 (de) | 2013-07-05 | 2013-09-12 | Stephan Kaut | Projezierte Lichtgitter aus Fahrzeugen |
DE102015201764A1 (de) | 2015-02-02 | 2016-08-04 | Volkswagen Aktiengesellschaft | Verfahren und Fahrerassistenzsystem zum Erzeugen einer Lichtverteilung durch ein Fahrzeug zur Ausgabe einer Fahranweisung |
DE102015201766A1 (de) | 2015-02-02 | 2016-08-04 | Volkswagen Aktiengesellschaft | Verfahren zum Erzeugen einer Lichtverteilung zur Ausgabe einer Fahranweisung für ein erstes Fahrzeug |
EP3147821A1 (fr) | 2015-09-28 | 2017-03-29 | Valeo Vision | Système et procédé d'éclairage |
CN107161076A (zh) | 2016-03-07 | 2017-09-15 | 丰田自动车株式会社 | 车辆照明系统 |
CN107369336A (zh) | 2016-05-02 | 2017-11-21 | 福特全球技术公司 | 直观的触觉警报 |
DE102016006919A1 (de) | 2016-06-07 | 2017-02-09 | Daimler Ag | Verfahren zum Betrieb eines Fahrzeugs |
FR3055979A1 (fr) | 2016-09-15 | 2018-03-16 | Valeo Vision | Caracteristiques de faisceau lumineux pixelise |
US20180086254A1 (en) * | 2016-09-29 | 2018-03-29 | Valeo Vision | Illumination system for an automotive vehicle |
CN107878300A (zh) | 2016-09-29 | 2018-04-06 | 法雷奥照明公司 | 通过机动车辆的投影系统投射图像的方法和相关联的投影系统 |
DE102016223650A1 (de) | 2016-11-29 | 2018-05-30 | Continental Automotive Gmbh | Leuchtsystem für ein Kraftfahrzeug und Verfahren dazu |
CN108216242A (zh) | 2016-12-14 | 2018-06-29 | 现代自动车株式会社 | 用于控制车辆的狭窄道路行驶的装置和方法 |
FR3062217A1 (fr) * | 2017-01-20 | 2018-07-27 | Valeo Vision | Aboutage de sources lumineuses pixelisees |
WO2018162219A1 (de) | 2017-03-09 | 2018-09-13 | Bayerische Motoren Werke Aktiengesellschaft | Kraftfahrzeug mit einem beleuchtungsmodul zur generierung einer symbolik |
Non-Patent Citations (3)
Title |
---|
Combined Chinese Office Action and Search Report issued Dec. 12, 2023, in corresponding Chinese Patent Application No. 201980065120.0 (with English Translation of Category of Cited Documents), 10 pages. |
International Search Report dated Nov. 7, 2019 in PCT/EP2019/076482 filed on Sep. 30, 2019, 3 pages. |
Japanese Office Action dated Jul. 28, 2023 in Japanese Application No. 2021-517983 (with English Translation), 8 pages. |
Also Published As
Publication number | Publication date |
---|---|
FR3086901A1 (fr) | 2020-04-10 |
US20220118901A1 (en) | 2022-04-21 |
JP7515468B2 (ja) | 2024-07-12 |
CN112805180B (zh) | 2024-08-06 |
FR3086901B1 (fr) | 2020-11-13 |
JP2022502782A (ja) | 2022-01-11 |
KR20210065116A (ko) | 2021-06-03 |
WO2020070078A1 (fr) | 2020-04-09 |
EP3860878A1 (fr) | 2021-08-11 |
KR102709526B1 (ko) | 2024-09-25 |
CN112805180A (zh) | 2021-05-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12077187B2 (en) | Sensing system and vehicle | |
CN108859933B (zh) | 车辆用灯泡及车辆 | |
CN109997057B (zh) | 激光雷达系统和方法 | |
US10369922B2 (en) | Vehicle headlight device | |
US11993201B2 (en) | Method for controlling modules for projecting pixelated light beams for a vehicle | |
US11454539B2 (en) | Vehicle lamp | |
US20240317132A1 (en) | Automatic Light Alignment | |
US11897383B2 (en) | Method for controlling a motor vehicle lighting system | |
US20230042933A1 (en) | Method for controlling a motor vehicle lighting system | |
US12097799B2 (en) | Vehicle headlight | |
US20230311818A1 (en) | Sensing system and vehicle | |
US12044373B2 (en) | Lights with microlens arrays | |
CN116867676A (zh) | 设置有能够发射像素化照明光束的光模块的机动车辆照明系统 | |
JP2024102522A (ja) | 車載検出器 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
AS | Assignment |
Owner name: VALEO VISION, FRANCE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EL IDRISSI, HAFID;REEL/FRAME:066273/0779 Effective date: 20210326 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |