CN117545982A - Vehicle with automatic headlight alignment - Google Patents

Vehicle with automatic headlight alignment Download PDF

Info

Publication number
CN117545982A
CN117545982A CN202280044558.2A CN202280044558A CN117545982A CN 117545982 A CN117545982 A CN 117545982A CN 202280044558 A CN202280044558 A CN 202280044558A CN 117545982 A CN117545982 A CN 117545982A
Authority
CN
China
Prior art keywords
headlight
vehicle
sensor
illumination
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280044558.2A
Other languages
Chinese (zh)
Inventor
C·P·查尔德
C·马祖尔
K·R·斯蒂尔
M·A·祖尔彻
M·B·曼恩博格
R·J·加罗内
唐效峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/721,146 external-priority patent/US12017577B2/en
Application filed by Apple Inc filed Critical Apple Inc
Publication of CN117545982A publication Critical patent/CN117545982A/en
Pending legal-status Critical Current

Links

Landscapes

  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)

Abstract

A vehicle may have a lamp, such as a headlight. The lamp may be moved using a positioner. Control circuitry in the vehicle may monitor the environment surrounding the vehicle using sensor circuitry. The sensor circuit may include one or more sensors, such as a lidar sensor, a radar sensor, an image sensor, and/or other sensors, to measure the shape of a surface in front of the vehicle and the position of the surface relative to the vehicle. The sensors and/or other sensors in the sensor circuit also measure headlight illumination of the surface. Based on the known shape of the surface in front of the vehicle and the distance between the surface and the vehicle, the control circuit can predict the location on the surface at which the headlight should be aimed. By comparing the prediction of the headlight illumination of the surface with the measurement of the headlight illumination of the surface, the vehicle can determine how to move the headlight with the positioner to align the headlight.

Description

Vehicle with automatic headlight alignment
This patent application claims priority from U.S. patent application Ser. No. 17/721,146, filed on 4 months and 14 days 2022, U.S. provisional patent application Ser. No. 63/298,365, filed on 11 months 2022, and U.S. provisional patent application Ser. No. 63/216,780, filed on 30 months 2021, which are hereby incorporated by reference in their entirety.
Technical Field
The present disclosure relates generally to systems such as vehicles, and more particularly to vehicles having lights.
Background
Automobiles and other vehicles have lights such as headlights. In order to accommodate different driving conditions, headlights are sometimes provided with adjustable settings, such as low and high beam settings. Some headlamps may turn during operation to accommodate road curves.
Disclosure of Invention
A vehicle may have a lamp, such as a headlight. Sensor circuitry in a vehicle may be used to measure the shape and position of a surface in front of the vehicle. The sensor circuit may also be used to measure how the headlight irradiates the surface when light from the headlight is projected onto the surface. For example, the sensor circuit may measure the position of the headlight aiming surface and may measure the pattern of light from the headlight on the surface as the headlight illumination is projected onto the surface. Light intensity measurements from an image sensor or other sensor may be used to obtain peak headlight intensity locations, may be used to locate edges of an illumination pattern, and may be used to determine other illumination characteristics.
Information about the three-dimensional shape of the surface in front of the vehicle can be used to predict where the headlight should be aimed, and thus the pattern of illumination from the headlight on the surface when the headlight is aligned with respect to the vehicle. By comparing the headlamp illumination intensity predictions on the surface with the measured headlamp illumination intensities on the surface, the vehicle can determine how to move the headlamp with the positioner to align the headlamp. If desired, information about the three-dimensional shape of the surface in front of the vehicle can be obtained from a database. For example, a three-dimensional map of the environment may be stored in a navigation database. Information from satellite navigation system sensors and/or other navigation sensors may be used to determine vehicle position. The corresponding three-dimensional surface shape information may then be retrieved from the database using the known vehicle location.
Drawings
Fig. 1 is a top view of an exemplary vehicle according to an embodiment.
Fig. 2 is a side view of an exemplary adjustable headlamp according to an embodiment.
Fig. 3 is a perspective view of an exemplary scene with a target being illuminated by a headlight according to an embodiment.
Fig. 4 is a graph showing how headlight performance may be monitored by measuring headlight illumination intensity as a function of position on an illuminated surface, according to an embodiment.
Fig. 5 is a cross-sectional side view of an exemplary headlamp having a plurality of independently adjustable elements, according to an embodiment.
Fig. 6 is a diagram showing how illumination from the headlight of fig. 5 may be measured, according to an embodiment.
Fig. 7 is a cross-sectional side view of an exemplary vehicle having a headlight and sensor circuit according to an embodiment.
Fig. 8 is a flowchart of exemplary operations involving the use of a vehicle with a headlight, according to an embodiment.
Detailed Description
Systems such as vehicles or other systems may have components that emit light, such as headlights and other lights. Headlamps may be used to illuminate roads and other objects in the vicinity of the vehicle. The illumination provided by the headlights allows the vehicle occupant to see the object and facilitate operation of the sensor during the night or under other dim ambient lighting conditions. For example, headlight illumination at visible and/or infrared wavelengths may be used to provide illumination for an image sensor used by an autopilot system or driver assistance system.
The illumination emitted by the headlights in the vehicle may be adjustable. For example, the headlights may have adjustable components that allow the headlight to operate in high beam and low beam modes and turn left and right (e.g., to accommodate curves in the road). The headlights can be adjusted to calibrate the headlights if desired. In this way, accidental misplacement of the headlight over time can be prevented.
To help ensure that the headlights are properly aligned and thus emit a beam of light in a desired direction, vehicle sensors, such as three-dimensional sensors, may collect information about objects within range of the headlights. For example, lidar sensors may be used to map three-dimensional shapes of roads and objects on roads in front of a vehicle. An image sensor in a vehicle may measure the pattern of illumination of headlights falling on roads and objects. Measurement of the headlight illumination shows the direction in which the headlight is pointing. By comparing the expected illumination (e.g., expected headlight illumination direction) with the measured illumination (e.g., measured headlight illumination direction), a change in headlight performance may be detected and corrective action taken. As an example, if it is determined that the headlight is pointing 5 ° higher, a positioner coupled to the headlight may be directed to automatically tilt the headlight 5 ° downward to compensate for this measured misalignment. In this way, the headlights may be continually adjusted during use of the vehicle to ensure that operation of the headlights is desirable. The headlights may also be adjusted based on measured and predicted changes in vehicle orientation relative to the road and other measured and predicted conditions.
FIG. 1 is a side view of a portion of an exemplary vehicle. In the example of fig. 1, the vehicle 10 is of a type that may carry passengers (e.g., an automobile, truck, or other automotive vehicle). Configurations where vehicle 10 is a robot (e.g., an autonomous robot) or other vehicle that does not carry a human passenger may also be used. Vehicles such as automobiles may sometimes be described herein as examples. As shown in fig. 1, the vehicle 10 may be operated on a road such as a road 14. An object such as object 26 may be located on or near other structures in the vicinity of vehicle 10, such as road 14.
The vehicle 10 may be driven manually (e.g., by a human driver), may be operated via remote control, and/or may be operated autonomously (e.g., by an autonomous driving system or other autonomous propulsion system). Autonomous driving systems and/or driver assistance systems in the vehicle 10 may perform autobraking, steering, and/or other operations using vehicle sensors such as lidar, radar, visible and/or infrared cameras (e.g., two-and/or three-dimensional cameras), proximity (distance) sensors, and/or other sensors to help avoid pedestrians, inanimate objects, and/or other external structures, such as the example obstacle 26 on the road 14.
The vehicle 10 may include a body, such as body 12. The body 12 may include a vehicle structure, such as a body panel formed of metal and/or other materials, and may include doors, hoods, trunk, fenders, wheel-mounted chassis, roofs, and the like. Windows may be formed in the door 18 (e.g., on a side of the body 12, on a roof of the vehicle 10, and/or in other portions of the vehicle 10). The windows, doors 18, and other portions of the body 12 may isolate the interior of the vehicle 10 from the external environment surrounding the vehicle 10. The door 18 may be opened and closed to allow a person to enter and exit the vehicle 10. Seats and other structures may be formed in the interior of the body 12.
The vehicle 10 may have automotive lighting, such as one or more headlights (sometimes referred to as headlamps), driving lights, fog lights, daytime running lights, turn lights, brake lights, and/or other lights. As shown in fig. 1, for example, the vehicle 10 may have a light, such as the light 16. In general, the lights 16 may be mounted on the front F of the vehicle 10, on the rear R of the vehicle 10, on the left and/or right sides W of the vehicle 10, and/or on other portions of the vehicle body 12. In the exemplary configuration, which may sometimes be described herein as an example, the lamp 16 is a headlight and is mounted to the front F of the vehicle body 12. As an example, there may be left and right headlights 16 located on the left and right sides of the vehicle 10, respectively, to provide illumination 20 in a forward direction (e.g., in the +x direction along which the vehicle 10 moves when traveling forward in the example of fig. 1). By illuminating the headlights 16 on exterior surfaces 28 in front of the vehicle 10, such as the roadway 14 and the object 26, the occupants of the vehicle 10 can see the surface 28 even in dim ambient lighting conditions (e.g., at night). Operation of sensors in the vehicle 10, such as image sensors and other sensors that use light, may also be supported by providing illumination to the surface 28.
The vehicle 10 may have a component 24. The components 24 may include propulsion and steering systems (e.g., a manually adjustable steering system and/or an autonomous steering system having wheels coupled to the body 12, steering controls, one or more motors for driving the wheels, etc.) and other vehicle systems. The component 24 may include control circuitry and/or input-output devices. The control circuitry in component 24 may be configured to run autonomous driving applications, navigation applications (e.g., applications for displaying maps on a display), and software for controlling vehicle temperature control devices, lighting, media playback, window lift, door operation, sensor operation, and/or other vehicle operations. For example, the control system may form part of an autonomous driving system that uses data, such as sensor data, to automatically drive the vehicle 10 on a roadway, such as roadway 14. The control circuitry may include processing circuitry and memory, and may be configured to perform operations in the vehicle 10 using hardware (e.g., dedicated hardware or circuitry), firmware, and/or software. Software code and other data for performing operations in the vehicle 10 are stored on a non-transitory computer readable storage medium (e.g., a tangible computer readable storage medium) in the control circuit. Software code may sometimes be referred to as software, data, program instructions, computer instructions, or code. The non-transitory computer-readable storage medium may include non-volatile memory, such as non-volatile random access memory, one or more hard disk drives (e.g., magnetic disk drives or solid state drives), one or more removable flash drives or other removable media, or other memory. Software stored on the non-transitory computer readable storage medium may be executed on the processing circuitry of component 24. The processing circuitry may include an application specific integrated circuit with processing circuitry, one or more microprocessors, a Central Processing Unit (CPU), or other processing circuitry.
Input-output devices of component 24 may include displays, sensors, buttons, light emitting diodes and other light emitting devices, haptic devices, speakers, and/or other devices for collecting environmental measurements, information regarding vehicle operation, and/or user inputs, as well as for providing outputs. The sensors in component 24 may include ambient light sensors, touch sensors, force sensors, proximity sensors, optical sensors such as cameras operating at visible, infrared, and/or ultraviolet wavelengths (e.g., fisheye cameras, two-dimensional cameras, three-dimensional cameras, and/or other cameras), capacitive sensors, resistive sensors, ultrasonic sensors (e.g., ultrasonic distance sensors), microphones, radio frequency sensors such as radar sensors, lidar (light detection and ranging) sensors, door opening/closing sensors, seat pressure sensors and other vehicle occupant sensors, window sensors, position sensors for monitoring position, orientation, and movement, speedometer, satellite positioning system sensors, and/or other sensors. Output devices in component 24 may be used to provide haptic output, audio output, visual output (e.g., display content, light, etc.), and/or other suitable output to vehicle occupants and others.
The three-dimensional sensors in the component 24 may be formed from two-dimensional image sensor pairs (e.g., binocular camera pairs formed at three-dimensional cameras) that together function as a stereoscopic depth sensor. An image sensor system that emits structured light (e.g., an array of points, lines, grids, and/or other structured light patterns of infrared and/or visible wavelengths) and captures images (e.g., two-dimensional images) for analysis may also be used to form a three-dimensional sensor. The captured image shows how the structured light pattern is distorted by the three-dimensional surface illuminated by the structured light pattern. By analyzing the distortion of the structured light, the three-dimensional shape of the surface can be reconstructed. If desired, the three-dimensional sensors of the vehicle 10 may include one or more time-of-flight sensors. For example, time-of-flight measurements may be made using light (e.g., lidar sensor measurements) and radio frequency signals (e.g., three-dimensional radar).
During operation, the control circuitry of component 24 may collect information from sensors and/or other input-output devices, such as lidar data, camera data (e.g., two-dimensional images), radar data, and/or other sensor data. For example, three-dimensional image data may be captured using a three-dimensional image sensor. Two-dimensional images (e.g., headlight illumination images on one or more exterior surfaces, such as exterior surface 28 of fig. 1) may also be collected.
A vehicle occupant or other user of the vehicle 10 may provide user input to the control circuitry of the vehicle 10. Cameras, touch sensors, physical controls, and other input devices may be used to collect user input. The remote data source may provide database information to the control circuitry of the component 24 by wireless communication with the vehicle 10. Display screens, speakers, and other output devices may be used to provide content to the user, such as interactive on-screen menu options and audio. A user may interact with this interactive content by providing touch input to a touch sensor in the display and/or by providing user input with other input devices. If desired, the control circuitry of the vehicle 10 may use sensor data, user input, information from a remote database, and/or other information (e.g., information regarding nearby obstacles in the road and/or other environment surrounding the vehicle 10) to provide driver assistance information to the driver and/or to autonomously drive the vehicle 10.
The component 24 may include a forward sensor circuit, as shown by the forward sensor 24F of fig. 1. The forward sensor circuit may include one or more sensors facing a surface in front of the vehicle 10 (e.g., one or more sensors oriented in the +x direction of fig. 1 to detect structures in front of the vehicle 10, such as the obstacle 26 and the surface 28 of the roadway 14). The sensors 24F and/or other sensors in the vehicle 10 may include lidar, radar, visible and/or infrared cameras, and/or other sensors. For example, sensor 24F may include a two-dimensional image sensor and/or a three-dimensional image sensor that operate using structured light, binocular vision, time of flight (e.g., lidar or radar), and/or other three-dimensional imaging arrangements. Sensor 24F may include a three-dimensional sensor that measures the three-dimensional shape of surface 28 and optionally the headlight illumination pattern of headlight 16 on surface 28. If desired, a two-dimensional image sensor may be used to measure the headlight illumination pattern on the surface 28 (e.g., the forward sensor circuitry of the vehicle 10 may use a three-dimensional sensor and a two-dimensional sensor to measure the surface shape and headlight illumination intensity, respectively, or both sensors may be used to gather information about the surface shape and/or the surface illumination).
To ensure that the surface 28 is illuminated sufficiently well to be visible to a user in the vehicle 10 and to the visible light image sensor in the sensor 26F, the headlight 16 may produce visible light illumination. To help ensure that the optional infrared image sensor in forward sensor 24F receives sufficient reflected infrared light from the lighting structure in front of vehicle 10, headlight 16 may produce infrared illumination (if desired). The forward sensor circuit of the vehicle 10 for measuring headlight illumination may be sensitive to visible light and, if desired, to infrared light.
To correct for misalignment of the headlights 16 over time (e.g., due to offset in the mounting structure of the headlights 16, variations in vehicle suspension components, etc.), the control circuitry of the vehicle 10 may dynamically control the positioners in the headlights 16 based on sensor measurements (e.g., based on differences between the expected pattern of headlight illumination and the measured illumination pattern). For example, if the headlight 16 is pointed too high, the locator may be used to tilt the headlight 16 downward so that the headlight 16 is aimed correctly. In this manner, the headlights 16 may automatically compensate for misalignment and may remain aligned during operation of the vehicle 10.
Fig. 2 is a cross-sectional side view of an exemplary headlamp, showing how the headlamp may be mounted to the vehicle body 12. The body 12 may have a cavity that receives the headlight 16, the headlight 16 may be attached to an exterior surface of the body 12, and/or the headlight 16 may be otherwise supported by the body 12. As shown in fig. 2, the headlight 16 may include a headlight housing 30 and one or more lenses or other optical components, such as a headlight lens 32. The housing 30 may include support structures and shell structures for supporting the components of the headlight 16. These structures may facilitate mounting the headlight 16 to the vehicle body 12. The housing 30 may include polymers, metals, carbon fiber composites, and other fiber composites, glass, ceramics, other materials, and/or combinations of these materials. The lens 32 may include a polymer, glass, transparent ceramic, and/or other material that is transparent to visible light and infrared light (e.g., near infrared light). The headlight 16 comprises a light source, for example a light source 40, which emits light 20. Light 20 may comprise visible light (e.g., 400nm to 750nm light) and, if desired, infrared light (e.g., near infrared light at one or more wavelengths of 800nm to 2500nm or other suitable infrared light). The lens 32 may be formed from one or more lens elements and may be used to help collimate the light 20 and direct the light 20 from the front light 16 in a desired direction (e.g., to produce an illumination beam in the +x direction).
The light source 40 may include one or more light emitting devices such as light emitting diodes, lasers, lamps, or other components that emit light. Optical elements such as reflectors, lenses, diffusers, tinting elements, filters, adjustable shutters for adjusting the output of the headlight 16 between low and high beam illumination patterns, and/or other optical components may be included in the headlight 16 (e.g., such optical elements may be included in the housing 30). The individually adjustable light emitting diodes and electrically adjustable components, such as adjustable shutters and/or other adjustable optical components associated with the headlights 16, may be adjusted by the control circuitry of the vehicle 10 to adjust the direction of the light 20 and the shape of the area covered by the light 20 (e.g., adjust the light 20 to produce a desired low or high beam illumination pattern and/or other illumination pattern, steer the light 20, etc.).
A positioner, such as positioner 44, may be used to adjust the position, and thus the angular orientation, of the headlight 16 relative to the vehicle body 12. The positioner 44 may include one or more electrically adjustable actuators, such as actuator 42, and may include an optional manually adjustable positioning component (e.g., a threaded member that may be rotated with a manual or motorized screwdriver to adjust the position of the headlight 16). The actuator 42 may include one or more motors, solenoids, and/or other actuators. In response to commands from the control circuitry of the vehicle 10, the positioner formed by the actuator 42 may be used to translate the headlight 16 along the X, Y and/or Z-axis and/or other axes and/or may be used to rotate the headlight 16 about the X, Y and/or Z-axis and/or other axes. As one example, the actuator 42 may tilt the headlight 16 upward and downward relative to structure in front of the vehicle 10 by rotating the headlight 16 about the Y-axis of fig. 2, and may rotate the headlight 16 leftward and rightward about the Z-axis of fig. 2. If desired, the positioner of the headlight 16 can be used to make different types of positional adjustments (e.g., rotation about the X-axis, translation and/or rotation relative to another axis, etc.). It is exemplary to tilt the headlights 16 up/down and rotate the headlights 16 to the right/left using a positioner, such as positioner 44 of fig. 2, formed by one or more actuators 42 in the vehicle 10.
During operation, the vehicle 10 may adjust the headlights 16 to accommodate different driving conditions. One or more adjustable shutters, adjustable lighting, and/or other adjustable components in the headlight 16 may be controlled by a control circuit of the vehicle 10. The high beam or low beam may be selected based on user input and/or based on an oncoming vehicle detected using one or more sensors, if desired. As another example, when it is determined (from a steering system sensor, a position sensor, a lidar sensor, etc.) that the road on which the vehicle 10 is traveling begins to curve left, the headlight 16 may be automatically turned left by the positioner control to ensure that the illumination of the road by the light 20 is desirable. The headlights 16 may also be turned on and off and/or otherwise adjusted based on measured ambient lighting conditions, weather, and other factors.
The position of the headlights 16 may also be adjusted for calibration purposes. For example, to avoid the risk that the headlights 16 may be misplaced over time, the vehicle 10 may monitor the alignment of the headlights 16. As an example, the vehicle 10 may use forward sensor circuitry to map structures in front of the vehicle 10 and measure illumination patterns on those structures. Based on these measurements, the control circuitry of the vehicle 10 may determine which (if any) corrective action to take. For example, the vehicle 10 may determine how the locator 44 should reposition the headlights 16 to correct for detected changes in headlight alignment.
To map the structure in front of the vehicle 10, the vehicle 10 may use three-dimensional sensors to collect three-dimensional images of the structure. The three-dimensional sensor may be a lidar sensor, a radar sensor, a stereo camera, a structured light sensor, or other three-dimensional image sensor that may collect three-dimensional images. As an example, consider the scenario of fig. 3. In the example of fig. 3, the vehicle 10 is traveling on a road 14 (e.g., public road, roadway, etc.). The three-dimensional sensor in the forward sensor 26F faces forward in the +x direction. The surface 28 is associated with the portion of the roadway 14 in front of the vehicle 10 and the object 26 and is in the field of view of the three-dimensional sensor. Thus, the three-dimensional sensor may capture a three-dimensional image of the surface 28 to determine the shape of the road 14 (e.g., the position in three dimensions) and the shape of the object 26 (e.g., the position in three dimensions). The captured shape information includes information regarding the distance between the vehicle 10 and the surface 28. Objects such as the road 14 and the object 26 may receive illumination from the headlights 16 and, thus, may sometimes be referred to as target objects or targets.
The object 26 of the surface 28 may be a test object having a set of predetermined registration marks 50 (sometimes referred to as fiducials, optical targets, or alignment marks), or may be any other object (e.g., a daily object such as a wall, garage door, vehicle, or other structure). By way of example, the object 26 may be an external object (e.g., a visually distinct mark or other characteristic that allows the three-dimensional sensor to sense the shape and appearance of the surface 28) that contains the detectable surface mark 54. The presence of the markers 50 and/or other markers 54 may assist the vehicle 10 in accurately measuring the location surface 28. For example, the alignment marks 50 may be separated from each other by a known distance, so analysis of the image comprising the marks 50 may help determine the distance of the object 26 from the vehicle 10, and may help determine the angular orientation of the object 26 relative to the vehicle 10. In a stereoscopic image sensor based three-dimensional sensor, the presence of the markers 50 and/or markers 54 may facilitate the construction of a three-dimensional image from a two-dimensional stereoscopic image pair. Sensor data from multiple sources in the forward sensor circuit of the vehicle 10 may be combined, if desired, to further enhance the three-dimensional surface shape measurement. As an example, three-dimensional image data from a lidar sensor may be combined with three-dimensional data from a stereo camera, three-dimensional radar data, and data from a two-dimensional sensor.
Based on the three-dimensional image of the surface 28 captured using the three-dimensional image sensor, the vehicle 10 may determine an expected projection of the headlight beam (illumination 20) from the headlight 16 to the surface 28. A two-dimensional image sensor or other sensor in sensor 24F may measure the actual pattern of illumination 20 projected onto surface 28 so that the actual and expected projected patterns may be compared to identify differences.
As an example, consider the case where the object 26 is a flat surface 10 meters in front of the vehicle 10 and oriented perpendicular to the vehicle 10. Using the three-dimensional image of the surface 28, the vehicle 10 may determine the position and orientation of the object 26 (e.g., 10m in front of the vehicle 10) and may determine the inclination and/or other characteristics of the road 14. As an example, the three-dimensional image of the road 14 may display that the road 14 is flat and horizontal. Based on the known shape of the surface 28 (e.g., the known location of the surface of the object 26 relative to the vehicle 10 and the road 14), the vehicle 10 (e.g., the control circuitry of the component 24) may determine the location of the headlights 16 relative to the surface 28 and thereby predict the location of left and right headlight illumination center points 52 on the surface 28 on the object 26 to be produced by the left and right headlights 16, respectively, of the vehicle 10. If desired, headlight operation may be characterized by making other headlight illumination intensity measurements (e.g., measurements that identify edges of the headlight beam, or other headlight illumination measurements that determine the headlight illumination direction).
The headlight 16 may deviate from perfect alignment due to vibration and normal aging of the mounting components of the headlight 16 and/or other changes in the vehicle 10 over time. As an example, without intervention, the aiming of the left and right headlights of the vehicle 10 may slowly begin to be higher than the nominal aiming. Knowing the distance of the subject 26 from the headlights 16 and the nominal (correct) orientation of the headlights 16, the vehicle 10 can predict the correct position of the headlight aiming point 52. By capturing an image of the projected output of the headlight 16, the actual orientation of the headlight 16 (e.g., the actual direction in which the headlight 16 is pointing) may be measured and compared to the expected orientation of the headlight 16 when fully aligned (e.g., the expected direction in which the headlight 16 should be pointing). For example, an image sensor in the vehicle 10 may capture an image of the surface 28 while the surface 28 is under illumination from the headlights 16. The pattern of light 20 projected onto the surface 28 (e.g., object 26 and road 14) may show that the aiming point of the headlight 16 on the surface of the object 26 is 10cm higher than expected (e.g., in this example, point 52 may be 10cm higher). Because the shape of the surface 28 is known and the distance from the headlight 16 to the surface of the object 26 is known, the vehicle 10 may be vertically offset 10cm from the measured point 52 to determine that the headlight 16 is pointed at a position 2 ° higher (as an example). Based on this determination, the locator 44 may be directed to tilt the headlight 16 downward by 2 ° to compensate for the measured 2 ° angular misalignment. This aligns the headlight 16 so that it points to the intended location and so that the point 52 on the object 26 coincides with its intended location. In this way, the overall illumination pattern produced when the light 20 irradiates the surface 28 will be as desired.
In monitoring the headlamp performance, the vehicle 10 may measure the peak intensity of the headlamp illumination 20, may measure the edges of the illumination 20 (e.g., the boundaries of the illumination pattern), and/or may measure other headlamp performance parameters to characterize the output of the headlamp 16. These measured one or more headlamp performance parameters may then be compared to corresponding predicted headlamp performance parameters.
As an example, consider the headlight output shown in the graph of fig. 4. In the example of fig. 4, the headlight output intensity I has been plotted as a function of distance (e.g., distance across the surface 28 parallel to the X-axis or Y-axis of fig. 3). The solid line 60 corresponds to the expected output of the headlight 16 when the headlight 16 is properly aligned (e.g., based on predictions made of the measured shape of the surface 28 and known nominal operating characteristics of the headlight 16 when aligned). The dashed line 62 corresponds to a measured output of the headlight 16 (e.g., an output measured by capturing an image of the surface 28 when the surface is illuminated by the light 20). To determine the degree of difference in the measured and expected performance, the vehicle 10 may determine the location of the peak in the intensity I of each curve, may determine the location of the edge of each curve, and/or may additionally measure the intensity and location of the light output from the headlight 16.
As shown in fig. 4, for example, the expected intensity curve 60 has an expected intensity peak 64, while the measured curve 62 has a measured intensity peak 66 that is offset from the peak 64 by a distance DP. The vehicle 10 may compare points 64 and 66 to determine the value of DP and/or the vehicle 10 may collect information about the expected and measured intensity patterns of the headlights by comparing the edge intensities (see, e.g., point 68, which corresponds to the location of the edges of the headlight illumination pattern where the expected intensity 60 has fallen to the intensity threshold ITH, and point 70, which corresponds to the measured location of these edges where the measured intensity 62 has the intensity threshold ITH). Using the illumination pattern edges, peaks, and/or other illumination pattern characteristics, predictive and measured headlamp information (e.g., curves 60 and 62) may be compared by the vehicle 10 to determine the amount by which the locator 44 should be adjusted to align the headlamp 16. The headlights 16 may be aligned together (e.g., may be measured when the left and right headlights are illuminated) or may be aligned separately (e.g., by making a first measurement when the left headlight is illuminated instead of the right headlight, and by making a second measurement when the right headlight is illuminated instead of the left headlight).
The headlight 16 may, if desired, comprise a plurality of individually adjustable headlight elements. As shown in fig. 5, for example, the headlight 16 may have a plurality of headlight elements 72, each of which is individually adjustable. The elements 72 may have individually adjustable light sources (e.g., each element 72 may correspond to an individual light emitting diode), and/or the elements 72 may have individually adjustable shutters or other light adjustment devices. To improve the accuracy of the headlight output characteristic measurement, one or more of the elements 72 may be used to generate illumination, while the remaining elements 72 do not generate illumination. By cycling through each element 72 (or group of elements), a different corresponding output intensity measurement may be obtained for each element 72 (or group of elements). As an example, consider the case where there are three separate light emitting diodes in the headlight 16 (e.g., element 72 corresponds to a separately adjustable light source). To determine whether the headlight 16 needs to be aligned, each of the three leds may be turned on sequentially while capturing a corresponding image of the surface 28 under the resulting illumination. In this way, more detailed headlight illumination measurements can be made than if all elements 72 were turned on simultaneously.
Fig. 6 shows how this type of approach may produce multiple partially activated headlamp output curves, each curve corresponding to activation of a separate corresponding element 72. For each element 72, the vehicle 10 may generate a corresponding expected output curve 74, and a corresponding actual output intensity (curve 76) may be measured. By collecting headlamp performance data using higher granularity measurements (such as these measurements), headlamp performance can be measured more accurately than when all elements 72 are activated together. After characterizing each individual element 72 (e.g., by measuring the degree of offset of the expected curve 74 relative to the measured curve 76), any misalignment of the headlights may be accurately determined. The positioner 44 may then be used to move the headlight 16 (e.g., adjust the angular orientation of the headlight 16) and/or the relative intensity of each element 72 may be adjusted to align the headlight 16 and help ensure that the headlight 16 provides illumination in a desired pattern.
The vehicle 10 may measure the surface 28 and the headlight illumination projected on the surface 28 when parked adjacent to a calibration target (e.g., a screen or other object having registration markers 50), when parked adjacent to a wall, garage door, or other structure, or during normal operation while traveling on a roadway (e.g., when the vehicle 10 is traveling autonomously or manually through traffic).
Depending on the operating conditions of the vehicle 10, the vehicle 10 may tilt or otherwise change its orientation relative to the road 14. As shown in fig. 7, for example, the vehicle 10 may lean forward while decelerating. Such tilting may be detected by a three-dimensional sensor in the forward sensor 24F, and if desired, a sensor such as sensor 24T may be used (e.g., a suspension displacement sensor that senses the extent to which the wheels 78 protrude from the vehicle body 12 to determine the orientation of the vehicle body 12 relative to the road 14). By measuring the orientation of the vehicle 10 relative to the road 14, the expected location of the headlight illumination on the surface 28 can be determined. For example, if it is determined that the vehicle 10 is leaning downward, the expected location of point 52 of FIG. 3 will be lower than the point where it is determined that the vehicle 10 is leaning upward. Thus, sensor information such as vehicle suspension sensor information and/or other tilt sensor information may be considered when predicting the position of the headlight output on the target.
If desired, the positioner 44 may be controlled, one or more light sources and/or light modulating components may be controlled (see, e.g., element 72 of FIG. 5), and/or other adjustable components associated with the headlights 16 may be controlled to adjust the illumination 20 (e.g., when the vehicle 10 is driven). These adjustments may be made based on sensor measurements that show vehicle inclination, road characteristics that indicate, for example, the presence or predicted presence of a deceleration obstacle in the road 14 (e.g., see obstacle 14B), weather (e.g., whether there is rain or other precipitation conditions), ambient lighting conditions, predicted or detected turns in the road 14, geographic vehicle location, and/or other conditions of the vehicle 10 while parked, traveling, etc. If desired, the vehicle 10 may have a sensor such as sensor 241. The sensor 241 may be an inertial measurement unit, for example, containing a compass, accelerometer, and/or gyroscope, and may be used to measure the orientation of the body 12, the forward sensor 24F, and/or the headlight 16 relative to gravity.
As an example, consider a scenario in which the control circuitry of the vehicle 10 uses sensors or other data sources to determine that the vehicle 10 begins to turn left along the road 14. The vehicle 10 may obtain information about a left turn in the road 14 from a map database or other external database, from lidar measurements or other forward-facing sensor measurements, from inertial measurement units, from steering system components (e.g., steering position sensors), and/or from other sources. In response to detecting the presence or impending left turn, the vehicle 10 may use the positioner 44 to turn the headlight 16 to the left. This helps to ensure that the illumination 20 is present on the road 14. As another example, if an impending obstacle such as obstacle 14B is detected, vehicle 10 may automatically adjust the position of headlights 16 as vehicle 10 passes obstacle 14B to help maintain a desired direction of headlight illumination 20 (e.g., to help ensure that headlight illumination 20 is straight ahead, even when vehicle 10 is leaning due to wheels 78 moving past obstacle 14B). The headlight 16 may support low beam and high beam modes if desired. The vehicle 10 may switch between modes based on sensor data from sensors in the vehicle 10, such as a rain sensor (e.g., a humidity sensor), an ambient light sensor, an oncoming headlight sensor, a traffic sensor, and/or other sensors. During the auto-alignment operation, headlight movement, such as movement intended to accommodate turns in the road, may be considered. For example, if the headlights 16 are turned to the left due to the presence of a left turn in the road 14, the vehicle 10 will expect that the headlight illumination 20 will likewise be moved to the left on the surface 28, and thus this information may be considered when measuring the headlight output to evaluate the headlight alignment.
An exemplary operation involved in using the vehicle 10 is shown in fig. 8.
During operation of block 80, the headlight 16 may be used to illuminate the subject 26, the roadway 14 (e.g., the surface 28 of fig. 3). The left and right headlights 16 may be illuminated simultaneously or individually. In a headlamp configuration in which each headlamp has a plurality of adjustable elements, such as element 72 of fig. 5, these elements may be adjusted individually during operation of block 80 if desired (e.g., to provide information regarding the different roles of these elements to the different portions of headlamp illumination provided by the headlamp 16 during headlamp characterization).
During operation of block 82, a three-dimensional sensor in forward sensor 24F may be used to capture an image of surface 28 (e.g., a three-dimensional image may be captured). The presence of registration marks 50 and/or other detectable features such as marks 54 on the surface of the target, such as the surface of object 26, may facilitate the capture of satisfactory three-dimensional image data from the target. In addition to obtaining a three-dimensional map (shape) of the surface 28, the vehicle 10 may also capture images of headlight illumination present on the surface 28 from the headlight 16. For example, a visible light image and/or an infrared image from a three-dimensional image sensor, a separate two-dimensional image sensor, or other sensor may be captured that shows the location of peak intensities of the headlight illumination and/or shows other headlight illumination features (e.g., the location of edges of the headlight illumination pattern).
If desired, information about the three-dimensional shape of the surface in front of the vehicle may be obtained from local (in-vehicle) and/or remote navigation system databases in addition to or instead of obtaining three-dimensional shape information. For example, a three-dimensional map of the environment may be stored in a navigation database for driving assistance functions and/or autonomous driving functions. Information from navigation system sensors (e.g., global positioning system circuitry and/or other satellite navigation system circuitry, inertial measurement units, lidar, image recognition systems, and/or other navigation sensors) may be used to determine vehicle position (location and orientation). The vehicle location information obtained from the navigation system sensors in this manner may be used to retrieve corresponding three-dimensional surface shape information (e.g., the three-dimensional shape of the surface at the determined vehicle bearing) from a database.
After measuring the shape of the surface 28 and/or otherwise determining the shape of the surface 28 (e.g., by obtaining information from a database) and measuring the pattern of headlight illumination 20 illuminating the surface 28, the vehicle 10 may determine an expected pattern of headlight illumination (e.g., an expected peak intensity location of headlight output, an expected location of headlight beam edges, and other characteristics associated with a direction in which the headlight illumination is expected to be directed) on the surface 28 during operation of block 84. The expected headlamp illumination pattern is determined based on a known shape of the surface 28 (e.g., the position of the surface 28 relative to the vehicle 10 in three dimensions) and a known nominal performance characteristic of the headlamp 16 (e.g., a known size and shape of the beam emitted by each headlamp). During block 84, the vehicle 10 measures the actual headlight illumination pattern generated by the headlight 16 on the surface 28 and compares the measured headlight illumination information with the expected headlight illumination information.
If the expected and measured illumination patterns (center position, edge position, etc.) do not match, corrective action may be taken to align the headlights 16 based on the comparison. For example, during operation of block 86, the control circuitry of the vehicle 10 may direct the positioner 44 to tilt the headlight 16 downward by 3 ° in response to detecting an undesired upward tilt of 3 °. As indicated by line 88, the auto-alignment operation of fig. 8 may be repeatedly performed (e.g., whenever the vehicle 10 is parked periodically on a schedule, whenever there is a satisfactory surface 28 available in front of the vehicle 10, in response to a user input command and/or in response to a determination that other headlamp calibration criteria have been met).
Although sometimes described in the context of headlights, any suitable light in the vehicle 10 (e.g., fog lights, tail lights, parking lights, auxiliary side lights, etc.) may be aligned using the method of fig. 8. In addition to performing a headlight alignment operation, the control circuitry of the vehicle 10 may also use sensor measurements to calibrate actuators such as the positioner 44, if desired. As an example, when the vehicle 10 is parked, the positioner 44 may be calibrated by guiding the movement of the positioner 44 while corresponding sensor measurements are made to assess the accuracy of these movements. Examples of sensors that may be used to measure actuator performance such that a compensating calibration operation may be performed include a light sensor (e.g., an image sensor that measures whether the light output from the headlight 16 is moving 4.5 ° when the positioner 44 is directed to move 4.5 °) and an inertial measurement unit (e.g., an inertial measurement unit coupled to the positioner 44 that measures angular movement of the positioner 44 during calibration). Calibrating the positioner 44 while the vehicle 10 is stationary (parked) enables the vehicle 10 to more accurately perform open loop control of the target of the positioner 44 while traveling based on navigation system information (inertial measurement unit data and satellite navigation system data) and other data.
According to an embodiment, there is provided a vehicle including: a vehicle body; a headlight supported by the vehicle body, the headlight configured to generate headlight illumination; a control circuit configured to detect a difference between an expected direction of headlight illumination and a measured direction of headlight illumination; and an electrical positioner configured to align the headlight in response to the detected discrepancy.
According to another embodiment, a vehicle includes a forward sensor circuit on a body configured to capture a three-dimensional image of a surface in front of the body, and configured to measure headlight illumination from a headlight when illuminating the surface, the control circuit configured to determine an expected direction of the headlight illumination using the three-dimensional image and the measured headlight illumination on the surface.
According to another embodiment, the forward sensor circuit includes a two-dimensional image sensor configured to measure headlight illumination.
According to another embodiment, the forward sensor circuit includes a lidar sensor that captures three-dimensional images.
According to another embodiment, the forward sensor circuit comprises a three-dimensional sensor configured to capture a three-dimensional image, and the three-dimensional sensor comprises a three-dimensional sensor selected from the group consisting of: radar sensors, stereo cameras, and structured light sensors.
According to another embodiment, a vehicle includes a sensor configured to measure headlight illumination when a control circuit adjusts the headlight to change the headlight illumination.
According to another embodiment, the headlight comprises a plurality of light sources, and the sensor is configured to measure the headlight illumination when the control circuit adjusts the plurality of light sources to produce the respective different amounts of light.
According to another embodiment, the headlight is operable in a low beam mode and a high beam mode, and the sensor is configured to measure headlight illumination when the control circuit causes the headlight to change between operating in the low beam mode and the high beam mode.
According to another embodiment, the vehicle includes a sensor configured to detect a tilt of the body relative to the road, and the control circuit is configured to adjust the electrically tunable locator based on the detected tilt.
According to another embodiment, the control circuit is configured to adjust the electrically tunable locator based on information comprising information selected from the group consisting of: weather information, vehicle location information, and road information.
According to another embodiment, a vehicle comprises: a navigation system circuit configured to determine a vehicle position, the control circuit configured to use the determined vehicle position to retrieve a three-dimensional surface shape corresponding to a surface in front of the vehicle body from a database; and an image sensor configured to measure headlight illumination from the headlight as the headlight irradiates the surface, the control circuitry configured to determine an expected direction of the headlight illumination using the three-dimensional surface shape and the measured headlight illumination on the surface.
According to another embodiment, the control circuit is further configured to calibrate the electrical positioner when the vehicle body is parked.
According to one embodiment, there is provided a vehicle including: a vehicle body; a headlight configured to generate headlight illumination on a surface of a front of the vehicle body; a sensor circuit configured to obtain a surface measurement of the surface and configured to obtain a headlight illumination measurement of the headlight illumination on the surface; an electrical positioner configured to move the headlight relative to the vehicle body; and a control circuit configured to adjust the electrically tunable locator to align the headlight based on the surface measurement and the headlight illumination measurement.
According to another embodiment, the sensor circuit comprises a three-dimensional sensor and the surface measurement comprises a three-dimensional surface shape collected by the three-dimensional sensor.
According to another embodiment, the three-dimensional sensor comprises a lidar sensor.
According to another embodiment, a three-dimensional sensor includes a stereo sensor having a pair of cameras.
According to another embodiment, the three-dimensional sensor comprises an optical sensor.
According to another embodiment, the three-dimensional sensor comprises a radar sensor.
According to another embodiment, the control circuit is configured to determine an expected position of the headlight illumination on the surface using the surface measurement, and to compare the expected position with a measured position of the headlight illumination on the surface obtained from the headlight illumination measurement.
According to another embodiment, the control circuit is configured to adjust the electrically adjustable locator to align the headlight based on a comparison of the expected position and the measured position.
According to an embodiment, there is provided a vehicle including a vehicle body; a headlight; a positioner configured to move the headlight relative to the vehicle body; a three-dimensional sensor configured to measure a surface of an object in front of the vehicle body; and an image sensor configured to measure a position on the surface at which the headlight is pointed; a control circuit configured to use the measured surface to determine a predicted position on the target at which the headlight is expected to be aimed when the headlight is aligned with respect to the vehicle body, and compare the measured position with the predicted position, and move the headlight using the positioner based on the comparison.
According to another embodiment, the three-dimensional sensor is configured to measure the shape of the surface in three dimensions and the distance between the surface and the vehicle body.
The foregoing is merely exemplary and various modifications may be made to the embodiments described. The foregoing embodiments may be implemented independently or may be implemented in any combination.

Claims (22)

1. A vehicle, comprising:
a vehicle body;
a headlight supported by the vehicle body, the headlight configured to generate headlight illumination;
a control circuit configured to detect a difference between an expected direction of the headlight illumination and a measured direction of the headlight illumination; and
an electrically adjustable positioner configured to align the headlight in response to the detected difference.
2. The vehicle of claim 1, the vehicle further comprising:
a forward sensor circuit on the body configured to capture a three-dimensional image of a surface in front of the body, and configured to measure the headlight illumination from the headlight when the headlight illuminates the surface, wherein the control circuit is configured to determine the expected direction of the headlight illumination using the three-dimensional image and the measured headlight illumination on the surface.
3. The vehicle of claim 2, wherein the forward sensor circuit comprises a two-dimensional image sensor configured to measure the headlight illumination.
4. The vehicle of claim 2, wherein the forward sensor circuit comprises a lidar sensor capturing the three-dimensional image.
5. The vehicle of claim 2, wherein the forward sensor circuit comprises a three-dimensional sensor configured to capture the three-dimensional image, and wherein the three-dimensional sensor comprises a three-dimensional sensor selected from the group consisting of: radar sensors, stereo cameras, and structured light sensors.
6. The vehicle of claim 1, further comprising a sensor configured to measure the headlight illumination when the control circuit adjusts the headlight to change the headlight illumination.
7. The vehicle of claim 6, wherein the headlight comprises a plurality of light sources, and wherein the sensor is configured to measure the headlight illumination when the control circuit adjusts the plurality of light sources to produce respective different amounts of light.
8. The vehicle of claim 6, wherein the headlight is operable in a low beam mode and a high beam mode, and wherein the sensor is configured to measure the headlight illumination when the control circuit causes the headlight to change between operating in the low beam mode and operating in the high beam mode.
9. The vehicle of claim 1, further comprising a sensor configured to detect a tilt of the body relative to a road, wherein the control circuit is configured to adjust the electrical positioner based on the detected tilt.
10. The vehicle of claim 1, wherein the control circuit is configured to adjust the electrical trim locator based on information, wherein the information comprises information selected from the group consisting of: weather information, vehicle location information, and road information.
11. The vehicle of claim 1, the vehicle further comprising:
a navigation system circuit configured to determine a vehicle position, wherein the control circuit is configured to use the determined vehicle position to retrieve a three-dimensional surface shape from a database corresponding to a surface in front of the vehicle body; and
an image sensor configured to measure the headlight illumination from the headlight when the headlight illuminates the surface, wherein the control circuitry is configured to determine the expected direction of the headlight illumination using the three-dimensional surface shape and the measured headlight illumination on the surface.
12. The vehicle of claim 1, wherein the control circuit is further configured to calibrate the electrical positioner when the body is parked.
13. A vehicle, comprising:
a vehicle body;
a headlight configured to generate headlight illumination on a surface in front of the vehicle body;
a sensor circuit configured to obtain a surface measurement on the surface and configured to obtain a headlight illumination measurement of the headlight illumination on the surface;
an electrically adjustable positioner configured to move the headlight relative to the vehicle body; and
control circuitry configured to adjust the electrically adjustable locator to align the headlight based on the surface measurement and the headlight illumination measurement.
14. The vehicle of claim 13, wherein the sensor circuit comprises a three-dimensional sensor, and wherein the surface measurement comprises a three-dimensional surface shape collected by the three-dimensional sensor.
15. The vehicle of claim 14, wherein the three-dimensional sensor comprises a lidar sensor.
16. The vehicle of claim 14, wherein the three-dimensional sensor comprises a stereo sensor having a pair of cameras.
17. The vehicle of claim 14, wherein the three-dimensional sensor comprises an optical sensor.
18. The vehicle of claim 14, wherein the three-dimensional sensor comprises a radar sensor.
19. The vehicle of claim 13, wherein the control circuit is configured to determine an expected location of the headlight illumination on the surface using the surface measurements, and to compare the expected location to a measured location of the headlight illumination on the surface obtained from the headlight illumination measurements.
20. The vehicle of claim 19, wherein the control circuit is configured to adjust the electrically adjustable locator to align the headlight based on the comparison of the expected position and the measured position.
21. A vehicle, comprising:
a vehicle body;
a headlight;
a positioner configured to move the headlight relative to the vehicle body;
a three-dimensional sensor configured to measure a surface of an object in front of the vehicle body; and
an image sensor configured to measure a location on the surface at which the headlight is pointed; and
a control circuit configured to:
Using the measured surface to determine a predicted location on a target at which the headlight is expected to be aimed when the headlight is aligned with respect to the vehicle body;
comparing the measured position with the predicted position; and
the headlight is moved using the positioner based on the comparison.
22. The vehicle of claim 21, wherein the three-dimensional sensor is configured to measure a shape of the surface in three dimensions and a distance between the surface and the body.
CN202280044558.2A 2021-06-30 2022-06-15 Vehicle with automatic headlight alignment Pending CN117545982A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US63/216,780 2021-06-30
US63/298,365 2022-01-11
US17/721,146 US12017577B2 (en) 2021-06-30 2022-04-14 Vehicles with automatic headlight alignment
US17/721,146 2022-04-14
PCT/US2022/033669 WO2023278158A1 (en) 2021-06-30 2022-06-15 Vehicles with automatic headlight alignment

Publications (1)

Publication Number Publication Date
CN117545982A true CN117545982A (en) 2024-02-09

Family

ID=89794382

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280044558.2A Pending CN117545982A (en) 2021-06-30 2022-06-15 Vehicle with automatic headlight alignment

Country Status (1)

Country Link
CN (1) CN117545982A (en)

Similar Documents

Publication Publication Date Title
KR101859047B1 (en) Headlamp, Adaptive Driver Assistance System and Vehicle
US9360668B2 (en) Dynamically calibrated head-up display
US9187029B2 (en) System and method for controlling exterior vehicle lights on motorways
KR20180071663A (en) Vehicle and method for controlling thereof
EP3281824B1 (en) Vehicle control device for a plurality of lamps mounted on vehicle
KR102673144B1 (en) Sensor-cluster apparatus
KR20160036242A (en) Gesture recognition apparatus, vehicle having the same and method for controlling the same
CN218489560U (en) Vehicle with a steering wheel
KR20220056169A (en) Apparatus for controlling headlamp of vehicle
CN114056225A (en) Method for operating a high beam assistance system of a motor vehicle and motor vehicle
KR20200046167A (en) A control method and a vehicle for determining a virtual vehicle boundary thereof
KR102450656B1 (en) Vehicle and method for controlling thereof
US11560083B2 (en) Autonomous headlamp encapsulated with camera and artificial intelligence processor to adjust illumination
CN217540605U (en) Vehicle lamp, vehicle and vehicle headlamp
CN117545982A (en) Vehicle with automatic headlight alignment
JP2024524189A (en) Vehicles with automatic headlight alignment
CN116419072A (en) Vehicle camera dynamics
JP2023536676A (en) ADAS calibration system for calibrating at least one headlamp of a vehicle
US11879610B2 (en) Multi-mode lights
US20240110684A1 (en) Multi-Mode Lights
US20230349528A1 (en) Lights With Microlens Arrays
CN117480340A (en) Multiband adjustable lamp
CN117957399A (en) Multi-mode lamp
KR20160061942A (en) Gesture recognition apparatus and method for controlling the same
WO2024044424A1 (en) Vehicle lights with multiple functions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination