CN118269807A - Car lamp control method, car-mounted lighting device and car - Google Patents

Car lamp control method, car-mounted lighting device and car Download PDF

Info

Publication number
CN118269807A
CN118269807A CN202211733073.2A CN202211733073A CN118269807A CN 118269807 A CN118269807 A CN 118269807A CN 202211733073 A CN202211733073 A CN 202211733073A CN 118269807 A CN118269807 A CN 118269807A
Authority
CN
China
Prior art keywords
vehicle
brightness
information
target object
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211733073.2A
Other languages
Chinese (zh)
Inventor
罗宇哲
段军克
黄旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211733073.2A priority Critical patent/CN118269807A/en
Publication of CN118269807A publication Critical patent/CN118269807A/en
Pending legal-status Critical Current

Links

Landscapes

  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)

Abstract

The embodiment of the application discloses a car lamp control method, a car-mounted lighting device and a car, which are used for improving the driving safety of the car under the condition of glare. The method of the embodiment of the application comprises the following steps: acquiring a first brightness of an illumination area in front of a vehicle and a second brightness of a light source from a target object; wherein the light source of the target object is a light source outside the vehicle. A glare value is obtained from the first luminance and the second luminance, the glare value being indicative of a degree to which the second luminance is greater than the first luminance. If the glare value is greater than the first threshold, the brightness of the low beam of the vehicle is enhanced.

Description

Car lamp control method, car-mounted lighting device and car
Technical Field
The embodiment of the application relates to the field of lighting control, in particular to a car lamp control method, a car-mounted lighting device and a car.
Background
The high beam on the vehicle is used to illuminate objects at a distance, so that the high beam is concentrated in light and has high brightness. The visibility of the driving environment is low in driving scenes such as night, overcast days, heavy rain and the like. If the oncoming vehicle turns on the high beam, glare may occur due to the irradiation of the high beam of the oncoming vehicle in the view of the driver of the host vehicle.
If glare occurs in the driver's field of view, the driver cannot see objects (e.g., lane lines, road blocks, vehicles ahead, etc.) that are darker in front (darker relative to the oncoming traffic light), increasing the risk of traffic accidents.
Disclosure of Invention
The embodiment of the application provides a car lamp control method, a car-mounted lighting device and a car, which are used for improving the driving safety of the car under the condition of glare.
In a first aspect, an embodiment of the present application provides a vehicle lamp control method, which may be applied to a vehicle-mounted lighting device. The method comprises the following steps: the in-vehicle lighting device acquires a first luminance of an illumination area in front of the vehicle, and acquires a second luminance of a light source from the target object. Wherein the light source of the target object is a light source outside the vehicle. Then, the in-vehicle lighting device obtains a glare value indicating a degree to which the second luminance is greater than or equal to the first luminance, based on the first luminance and the second luminance. If the glare value is greater than the first threshold, the in-vehicle lighting device enhances the brightness of the low beam of the vehicle. Wherein the illumination area is an illumination area of a low beam of the vehicle.
In the embodiment of the application, the vehicle-mounted lighting device determines the glare value according to the first brightness and the second brightness in front of the vehicle, so as to determine whether the visual field of the driver has glare, and the brightness of the dipped headlight is enhanced under the condition that the glare has occurred, so that the brightness contrast in the visual field of the driver is reduced (the brightness of the lighting area is improved, the brightness contrast between the lighting area and the light source of the target object is reduced), the lighting area can be seen clearly by the driver, and the probability of traffic accidents is reduced.
In the embodiment of the application, the first threshold is determined through experiments, and the human eye can feel the glare value when the glare occurs. The strong light source causes glare to occur, and when the glare value is larger than the first threshold value, the human eyes are stimulated by the strong light source to such an extent that the human eyes can perceive the glare. Since there is a difference between the perceptibility of light by the human eyes of different people, different sizes of the first threshold values can be determined. Alternatively, the first threshold may be a ratio between the second luminance and the first luminance, for example, the first threshold (ratio) may be ≡1, for example, may be 1, 1.1, 1.2, or a ratio of a larger order (for example, tens, hundreds, thousands, etc., without limiting the present application thereto), etc.; the first threshold may also be a difference between the second luminance and the first luminance, for example, the first threshold (difference) may be ≡10dB, for example, may be 10dB, 20dB, 30dB, etc., which is not limited by the present application.
In an alternative implementation, for the second brightness acquisition, this may be achieved by: the vehicle-mounted lighting device acquires luminance information in front of the vehicle. And determining a target object according to the brightness information, wherein the brightness of the light source of the target object is greater than or equal to a second threshold value. Then, the in-vehicle lighting device obtains a second luminance of the light source of the target object based on the luminance information.
In the embodiment of the application, the light source of the target object is determined by judging the brightness threshold value in the brightness information, the judging mode is simple and clear, complex processing operation is not needed, and the calculation force can be saved and the efficiency can be improved. And, taking the second threshold value as a brightness basis for judging that the glare is generated, and only when the glare is possibly generated based on the second threshold value, carrying out subsequent brightness determination, glare value acquisition and the like. Determination of brightness, acquisition of glare values, etc. in a scene where glare is unlikely to occur (e.g., brightness of all objects in front of the vehicle is low, e.g., is below a second threshold value) can be avoided, so that computational effort can be saved.
In the embodiment of the application, the second threshold is determined through experiments, and the human eyes can perceive the lowest brightness of the strong light source when the glare occurs. A strong light source with a brightness greater than or equal to the second threshold may produce glare in the human eye. Since there is a difference between the brightnesses that can be perceived by the human eye in different driving scenarios, a second threshold value of different sizes can be determined. Alternatively, the second threshold may be ≡10dB, for example, the second threshold may be 10dB, 20dB, 30dB, etc., which is not limited by the present application.
In an alternative implementation, for the second brightness acquisition, this may be achieved by: the vehicle-mounted lighting device acquires luminance information and graphic information in front of the vehicle. And determining a target object according to the brightness information and the graphic information, wherein the brightness of the light source of the target object is greater than or equal to a second threshold value, and the shape of the target object meets the shape characteristics of the vehicle. Then, the in-vehicle lighting device can acquire the second brightness of the light source of the target object based on the brightness information.
In the embodiment of the present application, an object whose brightness meets the brightness requirement and whose shape meets the shape characteristics of the target object (for example, the shape characteristics of a vehicle, optionally, the shape characteristics of a searchlight, etc., which is not limited in the present application) is determined as the target object. Objects which do not meet the preset target object types can be screened out, so that misjudgment on target objects is avoided. The target object of the preset type can be rapidly identified, and the efficiency of blind area brightness compensation (namely, the brightness of the low beam lamp is enhanced) is improved when the glare occurs.
In an alternative implementation, the target object comprises a oncoming vehicle of a vehicle and the light source of the target object comprises a lamp of the oncoming vehicle of the vehicle. The vehicle-mounted lighting device acquires the second brightness of the light source from the target object, which can be realized in the following manner: the in-vehicle lighting device acquires graphic information, color information, and luminance information in front of the vehicle. Then, a target object is determined according to the graphic information and the color information, wherein the shape of the target object meets the shape characteristics of the vehicle, and the light beam emitted by the light source of the target object is white. And determining the second brightness of the light source of the target object according to the brightness information.
In the embodiment of the application, the target object is determined by the shape characteristics of the vehicle and the characteristics of the white light beam, so that the vehicles running in the same direction, independent lamps and the like can be prevented from being misjudged as the lamps of the oncoming vehicles (the light sources of the target object), and the accuracy of the judgment of the target object is improved. In addition, the identification of specific types of objects from the graphic information and the color information is a relatively mature technology, so that the identification with high efficiency and high accuracy can be realized.
In an alternative implementation, the vehicle-mounted lighting device may obtain the second brightness of the light source from the target object by: graphic information, color information, and luminance information in front of the vehicle are acquired. The target object is determined based on the graphic information and the color information. The light beam emitted by the light source of the target object is white, the time length of the light beam shining is greater than or equal to the first time length, and the shape of the light beam accords with at least one of the following: the included angle between the light beam and the horizontal plane is smaller than or equal to the first angle; or, the included angle between the light beam and the vertical surface is smaller than or equal to the second angle; wherein the vertical plane is parallel to the front direction of the vehicle. Then, a second luminance of the light source of the target object is determined based on the luminance information.
In the embodiment of the application, if the brightness of the driving environment (namely, the environment brightness of the driver visual field) is more stable, the driving safety is more facilitated. The blind area is supplemented by the light source with shorter lighting time (namely an unstable light source), so that the ambient brightness of the visual field of a driver is frequently changed, and the driving safety is not facilitated. Therefore, the embodiment of the application defines the first time period, and if the time period of the light beam is longer than or equal to the first time period, the light source emitting the light beam is considered to be a stable light source, and then the blind area of the glare caused by the light source is supplemented. Alternatively, the first duration may be greater than or equal to 100ms, for example, the first duration may be 100ms, 120ms, 150ms, etc., which is not limited by the present application.
In the embodiment of the application, the light source with too short lighting time length can be excluded from the light source of the target object through judging the lighting time length of the white light beam emitted by the light source, so that frequent change of the ambient brightness of the visual field of the driver caused by carrying out blind area lighting compensation on the light source with too short lighting time length is avoided. If the included angle between the white light beam emitted by the light source and the vehicle is larger, the glare phenomenon may not be caused, so that the light source out of the range of the glare phenomenon can be eliminated through judging the angle of the white light beam emitted by the light source, misjudgment is prevented when no glare occurs, and the accuracy of the glare judgment is improved.
In an alternative implementation, the in-vehicle lighting device may obtain the second brightness of the light source from the target object by: acquiring graphic information and brightness information in front of a vehicle; and determining the target object according to the graphic information and the brightness information. The brightness of the light source of the target object is larger than or equal to a second threshold value, and the time length of the light beam emitted by the light source of the target object is larger than or equal to a first time length. Then, a second luminance of the second object is obtained based on the luminance information.
In the embodiment of the application, the light source of the target object is judged by the brightness threshold value and the lighting time length, the judging mode is simple and quick, and the efficiency of determining the target object can be improved, so that the efficiency of supplementing the light to the blind area when the glare is generated again is improved.
In an alternative implementation, the vehicle-mounted lighting device may acquire image information in front of the vehicle through the camera. Wherein, the graphic information, the color information and the brightness information are all extracted from the image information.
In an alternative implementation, the vehicle-mounted lighting device can acquire image information in front of the vehicle through the camera, and acquire brightness information in front of the vehicle through the brightness sensor. Wherein the graphic information and the color information are extracted from the image information.
In an alternative implementation, the image information includes a plurality of images having different exposure durations. The vehicle-mounted lighting device can acquire the shape characteristics of the target object by the following modes: and the vehicle-mounted lighting device fuses the plurality of images with different exposure time lengths to obtain a fused image. Then, the in-vehicle lighting device acquires the shape feature of the target object from the graphic information of the fused image.
In the embodiment of the application, images with different exposure time lengths can provide information of different aspects (for example, images with long exposure time lengths can provide more accurate brightness information, and images with long exposure time lengths can provide more accurate shape features). By fusing images with different exposure time lengths, the acquired fused image contains more accurate information (such as more accurate brightness information, more accurate shape characteristics and the like), and the target object can be identified more efficiently and accurately through the fused image.
In the embodiment of the application, the glare value is used for reflecting the degree of brightness contrast of different areas perceived by human eyes, so that the dazzling degree of a picture seen by human eyes is reflected. In an alternative implementation, the glare value may comprise a ratio between the second luminance and the first luminance. Or the glare value may comprise the difference between the second luminance and the first luminance.
In an alternative implementation, the onboard lighting device may enhance the brightness of the low beam of the vehicle by projecting a supplemental light beam toward a target one of the illumination areas. Wherein the area of the target illumination area and the target distance are in positive correlation; the target distance is the distance between the vehicle and the target object.
In the embodiment of the application, the area with increased brightness is limited in the target illumination area in front of the vehicle, and compared with the brightness of the whole illumination area of the enhanced dipped headlight, the vision of the driver can be concentrated in the target illumination area (the target illumination area is in front of the vehicle, is a place which needs attention of the driver in the driving process of the vehicle and is related to driving safety), so that the occurrence probability of traffic accidents is reduced.
In an alternative implementation, the left and right side boundaries of the bright light beam have a sharpness in horizontal cross-section of greater than or equal to 0.05.
In the embodiment of the present application, the attention of the driver is attracted by the target illumination area having a sharp boundary (i.e., sharpness of 0.05 or more as described above). In the glare scene, the driver can quickly locate the region where the vision line is concentrated (namely, the target illumination region and the region related to the driving safety) through the prompt of the sharp boundary, so that the driving safety is improved.
In an alternative implementation, when the bright filling beam is projected onto the vertical wall, the included angle α between the tangent line and the horizontal line at any point on the boundary of the left and right sides of the illuminated area satisfies: alpha is more than or equal to 20 degrees and less than or equal to 70 degrees.
In an alternative implementation, the horizontal angle h between the bright light beam and the horizontal plane satisfies: h is less than or equal to-10 degrees and less than or equal to-1 degrees.
In an alternative implementation, the vertical angle v between the bright light beam and the vertical plane satisfies: v is more than or equal to-10 degrees and less than or equal to 10 degrees.
In a second aspect, an embodiment of the present application provides a vehicle. The vehicle includes a sensor, a processing unit, and a low beam. Wherein the sensor is used for acquiring information in front of the vehicle. The processing unit is used for acquiring first brightness of an illumination area in front of the vehicle and second brightness of a light source from a target object according to information in front of the vehicle. Wherein the light source of the target object is a light source outside the vehicle. The processing unit is further configured to obtain a glare value based on the first luminance and the second luminance, the glare value being indicative of a degree to which the second luminance is greater than the first luminance. The processing unit may control the low beam light to enhance the brightness if the processing unit determines that the glare value is greater than the first threshold.
Alternatively, the processing unit may send a control instruction to the low beam light, in case it is determined that the glare value is larger than the first threshold, the instruction being for instructing the low beam light to enhance the brightness. Alternatively, the processing unit may send the control instruction for enhancing the brightness directly to the low beam lamp, or may send the control instruction for enhancing the low beam lamp brightness to the lamp control unit, so as to control the low beam lamp to enhance the brightness through the lamp control unit.
The advantageous effects of the second aspect are referred to in the first aspect and are not described here in detail.
Alternatively, the processing unit may be a main control unit on the vehicle, or may be a processing chip for controlling the lamps of the vehicle, which is not limited in the present application.
Alternatively, the processing unit may be integrated on the vehicle lamp, and the sensor may be integrated on the vehicle lamp, which is not limited in the present application.
In an alternative implementation, the processing unit is specifically configured to: luminance information in front of the vehicle is acquired. And determining a target object according to the brightness information, wherein the brightness of the light source of the target object is greater than or equal to a second threshold value. And obtaining the second brightness of the light source of the target object according to the brightness information.
In an alternative implementation, the processing unit is specifically configured to: luminance information and graphic information in front of the vehicle are acquired. And determining a target object according to the brightness information and the graphic information, wherein the brightness of a light source of the target object is greater than or equal to a second threshold value, and the shape of the target object meets the shape characteristics of the vehicle. And obtaining the second brightness of the light source of the target object according to the brightness information.
In an alternative implementation, the processing unit is specifically configured to: graphic information, color information, and luminance information in front of the vehicle are acquired. And determining a target object according to the graphic information and the color information, wherein the shape of the target object meets the shape characteristics of the vehicle, and the light beam emitted by the light source of the target object is white. And determining the second brightness of the light source of the target object according to the brightness information.
In an alternative implementation, the processing unit is specifically configured to: graphic information, color information, and luminance information in front of the vehicle are acquired. And determining a target object according to the graphic information and the color information. And determining the second brightness of the light source of the target object according to the brightness information. The light beam emitted by the light source of the target object is white, the time length of the light beam shining is greater than or equal to the first time length, and the shape of the light beam meets at least one of the following conditions: the included angle between the light beam and the horizontal plane is smaller than or equal to the first angle; or, the included angle between the light beam and the vertical surface is smaller than or equal to the second angle; wherein the vertical surface is parallel to the front direction of the vehicle;
In an alternative implementation, the processing unit is specifically configured to: graphic information and luminance information in front of the vehicle are acquired. And determining the target object according to the graphic information and the brightness information, wherein the brightness of the light source of the target object is larger than or equal to a second threshold value, and the time length of the light beam emitted by the light source of the target object when the light beam is lighted is larger than or equal to a first time length. And obtaining the second brightness of the light source of the second object according to the brightness information.
In an alternative implementation, the sensor includes a camera for acquiring image information in front of the vehicle. The processing unit is used for extracting graphic information, color information and brightness information from the image information.
In an alternative implementation, the sensor includes a camera for acquiring image information in front of the vehicle and a brightness sensor for acquiring brightness information. The processing unit is used for extracting graphic information and color information from the image information.
In an alternative implementation, the image information includes a plurality of images having different exposure durations. The processing unit is used for: and fusing the plurality of images with different exposure time lengths to obtain a fused image. And acquiring the shape characteristics of the target object from the graphic information of the fusion image.
In an alternative implementation, the glare value comprises a ratio between the second luminance and the first luminance; or the glare value comprises the difference between the second luminance and the first luminance.
In an alternative implementation, the low beam lamp body is used to project a complementary beam of light toward a target one of the illumination areas. Wherein the area of the target illumination area has a positive correlation with the target distance. Wherein the target distance is the distance between the vehicle and the light source of the target object.
In an alternative implementation, the left and right side boundaries of the bright light beam have a sharpness in horizontal cross-section of greater than 0.05.
In an alternative implementation, when the bright filling beam is projected onto the vertical wall, the included angle α between the tangent line and the horizontal line at any point on the boundary of the left and right sides of the illuminated area satisfies: alpha is more than or equal to 20 degrees and less than or equal to 70 degrees.
In an alternative implementation, the horizontal angle h between the bright light beam and the horizontal plane satisfies: h is less than or equal to-10 degrees and less than or equal to-1 degrees.
In an alternative implementation, the vertical angle v between the bright light beam and the vertical plane satisfies: v is more than or equal to-10 degrees and less than or equal to 10 degrees.
Drawings
FIG. 1 is a schematic view of a vehicle lamp according to the present application;
FIG. 2 is a schematic diagram of the driver's view angle in a glare scene according to the present application;
fig. 3 is a schematic structural diagram of a vehicle-mounted lighting device according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a vehicle lamp control method according to an embodiment of the present application;
Fig. 5 is a schematic diagram of a light supplementing effect of the vehicle lamp control method according to the embodiment of the application;
Fig. 6 is another schematic flow chart of a vehicle lamp control method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a target illumination area according to an embodiment of the present application;
Fig. 8a is a schematic diagram of a target illumination area under different road conditions according to an embodiment of the present application;
FIG. 8b is a schematic diagram of a target illumination area in a curve scene according to an embodiment of the present application;
Fig. 9 is another schematic flow chart of a vehicle lamp control method according to an embodiment of the present application;
FIG. 10 is a schematic illustration of the features of the oncoming vehicle lights provided by an embodiment of the present application;
fig. 11a is another schematic flow chart of a vehicle lamp control method according to an embodiment of the present application;
FIG. 11b is a schematic diagram of light beams emitted from opposite vehicles according to an embodiment of the present application;
FIG. 12 is a schematic diagram showing the sharpness of the left and right edges of a bright light beam according to an embodiment of the present application;
FIG. 13a is a schematic view illustrating a projection angle of a light beam according to an embodiment of the present application;
FIG. 13b is a schematic diagram illustrating an angle between a light beam and a horizontal plane;
FIG. 13c is a schematic diagram illustrating an angle between a vertical plane and a bright-compensating beam according to an embodiment of the present application;
fig. 14 is a schematic diagram of a vehicle lamp control system according to an embodiment of the present application;
fig. 15 is a schematic flow chart of a vehicle lamp control system according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below with reference to the accompanying drawings. As one of ordinary skill in the art can know, with the development of technology and the appearance of new scenes, the technical scheme provided by the embodiment of the application is also applicable to similar technical problems.
The terms first, second and the like in the description and in the claims and in the above-described figures, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances and are merely illustrative of the manner in which embodiments of the application have been described in connection with the description of the objects having the same attributes. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of elements is not necessarily limited to those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. In addition, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
Some terms of art appearing in the embodiments of the application are explained below:
Glare (dazzle): refers to visual conditions in the field of view that cause visual discomfort and reduce the visibility of objects due to an unfavorable luminance distribution, or the presence of extreme contrast in luminance in space or time.
As shown in fig. 1, the vehicle includes a low beam and a high beam. The low beam is used for illuminating a road surface in a relatively short range in front of the vehicle, and the high beam is used for realizing long-distance illumination. Therefore, the light of the high beam is more concentrated and the brightness is higher than that of the low beam.
In driving scenes such as night, overcast and rainy weather, haze weather and the like, the visibility of the environment where the vehicle is located is low. If the far-reaching light is turned on by the oncoming vehicle (i.e., the vehicle whose traveling direction is opposite to that of the vehicle), the light of the far-reaching light is concentrated and the brightness is high, so that the brightness difference between the far-reaching light and the surrounding environment is large, glare is easily generated in the visual field of the driver of the vehicle, and the risk of traffic accidents is increased.
For example, in the driver's view shown in fig. 2, glare occurs in the driver's view due to the low brightness of the environment in which the vehicle is located and the high brightness of the high beam light facing the incoming vehicle. The glare causes that a driver cannot see a zebra crossing, a lane line, etc. with darker brightness in front of the vehicle, thereby increasing the risk of traffic accidents.
It should be noted that, in addition to the high beam of the oncoming vehicle, the irradiation of the vehicle by other types of light sources (e.g., searchlight, etc.) may also cause glare in the driver's field of view, which is not a limitation of the present application.
In order to solve the problems, the embodiment of the application provides a car lamp control method, a car-mounted lighting device and a car. According to the embodiment of the application, whether the glare occurs is judged through the glare value, so that the brightness of the dipped headlight of the vehicle is enhanced under the condition of the glare, the brightness difference in the visual field of a driver is reduced, the glare phenomenon is lightened, and the risk of traffic accidents is reduced.
Fig. 3 is a schematic structural diagram of a vehicle-mounted lighting device according to an embodiment of the present application. As shown in fig. 3, the vehicle-mounted lighting device provided by the embodiment of the application comprises a processing unit, a vehicle lamp control unit and a vehicle lamp.
The processing unit is connected with the sensor and is used for processing data from the sensor and sending an instruction of lamp illumination to the lamp control unit. For example, instructions to enhance illumination intensity for a particular area, etc., may be included, as the application is not limited in this regard. The car light control unit is used for controlling the illumination of the car light according to the instruction.
Alternatively, the sensor may include a camera, a brightness sensor, a radar, etc.; the processing unit may comprise a microcontroller (micro control unit, MCU); the vehicle light control unit may include a headlight control module (headlamp control module, HCM); the lamp may include a high beam lighting module and a low beam lighting module, which the present application is not limited to.
Alternatively, the vehicle-mounted lighting device may also include a sensor, which is not limited in this regard by the present application.
Based on the structure of the vehicle-mounted lighting device shown in fig. 3, the embodiment of the application provides a vehicle lamp control method. As shown in fig. 4, the method includes:
401. A first luminance of an illumination area in front of the vehicle and a second luminance of a light source from a target object are acquired.
In the embodiment of the present application, the area that can be irradiated by the low beam lamp of the vehicle is referred to as an illumination area. The illumination area is an area which is closer to the front of the vehicle and is most likely to influence the driving safety of the vehicle. Therefore, in order to secure the running safety of the vehicle, the object in the illumination area should be clearly observed by the driver.
In the embodiment of the application, a light source with high brightness such as a high beam light of a oncoming vehicle is called a light source of a target object, and the light source of the target object is a light source outside the vehicle. Since the luminance of the light source of the target object is high, when the light source of the target object is irradiated in the host vehicle direction, there may be a large luminance difference from other objects in front of the vehicle (for example, objects in the illumination area), resulting in glare in the driver's view. Alternatively, the light source of the target object may be a searchlight or the like in addition to the lamps of the oncoming vehicles, which is not limited in the present application.
In the embodiment of the present application, the luminance of the illumination area is referred to as a first luminance, and the luminance of the light source of the target object is referred to as a second luminance.
Alternatively, in step 401, the acquisition of the first luminance and the second luminance may be implemented by a processing unit (e.g., MCU) in the in-vehicle lighting device. Specifically, the processing unit may acquire the first luminance and the second luminance according to luminance information, image information, and the like from the sensor. The specific determination process will be described in the following embodiments, and will not be described here.
402. A glare value is obtained from the first luminance and the second luminance, the glare value being indicative of a degree to which the second luminance is greater than the first luminance.
In the embodiment of the application, the illumination area is the area of most concern for ensuring the driving safety of the vehicle. The light source of the target object is the object with the highest brightness that may occur in the driver's field of view. The degree of difference between the first luminance (of the illumination area) and the second luminance (of the light source of the target object) may reflect the degree to which the object in the illumination area is clearly visible to the driver. When the degree of difference is large, glare may occur to the driver's field of view, resulting in the driver not being able to see the objects in the illuminated area.
Thus, embodiments of the present application define the concept of a glare value for reflecting the extent to which the second luminance (i.e. the luminance of the light source of the target object) is larger than the first luminance (i.e. the luminance of the illumination area). By the magnitude of the glare values, the degree of bright-dark contrast between different objects in the driver's field of view (i.e. in front of the vehicle) can be determined. The glare value may reflect the degree of irritation of the human eye by the strong light in the case where the human eye is irradiated with the strong light of the light source of the target object. The larger the glare value, the larger the difference in brightness between the light source of the target object and the illumination area in the driver's field of view, the more intense the human eye is stimulated by the strong light of the light source of the target object, the more easily the glare is generated.
Alternatively, the glare value may comprise a ratio between the second luminance and the first luminance. Experiments show that when the ratio between the second brightness and the first brightness is greater than or equal to a certain threshold, the human eyes can feel that glare occurs. The embodiment of the application refers to the threshold value as a first threshold value, namely the minimum glare value when the human eyes can feel the glare. Since there is a difference between the perceptibility of light by the human eyes of different people, different sizes of the first threshold values can be determined. Alternatively, the first threshold may be ≡1, for example, may be 1, 1.1, 1.2, or a ratio of greater magnitude (e.g., tens, hundreds, thousands, etc., without limitation of the application), and the application is not limited in this regard.
Alternatively, the glare value may also comprise the difference between the second luminance and the first luminance. Experiments show that when the difference between the second brightness and the first brightness is greater than or equal to a certain threshold, the human eye can feel that glare occurs. The embodiment of the application refers to the threshold value as a first threshold value, namely the minimum glare value when the human eyes can feel the glare. Since there is a difference between the perceptibility of light by the human eyes of different people, different sizes of the first threshold values can be determined. Alternatively, the first threshold may be ≡10dB, for example, 10dB, 20dB, 30dB, etc., which is not limited by the present application.
Alternatively, in step 402, the determination of the glare value may be implemented by a processing unit (e.g. MCU) in the in-vehicle lighting device.
403. If the glare value is greater than the first threshold, the brightness of the low beam of the vehicle is enhanced.
If the glare value is greater than the first threshold value, it is determined that glare has occurred in the driver's field of view of the vehicle. The brightness of the vehicle dipped headlight can be enhanced, so that the brightness contrast degree between the illumination area of the vehicle dipped headlight and the light source of the target object can be reduced, glare is reduced, a driver can see objects in the illumination area of the vehicle dipped headlight clearly, the probability of traffic accidents is reduced, and the driving safety is improved.
Alternatively, the brightness of the dipped headlight may be enhanced by increasing the illumination brightness of the dipped headlight, increasing the duty cycle of the dipped headlight (e.g., the dipped headlight is controlled by a spatial light modulator), or the like, which is not limited by the present application.
For example, as shown in fig. 5, the brightness of the illumination area can be enhanced by enhancing the brightness of the dipped headlight, so that the definition of objects (such as zebra stripes, lane lines and other scenes in the figure) in the illumination area in the field of view of the driver is improved, and the driving safety is improved.
In the embodiment of the application, the illumination area of the low beam lamp of the vehicle can be called a blind area, which means an area which is related to driving safety and has lower brightness under the condition of glare. The area is easily invisible to the driver when glare occurs, and is therefore called a blind area. In the embodiment of the application, under the condition of judging that glare occurs, the brightness of the dipped headlight is enhanced, and the light supplementing of the blind area can be realized, so that the driving safety is improved.
Alternatively, the blind area in the embodiment of the present application may be all the illumination areas of the low beam of the vehicle, or may be a partial area in the illumination areas of the low beam of the vehicle, which is not limited in this aspect of the present application.
Alternatively, in step 403, the enhancement of the low beam brightness may be achieved by a processing unit (e.g., MCU), a lamp control unit (e.g., HCM) and a lamp in the in-vehicle lighting device. Specifically, the processing unit may send an instruction to the lamp control unit to increase the illumination brightness of the low beam lamp according to the glare value being larger than the first threshold value. The lamp control unit can enhance the illumination brightness of the lamp according to the instruction.
Optionally, the instruction sent by the processing unit to the vehicle lamp control unit may further include an area requiring brightness enhancement, and the vehicle lamp control unit may control the vehicle lamp to enhance the brightness of the corresponding area according to the instruction. The partial area for enhancing brightness will be specifically described in the following embodiments, and will not be described here again.
The overall thought of the car lamp control method provided by the embodiment of the application is described above, and specific implementation of each step in the method is described in a one-to-one unfolding manner.
Alternatively, in step 401, the second brightness of the light source of the target object may be obtained in a variety of ways. For example, it is possible to determine whether or not a condition for generating glare is provided based on the luminance information in front of the vehicle, and if it is determined that glare is likely to occur, the second luminance of the light source of the target object is obtained again. Or the target object may be identified based on image information, color information, or the like in front of the vehicle, and the second brightness of the light source of the target object may be acquired, and the description will be made separately.
1. And determining the target object according to the brightness information.
Optionally, if the condition of generating the glare is determined according to the brightness information, and then the second brightness of the light source of the target object is obtained, as shown in fig. 6, the specific flow may specifically include:
601. Luminance information in front of the vehicle is acquired.
The processing unit in the in-vehicle lighting device may acquire luminance information in front of the vehicle through the sensor. Alternatively, the processing unit may directly acquire the luminance information in front of the vehicle through the luminance sensor.
The processing unit may acquire the luminance information in front of the vehicle by other means besides the luminance sensor, which is not limited in the present application. For example, the processing unit may acquire image information in front of the vehicle through the camera, and extract luminance information from the image information.
602. Determining a target object according to the brightness information; wherein the brightness of the light source of the target object is greater than or equal to the second threshold.
Since stronger brightness is required for the occurrence of glare, in the embodiment of the present application, the second threshold is used as a brightness basis for determining whether or not glare occurs. In the embodiment of the application, the second threshold is determined through experiments, and the human eyes can perceive the lowest brightness of the strong light source when the glare occurs. A strong light source with a brightness greater than or equal to the second threshold may produce glare in the human eye. Since there is a difference between the brightnesses that can be perceived by the human eye in different driving scenarios, a second threshold value of different sizes can be determined. Alternatively, the second threshold may be ≡10dB, for example, 10dB, 20dB, 30dB, etc., which is not limited by the present application.
If the brightness of the light source of a certain object in the brightness information is greater than or equal to the second threshold value, the current scene is considered to have a condition of generating glare, and the processing unit can determine the object as a target object.
In the embodiment of the application, the second threshold is used as the brightness basis for judging that the glare occurs, and the subsequent brightness determination, the glare value determination and the like are only performed when the glare is determined to possibly occur based on the second threshold. The acquisition of brightness, glare value, etc. in a scene where glare is unlikely to occur (e.g., the brightness of all objects in front of the vehicle is low, e.g., is below the second threshold value) can be avoided, so that computational effort can be saved.
For example, during the running of the vehicle, step 601 may be performed to obtain real-time luminance information in front of the vehicle, and the subsequent steps 603 to 605 (i.e., obtaining the first luminance, the second luminance, the glare value, etc.) may be performed only when it is determined in step 602 that glare may occur, so that the power consumption of the processing unit may be reduced.
603. And acquiring the second brightness of the light source of the target object and the first brightness of the illumination area according to the brightness information.
Since the luminance information in front of the vehicle has been acquired in step 601, the light source of the target object is determined in step 602, i.e., the second luminance of the light source of the target object can be obtained directly from the luminance information.
In embodiments of the present application, the illumination area of the dipped headlight may be determined by a plurality of implementations, thereby determining the first brightness of the illumination area. For example, the preset area may be taken as the illumination area from the luminance information in front of the vehicle. For example, the range corresponding to the illumination area that can be irradiated by the low beam of the host vehicle may be defined in the spatial coordinate system of the luminance information. The luminance of the region (i.e., the first luminance) can be obtained directly from the luminance information.
Alternatively, the illumination area of the low beam of the vehicle may be determined according to the brightness distribution condition in the brightness information, so as to determine the first brightness of the illumination area. For example, it is possible to divide the luminance information into different areas according to the luminance value, determine that the area closest to the host vehicle and having higher luminance in the luminance information is the illumination area described above, and obtain the first luminance of the illumination area from the luminance information.
Alternatively, the preset area may be used as the illumination area from the image information in front of the vehicle. For example, the range corresponding to the illumination area that can be irradiated by the low beam of the host vehicle may be defined in the spatial coordinate system of the image information. And then obtaining the first brightness of the illumination area through the mapping relation between the image information and the brightness information.
Optionally, there may be a fixed mapping relationship between the spatial coordinate system of the image information and the spatial coordinate system of the luminance information, and according to the position coordinates of the illumination area in the image information, the position coordinates of the illumination area in the luminance information may be determined, so as to obtain the first luminance of the illumination area in the luminance information.
604. The glare value is obtained from the first luminance and the second luminance.
Step 604 refers to step 402 in the embodiment shown in fig. 4, and will not be described here.
605. If the glare value is greater than the first threshold, the brightness of the low beam of the vehicle is enhanced.
Alternatively, the brightness of the vehicle dipped headlight may be enhanced in a variety of ways (i.e., blind spot light filling is achieved). For example, the illumination brightness of the entire low beam may be enhanced, or the brightness may be enhanced in a partial region defined in front of the vehicle, or the like, which is not limited to this aspect of the application. That is, the blind area may be the entire illumination area of the low beam lamp, or may be a partial area in the illumination area of the low beam lamp, which is not limited in the present application.
For example, as shown in fig. 7, a target illumination area may be defined in the illumination range of the low beam of the vehicle, and a light-compensating beam is projected to the target illumination area by the low beam illumination module, so as to implement light-compensating for a blind area, thereby reducing the brightness difference between the target illumination area and the light source of the target object, and enabling the driver to better see the object within the target illumination area (blind area).
In the embodiment of the application, the target illumination area (blind area) is an area which is easy to influence the driving safety of the vehicle in the driving process of the vehicle. For example, the target illumination area may be an area on a lane in which the vehicle is located that can be illuminated by a vehicle low beam; or may be an area delimited on a lane in which the vehicle is located; or may be a region defined on a lane where the vehicle is located and left and right lanes, etc., which is not limited in the present application.
As shown in fig. 7, the target illumination area may be rectangular, trapezoidal, etc., which the present application is not limited to. For example, as shown in a diagram a in fig. 7, the target illumination area may be a rectangle located in front of the vehicle and attached to a lane line of a lane in which the vehicle is located; or may be a trapezoid located in front of the vehicle, including a portion of the lane line in which the vehicle is located (i.e., a portion in front of the vehicle), as shown in fig. 7b, c, or d. Or, as shown in the e-chart of fig. 7, is a trapezoid with curved waists.
Alternatively, the target illumination area (blind area) shown in fig. 7 a may be a rectangle in front of the vehicle, and the rectangle may be 100 meters, 150 meters, 600 meters, etc. in length and 3 meters, 4 meters, 2 meters, etc. in width, which is not limited in the present application. When glare occurs, a driver can see objects in the target illumination area clearly through light supplementing of the blind area in the range, so that the probability of traffic accidents is reduced. Alternatively, the length of the rectangle may be longer or shorter, and the width may be larger or smaller, which is not limited in the present application.
Notably, the target illumination area is an area delimited in the illumination area of the low beam light. Since glare usually occurs in a scene where the driving environment is low in brightness, the low beam is already in an on state before the bright-fill beam is projected to the target illumination area. At this time, the light beam emitted from the dipped headlight is projected onto the illumination area of the dipped headlight, and the embodiment of the application refers to the light beam as a base illumination light beam. Therefore, in the embodiment of the present application, the light beam projected onto the target illumination area may include the basic illumination beam in addition to the light beam for brightness compensation, which is not limited in the present application.
Optionally, in the embodiment of the present application, the processing unit on the vehicle-mounted lighting device may further adjust the target lighting area according to the real-time road condition, so as to achieve a good brightness compensating effect under different road conditions. For example, the processing unit may change the range of the target illumination area in real time according to the distance between the host vehicle and the oncoming vehicle.
For example, as shown in fig. 8a, if the target illumination area is a quadrangle on the lane where the vehicle is located, both left and right sides of the quadrangle are parallel to the lane line on one side of the lane, respectively. The length of the left and right sides of the quadrangle can be determined according to the distance between the vehicle and the opposite vehicle. Specifically, the area size of the target illumination area and the target distance are in positive correlation. The target distance is the distance between the host vehicle and the opposite incoming vehicle.
For example, when the host vehicle runs relatively to the oncoming vehicle, the target illumination area may be as shown in fig. 8a from the whole process of long-distance relative running and gradually approaching to the meeting vehicle.
As shown in a diagram a of fig. 8a, in a case where the distance between the oncoming vehicle and the host vehicle is large, for example, in a case where the target distance between the oncoming vehicle and the host vehicle (hereinafter, for convenience of description, the distance between the oncoming vehicle and the host vehicle is referred to as a target distance) is larger than the maximum illumination distance of the host vehicle dipped headlight, the oncoming vehicle does not enter the effective distance of glare compensation, and the host vehicle does not perform blind area light.
As shown in b diagram in fig. 8a, the target distance between the oncoming vehicle and the host vehicle gradually decreases as the oncoming vehicle and the host vehicle travel relatively. If the target distance is less than or equal to the effective distance of glare compensation (such as the maximum illumination distance of the low beam of the host vehicle), the host vehicle can perform blind zone illumination compensation, i.e. determine the target illumination area and project illumination compensation light beams to the target illumination area.
As shown in the c-chart of fig. 8a, as the two vehicles approach gradually, the target distance gradually decreases, and the area to be lighted gradually decreases. Therefore, the lengths of the left side and the right side of the quadrangle can be reduced in real time, so that the target illumination area (blind area) comprises the area between the vehicle and the opposite vehicle.
As shown in d diagram of fig. 8a, after two vehicles meet, the far-reaching light of the oncoming vehicle is no longer directly projected to the eyes of the driver of the vehicle, and no glare is generated in the field of view of the driver of the vehicle, so that blind area illumination is no longer required. Thus, blind spot illumination is stopped (i.e., illumination compensation is turned off).
Alternatively, if the lane in which the host vehicle is located is a curve, the corresponding target illumination area (blind area) is as shown in fig. 8 b.
Alternatively, the embodiment of the application can determine the target illumination areas under different road conditions through the graphic information. The graphic information can come from a camera, a radar and other sensors.
Optionally, if the sensor is a camera, the camera may acquire image information in front of the vehicle. The processing unit can extract graphic information from the image information, determine lane lines on the left side and the right side of a lane where the vehicle is located and the position of the oncoming vehicle according to the graphic information, and then determine target illumination areas between the lane lines on the left side and the right side and the oncoming vehicle and the vehicle.
Alternatively, if the sensor is a radar, the radar may acquire spatial position information of an object in front of the vehicle. The processing unit can determine graphic information in front of the vehicle according to the spatial position information, determine lane lines on the left side and the right side of a lane where the vehicle is located and positions of opposite vehicles according to the graphic information, and further determine target illumination areas between the lane lines on the left side and the right side and the opposite vehicles and the vehicle.
Alternatively, the determination of the target illumination area may be achieved by a neural network. In an alternative implementation manner, the image information of the front of the vehicle, which is acquired by the camera, may be input into the neural network, the characteristic points of the lane lines are detected from the image information through the neural network, and then the characteristic points are fitted to obtain the lane line coordinate equation of the front of the vehicle.
The lane line coordinate equation based on fitting is a coordinate equation in a camera coordinate system. The processing unit may determine a target illumination area in the camera coordinate system according to the coordinate equation, and convert coordinates of the illumination area in the camera coordinate system into coordinates in the vehicle lamp coordinate system. Thereby issuing an instruction to the lamp control unit to project a complementary light beam to the target illumination area.
Optionally, invalid information can be filtered through space-time pipeline filtering, a reliable feature point set is determined in the feature points, and a curve is fitted based on the reliable feature point set.
Alternatively, the second-order bezier curve may be used to fit the feature points (or the reliable point set) to obtain a lane line coordinate equation in front of the vehicle, which is not limited in the present application.
2. And determining the first object as the target object according to the brightness being greater than or equal to the second threshold value and the first object meeting the shape characteristics of the target object.
Alternatively, the target object whose brightness and shape satisfy the conditions may be determined according to the brightness information and the graphics information, and then the subsequent determination and the brightness compensation of brightness, glare value, and the like may be performed. Specifically, as shown in fig. 9, the flow of the method includes:
901. Luminance information in front of the vehicle is acquired.
Step 901 is referred to step 601 in the embodiment shown in fig. 6, and will not be described in detail herein.
902. From the luminance information, a first object having a luminance greater than or equal to a second threshold value is determined.
This step is referred to as step 602 of the embodiment shown in fig. 6. Step 902 is different from step 602 in that in step 602, an object with a brightness greater than or equal to a second threshold is directly determined as a light source of the target object; in step 902, the object is determined as the first object that is likely to be the light source of the target object, and it is further determined whether the first object is the light source of the target object or not through subsequent recognition of the first object.
903. And if the first object is determined to meet the shape characteristic of the vehicle according to the graphic information, determining the first object as a target object.
In the embodiment of the present application, the type of the target object may be set (for example, the type of the target object may be set as a oncoming vehicle, a searchlight, etc., which is not limited in the present application), and whether the first object satisfies the shape feature of the type may be determined according to the graphic information, thereby determining whether the first object is the target object. Next, a determination process of whether the first object is a target object will be described with respect to an example in which the oncoming vehicle is a set target object type.
Alternatively, the processing unit may extract graphic information in front of the vehicle from image information from the camera, or from spatial position information (of an object in front of the vehicle) from the radar.
Alternatively, as shown in FIG. 10, the shape characteristics of the vehicle may include the shape characteristics of the vehicle's exterior contour, interior contour, texture, tires, rearview mirrors, windshields, etc., from one or more of which the shape characteristics of the vehicle may be determined.
Fig. 10 exemplifies the shape characteristics of a front-side vehicle (i.e., the shape characteristics of an oncoming vehicle displayed against the host vehicle). It should be noted that the shape feature displayed on the oncoming vehicle side facing the host vehicle may be used as the shape feature of the vehicle, and the present application is not limited thereto.
Alternatively, assuming that the kind of the predetermined target object is a floodlight, the shape characteristics of the floodlight may include a lamp socket, a lamp holder, and the like.
Alternatively, in the embodiment of the present application, it may be determined whether the first object is the target object through other information besides the graphic information in front of the vehicle, which is not limited by the present application. For example, if the type of the target object is an oncoming vehicle with the high beam on, it may be determined whether the first object satisfies the shape feature of the vehicle and the color feature (white) of the light beam emitted from the light source of the first object by combining the graphic information and the color information in front of the vehicle, thereby determining whether the first object is the target object.
For example, in the example shown in fig. 10, color information corresponding to the graphic information may be acquired (for example, the graphic information and the color information corresponding thereto may be extracted from the image information). It may be determined whether the first object is a oncoming vehicle (target object) having the high beam turned on in combination with the shape feature and the color feature of the lamp. Optionally, the color feature of the high beam may be a white light beam and a white light spot, and the light beam and/or the light spot emitted by the first object is white according to the shape feature of the first object, so that it may be determined that the first object emits light to be a oncoming vehicle (target object) on which the high beam is turned on.
Alternatively, the identification of the light source of the target object may be achieved by a neural network. In an alternative implementation manner, the image information acquired by the camera may be input into a neural network, and whether the first object is a target object is determined through the neural network. Specifically, if the sensor is a camera, the image information of the front of the vehicle acquired by the camera may include a plurality of images with different exposure time periods. The processing unit on the vehicle-mounted lighting device can fuse the images to obtain a fused image. The processing unit may then obtain the shape characteristics of the first object from the fused image (optionally, the color characteristics of the light beam emitted by the light source of the first object may also be obtained, etc.). So that it is determined whether the first object is a oncoming vehicle (target object) according to whether the shape characteristic of the first object (and optionally the color characteristic of the light beam emitted from the light source of the first object, etc.) satisfies the shape characteristic of the vehicle (and optionally the color characteristic of the white light beam emitted from the oncoming vehicle light source).
Since images of different exposure durations may provide information in different aspects (e.g., images of longer exposure durations may provide more accurate brightness information, images of shorter exposure durations may provide more accurate shape features). By fusing images with different exposure time lengths, the acquired fused image contains more accurate information (such as more accurate brightness information, more accurate shape characteristics and the like), and a target object (such as a oncoming vehicle with a high beam turned on) can be identified more efficiently and accurately through the fused image.
Optionally, after the neural network identifies the target object, the identified result may be further subjected to a space-time pipeline filter model to ensure accuracy of the identified result.
Alternatively, the neural network may be a code in a computing unit, and the recognition of the target object may be implemented locally in the computing unit. Optionally, the neural network may be a service provided by the cloud, the computing unit uploads the multiple images with different exposure time periods to the cloud, the neural network of the cloud outputs the identification result, and the identification result is returned to the computing unit.
904. And acquiring the second brightness of the light source of the target object and the first brightness of the illumination area according to the brightness information.
905. A glare value is determined from the first luminance and the second luminance, the glare value being indicative of a degree to which the second luminance is greater than the first luminance.
906. If the glare value is greater than the first threshold, the brightness of the low beam of the vehicle is enhanced.
Steps 904 to 906 refer to steps 603 to 605 in the embodiment shown in fig. 6, and are not described here again.
3. And identifying the condition that the first object meets the target object according to the graphic information and the color information, and determining the first object as the target object.
Alternatively, the target object whose shape feature and color feature satisfy the conditions may be determined according to the graphic information and the color information, and then the subsequent determination and the brightness compensation of the brightness, the glare value, and the like may be performed. Specifically, as shown in fig. 11a, the flow of the method includes:
1101. graphic information, color information, and luminance information in front of the vehicle are acquired.
Alternatively, the processing unit may extract graphic information, color information, and brightness information from the image information from the camera.
Alternatively, the brightness information may be obtained by a brightness sensor, which is not limited in the present application. Alternatively, the graphic information may be acquired by the processing unit according to the spatial position information from the radar, which is not limited in the present application.
1102. If the first object is determined to meet the shape characteristics of the vehicle according to the graphic information, and the light beam emitted by the first object is determined to be white according to the color information, the first object is determined to be the oncoming vehicle with the high beam on.
The graphic information includes shape features of a plurality of objects in front of the vehicle, and the color information includes color features of the plurality of objects. In the embodiment of the present application, the type of the target object may be set (for example, the type of the target object may be set as an oncoming vehicle with a high beam turned on, a turned-on searchlight, or the like, which is not limited in the present application), and the target object satisfying the characteristics of the target object may be determined from a plurality of objects in front of the vehicle according to the graphic information and the color information. Next, a procedure of determining a target object from a plurality of objects in front of a vehicle will be described with an example in which a oncoming vehicle with a high beam turned on is taken as a set target object type.
The processing unit may determine an object satisfying the shape characteristics of the vehicle based on the graphic information, and determine whether the light beam emitted from the object satisfies the color characteristics (i.e., whether it is white) of the light beam emitted from the high beam based on the color information. And if the certain object meets the two conditions, determining the object as a target object.
For a description of the shape characteristics of the vehicle, refer to the description of step 903 in the embodiment shown in fig. 9, and will not be repeated here.
Since the target object is a oncoming vehicle on which the high beam is turned on, the light beam emitted from the high beam is a cause of glare. Thus, in addition to the shape features of the vehicle, the shape features of the light beam emitted by the light source of the first object may also be included in the graphical information. Alternatively, as shown in FIG. 10, the shape feature may be a spot and/or beam shape feature that is located near (including inside or near) the shape feature of the vehicle.
Alternatively, the calculating unit may determine the position coordinates of the light beam in the graphic information according to the shape features of the light beam emitted by the light source of the first object in the graphic information, and determine the color of the light beam through the mapping relationship between the graphic information coordinate system and the color information coordinate system.
Alternatively, if the light of the oncoming vehicle is directed to the host vehicle, the beam cannot be seen, and the shape feature of the beam emitted by the first object is the shape feature of the spot located near (including inside or near) the shape feature of the vehicle, which is not limited in this application.
Alternatively, assuming that the predetermined kind of the target object is an on-searchlight, the shape characteristics of the searchlight may include a lamp holder, and the like. Since the searchlight produces glare by the light beam emitted by the bulb, in addition to the shape feature of the searchlight, the shape feature of the light beam emitted by the light source of the first object may be included in the graphical information, such as a spot and/or a shape feature of the light beam located near (including inside or near) the shape feature of the searchlight. For the judgment of the shape and color of the beam emitted by the searchlight, refer to the description of the beam emitted by the high beam of the vehicle, and the description is omitted here.
Alternatively, in the embodiment of the present application, whether the color, the lighting duration, and the like of the beam emitted by the object satisfy the characteristics of the target object may be identified according to the graphic information and the color information in front of the vehicle, so as to determine the target object.
For example, the type of the target object may be set as an object that can emit a white light beam, and the duration of the light beam emitted from the object is set to be long, the direction of the light beam is set to be within a preset range, or the like. It is possible to determine that the object satisfying the above condition is the target object by the graphic information and the color information.
Since a strong light source is a necessary condition for generating glare, that is, generating glare requires that the brightness of the light beam emitted from the light source is high. Since the light beam having a high brightness is generally white, the type of the target object is set as an object capable of emitting a white light beam.
In the embodiment of the application, the blind area is used for supplementing the glare caused by the strong light source. If the strong light source is extinguished, the dipped headlight can reduce the brightness, and the basic illumination beam can be projected to the front of the vehicle. That is, the glare source is extinguished and the vehicle lighting device stops projecting the supplemental light beam.
In the embodiment of the application, if the brightness of the driving environment (namely, the environment brightness of the driver visual field) is more stable, the driving safety is more facilitated. The blind area is supplemented by the strong light source (namely, the unstable strong light source) with shorter lighting duration, and after the brightness of the dipped headlight is briefly enhanced, the dipped headlight stops projecting the supplementary light beam along with the extinction of the strong light source, so that the frequent change of the brightness of the dipped headlight can be caused. Thereby causing frequent changes of the ambient brightness of the driver's visual field, and being unfavorable for driving safety.
Therefore, the embodiment of the application defines the first time period, and if the time period of the light beam is greater than or equal to the first time period, the light source emitting the light beam is considered to be a stable light source, and then the blind area of the glare caused by the light source is supplemented. Alternatively, the first duration may be greater than or equal to 100ms, for example, the first duration may be 100ms, 120ms, 150ms, etc., which is not limited by the present application.
Therefore, whether to perform blind area illumination compensation can be determined according to the illumination time length of the white light beam emitted by the light source of the first object. If the duration of the light beam is determined to be greater than or equal to the first duration, the light source is considered to have a duration condition of the light beam for generating glare. Otherwise, if the lighting duration is less than the first duration, the light source is considered to be temporarily lit, for example, the oncoming vehicle only turns on the high beam momentarily, turns off immediately after turning on, or the searchlight only sweeps the vehicle, and all the conditions are considered to not have conditions for glare.
In the embodiment of the application, whether the light source has the condition of generating glare can be determined according to the included angle between the light beam emitted by the light source of the first object and the visual field of the driver. In an alternative implementation, the target object may be set to be a oncoming vehicle in which the light source emits a white light beam and the light beam is lit for a time period greater than or equal to the first time period. Since the height difference between the light source (i.e., the lamp, specifically, the high beam emitting white light) of the oncoming vehicle and the field of view of the driver of the host vehicle is small, in the image information in front of the host vehicle, if the angle between the light beam emitted by the first object and the horizontal plane (for example, the angle between the optical axis of the light beam and the horizontal plane is smaller than or equal to the first angle (for example, 5 °, 6 °, 8 °, etc.), the height of the first object can be considered to meet the height condition for generating glare. Alternatively, the angle between the optical axis of the light beam and the horizontal plane may be above or below the horizontal plane, which is not limited in the present application.
Alternatively, the direction of the light beam may be determined by a calibration point in the graphical information. Specifically, the direction of the light beam can be determined by the positional relationship between the light beam and the calibration point in the images at different moments.
If the angle between the beam of light emitted by the first object and the vertical plane is too large, glare may not occur in the driver's field of view. Therefore, the embodiment of the present application sets the second angle, and when the included angle between the light beam emitted by the first object and the vertical plane (for example, as shown in fig. 11b, the included angle between the optical axis of the light beam and the vertical plane, and the vertical plane is parallel to the front direction of the vehicle) is smaller than or equal to the second angle, the left-right distance of the first object can be considered to meet the left-right distance condition for generating glare. Alternatively, the second angle may be about 10 °, for example, 12 °, 11 °,10 °,9 °,8 °, etc., which is not limited by the present application.
In the embodiment of the present application, if the target object is an oncoming vehicle, the second angle may be determined by a sum of a divergence angle of the high beam of the vehicle and an included angle between the two vehicles (i.e., an included angle between the oncoming vehicle and the host vehicle). Since the divergence angle of the high beam of the vehicle is generally 4 ° or 5 °, and an included angle of 4 ° or 5 ° may exist between the oncoming vehicle and the host vehicle, the second included angle is determined to be about 10 °.
In an embodiment of the present application, the calculating unit may determine the shape characteristic of the light beam emitted by the light source of the first object according to the graphic information. Alternatively, the shape characteristics of the beam may include a light spot, a light beam, and the like, which is not limited by the present application.
The computing unit can determine the lightening time of the light beam emitted by the light source of the first object according to the graph information acquired at different moments; according to the graphic information, the included angle between the optical axis of the light beam and the visual field of the driver can be determined; from the color information, the color of the light beam can be determined. That is, the calculation unit determines that an object meets the three conditions according to the graphic information and the color information in front of the vehicle, and determines the object as the target object, thereby determining the second luminance of the target object according to the luminance information.
Alternatively, the calculating unit may determine the position coordinates of the light beam in the graphic information according to the shape features of the light beam emitted by the light source of the first object in the graphic information, and determine the color of the light beam through the mapping relationship between the graphic information coordinate system and the color information coordinate system.
In the embodiment of the application, the blind area is supplemented with the glare generated by the stable strong light source, so that the brightness of the visual field of a driver can be more stable, and the driving safety is improved; on the other hand, the brightness supplementing efficiency can be improved, and the waste of calculation force caused by frequent calculation of brightness, glare value, demarcation of a target illumination area and the like is avoided.
1103. And determining the first brightness of the illumination area and the second brightness of the lamps of the oncoming vehicles according to the brightness information.
The processing unit determines the target object (e.g. the oncoming vehicle with the high beam on), i.e. the second brightness of the light source of the target object (i.e. the lamps of the oncoming vehicle) and the first brightness of the illumination area, based on the brightness information.
With respect to the determination of the illumination area, referring specifically to step 603 of the embodiment shown in fig. 6, a detailed description is omitted here.
1104. The glare value is determined from the first luminance and the second luminance.
1105. If the glare value is greater than the first threshold, the brightness of the low beam of the vehicle is enhanced.
Steps 1104 to 1105 refer to steps 604 to 605 of the embodiment shown in fig. 6, and are not described here again.
4. And determining the first object as a target object according to the brightness information and the characteristics of the emitted light beam.
Alternatively, in the embodiment of the present application, the target object may also be determined from a plurality of objects in front of the vehicle through the graphic information and the luminance information. Specifically, if it is determined that the brightness of the second object is greater than or equal to the second threshold according to the graphic information, and it is determined that the time period when the light beam emitted from the light source of the second object is lit is greater than or equal to the first time period according to the image information, the processing unit may consider the second object as a target object (for example, a lamp of a oncoming vehicle). Thereby performing a series of operations of determining the first luminance, the second luminance, the glare value, the blind area luminance complement, and the like.
The method for controlling the car lamp reduces the danger of glare through the brightness compensation of the blind area. The highlighting of the blind spot is achieved by projecting a highlighting beam towards the target illumination area. The bright-up light beam will be described next.
Alternatively, the bright-fill beam may have sharp boundaries in order to draw the driver's attention to objects within the target illumination area. In the embodiment of the application, a sharp boundary refers to a boundary on which a light beam is projected by a human eye. In particular, the left and right side boundaries of the bright-complement beam projected to the target illumination area can be made to have a sharpness of 0.05 in the horizontal test, which can be clearly recognized by the human eye.
Alternatively, the sharpness of the boundaries of the left and right sides of the bright light beam may be other values, such as 0.02, 0.1, 0.04, etc., other than 0.05, which is not limited by the present application.
As shown in fig. 12, the left side edge of the light pattern has a sharpness of greater than 0.05 as a result of the horizontal cross-section light intensity test (thicker dashed line in fig. 12) when the bright-compensating light beam is projected in the forward direction on the vertical wall surface. In the embodiment of the present application, the form of the left boundary is not limited, and the left boundary may be a slant line, a curve, etc., as long as the included angle α between the tangent line and the horizontal line of all points on the left boundary is 20 ° -70 °. The right boundary is the same as the above, and will not be described again here.
In the embodiment of the application, the sharpness test method (the horizontal section light intensity test can be used) and the calculation mode refer to the specification of the national standard dipped headlight cut-off line. That is, as shown in fig. 12, the test was performed by projecting a bright compensating light beam on a vertical wall surface 25m away from the host vehicle. In this test method, there can be only one visually observed cut-off line (i.e., the thicker dashed line in fig. 12); the horizontal portion of the cut-off line 2.5 deg. to the left of the vertical line (also referred to as the vertical line, or vertical plane in the previous embodiment) is scanned vertically and the maximum gradient G is determined using equation 1.
G=log β-logE(β+1) formula 1
Wherein, beta is the angle of the vertical direction (also the angle of the test point), and the unit is degree; g is more than or equal to 0.13 and less than or equal to 0.4; the step size of the horizontal portion of the cut-off line was scanned at 0.05 °.
In the embodiment of the application, the attention of a driver is attracted by the target illumination area with sharp boundary. In the glare scene, the driver can quickly locate the region where the vision line is concentrated (namely, the target illumination region and the region related to the driving safety) through the prompt of the sharp boundary, so that the driving safety is improved.
Alternatively, for the light beam, the angle between the light beam and the horizontal and vertical planes may be limited to a predetermined range. For example, as shown in fig. 13a, the horizontal angle h between the bright filling beam and the horizontal plane may be set to be: h is less than or equal to-10 degrees and less than or equal to-1 degrees; the vertical included angle v between the vertical plane and the vertical plane meets the following conditions: v is more than or equal to-10 degrees and less than or equal to 10 degrees.
The horizontal angle h between the bright filling beam emitted by the vehicle and the horizontal plane is shown in fig. 13 b. Wherein h1 is the included angle between the upper edge beam of the bright filling beam and the horizontal plane, and h2 is the included angle between the lower edge beam of the bright filling beam and the horizontal plane. Both h1 and h2 meet the condition of-10 DEG.ltoreq.h.ltoreq.1 DEG, i.e. -10 DEG.ltoreq.h1.ltoreq.1 DEG, and-10 DEG.ltoreq.h2.ltoreq.1 deg.
The vertical angle v between the bright-fill beam emitted by the vehicle and the vertical plane is shown in fig. 13 c. Wherein v1 is the angle between the left edge beam of the light compensating beam and the horizontal plane, and v2 is the angle between the right edge beam of the light compensating beam and the horizontal plane. Then v1 and v2 both meet the condition of-10.ltoreq.v.ltoreq.10, i.e. -10.ltoreq.v1.ltoreq.10 and-10.ltoreq.v2.ltoreq.10.
The vehicle-mounted lighting device provided by the embodiment of the application is shown in fig. 3, and the vehicle-mounted lighting device can be used for realizing the vehicle lamp control method in any one of the embodiments shown in fig. 4 to 13 c.
Optionally, the embodiment of the application also provides a car lamp control system, and the structure of the system is shown in fig. 14. The system may include: a sensor, a glare identification controller and a vehicle lamp controller.
The sensors are used to obtain information in front of the vehicle and may include glare recognition devices, vehicle-mounted cameras, ambient light sensors, and the like. The vehicle-mounted camera is the camera in the previous embodiment, and the ambient light sensor is the brightness sensor in the previous embodiment. The glare recognition device is a brightness sensor integrated with a glare recognition algorithm, i.e. an algorithm for determining a glare value by means of a first brightness and a second brightness.
The glare recognition controller may perform the function of the processing unit to determine the glare value and determine whether or not glare occurs in the above embodiments. The glare recognition controller may include a glare recognition module and glare processing logic. The glare identifying module is used for determining the first brightness, the second brightness and the glare value, and the glare processing logic is used for determining whether to carry out blind area brightness compensation according to the magnitude of the glare value.
Alternatively, if the sensor is a glare recognition device, the glare processing logic may determine whether to perform blind area light restoration directly according to the glare value from the glare recognition device, and the glare recognition controller may not include the glare recognition module.
The car light controller comprises a dynamic image projection algorithm module and a car light control module. The dynamic projection algorithm module is used for determining a dipped headlight illumination module corresponding to the target illumination area, and the car light control module is used for projecting a light supplementing beam to the target illumination area.
Alternatively, information may be communicated between the glare identification controller and the vehicle light controller via a gateway.
The car light control system shown in fig. 14 can control the car light to realize the blind area light compensation through the flow shown in fig. 15 under the scene of glare.
If the high beam of the oncoming vehicle is turned on, and glare occurs, the host vehicle may acquire image information, brightness information, and the like (e.g., a camera acquires image information) through a sensor. The glare recognition system (i.e., the glare recognition controller) may recognize that glare occurs according to information obtained by the sensor (e.g., the brightness is greater than or equal to the second threshold in the embodiment shown in fig. 6, or the recognition method in other embodiments is not limited thereto). The sensor (e.g., camera) may identify the location of the oncoming vehicle, and then the glare recognition system may calculate the ambient light level (i.e., the first and second brightness described above), the glare value, and so forth.
The glare recognition system recognizes the speed, the position, the brightness and the like of the oncoming vehicles in real time, so that the target illumination area is determined in real time according to the road conditions. And after the target illumination area is determined, the glare recognition system can send an instruction to the dynamic image projection algorithm module, so that the dynamic image projection algorithm module determines the dipped headlight illumination module corresponding to the target illumination area. The car light control module can control the module to project the light compensating beam, thereby controlling the car light to project the light compensating beam at the position where light compensation is needed.
Along with the running of the vehicle, the road condition may change at any time, so the glare recognition system may acquire information such as the speed and the rotation angle of the vehicle, so as to determine the position (i.e. the blind area and the target illumination area) where the light supplement is required and the brightness (i.e. the brightness of the light supplement beam) of the light supplement in real time. Thereby dynamically correcting the light filling position (target illumination area) and brightness (i.e., brightness of the filling light beam).
If the glare recognition system determines that glare has ended in the driver's field of view based on real-time road conditions, the projection of the bright fill light beam (also referred to as anti-glare content) may be ended.
The embodiment of the application also provides a vehicle, which comprises the vehicle-mounted lighting device and/or the vehicle lamp control system.
The embodiment of the application also provides a vehicle which comprises the sensor, the processing unit and the dipped headlight. Wherein the sensor is used for acquiring information in front of the vehicle. The processing unit is used for acquiring first brightness of an illumination area in front of the vehicle and second brightness of a light source from a target object according to information in front of the vehicle. Wherein the light source of the target object is a light source outside the vehicle. The processing unit is further configured to obtain a glare value based on the first luminance and the second luminance, the glare value being indicative of a degree to which the second luminance is greater than the first luminance. If the glare value is greater than the first threshold, the brightness of the dipped headlight is enhanced.
The specific functions of the sensor, the processing unit and the dipped headlight are referred to the description of any one of the embodiments in fig. 4 to 15, and are not repeated here.
Alternatively, the processing unit may be a master control unit on the vehicle. In addition, the processing unit may be a separate processing unit on the vehicle, a processing chip for controlling the vehicle lamp (for example, the chip may be integrated inside the lamp), or the like, which is not limited to this aspect of the application.
Alternatively, the processing unit may be integrated on the vehicle lamp, so as to control the vehicle lamp, which is not limited in the present application.
Optionally, the sensor may be integrated on a vehicle lamp, so as to quickly return information (such as image information, brightness information, etc.) in front of the vehicle, and improve the response speed of glare compensation.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.

Claims (30)

1.A vehicle lamp control method, characterized by comprising:
Acquiring a first brightness of an illumination area in front of a vehicle and a second brightness of a light source from a target object; wherein the light source of the target object is a light source outside the vehicle;
Obtaining a glare value from the first luminance and the second luminance, the glare value being indicative of a degree to which the second luminance is greater than the first luminance;
And if the glare value is greater than a first threshold value, enhancing the brightness of a dipped headlight of the vehicle.
2. The method of claim 1, wherein the obtaining the second brightness of the light source from the target object comprises:
acquiring brightness information in front of the vehicle;
Determining the target object according to the brightness information, wherein the brightness of a light source of the target object is greater than or equal to a second threshold value;
and obtaining the second brightness of the light source of the target object according to the brightness information.
3. The method of claim 1, wherein the obtaining the second brightness of the light source from the target object comprises:
acquiring brightness information and graphic information in front of the vehicle;
determining the target object according to the brightness information and the graphic information, wherein the brightness of a light source of the target object is larger than or equal to a second threshold value, and the shape of the target object meets the shape characteristics of a vehicle;
and obtaining the second brightness of the light source of the target object according to the brightness information.
4. The method of claim 1, wherein the obtaining the second brightness of the light source from the target object comprises:
acquiring graphic information, color information and brightness information in front of the vehicle;
Determining the target object according to the graphic information and the color information, wherein the shape of the target object meets the shape characteristics of a vehicle, and the light beam emitted by the light source of the target object is white;
And determining the second brightness of the light source of the target object according to the brightness information.
5. The method of claim 1, wherein the obtaining the second brightness of the light source from the target object comprises:
acquiring graphic information, color information and brightness information in front of the vehicle;
Determining the target object according to the graphic information and the color information, wherein a light beam emitted by a light source of the target object is white, the time length of the light beam which is lightened is greater than or equal to a first time length, and the shape of the light beam meets at least one of the following conditions:
the included angle between the light beam and the horizontal plane is smaller than or equal to a first angle; or alternatively, the first and second heat exchangers may be,
The included angle between the light beam and the vertical surface is smaller than or equal to a second angle; wherein the vertical plane is parallel to the front direction of the vehicle;
And determining the second brightness of the light source of the target object according to the brightness information.
6. The method of claim 1, wherein the obtaining the second brightness of the light source from the target object comprises:
acquiring graphic information and brightness information in front of the vehicle;
determining the target object according to the graphic information and the brightness information, wherein the brightness of a light source of the target object is greater than or equal to a second threshold value, and the time length of the light beam emitted by the light source of the target object when the light beam is lighted is greater than or equal to a first time length;
And obtaining the second brightness of the light source of the second object according to the brightness information.
7. The method according to any one of claims 2 to 6, further comprising:
And acquiring image information in front of the vehicle through a camera, wherein the graphic information, the color information and the brightness information are extracted from the image information.
8. The method according to any one of claims 2 to 6, further comprising:
And acquiring image information in front of the vehicle through a camera, and acquiring the brightness information through a brightness sensor, wherein the graphic information and the color information are extracted from the image information.
9. The method according to claim 7 or 8, wherein the image information includes a plurality of images having different exposure durations;
the method for acquiring the shape characteristics of the target object comprises the following steps:
fusing the images with different exposure time lengths to obtain a fused image;
And acquiring the shape characteristics of the target object from the graphic information of the fusion image.
10. The method according to any one of claims 1 to 9, wherein the glare value comprises a ratio between the second luminance and the first luminance; or the glare value comprises a difference between the second luminance and the first luminance.
11. The method according to any one of claims 1 to 10, wherein the enhancing the brightness of the low beam of the vehicle comprises:
Projecting a complementary light beam to a target illumination area in the illumination areas, wherein the area of the target illumination area has a positive correlation with the target distance; wherein the target distance is a distance between the vehicle and a light source of the target object.
12. The method of claim 11, wherein the left and right side boundaries of the light beam have a sharpness greater than 0.05 in horizontal cross section.
13. A method according to claim 11 or 12, wherein the angle α between a tangent to any point on the boundary of the left and right sides of the illuminated area and the horizontal line when the bright-compensating light beam is projected onto the vertical wall surface is such that: alpha is more than or equal to 20 degrees and less than or equal to 70 degrees.
14. The method according to any one of claims 11 to 13, wherein the horizontal angle h between the bright-up light beam and the horizontal plane satisfies: h is less than or equal to-10 degrees and less than or equal to-1 degrees.
15. The method according to any one of claims 11 to 14, wherein the perpendicular angle v between the bright-up light beam and the perpendicular plane is such that: v is more than or equal to-10 degrees and less than or equal to 10 degrees.
16. A vehicle comprising a sensor, a processing unit and a dipped headlight;
the sensor is used for acquiring information in front of the vehicle;
The processing unit is used for: acquiring first brightness of an illumination area in front of the vehicle and second brightness of a light source from a target object according to the information in front of the vehicle; and obtaining a glare value according to the first luminance and the second luminance, the glare value being indicative of a degree to which the second luminance is greater than the first luminance; wherein the light source of the target object is a light source outside the vehicle; and if the glare value is determined to be larger than a first threshold value, controlling the dipped headlight to enhance the brightness.
17. The vehicle according to claim 16, characterized in that the processing unit is specifically configured to:
acquiring brightness information in front of the vehicle;
Determining the target object according to the brightness information, wherein the brightness of a light source of the target object is greater than or equal to a second threshold value;
and obtaining the second brightness of the light source of the target object according to the brightness information.
18. The vehicle according to claim 16, characterized in that the processing unit is specifically configured to:
acquiring brightness information and graphic information in front of the vehicle; determining the target object according to the brightness information and the graphic information, wherein the brightness of a light source of the target object is larger than or equal to a second threshold value, and the shape of the target object meets the shape characteristics of a vehicle;
and obtaining the second brightness of the light source of the target object according to the brightness information.
19. The vehicle according to claim 16, characterized in that the processing unit is specifically configured to:
acquiring graphic information, color information and brightness information in front of the vehicle;
Determining the target object according to the graphic information and the color information, wherein the shape of the target object meets the shape characteristics of a vehicle, and the light beam emitted by the light source of the target object is white;
And determining the second brightness of the light source of the target object according to the brightness information.
20. The vehicle according to claim 16, characterized in that the processing unit is specifically configured to:
acquiring graphic information, color information and brightness information in front of the vehicle;
Determining the target object according to the graphic information and the color information, wherein a light beam emitted by a light source of the target object is white, the time length of the light beam which is lightened is greater than or equal to a first time length, and the shape of the light beam meets at least one of the following conditions:
the included angle between the light beam and the horizontal plane is smaller than or equal to a first angle; or alternatively, the first and second heat exchangers may be,
The included angle between the light beam and the vertical surface is smaller than or equal to a second angle; wherein the vertical plane is parallel to the front direction of the vehicle;
And determining the second brightness of the light source of the target object according to the brightness information.
21. The vehicle according to claim 16, characterized in that the processing unit is specifically configured to:
acquiring graphic information and brightness information in front of the vehicle;
determining the target object according to the graphic information and the brightness information, wherein the brightness of a light source of the target object is greater than or equal to a second threshold value, and the time length of the light beam emitted by the light source of the target object when the light beam is lighted is greater than or equal to a first time length;
And obtaining the second brightness of the light source of the second object according to the brightness information.
22. The vehicle according to any one of claims 17 to 21, characterized in that the sensor comprises a camera for acquiring image information in front of the vehicle;
The processing unit is configured to extract the graphics information, the color information, and the luminance information from the image information.
23. The vehicle according to any one of claims 17 to 21, characterized in that the sensor includes a camera for acquiring image information in front of the vehicle and a luminance sensor for acquiring the luminance information;
the processing unit is used for extracting the graphic information and the color information from the image information.
24. The vehicle according to claim 22 or 23, characterized in that the image information includes a plurality of images different in exposure time period;
The processing unit is used for:
fusing the images with different exposure time lengths to obtain a fused image;
And acquiring the shape characteristics of the target object from the graphic information of the fusion image.
25. The vehicle of any of claims 16-24, characterized in that the glare value comprises a ratio between the second luminance and the first luminance; or the glare value comprises a difference between the second luminance and the first luminance.
26. The vehicle according to any one of claims 16 to 25, characterized in that the dipped headlight is specifically intended for:
Projecting a complementary light beam to a target illumination area in the illumination areas, wherein the area of the target illumination area has a positive correlation with the target distance; wherein the target distance is a distance between the vehicle and a light source of the target object.
27. The vehicle of claim 26, wherein left and right side boundaries of the light beam have a sharpness greater than 0.05 in horizontal cross section.
28. The vehicle of claim 26 or 27, wherein the angle α between a tangent line and a horizontal line at any point on the boundary of the left and right sides of the illuminated area when the bright-compensating light beam is projected onto the vertical wall surface is as follows: alpha is more than or equal to 20 degrees and less than or equal to 70 degrees.
29. The vehicle according to any of claims 26-28, characterized in that the horizontal angle h between the bright-up light beam and the horizontal plane satisfies: h is less than or equal to-10 degrees and less than or equal to-1 degrees.
30. The vehicle according to any of claims 26-29, characterized in that the vertical angle v between the bright-up light beam and the vertical plane is such that: v is more than or equal to-10 degrees and less than or equal to 10 degrees.
CN202211733073.2A 2022-12-30 2022-12-30 Car lamp control method, car-mounted lighting device and car Pending CN118269807A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211733073.2A CN118269807A (en) 2022-12-30 2022-12-30 Car lamp control method, car-mounted lighting device and car

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211733073.2A CN118269807A (en) 2022-12-30 2022-12-30 Car lamp control method, car-mounted lighting device and car

Publications (1)

Publication Number Publication Date
CN118269807A true CN118269807A (en) 2024-07-02

Family

ID=91634772

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211733073.2A Pending CN118269807A (en) 2022-12-30 2022-12-30 Car lamp control method, car-mounted lighting device and car

Country Status (1)

Country Link
CN (1) CN118269807A (en)

Similar Documents

Publication Publication Date Title
CN108025672B (en) Projection of a predeterminable light pattern
US9268740B2 (en) External environment recognizing device for vehicle and light distribution control system using the same
JP5097648B2 (en) Headlamp control device and vehicle headlamp device
JP2023106557A (en) Lamp fitting system and vehicle lamp fitting
US9580005B2 (en) Method and control unit for setting a characteristic of a light emission of at least one headlight of a vehicle
EP2537709B1 (en) Image processing apparatus and light distribution control method
CN105270254B (en) Method and device for controlling the light emission of at least one headlight of a vehicle
US8315766B2 (en) Process for detecting a phenomenon limiting the visibility for a motor vehicle
US20190001868A1 (en) Vehicle headlight and light distribution control device of vehicle headlight
JP7241081B2 (en) Vehicle display system and vehicle
JP2006343322A (en) Method for detecting nighttime fog, and system for implementing the same
CN110869940B (en) Method for controlling a pixel headlight of a motor vehicle located on a roadway
US20200114805A1 (en) Method for reducing headlight glare, headlight system and system controller for controlling a headlight system
CN110300683B (en) Control of a motor vehicle headlight
JP2012240530A (en) Image processing apparatus
CN115485743A (en) Simulation method for pixel headlamp system
CN109476256B (en) Control device for a lighting lamp and method for operating a lighting lamp
CN102509302B (en) Passing light brightness dead line based on human vision for headlamp and HV (Hyper Velocity) point computing method
US20150085506A1 (en) Method for controlling the illumination of a road profile
CN117774813A (en) Control device, control method, and program for vehicle headlamp
CN118269807A (en) Car lamp control method, car-mounted lighting device and car
WO2022196296A1 (en) Vehicle lamp control device, vehicle lamp control method and vehicle lamp system
CN115959030A (en) Method and device for generating high beam control signal
CN110341582B (en) Intelligent car lamp semantic projection lighting device and method
Weber et al. Virtual night drive

Legal Events

Date Code Title Description
PB01 Publication