WO2023020123A1 - 一种车辆灯光的控制方法、灯光系统以及车辆 - Google Patents

一种车辆灯光的控制方法、灯光系统以及车辆 Download PDF

Info

Publication number
WO2023020123A1
WO2023020123A1 PCT/CN2022/101849 CN2022101849W WO2023020123A1 WO 2023020123 A1 WO2023020123 A1 WO 2023020123A1 CN 2022101849 W CN2022101849 W CN 2022101849W WO 2023020123 A1 WO2023020123 A1 WO 2023020123A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
driven
target light
path
light pattern
Prior art date
Application number
PCT/CN2022/101849
Other languages
English (en)
French (fr)
Inventor
刘晟君
张君豪
段军克
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP22857436.4A priority Critical patent/EP4368450A1/en
Publication of WO2023020123A1 publication Critical patent/WO2023020123A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • B60Q1/06Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle
    • B60Q1/08Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle automatically
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/30Indexing codes relating to the vehicle environment
    • B60Q2300/32Road surface or travel path
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/40Indexing codes relating to other road users or special conditions
    • B60Q2300/45Special conditions, e.g. pedestrians, road signs or potential dangers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2400/00Special features or arrangements of exterior signal lamps for vehicles
    • B60Q2400/50Projected symbol or information, e.g. onto the road or car body
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the present application relates to the field of automatic driving, and in particular to a method for controlling vehicle lighting, a lighting system and a vehicle.
  • the vehicles can be autonomous vehicles (autonomous vehicles; self-piloting automobile), also known as unmanned vehicles.
  • the vehicles can also be cars, trucks, motorcycles, public vehicles, etc. Vehicles, lawn mowers, RVs, fairground vehicles, trolleys, golf carts, trains, or carts, etc.
  • the warning lights of existing vehicles have a single reminder function and can only realize the function of reminding or illuminating the driving path.
  • Embodiments of the present invention provide a control method for vehicle lighting, a lighting system and a vehicle, which are used to improve the functions realized by the vehicle lighting.
  • the first aspect of the embodiments of the present invention provides a method for controlling vehicle lights, the method comprising: acquiring driving information, the driving information including at least one of navigation information, driving assistance information, and vehicle data; route information; acquiring a target light pattern corresponding to the driving information and the information of the to-be-traveled path; displaying the light beam emitted by the vehicle on the to-be-traveled path with the target light pattern.
  • the light beam emitted by the vehicle and the target light pattern displayed on the path to be driven can help improve the accuracy of navigation, and can realize the illumination of the path to be driven, ensuring It ensures the safety of the vehicle during driving according to the navigation.
  • other people or vehicles on the path can quickly determine the location of the vehicle to be driven according to the prompt of the target light pattern, which facilitates the avoidance of other people or vehicles on the path and improves driving safety.
  • the target light pattern represents the shape and size of the light displayed on the ground, and target light patterns with different lengths, widths, or curvatures are all different target light patterns.
  • the driving assistance information is relevant information for realizing unmanned driving.
  • the driving assistance information is information from the advanced driving assistance system ADAS of the vehicle.
  • the target light pattern can be matched with the vehicle's driving assistance information and the route to be driven. Based on the target light pattern, the vehicle's driving intention, emergency decision-making, and vehicle driving prediction events can be accurately identified, which improves the safety of vehicle driving. .
  • the acquiring the target light pattern corresponding to the driving information and the information about the route to be driven includes: acquiring at least one first display attribute, the at least one first display attribute A display attribute corresponds to the driving information; obtain at least one second display attribute, and the at least one second display attribute corresponds to the information of the route to be driven; determine that the at least one first display attribute and the at least one A light type of the second display attribute is the target light type.
  • the target light pattern shown in this aspect can correspond to multiple display attributes, which effectively increases the number of display attributes possessed by the target light pattern, increases the amount of driving information that the target light pattern can prompt, and effectively Increased the number of scenes that target lighttypes can be applied to.
  • the obtaining at least one first display attribute includes: M plane coordinates included in the vehicle according to the navigation information, and the route to be driven includes the first of the M plane coordinates From the i plane coordinate to the jth plane coordinate, i and j are both positive integers, and i is greater than or equal to 1, and j is greater than i and less than or equal to M.
  • the vehicle determines the shape of the target light pattern according to the shape of the path to be driven. For example, if the plurality of plane coordinates included in the path to drive extend along a straight line, then the target light pattern is determined to be a rectangle. For another example, if it is determined that the plurality of plane coordinates included in the route to be driven extends along an arc direction, then the vehicle determines that the target light pattern is arc-shaped.
  • the vehicle can determine the shape of the target light pattern according to the route to be driven indicated by the navigation information, and the driver can quickly and accurately determine the driving direction of the vehicle according to the shape of the target light pattern, which improves the efficiency of navigation. and accuracy.
  • the second display attribute may be one or more of the following:
  • the vehicle determines the second display attribute according to the size of the path to be driven. If the second display attribute is width, the width included in the second display attribute may be equal to the width of the lane line of the path to be driven. If the second display attribute is the length, the second display attribute may be the length between the first position and the second position, wherein the first position is the current position of the vehicle, and the second position is the closest to the vehicle included in the navigation information. The position of the intersection, or the second position, may be the position closest to the traffic light of the vehicle on the waiting path. The vehicle determines that the first display attribute is an arc, the second display attribute includes a bending direction, and the bending direction included in the second display attribute is consistent with the bending direction of the lane line of the path to be driven.
  • the width of the target light pattern is equal to the width of the lane line of the path to be driven, it is helpful for pedestrians or other vehicles to determine the width occupied by the vehicle when driving to the position of the target light pattern.
  • the width of the light pattern determines whether avoidance is required, which improves driving safety.
  • the length of the target light pattern is equal to the distance between the vehicle's current position and the position of the intersection closest to the vehicle, or the length of the target light pattern is equal to the distance between the vehicle's current position and the position of the traffic light closest to the vehicle, raise Improve the efficiency of navigation and remind the driving status, improve driving safety.
  • the acquiring at least one first display attribute includes: acquiring a driving list, the driving list includes correspondences between different driving information and different display attributes; acquiring the At least one first display attribute, the at least one first display attribute is a display attribute corresponding to the driving information in the driving list.
  • the display attribute corresponding to the target light pattern is obtained based on the driving list, which improves the speed at which the vehicle obtains the target light pattern, and for the same driving information, the vehicle always displays the same target light pattern, which helps the driver to quickly obtain the target light pattern according to the target light pattern.
  • the current driving information is determined, and the efficiency and accuracy of the target light pattern prompting driving information are improved.
  • the width of the target light pattern is greater than or equal to the width of the vehicle, and/or, the length of the target light pattern is greater than or equal to the length of the vehicle.
  • the target light pattern can indicate the area that the vehicle will occupy, which effectively improves the driving safety of the vehicle.
  • the vehicle can form a target light pattern on the path to be driven, the width of the target light pattern is equal to the width of the vehicle, and the target light pattern can indicate that the vehicle is driving to the target light pattern When within the area, the occupied width.
  • the target can determine the driving range of the vehicle based on the clear boundary of the target light pattern. If the target appears within the target light pattern, it means that the target is within the safe distance of the vehicle, and there is a high possibility of a safety accident between the vehicle and the target. If the target appears outside the target light pattern, it means that the target is outside the safe distance of the vehicle, and the possibility of a safety accident between the vehicle and the target is very small.
  • the target light pattern is consistent with the shape of the route to be driven, for example, the shape of the path to be driven is a rectangle, and the target light pattern is a rectangle.
  • the shape of the path to be driven is arc-shaped, and the target light pattern is arc-shaped.
  • the target light type is related to the size of the path to be driven.
  • the area on the path occupied by the vehicle during driving can be determined, which helps avoid traffic accidents and improves driving safety.
  • the target light pattern is also arc-shaped, and the bending direction of the path to drive and the target light pattern The bending direction is consistent.
  • the driving information is a driving decision from an advanced driver assistance system (ADAS) of the vehicle, and the target light pattern corresponds to the type of the driving decision.
  • ADAS advanced driver assistance system
  • the driving decision is the driving intention of the vehicle, and the driving intention includes at least one of the following: going straight, changing lanes, turning, or entering a fork.
  • the driving decision is an emergency decision, and the emergency decision includes at least one of the following: emergency braking, emergency avoidance, or vehicle failure.
  • the driving decision is a vehicle driving prediction event, and the vehicle driving prediction event includes the vehicle being in a safe state or the vehicle being in a dangerous state.
  • the target light pattern can indicate the driving decision from the vehicle ADAS, and drivers, pedestrians or other vehicles can quickly and accurately determine the vehicle's driving decision based on the target light pattern, which improves driving safety.
  • the driving information is the vehicle speed of the vehicle, and the vehicle speed is positively correlated with the length of the target light pattern, that is, the faster the vehicle speed, the faster the target light pattern. The longer the length, the slower the vehicle speed, the shorter the length of the target light pattern.
  • the speed of the vehicle can be quickly determined based on the length of the target light pattern, and the driver can determine whether to adjust the speed according to needs, and pedestrians or other vehicles can quickly determine whether to avoid, which improves driving safety.
  • the vehicle speed of the vehicle is positively correlated with the flash frequency of the light displayed on the ground in the target light pattern, that is, the faster the vehicle speed, the higher the flash frequency. The slower, the lower the flash frequency.
  • the speed of the vehicle can be quickly determined based on the flash frequency, and the driver can determine whether to adjust the speed according to needs, and pedestrians or other vehicles can quickly determine whether to avoid, which improves driving safety.
  • the vehicle speed of the vehicle is positively correlated with the brightness of the light displayed on the ground in the target light pattern, that is, the faster the vehicle speed, the brighter the brightness, and the slower the vehicle speed , the brightness becomes darker.
  • the speed of the vehicle can be quickly determined based on the brightness, the driver can determine whether to adjust the speed according to the need, and pedestrians or other vehicles can quickly determine whether to avoid, which improves the safety of driving.
  • the light displayed on the ground in the target light type has a flash frequency:
  • the light displayed on the ground with the target light type may have a flash frequency
  • the change in the shape of the path to drive may be: the path to be driven by the vehicle is used to instruct the vehicle to switch from the straight state It is the turning state, or the vehicle’s waiting path is used to indicate the switch from the turning state to the straight state, or the vehicle’s waiting path is used to indicate that the vehicle is about to drive to the intersection, or the vehicle’s waiting path is used to indicate the lane
  • the size of the line changes (for example, the width of the lane line changes).
  • the light emitted by the headlights of the vehicle is displayed on the zebra crossing in a target light pattern.
  • obstacles such as pedestrians, other vehicles, etc.
  • the different driving information of various vehicles is indicated by the flashing frequency, which improves the efficiency of prompting driving information to drivers, pedestrians or other vehicles, and improves driving safety.
  • the driving information is the brightness of the environment where the vehicle is located.
  • the brightness of the emitted light of the car lamp can match the brightness of the environment.
  • the car lamp can provide the function of illuminating the path to be driven.
  • the driver drives according to the driving path illuminated by the target light pattern, which improves driving safety.
  • the driving information is the distance between the vehicle and the vehicle in front
  • the size of the distance is related to the brightness of the light displayed on the ground in the target light pattern
  • the driver or the vehicle in front can quickly determine the distance between the vehicle and the vehicle in front, and then accurately predict whether there is a possibility of collision between the vehicle and the vehicle in front, which improves driving safety.
  • the size of the spacing is negatively correlated with the flashing frequency of the light displayed on the ground in the target light pattern, that is, the larger the spacing, the lower the flashing frequency, The smaller the spacing, the higher the flash frequency.
  • the driver or the vehicle in front can quickly determine the distance between the vehicle and the vehicle in front, and then accurately predict whether there is a possibility of collision between the vehicle and the vehicle in front, which improves driving safety.
  • the size of the distance is positively correlated with the length of the target light pattern, that is, the larger the distance, the longer the length of the target light pattern, and the smaller the distance, the longer the target light pattern. The shorter the length of the type.
  • the driver or the vehicle in front can quickly determine the distance between the vehicle and the vehicle in front, and then accurately predict whether there is a possibility of collision between the vehicle and the vehicle in front, which improves driving safety.
  • the driving information is that there is an object to be identified around the vehicle, and the target light pattern covers at least a target area occupied by the object to be identified Area.
  • the light emitted by the headlights can illuminate the object to be identified in the target area, so that the vehicle can accurately identify the type of the object to be identified, and the driver can help the driver predict Judging whether to avoid the object to be identified improves driving safety.
  • the vehicle determines the driving decision according to the type of the object to be identified.
  • the vehicle when there is an object to be identified on the path to be driven by the vehicle, the vehicle can illuminate the object to be identified.
  • the vehicle can recognize the illuminated object to be identified and identify the specific type, so that the vehicle can execute the corresponding driving decision or the driver of the vehicle can avoid the vehicle according to the illuminated object to be identified, etc. Under the scene of the object, the safety of vehicle driving.
  • the target light type includes a first light type and a second light type, the first light type corresponds to driving information, and the second light type at least covers the target area, the The target area is the area occupied by the object to be identified.
  • the first light type shown in this aspect can illuminate the path to be driven, and the second light type can illuminate the objects to be identified around the path to be driven. While ensuring the driving of the vehicle, it is also possible to accurately determine the surrounding area of the path. Driving safety.
  • the information on the route to be driven may be one or more of the following:
  • the shape of the path to be driven may be that the lane line of the path to be driven is a straight lane line, or the lane line of the path to be driven is a curved lane line.
  • the size of the path to be driven can be the width and/or length of the lane line of the path to be driven.
  • the size of the path to be driven can also be, if the lane line of the path to be driven is curved, the The size of the lane line can also be the arc and/or the bending direction of the lane line of the path to be driven.
  • the route to be driven by the vehicle is located in a multi-fork road scenario, the information on the route to be driven may also include the location of the intersection to be driven on.
  • the distance between the current position of the vehicle and the nearest intersection on the path to be driven may refer to the distance between the current position of the vehicle and the intersection to be driven.
  • the target light pattern can indicate a variety of different driving information, which increases the number of scenes to which the target light pattern is applied, and improves driving efficiency and safety.
  • acquiring the information of the route to be driven includes: a camera of the vehicle photographs the route to be driven to obtain a video stream including the information of the route to be driven.
  • the vehicle's processor receives the video stream from the camera.
  • the processor extracts video frames included in the video stream.
  • the processor extracts video frames from the video stream with preset extraction speeds.
  • the processor analyzes the information of the route to be driven based on the video frame.
  • the way the vehicle obtains the information of the route to be driven based on the video stream including the information of the route to be driven ensures that the vehicle can accurately obtain the specific conditions of the route to be driven, effectively improves the accuracy of obtaining the target light pattern, and avoids There may be deviations in the information on the route to be driven suggested by the target light pattern.
  • the faster the extraction speed is the more up-to-date information on the route to be driven can be obtained.
  • the slower the extraction speed the more power consumption of the processor can be saved.
  • the vehicle can determine the extraction speed based on the specific conditions of the vehicle (such as the remaining power, the complexity of the driving road conditions).
  • the method before acquiring the driving information, the method further includes: when the vehicle satisfies a trigger condition, triggering the execution of the step of acquiring the information of the route to be driven.
  • the trigger condition is an instruction input by the driver for displaying the target light pattern on the path to be driven, or the trigger condition can be at least one of the following:
  • the current speed of the vehicle is greater than or equal to the first preset value
  • the brightness of the environment where the vehicle is currently located is less than or equal to the second preset value
  • the shape of the vehicle's waiting path changes and the change in vehicle speed is greater than or equal to the third preset value.
  • set value, or the change amount of the ambient brightness is greater than or equal to the fourth preset value and the like.
  • the change of the shape of the driving path can be: the vehicle’s waiting path is used to indicate that the vehicle is switched from the straight direction to the turning state, or the vehicle’s waiting path is used to indicate the switching from the steering state to the straight state, or the vehicle’s
  • the path to be driven is used to indicate that the vehicle is about to drive to an intersection, or the path to be driven by the vehicle is used to indicate that the size of the lane line changes (eg, the width of the lane line changes).
  • the vehicle can determine whether to display the target light pattern on the path to be driven according to the trigger condition, which avoids the waste of power consumption caused by displaying the target light pattern in a scene where the target light pattern does not need to be displayed.
  • the centerline of the target light pattern may coincide with the centerline of the path to be driven, or, the centerline of the target light pattern may coincide with the
  • the offset between the centerlines of the paths to be driven is less than or equal to a first distance, for example, the first distance may be 0.5 meters.
  • the distance between the boundary line of the target light pattern and the lane line of the path to be driven is less than or equal to the second
  • the spacing for example, the spacing between the left boundary line of the target light pattern and the left lane line of the path to be driven is less than or equal to the second spacing, and for another example, the right boundary line of the target light pattern and The distance between the right lane lines of the path to be driven is less than or equal to the second distance, or, when the length indicated by the second display attribute is the length between the first position and the second position, along In the extending direction of the path to be driven, the upper and lower boundary lines of the target light pattern coincide with the first position and the second position respectively.
  • the target light type can accurately indicate the width of the lane line or accurately indicate the distance between the first position and the second position, which improves driving safety.
  • the width of the target light type is equal to the narrowest width of the lane line of the path to be driven, then, based on the illuminated path to be driven, the driver can Accurately judge the sudden narrowing of the lane line, which improves driving safety.
  • the vehicle forms a target light pattern on the zebra crossing of the path to be driven, and the target light pattern can illuminate the zebra crossing. Then, pedestrians walking on the zebra crossing will notice the target light pattern, which is helpful for pedestrians to avoid vehicles on the zebra crossing. And because the target light pattern can illuminate the zebra crossing, then the zebra crossing will not become the driver's blind spot, effectively avoiding the possibility of safety accidents between vehicles and pedestrians.
  • the structure of the path to be driven is curved, the larger the curvature, the brighter the brightness, and the smaller the curvature, the darker the brightness.
  • the brightness when the size of the indicated lane line of the route to be driven of the vehicle changes is greater than the brightness when the size of the indicated lane line of the route to be driven of the vehicle does not change.
  • the brightness of the emitted light from the vehicle when displayed on the zebra crossing in the target light type is greater than the brightness when the emitted light from the vehicle is not displayed on the zebra crossing in the target light type.
  • the brightness within the range of the target light type is greater than that of the target light type of the vehicle Brightness when there are no obstacles in range.
  • the method further includes: A calibration image is collected, and the calibration image includes a path to be driven and a target light pattern displayed on the path to be driven.
  • the vehicle judges whether the target light type satisfies the recalibration condition according to the calibration image. Specifically, the vehicle determines that the target light type meets the recalibration condition when the vehicle determines that the calibration image meets at least one of the following conditions:
  • the centerline of the target light pattern judged by the vehicle deviates from the lane centerline of the path to be driven, or the distance between the centerline of the target light pattern judged by the vehicle and the lane centerline of the path to be driven is greater than the offset, or, within the target
  • the width of the light pattern is equal to the width of the lane line of the path to be driven, along the lateral direction of the path to be driven, the boundary lines on both sides of the target light pattern deviate from the boundary lines on both sides of the lane line, or the boundary lines on both sides of the target light pattern
  • the bending direction of the target light pattern needs to be consistent with the bending direction of the path to be driven, the bending direction of the target light pattern is inconsistent with the bending direction of the path to be driven, etc.
  • this aspect can accurately determine whether the target light pattern is successfully displayed on the path to be driven, avoiding the situation that the target light pattern is not successfully displayed on the path to be driven, and improving the target light pattern to indicate driving. Accuracy of Information.
  • the second aspect of the embodiment of the present invention provides a lighting system, the lighting system includes a vehicle light module and a control unit, the control unit is connected to the vehicle light module, and the control unit is used to obtain driving information, so
  • the driving information includes at least one of navigation information, driving assistance information, and vehicle-machine data, and is also used to obtain information on the route to be driven; and is also used to obtain a target corresponding to the driving information and the information on the route to be driven
  • the vehicle light module is used to display the light beam emitted by the vehicle on the path to be driven in the target light pattern.
  • control unit is configured to: acquire at least one first display attribute, where the at least one first display attribute corresponds to the driving information; acquire at least one second display attribute, The at least one second display attribute corresponds to the information of the route to be driven; and the light type having the at least one first display attribute and the at least one second display attribute is determined as the target light type.
  • the control unit is configured to: according to the M plane coordinates included in the navigation information, the route to be driven includes the i-th plane coordinate to the j-th plane coordinate among the M plane coordinates plane coordinates, where i and j are both positive integers, and i is greater than or equal to 1, and j is greater than i and less than or equal to M.
  • the vehicle determines the shape of the target light pattern according to the shape of the path to be driven. For example, if the plurality of plane coordinates included in the path to drive extend along a straight line, then the target light pattern is determined to be a rectangle. For another example, if it is determined that the plurality of plane coordinates included in the route to be driven extends along an arc direction, then the control unit determines that the target light pattern is arc-shaped.
  • the second display attribute may be one or more of the following:
  • the control unit determines the second display attribute according to the size of the path to be driven. If the second display attribute is width, the width included in the second display attribute is equal to the width of the lane line of the path to be driven.
  • the second display attribute determined by the control unit is length, and the length shown by the second display attribute is the length between the first position and the second position, wherein the first position is the current position of the vehicle, and the second position is the navigation
  • the position of the intersection closest to the vehicle included in the information, the second position may be the position of the traffic light closest to the vehicle on the path to drive.
  • the control unit determines that the first display attribute is an arc, the second display attribute includes a bending direction, and the bending direction included in the second display attribute is consistent with the bending direction of the lane line of the path to be driven.
  • control unit is configured to: obtain a driving list, the driving list includes correspondences between different driving information and different display attributes; obtain the at least one first A display attribute, the at least one first display attribute is a display attribute corresponding to the driving information in the driving list.
  • the width of the target light pattern is greater than or equal to the width of the vehicle, and/or, the length of the target light pattern is greater than or equal to the length of the vehicle.
  • control unit is configured to form a target light pattern on the path to be driven, the width of the target light pattern is greater than or equal to the width of the vehicle, and the target light pattern can indicate that the vehicle is driving The width occupied when reaching the area of the target light type.
  • the target light pattern is consistent with the shape of the route to be driven, for example, the shape of the path to be driven is a rectangle, and the target light pattern is a rectangle.
  • the shape of the path to be driven is arc-shaped, and the target light pattern is arc-shaped.
  • the target light type is related to the size of the path to be driven.
  • the target light pattern is also arc-shaped, and the bending direction of the path to drive and the target light pattern The bending direction is consistent.
  • the driving information is a driving decision from an advanced driver assistance system (ADAS) of the vehicle, and the target light pattern corresponds to the type of the driving decision.
  • ADAS advanced driver assistance system
  • the driving decision is the driving intention of the vehicle, and the driving intention includes at least one of the following: going straight, changing lanes, turning, or entering a fork.
  • the driving decision is an emergency decision
  • the emergency decision includes at least one of the following: emergency braking, emergency avoidance, or vehicle failure.
  • the driving decision is a vehicle driving prediction event
  • the vehicle driving prediction event includes the vehicle being in a safe state or the vehicle being in a dangerous state.
  • the driving information is the speed of the vehicle, and the speed of the vehicle is positively correlated with the length of the target light pattern, that is, the faster the vehicle speed, the higher the target light pattern. The longer the length, the slower the vehicle speed, the shorter the length of the target light pattern.
  • the vehicle speed of the vehicle is positively correlated with the flash frequency of the light displayed on the ground in the target light pattern, that is, the faster the vehicle speed, the higher the flash frequency, and the slower the vehicle speed , the lower the flash frequency.
  • the vehicle speed of the vehicle is positively correlated with the brightness of the light displayed on the ground in the target light pattern, that is, the faster the vehicle speed, the brighter the brightness, and the slower the vehicle speed, the brighter the brightness. darker.
  • the light displayed on the ground with the target light type has a flash frequency:
  • the change in the shape of the path to be driven can be: the path to be driven by the vehicle is used to indicate that the vehicle is switched from the straight state to the steering state , or, the vehicle's waiting path is used to indicate the switch from the steering state to the straight state, or the vehicle's waiting path is used to indicate that the vehicle is about to drive to the intersection, or the vehicle's waiting path is used to indicate the size of the lane line A change occurs (such as a change in the width of the lane markings).
  • the target light pattern of the vehicle is displayed on the zebra crossing.
  • obstacles such as pedestrians, other vehicles, etc.
  • obstacles appear within the range of the target light pattern of the vehicle.
  • the driving information is the brightness of the environment where the vehicle is located.
  • the driving information is the distance between the vehicle and the vehicle in front, and the magnitude of the distance is negative to the brightness of the light displayed on the ground in the target light pattern. Correlation, that is, the larger the spacing, the lower the brightness, and the smaller the spacing, the higher the brightness.
  • the size of the distance is negatively correlated with the flash frequency of the light displayed on the ground in the target light pattern, that is, the larger the distance, the lower the flash frequency and the smaller the distance. The smaller the value, the higher the flash frequency.
  • the size of the distance is positively correlated with the length of the target light pattern, that is, the larger the distance, the longer the length of the target light pattern, and the smaller the distance, the longer the target light pattern. The shorter the length of the type.
  • the driving information is that there is an object to be identified around the vehicle, and the target light pattern covers at least a target area, and the target area is occupied by the object to be identified Area.
  • control unit is configured to acquire an image to be identified including the object to be identified, and the control unit acquires the type of the object to be identified according to the image to be identified.
  • control unit determines the driving decision according to the type of the object to be identified.
  • the target light type includes a first light type and a second light type, the first light type corresponds to driving information, and the second light type at least covers the target area, the The target area is the area occupied by the object to be identified.
  • the information on the route to be driven may be one or more of the following:
  • the shape of the path to be driven may be that the lane line of the path to be driven is a straight lane line, or the lane line of the path to be driven is a curved lane line.
  • the size of the path to be driven can be the width and/or length of the lane line of the path to be driven.
  • the size of the path to be driven can also be, if the lane line of the path to be driven is curved, the The size of the lane line can also be the arc and/or the bending direction of the lane line of the path to be driven.
  • the route to be driven by the vehicle is located in a multi-fork road scenario, the information on the route to be driven may also include the location of the intersection to be driven on.
  • the distance between the current position of the vehicle and the nearest intersection on the path to be driven may refer to the distance between the current position of the vehicle and the intersection to be driven.
  • control unit is configured to use a camera to capture the route to be driven to obtain a video stream including information about the route to be driven.
  • the control unit receives the video stream from the camera.
  • the control unit extracts video frames included in the video stream.
  • the control unit extracts video frames from the video stream with a preset extraction speed.
  • the control unit is used for analyzing the information of the route to be driven based on the video frame.
  • the faster the extraction speed is the more up-to-date information on the route to be driven can be obtained.
  • the slower the extraction speed the more power consumption of the processor can be saved.
  • control unit is further configured to: trigger the execution of the step of acquiring information about the route to be driven when the trigger condition is met.
  • the trigger condition is an instruction input by the driver for displaying the target light pattern on the path to be driven, or the trigger condition can be at least one of the following:
  • the current speed of the vehicle is greater than or equal to the first preset value
  • the brightness of the environment where the vehicle is currently located is less than or equal to the second preset value
  • the shape of the vehicle's waiting path changes and the change in vehicle speed is greater than or equal to the third preset value.
  • set value, or the change amount of the ambient brightness is greater than or equal to the fourth preset value and the like.
  • the change of the shape of the driving path may be: the vehicle’s waiting path is used to indicate that the vehicle is switched from the straight direction to the turning state, or the vehicle’s waiting path is used to indicate the switching from the steering state to the straight state, or the vehicle’s
  • the path to be driven is used to indicate that the vehicle is about to drive to an intersection, or the path to be driven by the vehicle is used to indicate that the size of the lane line changes (eg, the width of the lane line changes).
  • the centerline of the target light pattern may coincide with the centerline of the path to be driven, or, the centerline of the target light pattern may coincide with the
  • the offset between the centerlines of the paths to be driven is less than or equal to a first distance, for example, the first distance may be 0.5 meters.
  • the distance between the boundary line of the target light pattern and the lane line of the path to be driven is less than or equal to the second
  • the spacing for example, the spacing between the left boundary line of the target light pattern and the left lane line of the path to be driven is less than or equal to the second spacing, and for another example, the right boundary line of the target light pattern and The distance between the right lane lines of the path to be driven is less than or equal to the second distance, or, when the length indicated by the second display attribute is the length between the first position and the second position, along In the extending direction of the path to be driven, the upper and lower boundary lines of the target light pattern coincide with the first position and the second position respectively.
  • the width of the target light type is equal to the narrowest width of the lane line of the path to be driven, then, based on the illuminated path to be driven, the driver can Accurately judge the sudden narrowing of the lane line, which improves driving safety.
  • the vehicle light module is used to form a target light pattern on the zebra crossing of the path to be driven, so as to illuminate the zebra crossing.
  • the smaller the degree the darker the brightness of the light displayed on the ground with the target light type.
  • the structure of the path to be driven is curved. The larger the arc, the higher the brightness, and the smaller the arc, the lower the brightness.
  • the brightness when the path to be driven is changed is greater than the brightness when the path to be driven is not changed, for example, if the path to be driven is always in a straight state, the brightness is smaller than the path to be driven Brightness when steering occurs.
  • the brightness when the size of the indicated lane line of the route to be driven of the vehicle changes is greater than the brightness when the size of the indicated lane line of the route to be driven of the vehicle does not change.
  • the brightness of the vehicle's outgoing light when it is displayed on the zebra crossing with the target light type is greater than the brightness when the vehicle's outgoing light is not displayed on the zebra crossing with the target light type.
  • the brightness of the emitted light of the vehicle light is greater than that within the range of the target light type of the vehicle The brightness of the light emitted by the headlights when there are no obstacles.
  • the light beam emitted by the vehicle light module is displayed on the path to be driven in the target light pattern
  • the control unit is used to reacquire the calibration image through the camera, the calibration The image includes the path to be driven and the target light pattern displayed on the path to be driven.
  • the control unit is used to determine whether the target light type meets the recalibration condition according to the calibration image. Specifically, when the control unit determines that the calibration image meets at least one of the following conditions, determine that the target light type meets the recalibration condition:
  • the control unit is used to judge that the centerline of the target light pattern deviates from the lane centerline of the path to be driven, or the distance between the centerline of the control unit to judge the target light pattern and the lane centerline of the path to be driven is greater than the offset or, when the width of the target light type is equal to the width of the lane line of the path to be driven, along the lateral direction of the path to be driven, the boundary lines on both sides of the target light type and the boundary lines on both sides of the lane line deviate , or when the bending direction of the target light pattern needs to be consistent with the bending direction of the path to be driven, the bending direction of the target light pattern is inconsistent with the bending direction of the path to be driven, etc.
  • the third aspect of the embodiment of the present invention provides a vehicle, the vehicle includes the lighting system as shown in the second aspect above.
  • Fig. 1 is a functional block diagram of an embodiment of a vehicle provided by the present application
  • FIG. 2 is a flow chart of steps in the first embodiment of the method for controlling vehicle lighting provided by the present application
  • Figure 3a is an example diagram of the first application scenario provided by the present application.
  • Figure 3b is an example diagram of the second application scenario provided by the present application.
  • Figure 3c is an example diagram of the third application scenario provided by this application.
  • Figure 3d is an example diagram of the fourth application scenario provided by this application.
  • Figure 3e is an example diagram of the fifth application scenario provided by this application.
  • FIG. 4 is a comparison example diagram of the first application scenario provided by the present application.
  • Figure 4a shows an example diagram of the specific road conditions of the vehicle turning right shown in the existing scheme
  • Figure 4b shows an example diagram of a specific road condition of a vehicle turning right shown in this application
  • FIG. 5 is a comparison example diagram of the second application scenario provided by the present application.
  • Fig. 5a is a diagram showing an example of the lighting of the low beam of the vehicle shown in the existing scheme
  • Figure 5b is an illustration of the lighting example of the target light type provided by the present application.
  • FIG. 6 is a comparison example diagram of the third application scenario provided by the present application.
  • Fig. 6a is an example diagram of the illumination of the low beam of the vehicle when the width of the lane line shown in the existing scheme changes;
  • Fig. 6b is an example diagram of the illumination of the target light type when the width of the lane line changes provided by the present application;
  • FIG. 7 is a comparison example diagram of the fourth application scenario provided by the present application.
  • Fig. 7a is an illustration of the lighting example of the zebra crossing illuminated by the low beam of the vehicle shown in the existing scheme
  • Fig. 7b is an example diagram of illumination of a zebra crossing illuminated by the target light pattern provided by the present application.
  • Fig. 8a is an illustration of the lighting example of the vehicle low beam lighting the front of the vehicle shown in the existing scheme
  • Fig. 8b is an example diagram of lighting in front of the vehicle illuminated by the target light type provided by the present application.
  • Fig. 8c is an example diagram with a first distance between the vehicle and the vehicle in front;
  • Fig. 8d is an example diagram with a second distance between the vehicle and the vehicle in front;
  • FIG. 9 is a flow chart of steps in the second embodiment of the method for controlling vehicle lighting provided by the present application.
  • FIG. 10 is a comparison example diagram of the sixth application scenario provided by the present application.
  • Fig. 10a is an illustration of the lighting example of the vehicle parking and storage shown in the existing scheme
  • Fig. 10b is an example diagram of the illumination of the target light type provided by the present application for the parking and storage of vehicles;
  • Figure 11 is a comparison example diagram of the seventh application scenario provided by this application.
  • Fig. 11a is an illustration of the lighting example of the dangerous state of the vehicle shown in the existing scheme
  • Fig. 11b is an example diagram of lighting when the target light type provided by the present application is dangerous to the vehicle;
  • Fig. 12 is a flow chart of steps in the third embodiment of the method for controlling vehicle lighting provided by the present application.
  • Fig. 13 is a comparison example diagram of the eighth application scenario provided by the present application.
  • Fig. 13a is an illustration of an example of illumination of an object to be recognized existing in the path to be driven shown in the existing scheme
  • Fig. 13b is an example diagram of the illumination of the object to be identified in the target light type provided by the present application for the object to be identified;
  • Fig. 14 is a flow chart of steps in the fourth embodiment of the method for controlling vehicle lighting provided by the present application.
  • Figure 15 is a comparison example diagram of the ninth application scenario provided by this application.
  • Fig. 15a is an illustration of an example of illumination of an object to be identified existing in front of the vehicle shown in an existing solution
  • Fig. 15b is an example diagram of the target light pattern provided by the present application for illuminating the object to be identified in front of the vehicle;
  • Fig. 16 is a structural example diagram of an embodiment of the lighting system provided by the present application.
  • FIG. 1 is a functional block diagram of an embodiment of the vehicle provided in the present application.
  • the vehicle 100 is configured in a fully or partially autonomous driving mode.
  • the vehicle 100 can control itself while in the automatic driving mode, and can determine the current state of the vehicle and its surrounding environment through human operation, determine the likely behavior of at least one other vehicle in the surrounding environment, and determine the behavior of the other vehicle.
  • a confidence level corresponding to the likelihood of performing the possible action is used to control the vehicle 100 based on the determined information.
  • the vehicle 100 While the vehicle 100 is in the autonomous driving mode, the vehicle 100 may be set to operate without human interaction.
  • Vehicle 100 may include various systems, each of which may include multiple elements.
  • each system and element of the vehicle 100 may be interconnected by wire or wirelessly.
  • the vehicle shown in this embodiment includes a sensor system 120 that may include several sensors that sense information about the environment around the vehicle 100 .
  • the sensor system 120 may include a positioning system 121 (the positioning system may be a global positioning system (global positioning system, GPS) system, or the Beidou system or other positioning systems), an inertial measurement unit (inertial measurement unit, IMU) 122, Radar 123 , laser range finder 124 and camera 125 .
  • the sensor system 120 may also include sensors of the interior systems of the monitored vehicle 100 (eg, interior air quality monitor, fuel gauge, oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding properties (position, shape, orientation, velocity, etc.).
  • the positioning system 121 may be used to estimate the geographic location of the vehicle 100 .
  • the IMU 122 is used to sense changes in position and orientation of the vehicle 100 based on inertial acceleration.
  • IMU 122 may be a combination accelerometer and gyroscope.
  • the radar 123 may utilize radio signals to sense objects within the surrounding environment of the vehicle 100 .
  • radar 123 may be used to sense the velocity and/or heading of an object.
  • the specific type of the radar 123 is not limited.
  • the radar 123 may be a millimeter wave radar or a laser radar.
  • the laser range finder 124 may utilize laser light to sense objects in the environment in which the vehicle 100 is located.
  • laser rangefinder 124 may include one or more laser sources, a laser scanner, and one or more detectors, among other system components.
  • Camera 125 may be used to capture multiple images of the surrounding environment of vehicle 100 .
  • the camera 125 may be a still camera, a video camera, a single/binocular camera or an infrared imager.
  • Vehicle 100 also includes an advanced driving assistance system (ADAS) 110 .
  • ADAS110 senses the surrounding environment at any time during the driving process of the vehicle, collects data, conducts identification, detection and tracking of static and dynamic objects, and combines the navigation map data to perform systematic calculation and analysis, so that the driver can be aware of the possibility in advance The dangers that occur can effectively increase the comfort and safety of vehicle driving.
  • ADAS 110 may control a vehicle through data acquired by sensory system 120 .
  • ADAS110 can control the vehicle through vehicle-machine data, wherein the vehicle-machine data can be the main data on the vehicle dashboard (fuel consumption, engine speed, temperature, etc.), vehicle speed information, steering wheel angle information, or body posture data, etc.
  • ADAS 110 can control the vehicle in one or more of the following ways:
  • ADAS 110 adjusts the heading of vehicle 100 .
  • the ADAS 110 controls the operating speed of the vehicle's engine and in turn controls the speed of the vehicle 100 .
  • ADAS 110 manipulates images captured by cameras 125 to identify objects and/or features in the environment surrounding vehicle 100 .
  • the ADAS 110 may be used to map the environment, track objects, estimate the speed of objects, and the like.
  • ADAS 110 determines a route for vehicle 100 , and in some embodiments, ADAS 110 may incorporate one or more predetermined map data from sensor system 120 to determine a route for vehicle 100 .
  • ADAS 110 may identify, assess, and avoid or otherwise overcome potential obstacles in the environment of vehicle 100 .
  • Peripherals 130 may include wireless communication system 131 , on-board computer 132 , microphone 133 and/or speaker 134 .
  • peripheral device 130 provides a means for a user of vehicle 100 to interact with the user interface.
  • on-board computer 132 may provide information to a user of vehicle 100 .
  • the user interface may also operate the on-board computer 132 to receive user input.
  • the onboard computer 132 can be operated through a touch screen.
  • peripheral devices 130 may provide a means for vehicle 100 to communicate with other devices located within the vehicle.
  • microphone 133 may receive audio (eg, voice commands or other audio input) from a user of vehicle 100 .
  • speaker 134 may output audio to a user of vehicle 100 .
  • the wireless communication system 131 may communicate wirelessly with one or more devices, directly or via a communication network.
  • the wireless communication system 131 may use third-generation mobile communication technology (3rd-generation, 3G) cellular communication, such as code division multiple access (code division multiple access, CDMA), global system for mobile communications (global system for mobile communications, GSM ), general packet radio service (GPRS).
  • the wireless communication system 131 may use the fourth generation mobile communication technology (the 4th generation mobile communication technology, 4G) cellular communication, such as long term evolution (long term evolution, LTE).
  • the wireless communication system 131 may also use fifth generation mobile communication technology (5th generation mobile communication technology, 5G) cellular communication.
  • the wireless communication system 131 may utilize a wireless local area network (wireless local area network, WLAN) for communication.
  • WLAN wireless local area network
  • the wireless communication system 131 may communicate directly with the device using an infrared link, Bluetooth, or ZigBee.
  • Wireless communication system 131 may also utilize various vehicle communication systems, for example, wireless communication system 131 may include one or more dedicated short range communications (DSRC) devices, which may include vehicle and/or roadside stations Public and/or private data communication between.
  • DSRC dedicated short range communications
  • Computer system 140 may control functions of vehicle 100 based on input received from various systems (eg, sensing system 120 , ADAS 110 , peripherals 130 ) and from a user interface.
  • Computer system 140 may include at least one processor 141 that executes instructions stored in a non-transitory computer-readable medium such as memory 142 .
  • Computer system 140 may also be a plurality of computing devices that control individual components or subsystems of vehicle 100 in a distributed manner.
  • the processor 141 can be one or more field-programmable gate arrays (field-programmable gate array, FPGA), application specific integrated circuit (application specific integrated circuit, ASIC) ), system on chip (SoC), central processing unit (central processor unit, CPU), network processor (network processor, NP), digital signal processing circuit (digital signal processor, DSP), microcontroller (micro controller unit, MCU), programmable logic device (programmable logic device, PLD) or other integrated chips, or any combination of the above chips or processors, etc.
  • the processor 141 may be located inside the vehicle, or the processor 141 may be located away from the vehicle and communicate wirelessly with the vehicle.
  • memory 142 may contain instructions (eg, program logic) executable by processor 141 to perform various functions of vehicle 100 .
  • memory 142 may also store data such as map data, route information, the vehicle's position, direction, speed, and other vehicle data. Information stored by memory 142 may be used by vehicle 100 and computer system 140 during operation of vehicle 100 in autonomous, semi-autonomous, and/or manual modes.
  • the vehicle 100 shown in this embodiment also includes a vehicle light module 150, and the light beam emitted by the vehicle light module 150 can display a target light pattern on the path to be driven by the vehicle 100. , the process of forming the target light pattern on the path to be driven will be described.
  • the lighting module shown in this embodiment can be applied not only to vehicles, but also to driving tools such as ships, airplanes, and helicopters.
  • FIG. 2 is a flow chart of the steps of the first embodiment of the vehicle lighting control method provided by the present application.
  • Step 201 the vehicle determines that a trigger condition is met.
  • the process of executing the method shown in this embodiment will be started, so that the light beam emitted by the vehicle can display the target light pattern on the path to be driven.
  • the trigger condition is met when the vehicle receives the driver inputting the activation command, for example, the vehicle receives the activation command input by the driver, and the activation command is used to display the target light pattern on the path to be driven.
  • the driver may input the activation instruction through voice input to the lighting system, touch gesture or press operation input to the vehicle cockpit screen, and the like.
  • the vehicle may determine that the trigger condition may be at least one of the following:
  • the current speed of the vehicle is greater than or equal to a first preset value (for example, the first preset value may be 60 km/h), and the brightness of the environment where the vehicle is currently located is less than or equal to a second preset value (for example, the first preset value
  • the second preset value can be 50 lux), or the shape of the vehicle’s waiting path changes, or the variation of the vehicle’s speed is greater than or equal to the third preset value, or the variation of the vehicle’s ambient brightness is greater than or equal to the fourth preset value, or the remaining power of the vehicle is greater than or equal to the fifth preset value and so on.
  • the change of the shape of the driving path can be: the vehicle switches from the straight direction to the steering state, or the vehicle switches from the steering state to the straight state, or the vehicle is about to drive to the intersection, or the lane line of the vehicle is driving.
  • Dimensions change e.g. lane line width changes).
  • the variation of the vehicle speed can be: the vehicle obtains the vehicle speed V1 at the time T1, and the vehicle obtains the vehicle speed V2 at the time T2, wherein the time T1 is the current time, the time T2 is earlier than the time T1, the vehicle
  • the amount of change of the vehicle speed greater than or equal to the third preset value can be: the difference between V2 and V1 is greater than or equal to the third preset value, for example, the preset value can be 10 km/h, it can be seen that in the vehicle When the variation of the vehicle speed is greater than or equal to 10 km/h, the trigger condition is met.
  • step 201 when the vehicle determines through step 201 that the trigger condition is met, the following steps are triggered:
  • Step 202 the vehicle acquires navigation information.
  • the vehicle shown in this embodiment acquires the navigation information according to the navigation destination input by the driver.
  • Drivers can input navigation destinations by inputting voice to the vehicle navigation system, inputting touch gestures to the cockpit screen of the vehicle navigation system, pressing buttons of the vehicle navigation system, and the like.
  • the computer system shown in FIG. 1 can obtain the navigation information from the positioning system.
  • the navigation information can be a series of plane coordinates for the vehicle to reach the navigation destination.
  • FIG. 3a is an example diagram of the first application scenario provided by the present application.
  • the navigation information shown in this embodiment is a series of plane coordinates that the vehicle 300 needs to pass through sequentially during the process of driving to the destination, such as plane coordinate A (x1, y1), plane coordinate B (x2, y2), plane coordinate c (x3, y3), and so on, the plane coordinates K(xk, yk), wherein the plane coordinates K(xk, yk) are the plane coordinates of the destination of the vehicle driving or the plane coordinates of the destination close to the vehicle driving. It can be seen that the vehicle can successfully arrive at the destination after successively passing through each plane coordinate included in the navigation information.
  • the process for the vehicle to obtain the navigation information may be: when the vehicle obtains the destination that needs to be driven, the vehicle may obtain the GPS coordinates of the vehicle's current location and the GPS coordinates of the destination. The vehicle obtains the map data, and then obtains the above-mentioned navigation information according to the map data, the GPS coordinates of the vehicle's current location, and the GPS coordinates of the destination.
  • Step 203 the vehicle acquires information about the route to be driven.
  • the light beam emitted by the vehicle shown in this embodiment can be displayed on the path to be driven, and for this, the vehicle needs to obtain information about the path to be driven.
  • the vehicle can determine the route to be driven according to the navigation information, as shown in FIG. 3b , wherein FIG. 3b is an example diagram of the second application scenario provided by the present application.
  • the route to be driven 301 shown in this embodiment includes part or all of the plane coordinates included in the navigation information shown above.
  • This embodiment does not limit the length of the path to be driven.
  • the length of the path to be driven can be 10 meters.
  • the path to be driven includes a plane within 10 meters in front of the vehicle 300 included in the navigation information. coordinate.
  • the shape of the path to be driven may be that the lane line of the path to be driven is a straight lane line, or the lane line of the path to be driven is a curved lane line.
  • the lane line of the route to be driven is a curved lane line.
  • the size of the path to be driven can be the width and/or length of the lane line of the path to be driven.
  • the size of the path to be driven can also be, if the lane line of the path to be driven is curved, the The size of the lane line can also be the arc and/or the bending direction of the lane line of the path to be driven.
  • FIG. 3c is an example diagram of a third application scenario provided by the present application.
  • the information on the route to be driven may also include the location of the intersection 304 to be driven on.
  • the distance between the current position of the vehicle and the nearest intersection on the path to be driven may refer to the distance between the current position of the vehicle 302 and the intersection 304 to be driven.
  • the camera of the vehicle shoots the route to be driven to obtain a video stream including information of the route to be driven.
  • the vehicle's computer system receives the video stream from the camera.
  • the processor included in the computer system extracts video frames included in the video stream, for example, the processor can extract video frames from the video stream at a speed of 30 frames per second. It should be clear that this embodiment does not limit the speed at which the processor extracts video frames. The faster the processor extracts video frames, the more up-to-date information on the route to be driven can be obtained. And the slower the speed at which the processor extracts the video frames, the more power consumption of the processor can be saved.
  • the processor may determine the speed of extracting video frames according to the complexity of the current road conditions. For example, the more complex the current driving road conditions are (such as frequent changes in the shape of the path to be driven, such as switching from a straight state to a turning state, and there are many intersections, etc.), then the processor can extract information at a faster speed. video frame. For another example, if the current driving condition is simpler (eg, the shape of the path to be driven is relatively stable, for example, it is always going straight), the processor can extract video frames at a slower speed.
  • the processor After the processor extracts the video frame, it can analyze the information of the route to be driven based on the video frame.
  • This embodiment does not limit the analysis method adopted by the processor.
  • the analysis method can be: object recognition algorithm, recovery in motion Structure from motion (SFM) algorithm, video tracking or artificial intelligence (AI), etc.
  • Step 204 the vehicle acquires the target light pattern.
  • Step 205 the light beam emitted by the vehicle is displayed on the path to be driven in the target light pattern.
  • Steps 204 to 205 are described in a unified manner as follows:
  • the vehicle after the vehicle acquires the navigation information and the information of the route to be driven, it can acquire the target light pattern corresponding to the navigation information and the information of the route to be driven. Specifically, the vehicle may acquire one or more first display attributes corresponding to the navigation information. The vehicle then acquires one or more second display attributes corresponding to the information of the route to be driven. Then, the vehicle determines that the light type having both the first display attribute and the second display attribute is the target light type.
  • the vehicle After the vehicle acquires the target light pattern, the vehicle emits a beam according to the target light pattern to ensure that the beam emitted by the vehicle can be displayed on the path to be driven with the target light pattern.
  • This embodiment does not limit the size of the distance between the target light pattern displayed on the path to be driven and the vehicle, as long as the driver in the vehicle can clearly see the target light pattern displayed in front of the vehicle, for example , the distance between the target light pattern and the vehicle may be 10 meters.
  • the vehicle determines the first display attribute according to each plane coordinate included in the navigation information. Specifically, the vehicle obtains the route to be driven, which includes the i-th plane coordinate to the j-th plane coordinate among the M coordinates, where i and j are both positive integers, and i is greater than or equal to 1, and j is greater than i and less than or equal to M.
  • the M plane coordinates may be plane coordinate A, plane coordinate B, plane coordinate C, and so on.
  • the vehicle determines, according to the navigation information, that the plane coordinates included between the vehicle's current position and the intersection 304 where the vehicle is to travel are the M plane coordinates (plane coordinates A to plane 304 as shown in Figure 3c). coordinates M), it can be seen that the vehicle travels through the M plane coordinates in sequence, and can travel to the intersection 304 closest to the vehicle.
  • the vehicle determines the first display attribute according to the M plane coordinates.
  • the specific determination process can be referred to the following first driving list shown in Table 1.
  • the first driving list shows the establishment of correspondences between different extension directions of M plane coordinates and different first display attributes.
  • FIG. 3d is an example diagram of the fourth application scenario provided by the present application. If the M plane coordinates extend along the straight line, then the vehicle determines that the first display attribute is a rectangle. Also as shown in combination with Table 1 and FIG. 3b, if the M plane coordinates extend along the arc direction, then the vehicle determines that the first display attribute is arc.
  • the vehicle determines the second display attribute according to the route to be driven, and the second display attribute can be one or more of the following:
  • the vehicle determines the second display attribute according to the size of the path to be driven. If the second display attribute is width, the width included in the second display attribute is equal to the width of the lane line of the path to be driven.
  • the description of the relationship between the width included in the second display attribute and the width of the lane line of the path to be driven is an optional example, and for another example, the width included in the second display attribute may also be smaller than the lane line
  • the width included in the second display attribute may also be greater than the width of the lane line, the width of the lane line, etc., which is not specifically limited.
  • the second display attribute determined by the vehicle is length.
  • the length shown by the second display attribute may be the length between the first position and the second position, wherein the first position is the current position of the vehicle, and the second position is the position of the intersection closest to the vehicle included in the navigation information .
  • the second position may be the position closest to the traffic light of the vehicle on the waiting path collected by the vehicle.
  • the second display attribute includes a bending direction. Specifically, as shown in Figure 3b, the vehicle determines the bending direction included in the second display attribute according to the arc of the lane line 301 of the path to be driven, so as to ensure that the bending direction included in the second display attribute is consistent with the lane line of the path to be driven The bending direction is the same.
  • the vehicle determines that the light type having both the first display attribute and the second display attribute is the target light type.
  • the first display attribute is a rectangle
  • the second display attribute is the width of the lane line of the path to be driven (such as the width of the lane line is 3.5 meters)
  • the shape of the target light type is a rectangle
  • the The width of the rectangle is 3.5 meters.
  • FIG. 3e is an example diagram of a fifth application scenario provided by the present application.
  • the first display attribute is arc-shaped, and the bending direction included in the second display attribute is consistent with the bending direction of the path to be driven
  • the target light pattern with the second display attribute is arc-shaped, and the bending direction is consistent with the bending direction of the path to be driven.
  • the bending directions of the paths to be driven 321 are consistent.
  • the centerline of the target light pattern may coincide with the centerline of the path to be driven, or the offset between the centerline of the target light pattern and the centerline of the path to be driven may be less than or It is equal to the first distance, and it can be known that the display method of the target light pattern can ensure that the target light pattern can be accurately displayed on the path to be driven.
  • the width included in the second display attribute is equal to the width of the lane line of the path to be driven, then, along the lateral direction of the path to be driven, the boundary lines on both sides of the target light pattern and the lane line of the path to be driven.
  • the boundary lines coincide, or the offset between the boundary lines on both sides of the target light pattern and the lane line of the path to be driven is less than or equal to the second distance, or, the length shown in the second display attribute is the first In the case of the length between the position and the second position, along the extension direction of the path to be driven, the upper and lower boundary lines of the target light pattern coincide with the first position and the second position respectively.
  • the target light pattern When the target light pattern is displayed on the path to be driven, the target light pattern can indicate the area occupied by the vehicle during driving. From the target light pattern 310 shown in FIG. 3 d , it can be known that the vehicle will drive to the lane position occupied by the target light pattern 310 . As can be seen from the target light pattern 321 in FIG. 3 e , the vehicle will drive to the lane position occupied by the target light pattern 321 . It can be seen that, through the light beam emitted by the vehicle, the target light pattern displayed on the path to be driven helps to improve the accuracy of navigation, and can realize the illumination of the path to be driven, ensuring the safety of the vehicle in the process of driving according to the navigation. For example, as shown in FIG. 4 , FIG. 4 is a comparison example diagram of an application scenario provided by the present application.
  • Fig. 4a shows an example diagram of the specific road conditions of the vehicle turning right shown in the existing scheme.
  • the vehicle 401 determines that it needs to turn right at the next intersection, that is, the intersection 402 according to the vehicle navigation, the vehicle 401 does not turn right at this intersection.
  • the light emitted by the vehicle can only illuminate the limited area in front of the vehicle 401, and cannot illuminate the intersection 402 where the vehicle is waiting to drive.
  • FIG. 4b is an example diagram of a specific road condition of a vehicle turning right shown in this application.
  • the target light pattern 403 can be determined.
  • the target light pattern 403 can illuminate the intersection 402 according to the waiting path of the vehicle 401 , so as to ensure the safety of the user driving the vehicle 401 through the intersection 402 when turning.
  • FIG. 5 is a comparison example diagram of the second application scenario provided by the present application.
  • Fig. 5a is a diagram showing an example of illumination of a low beam headlight of a vehicle shown in an existing solution.
  • the vehicle 501 of the existing scheme is in a scene with low ambient brightness (such as night, cloudy, rainy day), etc., and the illumination range of the light emitted by the low beam of the vehicle 501 is relatively small, as in the scene shown in Figure 5a, the vehicle The light emitted by the dipped beam of the vehicle 501 can only be illuminated within 25 meters in front of the vehicle 501 .
  • FIG. 5 b is an illustration of an illumination example of the target light type provided in the present application.
  • the target light pattern 503 displayed on the waiting path in front of the vehicle 502 has a length of 20 meters, a width of 3.5 meters, and a rectangle shape.
  • the target light pattern 503 shown in this embodiment is formed by the light beam emitted by the vehicle 502 and directly irradiated on the path to be driven. The brightness with which the path is illuminated. Since the target light pattern 503 has illuminated the path to be driven, the driver can drive according to the area illuminated by the target light pattern 503, which improves driving safety. Moreover, other people or vehicles on the path can quickly determine the location where the vehicle 503 is about to drive according to the prompt of the target light pattern 503, which facilitates the avoidance of other people or vehicles on the path and improves driving safety.
  • FIG. 6 is a comparison example diagram of the third application scenario provided by the present application.
  • FIG. 6a is an example diagram of illumination of the low beam headlight of the vehicle when the width of the lane line shown in the existing solution changes.
  • the width of the lane line on the path ahead of the vehicle 601 suddenly narrows, that is, the width 602 of the lane line is greater than the width 603 of the lane line. If the driver cannot accurately determine the change in the width of the lane line, it is easy to cause driving danger.
  • FIG. 6b is an example diagram of the illumination of the target light type when the width of the lane line changes provided in the present application.
  • the target light pattern 605 formed in front of the vehicle 604 can accurately illuminate the path ahead, and the width of the target light pattern 605 can be equal to the narrowest width of the lane line.
  • the driver can accurately judge the sudden narrowing of the lane line, which improves the driving safety.
  • FIG. 7 is a comparison example diagram of the fourth application scenario provided by the present application.
  • FIG. 7a is an example diagram of illumination of a zebra crossing illuminated by a low beam light of a vehicle shown in an existing solution.
  • FIG. 7a is an example diagram of illumination of a zebra crossing illuminated by a low beam light of a vehicle shown in an existing solution.
  • pedestrians on the zebra crossing 701 should not cross the zebra crossing under the instruction of the red light, and the vehicle 702 should cross the zebra crossing under the instruction of the green light.
  • pedestrians have weak safety awareness and continue to cross the zebra crossing under the instruction of the red light, if the vehicle 702 fails to avoid pedestrians, a safety accident will occur.
  • FIG. 7b is an illumination example diagram of a zebra crossing illuminated by a target light type provided in the present application.
  • the vehicle 703 can form a target light pattern 704 on the zebra crossing of the path to be driven, and the target light pattern 704 can illuminate the zebra crossing.
  • pedestrians walking on the zebra crossing will notice the target light pattern 704, which is helpful for pedestrians to avoid vehicles 703 on the zebra crossing.
  • the target light pattern 704 can illuminate the zebra crossing, the zebra crossing will not become a driver's blind spot, effectively avoiding the possibility of a safety accident between the vehicle 703 and pedestrians.
  • FIG. 8a is an example diagram of the lighting example of the vehicle low beam lighting the front of the vehicle shown in the existing solution.
  • the target 802 can be any other vehicle, or a non-motorized vehicle, or a pedestrian. It is uncertain whether the target 802 will collide with the target 802 while the vehicle 801 is running, that is, whether the target 802 is located outside the safe distance of the vehicle 801 .
  • FIG. 8b is an example diagram of illumination in front of the vehicle provided by the target light type in the present application.
  • the vehicle 803 can form a target light pattern 804 on the path to be driven, the width of the target light pattern 804 is equal to the width of the vehicle 803, and the target light pattern 804 can indicate when the vehicle 803 drives into the area of the target light pattern 804, the occupied width.
  • the target 805 can determine the driving range of the vehicle 803 based on the clear boundary of the target light pattern 804 .
  • the target 805 appears within the target light pattern 804, it means that the target 805 is within the safe distance of the vehicle 804, and there is a high possibility of a safety accident between the vehicle 804 and the target 805. If the target 805 appears outside the target light pattern 804, it means that the target 805 is outside the safe distance of the vehicle 804, and the possibility of a safety accident between the vehicle 805 and the target 805 is very small.
  • the target light type in this embodiment is an optional example without limitation.
  • the target light type can also be related to data from the vehicle (such as vehicle speed). Then, for the vehicle surroundings Pedestrians or vehicles, etc., can determine the vehicle speed based on the target light pattern.
  • the length of the target light pattern is positively correlated with the vehicle speed, that is, the faster the vehicle speed, the longer the length of the target light pattern, and the slower the vehicle speed, the shorter the length of the target light pattern.
  • the vehicle can store the corresponding list of vehicle speed and target light type as shown in Table 2 below.
  • the length of the target light pattern Greater than 120 km/h Between 60m and 80m Between 120 km/h and 80 km/h Between 50m and 60m Between 80 km/h and 60 km/h Between 40m and 50m Between 60 km/h and 40 km/h Between 20 meters and 40 meters Between 40 km/h and 20 km/h Between 10 meters and 20 meters Between 20 km/h and 0 km/h Between 0m and 10m
  • the vehicle may determine that the length of the corresponding target light pattern is 45 meters. For another example, if the vehicle determines that the vehicle speed is greater than 120 km/h, the vehicle may determine that the length of the corresponding target light pattern is 80 meters. It should be clear that the description of the correspondence between the vehicle speed and the length of the target light pattern in this embodiment is only an example and not limited, as long as the speed of the vehicle speed can be determined based on the length of the target light pattern.
  • the length of the target light pattern shown in this embodiment can also be in a dynamic relationship with the vehicle speed. Specifically, the vehicle obtains the vehicle speed, and obtains the target light pattern corresponding to the vehicle speed according to the following formula 1:
  • the vehicle shown in this embodiment can periodically bring the current vehicle speed of the vehicle into Formula 1, and obtain the current vehicle speed into Equation 1, etc.
  • the vehicle obtains the vehicle speed V1 at the time T1
  • the vehicle obtains the vehicle speed V2 at the time T2, where the time T1 is the current time, the time T2 is earlier than the time T1, and the variation of the vehicle speed is greater than Or equal to the preset value can be: the difference between V2 and V1 is greater than or equal to the preset value, for example, the preset value can be 10 km/h, it can be seen that the variation of the vehicle speed is greater than or equal to 10 km/h In the case of m/hour, the length of the target light pattern is obtained according to the above formula 1.
  • the light displayed on the ground in the target light type shown in this embodiment may also have a certain flash frequency, and the flash frequency may refer to at least one example shown below:
  • the speed of the vehicle is positively correlated with the flash frequency, that is, the faster the vehicle speed, the higher the flash frequency, and the slower the vehicle speed, the lower the flash frequency.
  • the shape of the path to be driven changes, the light displayed on the ground in the target light pattern has a flash frequency.
  • the change of the shape of the driving path may be: the vehicle’s waiting path is used to indicate that the vehicle is switched from the straight state to the turning state, or the vehicle’s waiting path is used to indicate the switching from the turning state to the straight state, or the vehicle’s waiting state
  • the path is used to indicate that the vehicle is about to drive to an intersection, or the vehicle's to-be-traveled path is used to indicate that the size of the lane line changes (eg, the width of the lane line changes).
  • the target light pattern of the vehicle is displayed on the zebra crossing.
  • obstacles such as pedestrians, other vehicles, etc.
  • the flashing frequency of the vehicle is negatively correlated with the ambient brightness, that is, the darker the ambient brightness, the higher the flashing frequency, and the brighter the ambient brightness, the lower the flashing frequency.
  • the brightness of the target light type shown in this embodiment may refer to at least one example shown below:
  • the speed of the vehicle is positively correlated with the brightness of the light displayed on the ground in the target light pattern, that is, the faster the vehicle speed, the brighter the brightness, and the slower the vehicle speed, the darker the brightness.
  • the structure of the path to be driven is curved, and The larger the arc, the brighter the brightness, and the smaller the arc, the darker the brightness.
  • the brightness of the light emitted by the vehicle is matched with the brightness of the environment, so as to ensure that the driver's eyes will not be excessively irritated while reminding the driver.
  • the brightness when the path to be driven is changed is greater than the brightness when the path to be driven is not changed, that is, if the path to be driven is always in a straight state, the brightness of the light emitted by the vehicle light is smaller than that of the vehicle light when the path to be driven is turned. The brightness of the outgoing light.
  • the brightness of the emitted light of the vehicle lamp when the size of the indicated lane line of the route to be driven of the vehicle changes is greater than the brightness of the emitted light of the vehicle lamp when the size of the indicated lane line of the route to be driven of the vehicle does not change.
  • the brightness of the light emitted by the car lights when the target light pattern is displayed on the zebra crossing is greater than the brightness of the light emitted by the car lights when the target light pattern is not displayed on the zebra crossing.
  • the brightness of the emitted light from the vehicle light is greater than that of the emitted light from the vehicle light when there is no obstacle within the range of the target light type of the vehicle. brightness.
  • the target light pattern shown in this embodiment is also related to the distance between the vehicle and the vehicle in front.
  • the vehicle in front is located directly in front of the vehicle or on the side. Take the vehicle directly in front of the vehicle as an example.
  • the distance between the vehicle 811 and the vehicle 812 in front is L1.
  • the distance between the vehicle 811 and the vehicle 812 in front is L2, and L1 is smaller than L1 .
  • the target light pattern displayed by the vehicle 811 shown in this example is located on the path between the vehicle 811 and the vehicle 812 ahead, that is, in the example shown in FIG. 8c, the target light pattern 813 is located on the path between the vehicle 811 and the vehicle 812 ahead.
  • the target light pattern 814 is located on the path between the vehicle 811 and the vehicle 812 ahead.
  • the length of the target light pattern shown in this embodiment is positively correlated with the distance between the vehicle and the vehicle in front, that is, the larger the distance between the vehicle and the vehicle in front, the longer the length of the target light pattern.
  • 8c and 8d it can be seen that in the case that the distance L2 between the vehicle 811 and the vehicle 812 in front shown in FIG. 8d is larger than the distance L1 between the vehicle 811 and the vehicle 812 in front shown in FIG.
  • the length of the target light pattern 814 shown is greater than the length of the target light pattern 813 shown in Figure 8c. It should be understood that when the distance between the vehicle and the vehicle in front is sufficiently large, for example, the distance reaches 150 meters or more, the length of the target light pattern will remain unchanged.
  • this embodiment takes the length of the target light pattern and the distance between the vehicle and the vehicle in front as an example for illustration.
  • the distance between the vehicle and the vehicle in front can also be related to The target light pattern shows that the brightness of the light on the ground is negatively correlated, that is, the shorter the distance between the vehicle and the vehicle in front, the brighter the brightness, and the longer the distance between the vehicle and the vehicle in front, the darker the brightness.
  • the distance between the vehicle and the vehicle in front is negatively correlated with the flash frequency of the light displayed on the ground in the target light pattern, that is, the shorter the distance between the vehicle and the vehicle in front, the higher the flash frequency, and the distance between the vehicle and the front The longer the distance between vehicles, the lower the flash frequency.
  • the brightness can be kept constant, and the flash frequency can also be kept constant or not flash.
  • Step 206 if the vehicle judges that the target light pattern meets the recalibration condition, return to step 203 .
  • Step 206 shown in this embodiment is an optional step.
  • the target light pattern displayed by the vehicle on the path to be driven can be calibrated.
  • the method of step 203 is executed to realize the recalibration of the target light pattern, that is, to reacquire the target light pattern. If the target light pattern does not meet the recalibration condition, it means that the target light pattern is accurate, and there is no need to reacquire the target light pattern, then the vehicle can drive to the above-mentioned waiting path, Obtain the target light pattern that needs to be displayed on the next path to be driven.
  • the vehicle In order to achieve the vehicle's purpose of judging whether the target light pattern meets the recalibration conditions, the vehicle needs to perform the following specific processes:
  • the vehicle re-acquires a calibration image through the vehicle's camera, and the calibration image includes the determined path to be driven and the target light pattern displayed on the path to be driven.
  • the vehicle judges whether the target light type meets the recalibration condition according to the calibration image. Specifically, the vehicle determines that the target light type meets the recalibration condition when the vehicle determines that the calibration image meets at least one of the following conditions:
  • the vehicle judges that the center line of the target light pattern deviates from the center line of the lane of the path to be driven, and along the lateral direction of the path to drive, when the width of the target light pattern needs to be equal to the width of the lane line of the path to drive, the target light
  • the boundary lines on both sides of the pattern deviate from the boundary lines on both sides of the lane line, or when the bending direction of the target light pattern needs to be consistent with the bending direction of the path to be driven, the bending direction of the target light pattern and the path to be driven Inconsistent bending direction, etc.
  • the description of the process of the vehicle judging whether the target light pattern satisfies the recalibration condition according to the calibration image in this embodiment is an optional example and is not limited, as long as the vehicle can accurately determine whether the target light pattern can It only needs to play an auxiliary role in navigating the driver and improve driving safety.
  • the light beam emitted by the vehicle and the target light pattern displayed on the driving path can be matched with the vehicle's navigation information, so as to improve driving safety according to the target light pattern.
  • the target light type is related to the navigation information, that is, the target light type will also change with the navigation information.
  • the target light pattern shown in this embodiment is related to the driving assistance information, that is, as the driving assistance information changes, the target light pattern will also change accordingly.
  • the specific execution process is shown in Figure 9, where Figure 9 is A flow chart of the steps of the second embodiment of the method for controlling vehicle lights provided in this application.
  • Step 901 the vehicle determines that a trigger condition is met.
  • step 901 For the execution process of step 901 shown in this embodiment, please refer to step 201 shown in the first embodiment for details, and details are not repeated here.
  • Step 902 the vehicle acquires driving assistance information.
  • the driving assistance information shown in this embodiment is relevant information for realizing unmanned driving.
  • the driving assistance information is the information from the ADAS of the vehicle.
  • the ADAS for the specific description of the ADAS, please refer to the relevant description in FIG. 1 , and details are not repeated here.
  • Step 903 the vehicle obtains the information of the route to be driven.
  • step 902 shown in this embodiment please refer to step 203 shown in Embodiment 1, and details are not repeated here.
  • Step 904 the vehicle acquires the target light pattern.
  • Step 905 the light beam emitted by the vehicle is displayed on the path to be driven in the target light pattern.
  • Steps 904 to 905 are described in a unified manner as follows:
  • the vehicle after the vehicle acquires the driving assistance information and the information of the route to be driven, it can acquire the target light pattern corresponding to the driving assistance information and the information of the driving route. Specifically, the vehicle may acquire one or more first display attributes corresponding to the driving assistance information. The vehicle obtains one or more second display attributes corresponding to the information of the route to be driven, then the vehicle determines that the light type having both the first display attribute and the second display attribute is the target light type, and the second display attribute For description, please refer to Embodiment 1, and details are not repeated here.
  • the vehicle emits a light beam according to the target light pattern to ensure that the light beam emitted by the vehicle can be displayed on the path to be driven with the target light pattern.
  • the target light pattern displayed on the path to be driven in this embodiment, please refer to Embodiment 1, and details are not repeated here.
  • the driving assistance information shown in this embodiment is the driving decision from the ADAS, and the vehicle can determine different first display attributes according to different driving decisions of the ADAS.
  • the vehicle determines the corresponding relationship between different ADAS driving decisions and different first display attributes according to the second driving list.
  • the driving decision is the driving intention of the vehicle as an example.
  • the driving intent shown in this example is the direction the vehicle is about to travel.
  • the second driving list shown in this example can be referred to in Table 3 below.
  • the second driving list shown in Table 3 creates correspondences between different driving intentions and different first display attributes.
  • the description of the content of the second driving list in this embodiment is an optional example and is not limited, as long as the light displayed on the ground in the target light pattern changes with the change of each driving intention determined by the ADAS. (It can also be referred to as a light carpet) only needs to change the first display attribute.
  • the number of first display attributes corresponding to each driving intention is not limited.
  • the driving intention "go straight" shown in Table 3 corresponds to a first display attribute (that is, the light carpet is rectangular), and the driving intention "change lane” corresponds to two first display attributes (that is, the light carpet has a flash frequency and the brightness) is just an example.
  • the light carpet corresponding to the driving intention is superimposed with multiple first display attributes.
  • the computer system obtains the corresponding first display attribute according to Table 3. It has a flash frequency and improves the brightness of the light carpet. It can be seen that during the process of the vehicle changing lanes, the light carpet, through its first display attribute, prompts surrounding vehicles or pedestrians that the vehicle is about to change lanes.
  • FIG. 10 is a comparison example diagram of the sixth application scenario provided by the present application. In the example shown in FIG. 10a , FIG.
  • 10a is an example diagram of lighting for a vehicle parked in a garage shown in an existing solution.
  • the vehicle 1001 shown in the existing solution is in an automatic driving state, and the vehicle 1001 needs to turn to park and enter the garage.
  • the person 1002 cannot determine the driving intention of the vehicle 1001, resulting in a dangerous situation.
  • FIG. 10b is an example diagram of the light carpet provided in this application for lighting a vehicle parking and entering a garage.
  • ADAS indicates that the driving intention is to turn right to park and enter the garage.
  • the vehicle based on the first display attribute corresponding to the steering (the light carpet is arc-shaped as shown in Table 3), the vehicle also determines the second The display attribute can display an arc-shaped light carpet 1004 on the path to be driven by the vehicle 1003 .
  • the person 1005 Based on the bending direction of the light blanket 1004, the person 1005 can accurately judge the driving intention of the vehicle 1003, avoid the person 1005 appearing in the area occupied by the light carpet 1004, and avoid the possibility of a safety accident between the person 1005 and the vehicle 1003 .
  • the vehicle determines the corresponding relationship between different ADAS driving decisions and different first display attributes according to the third driving list.
  • the driving decision is an emergency decision of the vehicle. accidents, vehicle breakdowns, etc.
  • the third driving list shown in this example can be referred to in Table 4 below.
  • the third driving list shown in Table 4 creates correspondences between different emergency decisions and different first display attributes.
  • the number of first display attributes corresponding to each emergency decision is not limited.
  • emergency avoidance shown in Table 3 corresponds to a first display attribute (that is, the light carpet has the second flash frequency)
  • emergency braking corresponds to three first display attributes (that is, the light carpet has the first flash frequency, the shape of the light carpet
  • the change and the light carpet having the first brightness) are only exemplary.
  • an emergency brake corresponds to multiple first display attributes
  • the light carpet corresponding to the emergency decision is superimposed with multiple first display attributes.
  • This embodiment does not limit the specific sizes of the first flash frequency and the second flash frequency shown in Table 4.
  • the light blanket does not have a flash frequency when the vehicle is running normally (such as going straight, turning), etc. And when the vehicle is in emergency braking or emergency avoidance, the light blanket has a flashing frequency.
  • the specific brightness of the first brightness and the second brightness shown in Table 4 is not limited, for example, the first brightness and the second brightness may both be greater than the brightness of the light carpet during normal driving of the vehicle.
  • This embodiment does not limit the specific change method of the shape change of the light blanket shown in Table 4, as long as the change of the shape of the light blanket can remind pedestrians or other vehicles around the vehicle that the vehicle is currently in the state of "emergency braking" That is, for example, the change of the shape of the optical carpet may be that the length of the optical carpet is shortened, or the width of the optical carpet is widened, etc., which are not specifically limited.
  • the computer system obtains the corresponding first display attribute according to Table 4 as the light carpet has The first flash frequency, the shape change of the light carpet and the light carpet has a first brightness. It can be seen that during the emergency braking process of the vehicle, the light carpet, through its first display attribute, reminds the surrounding pedestrians or vehicles that the vehicle is about to brake suddenly.
  • the vehicle determines the corresponding relationship between different ADAS driving decisions and different first display attributes according to the fourth driving list.
  • This example takes the driving decision as a vehicle driving prediction event as an example, where the vehicle driving prediction event is the ADAS's response to the vehicle. Prejudgment of possible events, for example, the vehicle driving prejudged event is that the vehicle is in a safe state, the vehicle is in a dangerous state, that is, the vehicle may have a safety accident, etc.
  • the fourth driving list shown in this example can be referred to in Table 5 below.
  • the fourth driving list shown in Table 5 establishes correspondences between different vehicle driving prediction events and different first display attributes.
  • the description of the content of the fourth driving list in this embodiment is an optional example and is not limited, as long as the change of the predicted driving event of each vehicle determined by the ADAS, the fourth driving list possessed by the light carpet As soon as the display properties change.
  • the quantity of the first display attributes corresponding to each predicted vehicle driving event is not limited.
  • the predicted event that the vehicle is in a dangerous state corresponds to two first display attributes
  • the predicted event that the vehicle is in a safe state corresponds to one first display attribute is only an example.
  • the light carpet corresponding to the vehicle driving prediction event is superimposed with multiple first display attributes.
  • the specific sizes of the third brightness and the fourth brightness shown in Table 5 are not limited. In order to remind the vehicles and pedestrians around the vehicle that the vehicle is about to be in a dangerous state, then the vehicle is in danger.
  • the fourth brightness of the light carpet in the state is greater than the third brightness of the light carpet when the vehicle is in a safe state.
  • This embodiment does not limit the specific magnitude of the third flashing frequency.
  • the light blanket does not have a flashing frequency when the vehicle is running normally (such as going straight and turning). And under the predicted event that the vehicle is in a dangerous state, the light carpet has the third flashing frequency.
  • the description of the specific type of the first display attribute corresponding to different vehicle driving prediction events in this embodiment is an optional example and is not limited.
  • the vehicle driving prediction event can also be related to the light carpet
  • the corresponding form, change method, etc., are not specifically limited.
  • FIG. 11 is a comparison example diagram of the seventh application scenario provided by the present application.
  • FIG. 11a is an example diagram of lighting in a dangerous state of a vehicle shown in an existing solution.
  • the vehicle 1100 is about to drive to the intersection.
  • the ADAS of the vehicle 1100 determines that there is a target vehicle 1101 on the right side of the intersection and is about to enter the intersection.
  • the ADAS of the vehicle 1100 judges whether the vehicle is in a safe state.
  • the safe state means that when the vehicle 1100 drives to the intersection, it will not collide with the target vehicle 1101. For this reason, the ADAS of the vehicle 1100 judges whether the vehicle is in a safe state.
  • the vehicle speed of the vehicle 1101, and the distance between the vehicle 1100 and the target vehicle 1101 determine that the vehicle 1100 is in a safe state. Based on what is shown in Table 5, it can be seen that the light carpet of the vehicle 1100 has the third brightness at this time, and the second display attribute of the light carpet can refer to the description of the second display attribute shown in Embodiment 1, and details will not be repeated, for example,
  • the length of the light blanket is positively correlated with the speed of the vehicle 1100 , and the width of the light blanket is consistent with the width of the lane of the path to drive.
  • the ADAS of the vehicle 1100 will detect the speed of the vehicle 1100 , the speed of the target vehicle 1101 and the distance between the vehicle 1100 and the target vehicle 1101 in real time.
  • the vehicle 1100 when the vehicle 1100 continues to approach the intersection, if the ADAS of the vehicle 1100 detects at least one of the vehicle 1100 and the target vehicle 1101 , it is in an acceleration state. Then, the ADAS of the vehicle 1100 determines that the vehicle 1100 is in a dangerous state according to the speed of the vehicle 1100, the speed of the target vehicle 1101, and the distance between the vehicle 1100 and the target vehicle 1101, that is, the vehicle 1100 and the target vehicle 1101 may collide at the intersection , then, the vehicle 1100 can adopt the emergency decision-making of emergency braking shown in Example 2 above, but the emergency braking may cause shock to the driver, pedestrians or vehicles around the vehicle 1100 .
  • the emergency braking may cause shock to the driver, pedestrians or vehicles around the vehicle 1100 .
  • Figure 11b is the light An example of lighting when the blanket is in a dangerous state for the vehicle.
  • the first display attribute of the light carpet displayed by the vehicle 1100 on the path to be driven is to have a fourth brightness and a third flash frequency, for example, the fourth brightness is greater than the third brightness 10 lux (lx).
  • the fourth brightness is greater than the third brightness 10 lux (lx).
  • Step 906 if the vehicle judges that the optical carpet meets the recalibration condition, return to step 902 .
  • step 905 For the description of the specific execution process of step 905 shown in this embodiment, please refer to the description of the execution process of step 205 shown in the first embodiment, and details are not repeated here.
  • the light blanket shown in this embodiment can be matched with the vehicle’s driving assistance information and the route to be driven. Based on the light blanket, it can accurately identify the vehicle’s driving intention, emergency decision-making, and the vehicle’s driving prediction events, etc., which improves the accuracy of the vehicle. driving safety.
  • the function of the light carpet is to help the driver to navigate.
  • the function of the light carpet is to improve the driving safety in the process of unmanned driving.
  • the function of the light carpet shown in this embodiment is to realize accurate identification of the object to be identified in front of the vehicle. This embodiment is described in conjunction with the process of forming a target light pattern from the light beam emitted by the vehicle as shown in FIG. 12 , wherein FIG. 12 is a flow chart of the third embodiment of the control method for vehicle lighting provided by this application.
  • Step 1201 the vehicle determines that a trigger condition is met.
  • Step 1202 the vehicle acquires navigation information.
  • step 1201 to step 1202 shown in this embodiment please refer to step 201 to step 202 shown in the first embodiment, and details are not repeated here.
  • Step 1203 the vehicle determines that the object to be recognized satisfies a preset condition.
  • FIG. 13 is a comparison example diagram of an eighth application scenario provided by the present application.
  • Fig. 13a is an illustration of an example of illumination of an object to be recognized existing on a path to be driven as shown in an existing solution.
  • the lighting range of the vehicle's headlights is limited. For example, if there is an object to be identified 1301 outside the illumination range of the low beam of the vehicle 1300 , neither the driver nor the ADAS of the vehicle 1300 can accurately identify the object to be identified.
  • the ADAS of the vehicle recognizes that there is an unknown object to be identified in front of the vehicle 1300 based on the sensing system of the vehicle (such as an infrared imager or a laser radar included in the vehicle).
  • the type of the object to be identified may be an obstacle, Pedestrians and the like are objects that can affect the driving safety of the vehicle 1300 . At this time, because the ambient brightness of the vehicle is very low, the camera of the vehicle cannot recognize the specific type of the object to be recognized.
  • the navigation information is a series of plane coordinates for the vehicle to reach the navigation destination.
  • the preset condition shown in this embodiment is that the ADAS determines that the plane coordinates of the object to be identified are close to a series of plane coordinates included in the navigation information. It can be seen that when the object to be identified satisfies the preset condition, the vehicle drives according to the navigation information. During the process, it is easy to cause a safety accident due to bumping into the object to be identified.
  • Step 1204 the vehicle obtains the information of the route to be driven.
  • step 1204 For the execution process of step 1204 described in this embodiment, please refer to step 203 shown in the first embodiment, and details are not repeated here.
  • Step 1205 the vehicle acquires the target light pattern.
  • Step 1206 the light beam emitted by the vehicle is displayed on the path to be driven in the target light pattern.
  • the light displayed on the ground in the target light type may be called a light carpet.
  • Steps 1205 to 1206 are described in a unified manner as follows:
  • the optical carpet covers the plane coordinates of the object to be identified. It can be seen that, as shown in FIG. Illustration of an example illumination of an object to be recognized.
  • the light carpet 1302 displayed by the vehicle 1300 on the path to be driven covers at least a target area 1303 , and the target area 1303 is the area occupied by the object to be recognized. It can be seen that the light carpet 1302 can illuminate the target area 1303 .
  • the target area 1303 shown in this embodiment may be located in the central area of the light carpet 1302 . It should be clear that the present embodiment does not limit the relative positional relationship between the optical carpet 1303 and the optical carpet 1302 , as long as the optical carpet 1302 at least covers the target area 1303 .
  • Step 1207 the vehicle collects images to be recognized.
  • the light carpet of the vehicle when the light carpet of the vehicle covers the target area, the light carpet can illuminate the object to be identified located in the target area.
  • the vehicle can shoot the light carpet again based on the vehicle's camera, because the brightness of the light carpet is sufficient, then the vehicle can capture a clear image to be recognized including the light carpet based on the camera.
  • Step 1208 the vehicle obtains the type of the object to be recognized according to the image to be recognized.
  • the vehicle can identify the type of the object to be identified included in the image to be identified according to the object identification algorithm, SFM algorithm, video tracking or AI. That is, it is identified that the type of the object to be identified is a pedestrian, a vehicle, or an obstacle, and the specific size of the object to be identified is also optionally identified. For example, if the object to be recognized is a pedestrian, the vehicle can also recognize the height of the pedestrian based on the image to be recognized. For another example, if the object to be recognized is an obstacle, the vehicle can also recognize the size of the obstacle based on the image to be recognized.
  • Step 1209 the vehicle determines a driving decision according to the type of the object to be identified.
  • the ADAS of the vehicle can determine the driving decision based on the type of the object to be recognized.
  • the driving decision please refer to the second embodiment, which will not be specifically described. repeat. It can be seen that based on the ADAS driving decision based on the type of the object to be identified, the vehicle can avoid the object to be identified on the path to be identified.
  • the way of avoidance includes but is not limited to steering, changing lanes, emergency braking, etc.
  • the vehicle may also prompt the driver by means of voice or the like.
  • this embodiment takes the object to be identified as an example for illustration.
  • the object to be identified can be located in any area around the vehicle, and the ADAS of the vehicle detects the object to be identified.
  • the light beam can be emitted according to the plane coordinates of the object to be identified, so as to ensure that the light carpet driven by the light beam can illuminate the object to be identified.
  • the vehicle can recognize the specific type of the illuminated object to be recognized.
  • the vehicle when there is an object to be identified on the vehicle's waiting path, the vehicle can illuminate the object to be identified through a light blanket formed by the outgoing light beam.
  • the vehicle can recognize the illuminated object to be identified and identify the specific type, so that the vehicle can execute the corresponding driving decision or the driver of the vehicle can avoid the vehicle according to the illuminated object to be identified, etc. Under the scene of the object, the safety of vehicle driving.
  • this embodiment assumes that the object to be identified exists at any position around the vehicle, such as the object to be identified exists directly in front of the vehicle, side front , the right side of the vehicle, the left side of the vehicle or the rear side of the vehicle, etc., the function of the light carpet shown in this embodiment is to realize accurate identification of objects to be identified existing around the vehicle.
  • This embodiment describes the process of forming a light blanket from the light beam emitted by the vehicle as shown in FIG. 14 , wherein FIG. 14 is a flow chart of the fourth embodiment of the control method for vehicle lighting provided by the present application.
  • Step 1401 the vehicle determines that a trigger condition is met.
  • Step 1402 the vehicle acquires driving information.
  • the driving information shown in this embodiment can be the navigation information or vehicle-machine data shown in Embodiment 1, or the driving assistance information shown in Embodiment 2.
  • the driving information shown in this embodiment can be the navigation information or vehicle-machine data shown in Embodiment 1, or the driving assistance information shown in Embodiment 2.
  • the driving assistance information shown in Embodiment 2 please refer to Embodiment 1 or Embodiment 2. Do not go into details.
  • Step 1403 the vehicle obtains the information of the route to be driven.
  • Step 1404 the vehicle obtains the first light type.
  • Step 1405 the first light beam emitted by the vehicle is displayed on the path to be driven in a first light pattern.
  • the vehicle shown in this embodiment obtains the first light type through steps 1401 to 1405.
  • the process of determining the first light type please refer to the description of the process of obtaining the light carpet shown in Embodiment 1 or Embodiment 2. , without going into details.
  • Step 1406 the vehicle acquires the plane coordinates of the object to be identified.
  • FIG. 15 is a comparison example diagram of the ninth application scenario provided by the present application.
  • Fig. 15a is an illustration of an example of illumination of an object to be recognized existing in front of a vehicle shown in an existing solution. If the vehicle 1500 is driving at night, however, the lighting range of the vehicle's headlights is limited. For example, if there is an object to be identified 1501 outside the illumination range of the low beam of the vehicle 1500 , neither the driver nor the ADAS of the vehicle 1500 can prepare to identify the object to be identified 1501 .
  • Fig. 15b is an example diagram of the illumination of the object to be recognized existing in front of the vehicle by the light carpet provided in the present application.
  • Vehicle 1500 is already displaying first light pattern 1502 on the intended travel path.
  • the illumination range of the first light type 1502 is limited.
  • the object to be identified 1501 is located outside the illumination range of the first light pattern 1502 , and neither the driver nor the ADAS of the vehicle 1500 can prepare to identify the object to be identified 1501 .
  • the vehicle's ADAS has identified an unknown object 1501 in front of the vehicle 1500 based on the vehicle's sensing system (such as the infrared imager or lidar included in the vehicle), and the type of the object 1501 to be identified may be an obstacle. Objects, pedestrians, etc., can affect the driving safety of the vehicle 1500. At this time, because the ambient brightness of the vehicle is very low, the camera of the vehicle cannot recognize the object 1501 to be recognized. To this end, the ADAS of the vehicle shown in this embodiment acquires the plane coordinates of the object to be identified 1501 .
  • Step 1407 the vehicle obtains the second light type.
  • Step 1408 the second light beam emitted by the vehicle is displayed around the vehicle in a second light pattern.
  • the second light pattern shown in this embodiment satisfies the condition that the second light pattern covers the plane coordinates of the object to be identified. It can be known that, as shown in Figure 15b, the second light pattern 1503 displayed by the vehicle 1500 at least covers the target area, the target area is the area occupied by the object to be identified. It can be seen that the second light type 1503 can illuminate the target area.
  • the target area shown in this embodiment may be located in the central area of the second light type 1503 . It needs to be clear that this embodiment does not limit the relative positional relationship between the target area and the second light type 1503 , as long as the second light type 1503 at least covers the target area.
  • the light carpet displayed on the vehicle shown in this embodiment includes the above-mentioned first light type for illuminating the path to be driven and the second light type for illuminating the object to be recognized. That is, the light carpet shown in this embodiment is formed by superimposing the first light type and the second light type.
  • Step 1409 the vehicle collects images to be recognized.
  • Step 1410 the vehicle acquires the type of the object to be recognized according to the image to be recognized.
  • Step 1411 the vehicle determines a driving decision according to the type of the object to be recognized.
  • step 1409 to step 1411 For the specific execution process of step 1409 to step 1411 shown in this embodiment, please refer to the description of the execution process of step 1207 to step 1208 shown in the third embodiment, and details are not repeated here.
  • This embodiment provides a lighting system.
  • the light beam emitted by the lighting system can display a light carpet on the path to be driven.
  • the specific description of the light carpet please refer to any one of the above-mentioned Embodiments 1 to 4. As shown, the details are not limited in this embodiment.
  • FIG. 16 is a structural example diagram of an embodiment of the lighting system provided by the present application.
  • the lighting system 1600 shown in this embodiment includes a vehicle light module 1601 and a control unit 1602 connected to the vehicle light module 1601 .
  • the control unit 1602 is used to obtain driving information, the driving information includes navigation information and/or driving assistance information, the control unit 1602 is also used to obtain information on the route to be driven; the control unit 1602 is also used to obtain A target light pattern corresponding to the driving information and the information on the route to be driven.
  • the vehicle light module 1601 is used to emit light beams, and the light beams are displayed on the to-be-driving path in the target light pattern.
  • Example 1 Taking the example shown in FIG. 1 as an example, an independent vehicle light module 150 is provided in front of the vehicle 100, and the independently provided vehicle light module is only used to emit a light beam with a target light type.
  • the vehicle light module 150 includes a vehicle light module 1601 and a control unit 1602 connected with the vehicle light module 1601.
  • Example 2 An independent vehicle light module 150 is provided in front of the vehicle 100, and the independently provided vehicle light module is only used to emit a light beam with a target light type.
  • the vehicle light module 150 includes a vehicle light module 1601 , and the computer system 140 of the vehicle includes a control unit 1602 .
  • Example 3 The vehicle has a left headlamp, which includes a lamp module 1601 and a control unit 1602 connected to the lamp module 1601, or, the left headlamp includes a lamp module 1601, and
  • the vehicle's computing system 140 includes a control unit 1602 .
  • Example 4 The vehicle has a right headlamp, which includes a lamp module 1601 and a control unit 1602 connected to the lamp module 1601, or, the right headlamp includes a lamp module 1601, and
  • the vehicle's computing system 140 includes a control unit 1602 .
  • the car light module 1601 shown in this example includes a first sub-car light module and a second sub-car light module, the first sub-car light module emits a first sub-beam, and the second sub-car light module emits a second
  • the sub-beam, the target light pattern shown in this example includes the light pattern displayed by the first sub-beam on the path to be driven and the light pattern displayed by the second sub-beam on the path to be driven.
  • the first sub-light module is located inside the right headlight
  • the second sub-light module is located inside the left headlight.
  • control unit 1602 is located in the computer system 140, or, the control unit 1602 is located in the right headlight, or, the control unit 1602 is located in the left headlight, or, the control unit 1602 includes a first sub-control unit and a second sub-control unit, the first sub-control unit and the second sub-control unit are respectively located in any two of the right headlamp, the left headlamp or the computer system 140 of the vehicle.
  • the vehicle light module 1601 includes a first sub-car light module and a second sub-car light module respectively located in the left fog lamp and the right fog lamp of the vehicle. repeat.
  • the present application also includes a vehicle including a lighting system as shown in FIG. 16 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

提供了一种车辆灯光的控制方法、灯光系统以及车辆,用于提高车辆灯光所实现的功能。方法包括:获取行车信息,行车信息包括导航信息、驾驶辅助信息以及车机数据中的至少一种;获取待行车路径的信息;获取与行车信息和待行车路径的信息对应的目标光型(310、321、……、814);将车辆(100、300、……、1500)出射的光束以目标光型(310、321、……、814),显示于待行车路径上。

Description

一种车辆灯光的控制方法、灯光系统以及车辆
本申请要求于2021年8月16日提交中国国家知识产权局、申请号为202110939124.6、申请名称为“一种车辆灯光的控制方法、灯光系统以及车辆”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及自动驾驶领域,尤其涉及一种车辆灯光的控制方法、灯光系统以及车辆。
背景技术
车辆均具有提示灯,提示灯能够实现对应的提醒功能,车辆可为自动驾驶车辆(autonomous vehicles;self-piloting automobile)又称无人驾驶车辆,该车辆还可为轿车、卡车、摩托车、公共车辆、割草机、娱乐车、游乐场车辆、电车、高尔夫球车、火车、或手推车等。
现有的车辆的提示灯,如前照灯、尾灯、转向灯等,实现的提醒功能单一,仅能够实现提醒或照亮行车路径的功能。
发明内容
本发明实施例提供了一种车辆灯光的控制方法、灯光系统以及车辆,其用于提高车辆灯光所实现的功能。
本发明实施例第一方面提供了一种车辆灯光的控制方法,所述方法包括:获取行车信息,所述行车信息包括导航信息、驾驶辅助信息以及车机数据中的至少一种;获取待行车路径的信息;获取与所述行车信息和所述待行车路径的信息对应的目标光型;将车辆出射的光束以所述目标光型,显示于所述待行车路径上。
可见,在行车信息为导航信息的情况下,通过车辆所出射的光束,在待行车路径上所显示的目标光型,有助于提高导航的准确性,而且能够实现对待行车路径的照明,保证了车辆按照导航行车过程中的安全。而且路径上的其他人员或车辆,根据目标光型的提示,能够迅速确定该车辆即将行驶的位置,便于路径上其他人员或车辆的避让,提高了驾驶安全。应理解,目标光型表示的是显示在地面上的光的形状和尺寸,具备不同长度、或宽度或曲度等的目标光型都是彼此不同的目标光型。
在行车信息为驾驶辅助信息的情况下,该驾驶辅助信息为用于实现无人驾驶的相关信息。具体地,该驾驶辅助信息为来自车辆的高级驾驶辅助系统ADAS的信息。而且目标光型能够与车辆的驾驶辅助信息和待行车路径匹配,基于该目标光型,能够准确的识别车辆的行驶意图、紧急决策、以及车辆的行驶预判事件等,提高了车辆驾驶的安全。
基于第一方面,一种可选地实现方式中,所述获取与所述行车信息和所述待行车路径的信息对应的目标光型包括:获取至少一个第一显示属性,所述至少一个第一显示属性与所述行车信息对应;获取至少一个第二显示属性,所述至少一个第二显示属性与所述待行车路径的信息对应;确定具有所述至少一个第一显示属性和所述至少一个第二显示属性的光型为所述目标光型。
可见,本方面所示所示与目标光型能够对应多个显示属性,有效地提高了目标光型所 具有的显示属性的数量,提高了目标光型所能够提示的行车信息的数量,有效地提高了目标光型所能够应用的场景的数量。
基于第一方面,一种可选地实现方式中,所述获取至少一个第一显示属性包括:车辆根据导航信息所包括的M个平面坐标,所述待行车路径包括M个平面坐标中的第i个平面坐标至第j个平面坐标,其中,i和j均为正整数,且i大于或等于1,j大于i且小于或等于M。车辆根据待行车路径的形态确定目标光型的形状,例如,若待行车路径所包括的多个平面坐标沿直线方向延伸,那么,确定目标光型呈矩形。又如,若确定待行车路径所包括的多个平面坐标沿弧形方向延伸,那么,车辆确定目标光型呈弧形。
可见,采用本方面所示,车辆能够根据导航信息所指示的待行车路径确定目标光型的形态,驾驶人员根据目标光型的形态能够快速且准确的确定车辆的行驶方向,提高了导航的效率和准确性。
基于第一方面,一种可选地实现方式中,该第二显示属性可为如下所示的一项或多项:
车辆根据待行车路径的尺寸确定第二显示属性,如第二显示属性为宽度,该第二显示属性所包括的宽度可以等于待行车路径的车道线的宽度。如第二显示属性为长度,该第二显示属性可为第一位置和第二位置之间的长度,其中,第一位置为车辆当前位置,第二位置为导航信息所包括的最靠近车辆的路口的位置,或该第二位置可为待行车路径上,最靠近车辆的红绿灯的位置。车辆确定第一显示属性为弧形,该第二显示属性包括弯曲方向,第二显示属性所包括的弯曲方向和待行车路径的车道线的弯曲方向一致。
可见,在目标光型的宽度等于待行车路径的车道线的宽度的情况下,有助于行人或其他车辆确定车辆行车至目标光型位置处,所占用的宽度,行人或其他车辆可根据目标光型的宽度确定是否需要避让,提高了行车的安全。在目标光型的长度等于车辆当前位置和最靠近车辆的路口的位置之间的距离,或目标光型的长度等于车辆当前位置和最靠近车辆的红绿灯的位置之间的距离的情况下,提高了导航的效率以及提醒了行车状态,提高了行车安全。
基于第一方面,一种可选地实现方式中,所述获取至少一个第一显示属性包括:获取行车列表,所述行车列表包括不同的行车信息和不同的显示属性的对应关系;获取所述至少一个第一显示属性,所述至少一个第一显示属性为所述行车列表中,与所述行车信息对应的显示属性。
可见,基于该行车列表获取目标光型对应的显示属性,提高了车辆获取目标光型的速度,而且针对同一行车信息,车辆始终显示相同的目标光型,有助于驾驶人员根据目标光型迅速确定当前的行车信息,提高了目标光型提示行车信息的效率和准确性。
基于第一方面,一种可选地实现方式中,所述目标光型的宽度大于或等于所述车辆的宽度,和/或,所述目标光型的长度大于或等于所述车辆的长度。
可知,该目标光型能够指示车辆即将占用的区域,有效地提高了车辆驾驶的安全。
基于第一方面,一种可选地实现方式中,车辆能够在待行车路径上形成目标光型,该目标光型的宽度等于车辆的宽度,该目标光型能够指示出车辆行车至目标光型的区域内时,所占用的宽度。
可知,目标能够基于目标光型清晰的边界,确定车辆的行车范围。若目标出现在目标光型之内,说明目标在车辆的安全距离之内,那么,车辆与目标之间出现安全事故的可能性很大。若目标出现在目标光型之外,说明目标在车辆的安全距离之外,那么,车辆与目标之间出现安全事故的可能性很小。
基于第一方面,一种可选地实现方式中,所述目标光型与所述待行车路径的形态一致,例如,待行车路径的形态呈矩形,目标光型呈矩形。又如,待行车路径的形态呈弧形,目标光型呈弧形。
可见,在所述目标光型与所述待行车路径的形态一致的情况下,有助于提高导航的效率以及准确性,避免了驾驶人员出现导航误判的情况。
基于第一方面,一种可选地实现方式中,所述目标光型与所述待行车路径的尺寸相关。
可见,基于目标光型的尺寸,能够确定车辆行车过程中所占用的路径上的区域,有助于避免出现交通事故的情况,提高了行车安全。
基于第一方面,一种可选地实现方式中,若所述待行车路径呈弧形,所述目标光型也呈弧形,且所述待行车路径的弯曲方向和所述目标光型的弯曲方向一致。
可见,在车辆需要转向的场景下,基于具有弯曲方向的目标光型,提示驾驶人员、行人或其他车辆,该车辆即将转向而且能够指示车辆具体转向的方向,有助于避免出现交通事故的情况,提高了行车安全。
基于第一方面,一种可选地实现方式中,所述行车信息为来自所述车辆的高级驾驶辅助系统ADAS的行车决策,所述目标光型与所述行车决策的类型对应。例如,行车决策为车辆的行驶意图,行驶意图包括如下所示的至少一项:直行、变道、转向或入岔路口等。又如,行车决策为紧急决策,紧急决策包括如下所示的至少一项:紧急刹车、紧急避险或车辆出现故障。又如,行车决策为车辆行驶预判事件,该车辆行驶预判事件包括车辆处于安全状态或车辆处于危险状态。
可见,目标光型能够指示来自车辆ADAS的行车决策,驾驶人员,行人或其他车辆能够基于该目标光型,快速且准确地确定该车辆的行车决策,提高了行车安全。
基于第一方面,一种可选地实现方式中,所述行车信息为所述车辆的车速,所述车辆的车速与所述目标光型的长度正相关关系,即车速越快,目标光型的长度越长,车速越慢,目标光型的长度越短。
可见,基于目标光型的长度能够快速的确定车辆的车速,驾驶人员可根据需要确定是否对车速进行调节,行人或其他车辆可迅速确定是否需要避让,提高了行车的安全。
基于第一方面,一种可选地实现方式中,所述车辆的车速与以所述目标光型显示在地面上的光的闪光频率正相关关系,即车速越快,闪光频率越高,车速越慢,闪光频率越低。
可见,基于闪光频率能够快速的确定车辆的车速,驾驶人员可根据需要确定是否对车速进行调节,行人或其他车辆可迅速确定是否需要避让,提高了行车的安全。
基于第一方面,一种可选地实现方式中,所述车辆的车速与以所述目标光型显示在地面上的光的亮度呈正相关关系,即车速越快,亮度越亮,车速越慢,亮度越暗。
可见,基于亮度能够快速的确定车辆的车速,驾驶人员可根据需要确定是否对车速进 行调节,行人或其他车辆可迅速确定是否需要避让,提高了行车的安全。
基于第一方面,一种可选地实现方式中,在下述一项或多项情况下,以所述目标光型显示在地面上的光具有闪光频率:
待行车路径的形态出现改变,那么,以所述目标光型显示在地面上的光可具有闪光频率,该行车路径的形态出现改变可为:车辆的待行车路径用于指示车辆由直行状态切换为转向状态,或,车辆的待行车路径用于指示由转向状态切换为直行状态,或车辆的待行车路径用于指示车辆即将行车到交叉的路口处,或车辆的待行车路径用于指示车道线的尺寸出现改变(如车道线的宽度出现改变)。或,车辆的车灯发出的光以目标光型显示于斑马线上。或,目标光型的覆盖范围内出现障碍物(如行人、其他车辆等)。
可见,通过闪光频率指示多种车辆不同的行车信息,提高了向驾驶人员、行人或其他车辆提示行车信息的效率,提高了行车安全。
基于第一方面,一种可选地实现方式中,所述行车信息为所述车辆所位于的环境亮度。
可见,在本申请实施例中,车灯出射光亮度能够与环境亮度匹配,例如,因环境亮度低(如阴天、雨天、黑夜)的情况下,车灯能够提供照亮待行车路径的功能,驾驶人员根据目标光型所照亮的行车路径行车,提高了行车安全。
基于第一方面,一种可选地实现方式中,所述行车信息为所述车辆与前方车辆之间的间距,所述间距的大小与以所述目标光型显示在地面上的光的亮度呈负相关关系,即间距越大,那么亮度越低,间距越小,那么亮度越亮。
可见,基于亮度,驾驶人员或前方车辆能够迅速的确定车辆与前方车辆之间的间距,进而准确的预判车辆与前方车辆是否有出现相撞的可能,提高了行车的安全。
基于第一方面,一种可选地实现方式中,所述间距的大小与以所述目标光型显示在地面上的光的闪光频率呈负相关关系,即间距越大,闪光频率越低,间距越小,闪光频率越高。
可见,基于闪光频率,驾驶人员或前方车辆能够迅速的确定车辆与前方车辆之间的间距,进而准确的预判车辆与前方车辆是否有出现相撞的可能,提高了行车的安全。
基于第一方面,一种可选地实现方式中,所述间距的大小与所述目标光型的长度呈正相关关系,即间距越大,目标光型的长度越长,间距越小,目标光型的长度越短。
可见,基于目标光型的长度,驾驶人员或前方车辆能够迅速的确定车辆与前方车辆之间的间距,进而准确的预判车辆与前方车辆是否有出现相撞的可能,提高了行车的安全。
基于第一方面,一种可选地实现方式中,所述行车信息为所述车辆周围存在待识别对象,所述目标光型至少覆盖目标区域,所述目标区域为所述待识别对象所占的区域。
可见,车灯出射光能够照亮位于目标区域内的待识别对象,以使车辆能够准确地识别待识别对象的类型,而且,驾驶人员基于被照亮的待识别对象,有助于驾驶人员预判是否对该待识别对象进行避让,提高了行车安全。
基于第一方面,一种可选地实现方式中,车辆根据待识别对象的类型确定行车决策。
可见,在车辆的待行车路径上存在待识别对象的情况下,车辆可照亮该待识别对象。车辆能够识别到被照亮的待识别对象识别出具体的类型,以便于车辆执行对应的行车决策 或车辆的驾驶人员根据被照亮的待识别对象驾驶车辆避让等,提高了车辆前方存在待识别对象的场景下,车辆驾驶的安全。
基于第一方面,一种可选地实现方式中,目标光型包括第一光型和第二光型,该第一光型与行车信息对应,该第二光型至少覆盖目标区域,所述目标区域为所述待识别对象所占的区域。
可见,本方面所示的第一光型能够照亮待行车路径,第二光型能够位于对待行车路径周围的待识别对象照亮,在保证车辆行车的同时,还能够准确的确定路径周围的行车安全。
基于第一方面,一种可选地实现方式中,待行车路径的信息可为如下所示的一项或多项:
待行车路径的形态、待行车路径的尺寸、待行车路径所包括的路口的位置、待行车路径上的红绿灯的情况、待行车路径上车辆当前位置与最近的路口之间的距离。其中,待行车路径的形态可为待行车路径的车道线是直行车道线,或待行车路径的车道线是弯曲车道线。待行车路径的尺寸可为待行车路径的车道线的宽度和/或长度等,又如,待行车路径的尺寸还可为,若待行车路径的车道线是弯曲的,那么该待行车路径的车道线的尺寸还可为待行车路径的车道线的弧度和/或弯曲方向。若车辆的待行车路径位于多岔路场景下,那么,该待行车路径的信息还可包括待行车的路口的位置。待行车路径上车辆当前位置与最近的路口之间的距离可指,车辆的当前位置与待行车的路口之间的距离。
可见,目标光型能够指示多种不同的待行车信息,提高了目标光型所应用的场景的数量,提高了行车的效率和安全。
基于第一方面,一种可选地实现方式中,获取待行车路径的信息包括:车辆的相机对待行车路径进行拍摄以获取包括待行车路径的信息的视频流。车辆的处理器接收来自相机的视频流。处理器提取视频流所包括的视频帧。处理器通过预设提取速度从视频流中提取视频帧。处理器基于视频帧分析出待行车路径的信息。
可见,车辆基于包括待行车路径的信息的视频流获取待行车路径的信息的方式,保证了车辆能够准确地获取到待行车路径的具体情况,有效地提高了获取目标光型的准确性,避免目标光型所提示的待行车路径的信息出现偏差的可能。
基于第一方面,一种可选地实现方式中,该提取速度越快,越能够获取待行车路径上最新的信息。而提取速度越慢,越能够节省处理器的功耗。
可见,车辆能够基于车辆的具体情况(如剩余电量,行车的路况的复杂程度)确定提取速度。
基于第一方面,一种可选地实现方式中,所述获取行车信息之前,所述方法还包括:车辆在满足触发条件的情况下,触发执行获取待行车路径的信息的步骤。该触发条件为驾驶人员输入的用于在待行车路径显示目标光型的指令,或,该触发条件可为如下所示的至少一项:
车辆的当前速度大于或等于第一预设值,车辆当前所位于的环境亮度小于或等于第二预设值,或车辆的待行车路径的形态出现改变,车速的变化量大于或等于第三预设值,或所述环境亮度的变化量大于或等于第四预设值等。其中,该行车路径的形态出现改变可为: 车辆的待行车路径用于指示车辆由直行方向切换为转向状态,或,车辆的待行车路径用于指示由转向状态切换为直行状态,或车辆的待行车路径用于指示车辆即将行车到交叉的路口处,或车辆的待行车路径用于指示车道线的尺寸出现改变(如车道线的宽度出现改变)。
可见,车辆能够根据触发条件确定是否在待行车路径上显示目标光型,避免了在无需显示目标光型的场景下,显示该目标光型所带来的功耗的浪费。
基于第一方面,一种可选地实现方式中,沿待行车路径的延伸方向,该目标光型的中心线可与待行车路径的中心线重合,或,该目标光型的中心线可与待行车路径的中心线之间的偏移量小于或等于第一间距,例如,该第一间距可为0.5米。
可见,通过该目标光型的该显示方式,以保证目标光型能够准确地显示在待行车路径上。
基于第一方面,一种可选地实现方式中,沿所述待行车路径的横向方向,所述目标光型的边界线与所述待行车路径的车道线之间的间距小于或等于第二间距,例如,所述目标光型的左侧边界线与所述待行车路径的左侧车道线之间的间距小于或等于第二间距,又如,所述目标光型的右侧边界线与所述待行车路径的右侧车道线之间的间距小于或等于第二间距,或,在该第二显示属性所示的长度为第一位置和第二位置之间的长度的情况下,沿待行车路径的延伸方向,目标光型的上下两个边界线分别与第一位置和第二位置重合。
可见,该目标光型能够准确地指示车道线的宽度或准确地指示第一位置和第二位置之间的间距,提高了行车的安全。
基于第一方面,一种可选地实现方式中,目标光型的宽度等于待行车路径的车道线所具有的最窄的宽度,那么,驾驶人员基于被照亮的待行车路径,驾驶人员能够准确的判断出车道线突然收窄的情况,提高了驾驶的安全性。
基于第一方面,一种可选地实现方式中,车辆在待行车路径的斑马线上形成目标光型,该目标光型能够照亮斑马线。那么,行人在斑马线上行走会注意到该目标光型,有助于行人在斑马线上对车辆的避让。而且因目标光型能够照亮斑马线,那么,斑马线就不会成为驾驶人员的视线盲区,有效地避免了车辆和行人之间出现安全事故的可能。
基于第一方面,一种可选地实现方式中,待行车路径的形态出现改变的程度越大,那么,以目标光型显示在地面上的光的亮度越亮,待行车路径的形态出现改变的程度越小,那么,亮度越暗,如,待行车路径的结构呈弧度,该弧度越大,亮度越亮,该弧度越小,亮度越暗。
可见,基于该亮度能够快速的确定待行车路径的形态出现改变的情况,以保证驾驶人员行车过程中的安全。
基于第一方面,一种可选地实现方式中,车辆的待行车路径指示车道线的尺寸出现改变时的亮度大于待行车路径指示车道线的尺寸未出现改变时的亮度。
可见,基于该亮度能够快速的确定待行车路径的尺寸出现改变的情况,以保证驾驶人员行车过程中的安全。
基于第一方面,一种可选地实现方式中,车辆出射光以目标光型显示于斑马线上时的亮度大于车辆出射光以目标光型未显示于斑马线上时的亮度。
可见,基于该目标光型,有助于行人注意到该车辆即将行驶入斑马线上的情况,避免车辆与行车之间出现安全事故。
基于第一方面,一种可选地实现方式中,车辆的目标光型的范围内出现障碍物(如行人、其他车辆等)时,目标光型范围内的光亮度大于车辆的目标光型的范围内未出现障碍物时的亮度。
可见,基于该亮度,有助于驾驶人员注意到车辆前方出现障碍物的情况,提高了行车的安全。
基于第一方面,一种可选地实现方式中,所述将车辆出射的光束以所述目标光型,显示于所述待行车路径上之后,所述方法还包括:车辆通过车辆的相机重新采集校准图像,该校准图像包括待行车路径以及在该待行车路径上显示的目标光型。车辆根据校准图像判断目标光型是否满足重校准条件,具体地,车辆确定该校准图像满足下述所示的至少一个条件的情况下,确定该目标光型满足重校准条件:
车辆判断目标光型的中心线与待行车路径的车道中心线出现偏移、或车辆判断目标光型的中心线与待行车路径的车道中心线之间的距离大于偏移量,或,在目标光型的宽度等于待行车路径的车道线的宽度的情况下,沿待行车路径的横向方向,目标光型两侧的边界线与车道线两侧的边界线出现偏移、或在目标光型的弯曲方向与待行车路径的弯曲方向需要一致的情况下,目标光型的弯曲方向与待行车路径的弯曲方向不一致等。
可见,本方面基于该重校准条件,能够准确地确定目标光型是否成功地显示于待行车路径上,避免目标光型未成功地显示于待行车路径上的情况,提高了目标光型指示行车信息的准确性。
本发明实施例第二方面提供了一种灯光系统,所述灯光系统包括车灯模块和控制单元,所述控制单元与所述车灯模块连接,所述控制单元用于,获取行车信息,所述行车信息包括导航信息、驾驶辅助信息以及车机数据中的至少一种,还用于获取待行车路径的信息;还用于获取与所述行车信息和所述待行车路径的信息对应的目标光型;所述车灯模块用于将车辆出射的光束以所述目标光型,显示于所述待行车路径上。
本方面所示的有益效果的说明,请参见第一方面所示,不做赘述。
基于第二方面,一种可选地实现方式中,控制单元用于:获取至少一个第一显示属性,所述至少一个第一显示属性与所述行车信息对应;获取至少一个第二显示属性,所述至少一个第二显示属性与所述待行车路径的信息对应;确定具有所述至少一个第一显示属性和所述至少一个第二显示属性的光型为所述目标光型。
基于第二方面,一种可选地实现方式中,控制单元用于:根据导航信息所包括的M个平面坐标,所述待行车路径包括M个平面坐标中的第i个平面坐标至第j个平面坐标,其中,i和j均为正整数,且i大于或等于1,j大于i且小于或等于M。车辆根据待行车路径的形态确定目标光型的形状,例如,若待行车路径所包括的多个平面坐标沿直线方向延伸,那么,确定目标光型呈矩形。又如,若确定待行车路径所包括的多个平面坐标沿弧形方向延伸,那么,控制单元确定目标光型呈弧形。
基于第二方面,一种可选地实现方式中,该第二显示属性可为如下所示的一项或多项:
控制单元根据待行车路径的尺寸确定第二显示属性,如第二显示属性为宽度,该第二显示属性所包括的宽度等于待行车路径的车道线的宽度。控制单元所确定的第二显示属性为长度,如该第二显示属性所示的长度为第一位置和第二位置之间的长度,其中,第一位置为车辆当前位置,第二位置为导航信息所包括的最靠近车辆的路口的位置,该第二位置可为待行车路径上,最靠近车辆的红绿灯的位置。控制单元确定第一显示属性为弧形,该第二显示属性包括弯曲方向,第二显示属性所包括的弯曲方向和待行车路径的车道线的弯曲方向一致。
基于第二方面,一种可选地实现方式中,所述控制单元用于:获取行车列表,所述行车列表包括不同的行车信息和不同的显示属性的对应关系;获取所述至少一个第一显示属性,所述至少一个第一显示属性为所述行车列表中,与所述行车信息对应的显示属性。
基于第二方面,一种可选地实现方式中,所述目标光型的宽度大于或等于所述车辆的宽度,和/或,所述目标光型的长度大于或等于所述车辆的长度。
基于第二方面,一种可选地实现方式中,控制单元用于在待行车路径上形成目标光型,该目标光型的宽度大于或等于车辆的宽度,该目标光型能够指示出车辆行车至目标光型的区域内时,所占用的宽度。
基于第二方面,一种可选地实现方式中,所述目标光型与所述待行车路径的形态一致,例如,待行车路径的形态呈矩形,目标光型呈矩形。又如,待行车路径的形态呈弧形,目标光型呈弧形。
基于第二方面,一种可选地实现方式中,所述目标光型与所述待行车路径的尺寸相关。
基于第二方面,一种可选地实现方式中,若所述待行车路径呈弧形,所述目标光型也呈弧形,且所述待行车路径的弯曲方向和所述目标光型的弯曲方向一致。
基于第二方面,一种可选地实现方式中,所述行车信息为来自所述车辆的高级驾驶辅助系统ADAS的行车决策,所述目标光型与所述行车决策的类型对应。
基于第二方面,一种可选地实现方式中,行车决策为车辆的行驶意图,行驶意图包括如下所示的至少一项:直行、变道、转向或入岔路口等。
基于第二方面,一种可选地实现方式中,行车决策为紧急决策,紧急决策包括如下所示的至少一项:紧急刹车、紧急避险或车辆出现故障。
基于第二方面,一种可选地实现方式中,行车决策为车辆行驶预判事件,该车辆行驶预判事件包括车辆处于安全状态或车辆处于危险状态。
基于第二方面,一种可选地实现方式中,所述行车信息为所述车辆的车速,所述车辆的车速与所述目标光型的长度正相关关系,即车速越快,目标光型的长度越长,车速越慢,目标光型的长度越短。
基于第二方面,一种可选地实现方式中,所述车辆的车速与以目标光型显示在地面上的光的闪光频率正相关关系,即车速越快,闪光频率越高,车速越慢,闪光频率越低。
基于第二方面,一种可选地实现方式中,所述车辆的车速与以目标光型显示在地面上的光的亮度正相关关系,即车速越快,亮度越亮,车速越慢,亮度越暗。
基于第二方面,一种可选地实现方式中,在下述一项或多项情况下,以目标光型显示 在地面上的光具有闪光频率:
待行车路径的形态出现改变,那么,以目标光型显示在地面上的光具有闪光频率,该行车路径的形态出现改变可为:车辆的待行车路径用于指示车辆由直行状态切换为转向状态,或,车辆的待行车路径用于指示由转向状态切换为直行状态,或车辆的待行车路径用于指示车辆即将行车到交叉的路口处,或车辆的待行车路径用于指示车道线的尺寸出现改变(如车道线的宽度出现改变)。
或,车辆的目标光型显示于斑马线上。
或,车辆的目标光型的范围内出现障碍物(如行人、其他车辆等)。
基于第二方面,一种可选地实现方式中,所述行车信息为所述车辆所位于的环境亮度。
基于第二方面,一种可选地实现方式中,所述行车信息为所述车辆与前方车辆之间的间距,所述间距的大小与以目标光型显示在地面上的光的亮度呈负相关关系,即间距越大,亮度越低,间距越小,亮度越高。
基于第二方面,一种可选地实现方式中,所述间距的大小与以目标光型显示在地面上的光的闪光频率呈负相关关系,即间距越大,闪光频率越低,间距越小,闪光频率越高。
基于第二方面,一种可选地实现方式中,所述间距的大小与所述目标光型的长度呈正相关关系,即间距越大,目标光型的长度越长,间距越小,目标光型的长度越短。
基于第二方面,一种可选地实现方式中,所述行车信息为所述车辆周围存在待识别对象,所述目标光型至少覆盖目标区域,所述目标区域为所述待识别对象所占的区域。
基于第二方面,一种可选地实现方式中,控制单元用于采集包括待识别对象的待识别图像,控制单元根据待识别图像获取待识别对象的类型。
基于第二方面,一种可选地实现方式中,控制单元根据待识别对象的类型确定行车决策。
基于第二方面,一种可选地实现方式中,目标光型包括第一光型和第二光型,该第一光型与行车信息对应,该第二光型至少覆盖目标区域,所述目标区域为所述待识别对象所占的区域。
基于第二方面,一种可选地实现方式中,待行车路径的信息可为如下所示的一项或多项:
待行车路径的形态、待行车路径的尺寸、待行车路径所包括的路口的位置、待行车路径上的红绿灯的情况、待行车路径上车辆当前位置与最近的路口之间的距离。其中,待行车路径的形态可为待行车路径的车道线是直行车道线,或待行车路径的车道线是弯曲车道线。待行车路径的尺寸可为待行车路径的车道线的宽度和/或长度等,又如,待行车路径的尺寸还可为,若待行车路径的车道线是弯曲的,那么该待行车路径的车道线的尺寸还可为待行车路径的车道线的弧度和/或弯曲方向。若车辆的待行车路径位于多岔路场景下,那么,该待行车路径的信息还可包括待行车的路口的位置。待行车路径上车辆当前位置与最近的路口之间的距离可指,车辆的当前位置与待行车的路口之间的距离。
基于第二方面,一种可选地实现方式中,控制单元用于通过相机对待行车路径进行拍摄以获取包括待行车路径的信息的视频流。控制单元接收来自相机的视频流。控制单元提 取视频流所包括的视频帧。控制单元通过预设提取速度从视频流中提取视频帧。控制单元用于基于视频帧分析出待行车路径的信息。
基于第二方面,一种可选地实现方式中,该提取速度越快,越能够获取待行车路径上最新的信息。而提取速度越慢,越能够节省处理器的功耗。
基于第二方面,一种可选地实现方式中,所述控制单元还用于:在满足触发条件的情况下,触发执行获取待行车路径的信息的步骤。该触发条件为驾驶人员输入的用于在待行车路径显示目标光型的指令,或,该触发条件可为如下所示的至少一项:
车辆的当前速度大于或等于第一预设值,车辆当前所位于的环境亮度小于或等于第二预设值,或车辆的待行车路径的形态出现改变,车速的变化量大于或等于第三预设值,或所述环境亮度的变化量大于或等于第四预设值等。其中,该行车路径的形态出现改变可为:车辆的待行车路径用于指示车辆由直行方向切换为转向状态,或,车辆的待行车路径用于指示由转向状态切换为直行状态,或车辆的待行车路径用于指示车辆即将行车到交叉的路口处,或车辆的待行车路径用于指示车道线的尺寸出现改变(如车道线的宽度出现改变)。
基于第二方面,一种可选地实现方式中,沿待行车路径的延伸方向,该目标光型的中心线可与待行车路径的中心线重合,或,该目标光型的中心线可与待行车路径的中心线之间的偏移量小于或等于第一间距,例如,该第一间距可为0.5米。
基于第二方面,一种可选地实现方式中,沿所述待行车路径的横向方向,所述目标光型的边界线与所述待行车路径的车道线之间的间距小于或等于第二间距,例如,所述目标光型的左侧边界线与所述待行车路径的左侧车道线之间的间距小于或等于第二间距,又如,所述目标光型的右侧边界线与所述待行车路径的右侧车道线之间的间距小于或等于第二间距,或,在该第二显示属性所示的长度为第一位置和第二位置之间的长度的情况下,沿待行车路径的延伸方向,目标光型的上下两个边界线分别与第一位置和第二位置重合。
基于第二方面,一种可选地实现方式中,目标光型的宽度等于待行车路径的车道线所具有的最窄的宽度,那么,驾驶人员基于被照亮的待行车路径,驾驶人员能够准确的判断出车道线突然收窄的情况,提高了驾驶的安全性。
基于第二方面,一种可选地实现方式中,车灯模块用于在待行车路径的斑马线上形成目标光型,从而照亮斑马线。
基于第二方面,一种可选地实现方式中,待行车路径的形态出现改变的程度越大,那么,以目标光型显示在地面上的光的亮度越亮,待行车路径的形态出现改变的程度越小,那么,以目标光型显示在地面上的光的亮度越暗,如,待行车路径的结构呈弧度,该弧度越大,亮度越高,该弧度越小,亮度越低。
基于第二方面,一种可选地实现方式中,待行车路径出现改变时的亮度大于待行车路径未出现改变时的亮度,例如若待行车路径一直处于直行的状态时的亮度小于待行车路径出现转向时的亮度。
基于第二方面,一种可选地实现方式中,车辆的待行车路径指示车道线的尺寸出现改变时的亮度大于待行车路径指示车道线的尺寸未出现改变时的亮度。
基于第二方面,一种可选地实现方式中,车辆出射光以目标光型显示于斑马线上时的 亮度大于车辆出射光以目标光型未显示于斑马线上时的亮度。
基于第二方面,一种可选地实现方式中,车辆的目标光型的范围内出现障碍物(如行人、其他车辆等)时,车灯出射光的亮度大于车辆的目标光型的范围内未出现障碍物时车灯出射光的亮度。
基于第二方面,一种可选地实现方式中,将车灯模块出射的光束以所述目标光型,显示于所述待行车路径上,控制单元用于通过相机重新采集校准图像,该校准图像包括待行车路径以及在该待行车路径上显示的目标光型。控制单元用于根据校准图像判断目标光型是否满足重校准条件,具体地,控制单元确定该校准图像满足下述所示的至少一个条件的情况下,确定该目标光型满足重校准条件:
控制单元用于判断目标光型的中心线与待行车路径的车道中心线出现偏移、或控制单元用于判断目标光型的中心线与待行车路径的车道中心线之间的距离大于偏移量,或,在目标光型的宽度等于待行车路径的车道线的宽度的情况下,沿待行车路径的横向方向,目标光型两侧的边界线与车道线两侧的边界线出现偏移、或在目标光型的弯曲方向与待行车路径的弯曲方向需要一致的情况下,目标光型的弯曲方向与待行车路径的弯曲方向不一致等。
本发明实施例第三方面提供了一种一种车辆,该车辆包括如上述第二方面所示的灯光系统。
附图说明
图1为本申请所提供的车辆的一种实施例功能框图;
图2为本申请所提供的车辆灯光的控制方法的第一种实施例步骤流程图;
图3a为本申请所提供的第一种应用场景示例图;
图3b为本申请所提供的第二种应用场景示例图;
图3c为本申请所提供的第三种应用场景示例图;
图3d为本申请所提供的第四种应用场景示例图;
图3e为本申请所提供的第五种应用场景示例图;
图4为本申请所提供的第一种应用场景对比示例图;
图4a所示为已有方案所示的车辆右转的具体路况示例图;
图4b所示为本申请所示的车辆右转的具体路况示例图;
图5为本申请所提供的第二种应用场景对比示例图;
图5a为已有方案所示的车辆近光灯的照明示例图;
图5b为本申请所提供的目标光型的照明示例图;
图6为本申请所提供的第三种应用场景对比示例图;
图6a为已有方案所示的车道线的宽度变化时车辆近光灯的照明示例图;
图6b为本申请所提供的车道线的宽度变化时目标光型的照明示例图;
图7为本申请所提供的第四种应用场景对比示例图;
图7a为已有方案所示的车辆近光灯照亮斑马线的照明示例图;
图7b为本申请所提供的目标光型照亮斑马线的照明示例图;
图8a为已有方案所示的车辆近光灯照亮车辆前方的照明示例图;
图8b为本申请所提供的目标光型照亮车辆前方的照明示例图;
图8c为车辆与前方车辆之间具有第一间距的示例图;
图8d为车辆与前方车辆之间具有第二间距的示例图;
图9为本申请所提供的车辆灯光的控制方法的第二种实施例步骤流程图;
图10为本申请所提供的第六种应用场景对比示例图;
图10a为已有方案所示的车辆停车入库的照明示例图;
图10b为本申请所提供的目标光型对车辆停车入库的照明示例图;
图11为本申请所提供的第七种应用场景对比示例图;
图11a为已有方案所示的车辆出现危险状态的照明示例图;
图11b为本申请所提供的目标光型对车辆出现危险状态时的照明示例图;
图12为本申请所提供的车辆灯光的控制方法的第三种实施例步骤流程图;
图13为本申请所提供的第八种应用场景对比示例图;
图13a为已有方案所示的待行车路径存在的待识别对象的照明示例图;
图13b为本申请所提供的目标光型对待行车路径存在的待识别对象的照明示例图;
图14为本申请所提供的车辆灯光的控制方法的第四种实施例步骤流程图;
图15为本申请所提供的第九种应用场景对比示例图;
图15a为已有方案所示的车辆前方存在的待识别对象的照明示例图;
图15b为本申请所提供的目标光型对车辆前方存在的待识别对象的照明示例图;
图16为本申请所提供的灯光系统的一种实施例结构示例图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
以下首先对本申请所应用的车辆进行说明,参见图1所示,其中,图1为本申请所提供的车辆的一种实施例功能框图。在一个实施例中,将车辆100配置为完全或部分地自动驾驶模式。例如,车辆100可以在处于自动驾驶模式中的同时控制自身,并且可通过人为操作来确定车辆及其周边环境的当前状态,确定周边环境中的至少一个其他车辆的可能行为,并确定该其他车辆执行可能行为的可能性相对应的置信水平,基于所确定的信息来控制车辆100。在车辆100处于自动驾驶模式中时,可以将车辆100置为在没有和人交互的情况下操作。车辆100可包括各种系统,每个系统可包括多个元件。另外,车辆100的每个系统和元件可以通过有线或者无线互连。
本实施例所示的车辆包括传感器系统120,传感器系统120可包括感测关于车辆100周边的环境的信息的若干个传感器。例如,传感器系统120可包括定位系统121(定位系 统可以是全球定位系统(global positioning system,GPS)系统,也可以是北斗系统或者其他定位系统)、惯性测量单元(inertial measurement unit,IMU)122、雷达123、激光测距仪124以及相机125。传感器系统120还可包括被监视车辆100的内部系统的传感器(例如,车内空气质量监测器、燃油量表、机油温度表等)。来自这些传感器中的一个或多个的传感器数据可用于检测对象及其相应特性(位置、形状、方向、速度等)。这种检测和识别是自主车辆100的安全操作的关键功能。定位系统121可用于估计车辆100的地理位置。IMU122用于基于惯性加速度来感测车辆100的位置和朝向变化。在一个实施例中,IMU122可以是加速度计和陀螺仪的组合。雷达123可利用无线电信号来感测车辆100的周边环境内的物体。在一些实施例中,除了感测物体以外,雷达123还可用于感测物体的速度和/或前进方向。本实施例对雷达123的具体类型不做限定,例如,雷达123可为毫米波雷达或激光雷达等。激光测距仪124可利用激光来感测车辆100所位于的环境中的物体。在一些实施例中,激光测距仪124可包括一个或多个激光源、激光扫描器以及一个或多个检测器,以及其他系统组件。相机125可用于捕捉车辆100的周边环境的多个图像。相机125可以是静态相机、视频相机、单\双目摄像头或红外成像仪。
车辆100还包括高级驾驶辅助系统(advanced driving assistance system,ADAS)110。ADAS110在车辆行车过程中随时来感应周围的环境,收集数据,进行静态、动态物体的辨识、侦测与追踪,并结合导航地图数据,进行系统的运算与分析,从而预先让驾驶者察觉到可能发生的危险,有效增加车辆驾驶的舒适性和安全性。例如,ADAS110可通过传感系统120获取的数据控制车辆。又如,ADAS110可通过车机数据控制车辆,其中,车机数据可为车辆仪表盘上的主要数据(油耗、发动机转速、温度等)、车速信息、方向盘转角信息,或车身姿态数据等。
ADAS110控制车辆的方式可为下述所示的一项或多项:
ADAS110调整车辆100的前进方向。ADAS110控制车辆的引擎的操作速度并进而控制车辆100的速度。ADAS110操作由相机125捕捉的图像,以便识别车辆100周边环境中的物体和/或特征。在一些实施例中,ADAS110可以用于为环境绘制地图、跟踪物体、估计物体的速度等等。ADAS110确定车辆100的行车路线,在一些实施例中,ADAS110可结合来自传感系统120的一个或多个预定地图数据以为车辆100确定行车路线。ADAS110可识别、评估和避免或者以其他方式越过车辆100的环境中的潜在障碍物。
车辆100通过外围设备130与外部传感器、其他车辆、其他计算机系统或用户之间进行交互。外围设备130可包括无线通信系统131、车载电脑132、麦克风133和/或扬声器134。
在一些实施例中,外围设备130提供车辆100的用户与用户接口交互的手段。例如,车载电脑132可向车辆100的用户提供信息。用户接口还可操作车载电脑132来接收用户的输入。车载电脑132可以通过触摸屏进行操作。在其他情况中,外围设备130可提供用于车辆100与位于车内的其它设备通信的手段。例如,麦克风133可从车辆100的用户接收音频(例如,语音命令或其他音频输入)。类似地,扬声器134可向车辆100的用户输出音频。
无线通信系统131可以直接地或者经由通信网络来与一个或多个设备无线通信。例如,无线通信系统131可使用第三代移动通信技术(3rd-generation,3G)蜂窝通信,例如码分多址(code division multiple access,CDMA)、全球移动通信系统(global system for mobile communications,GSM)、通用分组无线服务技术(general packet radio service,GPRS)。无线通信系统131可使用第四代移动通信技术(the 4th generation mobile communication technology,4G)蜂窝通信,例如长期演进(long term evolution,LTE)。无线通信系统131还可使用第五代移动通信技术(5th generation mobile communication technology,5G)蜂窝通信。无线通信系统131可利用无线局域网(wireless local area network,WLAN)通信。在一些实施例中,无线通信系统131可利用红外链路、蓝牙或紫蜂协议(ZigBee)与设备直接通信。无线通信系统131还可利用各种车辆通信系统,例如,无线通信系统131可包括一个或多个专用短程通信(dedicated short range communications,DSRC)设备,这些设备可包括车辆和/或路边台站之间的公共和/或私有数据通信。
车辆100的部分或所有功能受计算机系统140控制。计算机系统140可基于从各种系统(例如,传感系统120、ADAS110、外围设备130)以及从用户接口接收的输入来控制车辆100的功能。计算机系统140可包括至少一个处理器141,处理器141执行存储在例如存储器142这样的非暂态计算机可读介质中的指令。计算机系统140还可以是采用分布式方式控制车辆100的个体组件或子系统的多个计算设备。
本实施例对处理器141的类型不做限定,例如,该处理器141可为一个或多个现场可编程门阵列(field-programmable gate array,FPGA)、专用集成芯片(application specific integrated circuit,ASIC)、系统芯片(system on chip,SoC)、中央处理器(central processor unit,CPU)、网络处理器(network processor,NP)、数字信号处理电路(digital signal processor,DSP)、微控制器(micro controller unit,MCU),可编程控制器(programmable logic device,PLD)或其它集成芯片,或者上述芯片或者处理器的任意组合等。其中,处理器141可位于车辆内部,或处理器141可以位于远离该车辆并且与该车辆进行无线通信。
在一些实施例中,存储器142可包含指令(例如,程序逻辑),指令可被处理器141执行来执行车辆100的各种功能。除了指令以外,存储器142还可存储数据,例如地图数据、路线信息,车辆的位置、方向、速度以及其它的车辆数据。存储器142所存储的信息可在车辆100在自主、半自主和/或手动模式中操作期间被车辆100和计算机系统140使用。
本实施例所示的车辆100还包括车灯模块150,该车灯模块150所出射的光束,能够在车辆100的待行车路径上显示出目标光型,以下结合各个实施例对车辆出射的光束,在待行车路径上形成目标光型的过程进行说明。本实施例所示的灯光模块不仅可应用至车辆上,还可应用至船、飞机、直升飞机等驾驶工具上。
实施例一
本实施例结合图2所示对车辆100在待行车路径上显示目标光型的过程进行说明,其中,图2为本申请所提供的车辆灯光的控制方法的第一种实施例步骤流程图。
步骤201、车辆确定满足触发条件。
本实施例中,在车辆确定满足该触发条件的情况下,那么,会启动执行本实施例所示的方法的过程,以使得车辆出射的光束能够在待行车路径上显示目标光型。
可选地,在车辆接收到驾驶人员通过输入开启指令的方式确定满足该触发条件,例如,车辆接收到驾驶人员输入的启动指令,该启动指令用于在待行车路径显示目标光型的指令。可选地,驾驶人员可通过对灯光系统输入的语音、对车辆座舱屏幕输入的触摸手势或按压操作等方式,输入该启动指令。
又如,车辆可确定触发条件可为如下所示的至少一项:
车辆的当前速度大于或等于第一预设值(例如,该第一预设值可为60千米/时),车辆当前所位于的环境亮度小于或等于第二预设值(例如,该第二预设值可为50勒克斯),或车辆的待行车路径的形态出现改变,或车辆的车速的变化量大于或等于第三预设值,或车辆的环境亮度的变化量大于或等于第四预设值,或车辆的剩余电量大于或等于第五预设值等。
其中,该行车路径的形态出现改变可为:车辆由直行方向切换为转向状态,或,车辆由转向状态切换为直行状态,或车辆即将行车到交叉的路口处,或车辆所行车的车道线的尺寸出现改变(如车道线的宽度出现改变)。
车辆的车速的变化量可为:车辆在时刻T1获取到车辆的车速为V1,而车辆在时刻T2获取到的车速为V2,其中,时刻T1为当前时刻,时刻T2早于时刻T1,该车辆的车速的变化量大于或等于第三预设值可为:V2和V1之间的差大于或等于该第三预设值,例如该预设值可为10千米/时,可知,在车辆的车速的变化量大于或等于10千米/时的情况下,满足触发条件。
对车辆的环境亮度的变化量大于或等于第四预设值的说明,请参见上述对车辆的车速的变化量大于或等于第三预设值的说明,具体不做赘述。
本实施例中,车辆经由步骤201确定满足触发条件的情况下,触发执行下述步骤:
步骤202、车辆获取导航信息。
本实施例所示的车辆根据驾驶人员输入的需要到达的导航目的地获取该导航信息。驾驶人员可通过对车载导航系统输入语音、对车载导航系统的座舱屏幕输入触摸手势、按压车载导航系统的按钮等方式,输入导航目的地。
本实施例所示可为如图1所示的计算机系统获取来自定位系统的导航信息。该导航信息可为车辆到达导航目的地的一系列平面坐标。例如图3a所示,其中,图3a为本申请所提供的第一种应用场景示例图。本实施例所示的导航信息为车辆300行车至目的地的过程中,需要依次经过的一系列平面坐标,如平面坐标A(x1,y1),平面坐标B(x2,y2)、平面坐标c(x3,y3),依次类推,平面坐标K(xk,yk),其中,平面坐标K(xk,yk)为车辆行车的目的地的平面坐标或靠近车辆行车的目的地的平面坐标。可知,车辆依次经过导航信息所包括的各个平面坐标,能够成功到达目的地。
车辆获取导航信息的过程可为:车辆获取到需要行车的目的地的情况下,车辆可获取车辆当前所在位置的GPS坐标以及目的地的GPS坐标。车辆获取地图数据,进而根据地图 数据、车辆当前所在位置的GPS坐标以及目的地的GPS坐标,获取上述所示的导航信息。
步骤203、车辆获取待行车路径的信息。
本实施例所示的车辆所出射的光束能够显示在待行车路径上,为此,车辆需要获取待行车路径的信息。具体地,车辆能够根据导航信息,确定待行车路径,如图3b所示可知,其中,图3b为本申请所提供的第二种应用场景示例图。本实施例所示的待行车路径301包括上述所示的导航信息所包括的部分或全部平面坐标。本实施例对待行车路径的长度不做限定,例如,该待行车路径的长度可为10米,此种示例下说明该待行车路径包括导航信息所包括的位于车辆300前方10米之内的平面坐标。
本实施例所示的待行车路径的信息可为如下所示的一项或多项:
待行车路径的形态、待行车路径的尺寸、待行车路径所包括的路口的位置、待行车路径上的红绿灯的情况、待行车路径上车辆当前位置与最近的路口之间的距离。其中,待行车路径的形态可为待行车路径的车道线是直行车道线,或待行车路径的车道线是弯曲车道线。
例如图3b所示,车辆300向目的地行车的过程中,待行车路径的车道线为弯曲的车道线。待行车路径的尺寸可为待行车路径的车道线的宽度和/或长度等,又如,待行车路径的尺寸还可为,若待行车路径的车道线是弯曲的,那么该待行车路径的车道线的尺寸还可为待行车路径的车道线的弧度和/或弯曲方向。
又如图3c所示的示例,其中,图3c为本申请所提供的第三种应用场景示例图。若车辆302的待行车路径位于多岔路场景下,那么,该待行车路径的信息还可包括待行车的路口304的位置。待行车路径上车辆当前位置与最近的路口之间的距离可指,车辆302的当前位置与待行车的路口304之间的距离。
以下对本实施例所示的车辆获取待行车路径的信息的过程进行说明:
如图1所示的车辆的相机对待行车路径进行拍摄以获取包括待行车路径的信息的视频流。对相机的具体说明请参见图1所示,具体不做赘述。车辆的计算机系统接收来自相机的视频流。计算机系统所包括的处理器提取视频流所包括的视频帧,例如,处理器可通过30帧/秒的速度从视频流中提取视频帧。需明确地是,本实施例对处理器提取视频帧的速度的大小不做限定,处理器提取视频帧的速度越快,越能够获取待行车路径上最新的信息。而处理器提取视频帧的速度越慢,越能够节省处理器的功耗。
在具体应用中,处理器可根据当前路况的复杂程度确定提取视频帧的速度。例如,当前行车的路况越复杂(如待行车路径的形态变化频繁,具体例如从直行的状态切换至转向的状态,交叉的路口比较多等情况),那么,处理器可通过较快的速度提取视频帧。又如,若当前行车路况越简单(如待行车路径的形态比较稳定,具体例如,一直处于直行的状态),那么处理器可通过较慢的速度提取视频帧。
处理器提取到视频帧后,可基于视频帧分析出待行车路径的信息,本实施例对处理器所采用的分析方式不做限定,例如,该分析方式可为:物体识别算法、运动中恢复结构(structure from motion,SFM)算法、视频跟踪或人工智能(artificial intelligence,AI)等。
步骤204、车辆获取目标光型。
步骤205、车辆出射的光束,以目标光型显示于待行车路径上。
以下对步骤204至步骤205进行统一说明:
本实施例中,在车辆获取到导航信息以及待行车路径的信息之后,即可获取与导航信息以及行车路径的信息对应的目标光型。具体地,车辆可获取与导航信息对应的一个或多个第一显示属性。车辆再获取与待行车路径的信息对应的一个或多个第二显示属性。那么,车辆确定同时具有该第一显示属性和第二显示属性的光型为目标光型。
车辆获取到目标光型后,车辆根据目标光型出射光束,以保证车辆出射的光束,能够以目标光型显示于待行车路径上。本实施例对显示在待行车路径上的该目标光型与车辆之间的间距的大小不做限定,只要车辆内的驾驶人员能够清楚的看到显示于车辆前方的目标光型即可,例如,该目标光型和该车辆之间的间距可为10米。
以下对本实施例所示的目标光型进行说明:
本实施例中,车辆根据导航信息所包括的各个平面坐标确定第一显示属性。具体地,车辆获取待行车路径,该待行车路径包括M个坐标中的第i个平面坐标至第j个平面坐标,其中,i和j均为正整数,且i大于或等于1,j大于i且小于或等于M。
如图3b所示,该M个平面坐标可为平面坐标A,平面坐标B平面坐标C,依次类推。又如图3c所示,车辆根据导航信息,确定车辆当前位置与车辆待行驶的路口304之间的所包括的平面坐标均为该M个平面坐标(如图3c所示的平面坐标A至平面坐标M),可知车辆依次经由该M个平面坐标行驶,能够行驶至最靠近车辆的路口304。车辆根据M个平面坐标确定第一显示属性。具体确定过程可参见下述如表1所示的第一行车列表,该第一行车列表所示建立了M个平面坐标不同的延伸方向和不同的第一显示属性的对应关系。
表1
M个平面坐标不同的延伸方向 第一显示属性
M个平面坐标沿直线方向延伸 矩形
M个平面坐标沿弧形方向延伸 弧形
可知,结合表1和图3d所示,其中,图3d为本申请所提供的第四种应用场景示例图。若M个平面坐标沿直线方向延伸,那么,车辆确定第一显示属性为矩形。又如结合表1和图3b所示,若M个平面坐标沿弧形方向延伸,那么,车辆确定第一显示属性为弧形。
车辆再根据待行车路径确定第二显示属性,该第二显示属性可为如下所示的一项或多项:
例如,车辆根据待行车路径的尺寸确定第二显示属性。如第二显示属性为宽度,该第二显示属性所包括的宽度等于待行车路径的车道线的宽度。本实施例对第二显示属性所包括的宽度和待行车路径的车道线的宽度之间的关系的说明为可选地示例,又如,该第二显示属性所包括的宽度也可小于车道线的宽度,又如,该第二显示属性所包括的宽度也可大于车道线的宽度车道线的宽度等,具体不做限定。
又如,车辆所确定的第二显示属性为长度。如该第二显示属性所示的长度可为第一位置和第二位置之间的长度,其中,第一位置为车辆当前位置,第二位置为导航信息所包括 的最靠近车辆的路口的位置。该第二位置可为车辆采集到的待行车路径上,最靠近车辆的红绿灯的位置。
又如,若车辆确定第一显示属性为弧形,该第二显示属性包括弯曲方向。具体地,如图3b所示,车辆根据待行车路径的车道线301的弧度,确定第二显示属性所包括的弯曲方向,以保证第二显示属性所包括的弯曲方向和待行车路径的车道线的弯曲方向一致。
在车辆已确定第一显示属性和第二显示属性的情况下,车辆确定同时具有第一显示属性和第二显示属性的光型为目标光型。如图3d所示,在第一显示属性为矩形,且第二显示属性为待行车路径的车道线的宽度(如车道线的宽度3.5米),可知,目标光型的形状为矩形,且该矩形的宽度为3.5米。
如图3e所示,图3e为本申请所提供的第五种应用场景示例图。在第一显示属性为弧形,且该第二显示属性所包括的弯曲方向与该待行车路径的弯曲方向一致,可知,具有该第二显示属性的目标光型呈弧形,且弯曲方向与待行车路径321的弯曲方向一致。
以下对目标光型显示在待行车路径上的方式进行可选地说明:
沿待行车路径的延伸方向,该目标光型的中心线可与待行车路径的中心线重合,或,该目标光型的中心线可与待行车路径的中心线之间的偏移量小于或等于第一间距,可知,通过该目标光型的该显示方式,以保证目标光型能够准确地显示在待行车路径上。或,在该第二显示属性所包括的宽度等于待行车路径的车道线的宽度的情况下,那么,沿待行车路径的横向方向,目标光型两侧的边界线与待行车路径的车道线的边界线重合,或目标光型两侧的边界线与待行车路径的车道线的之间的偏移量小于或等于第二间距,或,在该第二显示属性所示的长度为第一位置和第二位置之间的长度的情况下,沿待行车路径的延伸方向,目标光型的上下两个边界线分别与第一位置和第二位置重合。
在目标光型显示在待行车路径上,该目标光型能够指示该车辆在行驶的过程中,所占的区域。如图3d所示的目标光型310可知,该车辆会行驶至该目标光型310所占的车道位置处。又如图3e的目标光型321可知,该车辆会行驶至该目标光型321所占的车道位置处。可见,通过车辆所出射的光束,在待行车路径上所显示的目标光型,有助于提高导航的准确性,而且能够实现对待行车路径的照明,保证了车辆按照导航行车过程中的安全。例如图4所示,图4为本申请所提供的一种应用场景对比示例图。
其中,车辆由该导航地图所示可知,车辆需要在下个路口右转。图4a所示为已有方案所示的车辆右转的具体路况示例图,如图4a所示可知,车辆401根据车载导航虽然确定需要在下个路口,即路口402右转,但是,车辆401此时还处于直行的状态,车辆所出射的灯光(如车辆的近光灯所出射的灯光),仅能够照亮车辆401前方有限的区域,无法照亮车辆待行车的路口402。
而本实施例所示的方法,如图4b所示,图4b所示为本申请所示的车辆右转的具体路况示例图。在确定出待行车路径为路口402右转的情况下,即可确定出目标光型403,对确定目标光型403的过程,可参见上述所述,不做赘述。可知,该目标光型403能够根据车辆401的待行车路径照亮路口402,以保证用户驾驶车辆401经由路口402转向行驶的安全。
又如图5所示,其中,图5为本申请所提供的第二种应用场景对比示例图。图5a所示的示例中,图5a为已有方案所示的车辆近光灯的照明示例图。已有方案的车辆501在环境亮度较低的场景中(如夜间、阴天、雨天)等,车辆501的近光灯所出射的灯光照亮范围比较小,如图5a所示的场景,车辆501的近光灯所出射的灯光仅能够实现在车辆501的前方25米之内实现照亮。
而采用本实施例所示的方法,参见图5b所示,其中,图5b为本申请所提供的目标光型的照明示例图。若在车辆502的前方的待行车路径上所显示的目标光型503的长度以20米,宽度为3.5米,形状为矩形为例。可知,本实施例所示的目标光型503是车辆502出射的直接照射在待行车路径上的光束以形成,该目标光型503的亮度大于已有方案所示的近光灯所出射的灯光照亮路径的亮度。由于目标光型503已照亮待行车路径,那么,驾驶人员按照目标光型503照亮的区域进行驾驶,提高了驾驶的安全性。而且路径上的其他人员或车辆,根据目标光型503的提示,能够迅速确定该车辆503即将行驶的位置,便于路径上其他人员或车辆的避让,提高了驾驶安全。
又如图6所示,图6为本申请所提供的第三种应用场景对比示例图。在车辆行车的过程中,经常会遇到车道线的宽度出现变化的场景。如图6a的已有方案所示,其中,图6a为已有方案所示的车道线的宽度变化时车辆近光灯的照明示例图。车辆601的前方路径的车道线的宽度存在突然收窄的情况,即车道线的宽度602大于车道线的宽度603。若驾驶人员无法准确的确定车道线的宽度的变化的情况下,容易出现驾驶危险。
而采用本实施例所示的方法,参加图6b所示,其中,图6b为本申请所提供的车道线的宽度变化时目标光型的照明示例图。车辆604的前方所形成的目标光型605能够准确的照亮前方的路径,而且目标光型605的宽度可等于车道线所具有的最窄的宽度,那么,驾驶人员基于被目标光型照亮的待行车路径,驾驶人员能够准确的判断出车道线突然收窄的情况,提高了驾驶的安全性。
又如图7所示,其中,图7为本申请所提供的第四种应用场景对比示例图。具体如图7a所示,其中,图7a为已有方案所示的车辆近光灯照亮斑马线的照明示例图。在车辆702行驶至斑马线701的过程中,在斑马线701上的行人应该受红灯的指示下不穿过斑马线,而车辆702应该受绿灯的指示穿过斑马线。但是,由于行人安全意识淡薄,在红灯的指示下继续穿过斑马线,若车辆702对行人避让不及,会出现安全事故。
而如图7b所示的采用本实施例所示的方法,其中,图7b为本申请所提供的目标光型照亮斑马线的照明示例图。车辆703能够在待行车路径的斑马线上形成目标光型704,该目标光型704能够照亮斑马线。那么,行人在斑马线上行走会注意到该目标光型704,有助于行人在斑马线上对车辆703的避让。而且因目标光型704能够照亮斑马线,那么,斑马线就不会成为驾驶人员的视线盲区,有效地避免了车辆703和行人之间出现安全事故的可能。
又如图8a所示,其中,图8a为已有方案所示的车辆近光灯照亮车辆前方的照明示例图。在车辆801行驶的过程中,若车辆801的前方出现目标802,该目标802可为其他任意车辆、或非机动车辆、或行人等,在车辆801的行驶过程中,车辆801以及车辆801前 方的目标802,均不确定车辆801行驶过程中,是否会与目标802出现碰撞,即目标802是否位于车辆801的安全距离之外。
而如图8b所示的本实施例所示的方法,其中,图8b为本申请所提供的目标光型照亮车辆前方的照明示例图。车辆803能够在待行车路径上形成目标光型804,该目标光型804的宽度等于车辆803的宽度,该目标光型804能够指示出车辆803行车至目标光型804的区域内时,所占用的宽度。可知,目标805能够基于目标光型804清晰的边界,确定车辆803的行车范围。若目标805出现在目标光型804之内,说明目标805在车辆804的安全距离之内,那么,车辆804与目标805之间出现安全事故的可能性很大。若目标805出现在目标光型804之外,说明目标805在车辆804的安全距离之外,那么,车辆805与目标805之间出现安全事故的可能性很小。
需明确地是,本实施例对目标光型的说明为可选地示例,不做限定,在其他示例中,目标光型还可与来自车机数据(例如车速)相关,那么,对于车辆周围的行人或车辆等,能够基于该目标光型确定车速。
具体地,目标光型长度与车速呈正相关关系,即车速越快,那么,目标光型的长度越长,车速越慢,那么,目标光型的长度越短。车辆可存储如下表2所示的车速与目标光型的对应列表。
表2
车速 目标光型的长度
大于120千米/时 60米至80米之间
120千米/时至80千米/时之间 50米至60米之间
80千米/时至60千米/时之间 40米至50米之间
60千米/时至40千米/时之间 20米至40米之间
40千米/时至20千米/时之间 10米至20米之间
20千米/时至0千米/时之间 0米至10米之间
例如,若车辆确定车速为70千米/时,那么车辆可确定对应的目标光型的长度为45米。又如,若车辆确定车速为大于120千米/时,那么车辆可确定对应的目标光型的长度为80米。需明确地是,本实施例对车速和目标光型的长度的对应关系的说明仅为一种示例,不做限定,只要能够基于目标光型的长度,确定车速的快慢即可。
本实施例所示的目标光型的长度还可与车速呈动态的关系,具体地,车辆获取到车速,并根据下述所示的公式1获取与车速对应的目标光型:
公式1:目标光型的长度L=50+[(120-当前车速)/40]*10
可知,在车辆将当前车速带入至公式1中,即可获取到对应的目标光型的长度。需明确地是,本实施例所示的公式1的说明为可选地示例,不做限定,只要车辆基于该公式1能够创建不同的车速与不同的目标光型的长度之间的线性关系即可。
可选地,本实施例所示的车辆可周期性的将车辆的当前车速带入至公式1中,获取车辆在确定车速的变化量大于或等于预设值的情况下,将车辆的当前车速带入至公式1等。具体地,车辆在时刻T1获取到车辆的车速为V1,而车辆在时刻T2获取到的车速为V2,其 中,时刻T1为当前时刻,时刻T2早于时刻T1,该车辆的车速的变化量大于或等于预设值可为:V2和V1之间的差大于或等于该预设值,例如该预设值可为10千米/时,可知,在车辆的车速的变化量大于或等于10千米/时的情况下,根据上述公式1所示的获取目标光型的长度。
可选地,本实施例所示的以目标光型显示在地面上的光还可具有一定的闪光频率,闪光频率可参见下述所示的至少一项示例:
例如,车辆的车速与闪光频率呈正相关关系,即车速越快,闪光频率越高,车速越慢,闪光频率越低。又如,待行车路径的形态出现改变,那么,以目标光型显示在地面上的光具有闪光频率。
该行车路径的形态出现改变可为:车辆的待行车路径用于指示车辆由直行状态切换为转向状态,或,车辆的待行车路径用于指示由转向状态切换为直行状态,或车辆的待行车路径用于指示车辆即将行车到交叉的路口处,或车辆的待行车路径用于指示车道线的尺寸出现改变(如车道线的宽度出现改变)。又如,车辆的目标光型显示于斑马线上。又如,车辆的目标光型的范围内出现障碍物(如行人、其他车辆等)。又如,车辆的闪光频率与环境亮度呈负相关关系,即环境亮度越暗,闪光频率越高,环境亮度越亮,闪光频率越低。
可选地,本实施例所示的目标光型所具有的亮度,可参见下述所示的至少一项示例:
例如,车辆的车速与以目标光型显示在地面上的光的亮度呈正相关关系,即车速越快,亮度越亮,车速越慢,亮度越暗。又如,待行车路径的形态出现改变的程度越大,那么,亮度越亮,待行车路径的形态出现改变的程度越小,那么,亮度越暗,如,待行车路径的结构呈弧度,且该弧度越大,亮度越亮,该弧度越小,亮度越暗。又如,车辆出射光的亮度和环境亮度匹配,以保证能够起到对驾驶人员提醒的作用的同时,还不会过度的刺激驾驶人员的眼睛。又如,待行车路径出现改变时的亮度大于待行车路径未出现改变时的亮度,即若待行车路径一直处于直行的状态时,车灯出射光的亮度小于待行车路径出现转向时的车灯出射光的亮度。又如,车辆的待行车路径指示车道线的尺寸出现改变时的车灯出射光的亮度大于待行车路径指示车道线的尺寸未出现改变时的车灯出射光的亮度。又如,车灯出射光以目标光型显示于斑马线上时的亮度大于车灯出射光以目标光型未显示于斑马线上时的亮度。又如,车辆的目标光型的范围内出现障碍物(如行人、其他车辆等)时,车灯出射光的亮度大于车辆的目标光型的范围内未出现障碍物时的车灯出射光的亮度。
可选地,本实施例所示的目标光型还与车辆与前方车辆之间的间距相关,该前方车辆位于该车辆的正前方或侧前方等位置,结合图8c和图8d所示,以前方车辆位于车辆正前方为例,例如图8c所示,车辆811与前方车辆812之间的间距为L1,在图8d中,车辆811与前方车辆812之间的间距为L2,且L1小于L1。本示例所示的车辆811所显示的目标光型位于车辆811和前方车辆812之间的路径上,即图8c所示的示例中,目标光型813位于车辆811和前方车辆812之间的路径上,图8d所示的示例中,目标光型814位于车辆811和前方车辆812之间的路径上。
本实施例所示的目标光型的长度与车辆和前方车辆之间的间距呈正相关关系,即,车辆与前方车辆之间的间距越大,那么,目标光型的长度越长,对比于图8c和图8d所示可 知,在图8d所示的车辆811与前方车辆812之间的间距L2大于图8c所示的车辆811与前方车辆812之间的间距L1的情况下,那么,图8d所示的目标光型814的长度大于图8c所示的目标光型813的长度。应理解,当车辆与前方车辆之间的间距足够大时,例如,间距达到150米及以上,则目标光型的长度将保持不变。
需明确地是,本实施例以目标光型的长度与车辆和前方车辆之间的间距呈正相关关系为例进行示例性说明,在其他示例中,车辆和前方车辆之间的间距还可与以目标光型显示在地面上的光的亮度呈负相关关系,即车辆和前方车辆之间的间距越短,亮度越亮,车辆和前方车辆之间的间距越长,亮度越暗。又如,车辆和前方车辆之间的间距与以目标光型显示在地面上的光的闪光频率呈负相关关系,即车辆和前方车辆之间的间距越短,闪光频率越高,车辆和前方车辆之间的间距越长,闪光频率越低。同理,当车辆与前方车辆之间的间距足够大,例如,间距达到150米及以上,则亮度可以保持不变,闪光频率也可以保持不变或者不闪光。
步骤206、若车辆判断目标光型满足重校准条件,则返回执行步骤203。
本实施例所示的步骤206为可选执行的步骤,通过执行步骤206能够对车辆在待行车路径上所显示的目标光型进行校准,在目标光型不准确的情况下,车辆需要通过返回执行步骤203的方式,以实现对目标光型的重新校准,即重新获取该目标光型。而在目标光型不满足该重校准条件的情况下,说明该目标光型是准确的,无需重新获取该目标光型,那么车辆可在行驶至上述所示的待行车路径上的情况下,获取需要在下一个待行车路径上所显示的目标光型。
为实现车辆判断目标光型是否满足重校准条件的目的,则车辆需要执行下述具体过程:
首先,车辆通过车辆的相机重新采集校准图像,该校准图像包括上述已确定的待行车路径以及在该待行车路径上显示的目标光型。
其次,车辆根据校准图像判断目标光型是否满足重校准条件,具体地,车辆确定该校准图像满足下述所示的至少一个条件的情况下,确定该目标光型满足重校准条件:
车辆判断目标光型的中心线与待行车路径的车道中心线出现偏移、沿待行车路径的横向方向,在目标光型的宽度需要等于待行车路径的车道线的宽度的情况下,目标光型两侧的边界线与车道线两侧的边界线出现偏移、或在目标光型的弯曲方向与待行车路径的弯曲方向需要一致的情况下,目标光型的弯曲方向与待行车路径的弯曲方向不一致等。
需明确地是,本实施例对车辆根据校准图像判断目标光型是否满足重校准条件的过程的说明为可选地示例,不做限定,只要车辆能够基于校准图像准确地确定目标光型是否能够起到对驾驶人员导航的辅助作用,且提高驾驶安全的作用即可。
可见,采用本实施例所示的方法,车辆出射的光束,在待行车路径上所显示的目标光型,能够与车辆的导航信息匹配,以便于根据目标光型提高驾驶的安全。
实施例二
在实施例一中,目标光型与导航信息相关,即随着导航信息的不同,那么,目标光型也会随之改变。而本实施例所示的目标光型与驾驶辅助信息相关,即随着驾驶辅助信息的 改变,那么,目标光型也会随之改变,具体执行过程结合图9所示,其中,图9为本申请所提供的车辆灯光的控制方法的第二种实施例步骤流程图。
步骤901、车辆确定满足触发条件。
本实施例所示的步骤901的执行过程,请详见实施例一所示的步骤201所示,具体不做赘述。
步骤902、车辆获取驾驶辅助信息。
本实施例所示的驾驶辅助信息为用于实现无人驾驶的相关信息。本实施例以该驾驶辅助信息为来自车辆的ADAS的信息,该ADAS的具体说明,请详见图1的相关说明,具体不做赘述。
步骤903、车辆获取待行车路径的信息。
本实施例所示的步骤902的具体说明,请参见实施例一所示的步骤203所示,具体不做赘述。
步骤904、车辆获取目标光型。
步骤905、车辆出射的光束,以目标光型显示于待行车路径上。
以下对步骤904至步骤905进行统一说明:
本实施例中,在车辆获取到驾驶辅助信息以及待行车路径的信息之后,即可获取与驾驶辅助信息以及行车路径的信息对应的目标光型。具体地,车辆可获取与驾驶辅助信息对应的一个或多个第一显示属性。车辆再获取与待行车路径的信息对应的一个或多个第二显示属性,那么,车辆确定同时具有该第一显示属性和第二显示属性的光型为目标光型,对第二显示属性的说明,请参见实施例一所示,具体不做赘述。
车辆根据目标光型出射光束,以保证车辆出射的光束,能够以目标光型显示于待行车路径上。本实施例目标光型显示在待行车路径上的方式的说明,请参见实施例一所示,具体不做赘述。对显示在待行车路径上的该目标光型与车辆之间的间距的大小不做限定,只要车辆内的驾驶人员能够清楚的看到显示于车辆前方的目标光型即可,例如,该目标光型和该车辆之间的间距可为10米。
本实施例所示的驾驶辅助信息为来自ADAS的行车决策,车辆能够根据不同的ADAS的行车决策,确定不同的第一显示属性。以下结合具体示例说明,不同的ADAS的行车决策所对应的不同的第一显示属性进行说明:
示例1
车辆根据第二行车列表确定与不同的ADAS的行车决策与不同的第一显示属性的对应关系,本示例以行车决策为车辆的行驶意图为例。本示例所示的行驶意图为车辆即将行驶的方向。本示例所示的该第二行车列表可参见下述表3所示,如表3所示的第二行车列表创建了不同的行驶意图与不同的第一显示属性的对应关系。
需明确地是,本实施例对第二行车列表的内容的说明,为可选地示例,不做限定,只要随着由ADAS确定的各行驶意图的改变,以目标光型显示于地面的光(也可称之为光毯)所具有的第一显示属性出现改变即可。
表3
Figure PCTCN2022101849-appb-000001
需明确地是,本实施例对各行驶意图所对应的第一显示属性的数量不做限定。例如表3所示的行驶意图“直行”对应一个第一显示属性(即光毯呈矩形),而行驶意图“变道”对应两个第一显示属性(即光毯具有闪光频率以及提高光毯的亮度)仅为一种示例性,在具体应用中,若一个行驶意图对应多个第一显示属性,那么该行驶意图对应的光毯叠加了多个第一显示属性。
例如,基于表3所示的第二行车列表所示可知,若ADAS指示车辆的计算机系统,ADAS的行车决策为变道,那么该计算机系统根据表3获取到对应的第一显示属性为光毯具有闪光频率以及提高光毯的亮度。可知,在车辆变道的过程中,该光毯通过其具有的第一显示属性,提示周围车辆或行人,该车辆即将变道。
又如,基于表3所示的第二行车列表,若ADAS指示车辆的计算机系统,ADAS的行车决策为转向,那么该计算机系统根据表3获取到对应的第一显示属性为光毯呈弧形,该弧形的弯曲方向和弧度可具体根据第二显示属性获取,对如何根据待行车路径确定呈弧形的光毯的弯曲方向和弧度的过程的说明,请参见实施例一所示,具体不做赘述,为更好地理解,以下结合图10所示,其中,图10为本申请所提供的第六种应用场景对比示例图。如图10a所示的示例中,图10a为已有方案所示的车辆停车入库的照明示例图。已有方案所示的车辆1001处于自动驾驶状态,该车辆1001需要转向以停车入库。在车辆1001周围存在人员1002或其他车辆的场景下,该人员1002无法确定该车辆1001的行驶意图,导致容易出现危险的情况。
而采用本实施例所示的方法,如图10b所示,图10b为本申请所提供的光毯对车辆停车入库的照明示例图。ADAS指示行驶意图为向右转向以停车入库,那么,车辆基于转向对应的第一显示属性(如表3所示的光毯呈弧形),还基于已采集到的待行车路径确定第二显示属性,能够在车辆1003的待行车路径上显示呈弧形的光毯1004。基于该光毯1004的弯曲方向,人员1005能够准确地判断出车辆1003的行驶意图,避免人员1005出现在光毯1004所占的区域内,避免了人员1005与车辆1003之间出现安全事故的可能。
示例2
车辆根据第三行车列表确定不同的ADAS的行车决策与不同的第一显示属性的对应关系,本示例以行车决策为车辆的紧急决策为例,其中,该紧急决策可为车辆紧急刹车、紧急避险、车辆出现故障等。本示例所示的该第三行车列表可参见下述表4所示,如表4所示的第三行车列表创建了不同的紧急决策与不同的第一显示属性的对应关系。
需明确地是,本实施例对第三行车列表的内容的说明,为可选地示例,不做限定,只 要随着由ADAS确定的各紧急决策的改变,光毯所具有的第一显示属性出现改变即可。
表4
Figure PCTCN2022101849-appb-000002
需明确地是,本实施例对各紧急决策所对应的第一显示属性的数量不做限定。例如表3所示的紧急避险对应一个第一显示属性(即光毯具有第二闪光频率),而紧急刹车对应三个第一显示属性(即光毯具有第一闪光频率、光毯的形状变化以及光毯具有第一亮度)仅为一种示例性,在具体应用中,若一个紧急刹车对应多个第一显示属性,那么该紧急决策对应的光毯叠加了多个第一显示属性。
本实施例对表4所示的第一闪光频率和第二闪光频率的具体大小不做限定,例如,车辆在正常行驶的过程中(如直行、转向)等,光毯不具有闪光频率。而在车辆处于紧急刹车或紧急避险的情况下,该光毯具有闪光频率。
表4所示的第一亮度和第二亮度的具体亮度不做限定,例如,该第一亮度以及第二亮度,可均为大于车辆在正常行驶的过程中光毯的亮度。
本实施例对表4所示的光毯的形状变化的具体变化方式不做限定,只要该光毯的形状的变化能够提醒车辆周围的行人或其他车辆,车辆当前处于“紧急刹车”的状态下即可,例如,该光毯的形状的变化可为该光毯的长度变短,或光毯的宽度变宽等,具体不做限定。
例如,基于表4所示的第三行车列表所示可知,若ADAS指示车辆的计算机系统,ADAS的紧急决策为紧急刹车,那么该计算机系统根据表4获取对应的第一显示属性为光毯具有第一闪光频率、光毯的形状变化以及光毯具有第一亮度。可知,在车辆紧急刹车的过程中,该光毯通过其具有的上述第一显示属性,提示周围的行人或车辆,该车辆即将紧急刹车。
示例3
车辆根据第四行车列表确定不同的ADAS的行车决策与不同的第一显示属性的对应关系,本示例以行车决策为车辆行驶预判事件为例,其中,该车辆行驶预判事件为ADAS对车辆可能出现的事件的预判,例如,该车辆行驶预判事件为车辆处于安全状态、车辆处于危险状态,即车辆可能会出现安全事故等。本示例所示的该第四行车列表可参见下述表5所示,如表5所示的第四行车列表创建了不同的车辆行驶预判事件与不同的第一显示属性的对应关系。
需明确地是,本实施例对第四行车列表的内容的说明,为可选地示例,不做限定,只要随着由ADAS确定的各车辆行驶预判事件的改变,光毯所具有的第一显示属性出现改变即可。
表5
Figure PCTCN2022101849-appb-000003
需明确地是,本实施例对各车辆行驶预判事件所对应的第一显示属性的数量不做限定。例如表5所示的车辆处于危险状态的预判事件对应两个第一显示属性,而车辆处于安全状态的预判事件对应一个第一显示属性仅为一种示例性,在具体应用中,若一个车辆行驶预判事件对应多个第一显示属性,那么该车辆行驶预判事件对应的光毯叠加了多个第一显示属性。
本实施例对表5所示的第三亮度和第四亮度的具体大小不做限定,为起到提醒车辆周围的车辆以及行人,该车辆即将处于危险状态的预判事件,那么,车辆处于危险状态时的光毯的第四亮度大于车辆处于安全状态时的光毯的第三亮度。
本实施例对第三闪光频率的具体大小不做限定,例如,车辆在正常行驶的过程中(如直行、转向)等,光毯不具有闪光频率。而在车辆处于危险状态的预判事件下,该光毯具有该第三闪光频率。
本实施例对不同的车辆行驶预判事件所对应的第一显示属性的具体类型的说明为可选地示例,不做限定,例如,在其他示例中,车辆行驶预判事件还可与光毯的形态、变化方式等对应,具体不做限定。
为更好地理解,以下结合图11所示的示例进行说明。其中,图11为本申请所提供的第七种应用场景对比示例图。
具体如图11a所示可知,图11a为已有方案所示的车辆出现危险状态的照明示例图。车辆1100即将行车至路口处。车辆1100的ADAS确定路口的右侧有目标车辆1101也即将进入路口。车辆1100的ADAS判断车辆是否处于安全状态,其中,安全状态是指,在车辆1100行车至路口处时,不会与目标车辆1101相撞,为此,车辆1100的ADAS根据车辆1100的车速、目标车辆1101的车速,以及车辆1100和目标车辆1101之间的间距确定车辆1100处于安全状态。基于表5所示可知,此时车辆1100的光毯具有第三亮度,该光毯的第二显示属性可参见实施例一所示的对第二显示属性的说明,具体不做赘述,例如,本示例下在车辆1100处于安全状态下的光毯的长度与车辆1100的车速处于正相关关系,且该光毯所具有的宽度与待行车路径的车道的宽度一致。
车辆1100的ADAS会实时检测车辆1100的车速,目标车辆1101的车速以及车辆1100和目标车辆1101之间的距离。
如图11a所示,在车辆1100继续靠近路口的情况下,若车辆1100的ADAS检测到车辆1100以及目标车辆1101中的至少一个,处于加速状态。那么,车辆1100的ADAS根据车辆1100的车速、目标车辆1101的车速,以及车辆1100和目标车辆1101之间的间距确定车辆1100处于危险状态,即车辆1100和目标车辆1101有可能在路口处出现碰撞,那么,车辆1100可采用上述示例2所示的紧急刹车紧急决策,但是,紧急刹车有可能会对驾驶人 员、车辆1100周围的行人或车辆造成惊吓。
为此,本示例可在车辆1100即将处于危险状态的预判事件的情况下,通过光毯提示驾驶人员、车辆周围的行人或车辆,如图11b所示,图11b为本申请所提供的光毯对车辆出现危险状态时的照明示例图。在车辆1100确定车辆处于危险状态下,那么,车辆1100显示在待行车路径上的光毯的第一显示属性为具有第四亮度以及具有第三闪光频率,例如,第四亮度比第三亮度增加10勒克斯(lx)。基于显示在待行车路径上的光毯1102,能够起到对目标车辆提醒的作用,以避免车辆1100和目标车辆1101在路口处出现相撞的情况的出现。
步骤906、若车辆判断光毯满足重校准条件,则返回执行步骤902。
对本实施例所示的步骤905的具体执行过程的说明,请参见实施例一所示的步骤205的执行过程的说明,具体不做赘述。
本实施例所示的光毯能够与车辆的驾驶辅助信息和待行车路径匹配,基于该光毯,能够准确的识别车辆的行驶意图、紧急决策、以及车辆的行驶预判事件等,提高了车辆驾驶的安全。
实施例三
在实施例一中,光毯的作用在于帮助驾驶人员导航。在实施例二中,光毯的作用在于提高无人驾驶的过程中的驾驶安全。而本实施例所示的光毯的作用在于,实现对车辆前方的待识别对象的准确识别。本实施例结合图12所示对车辆出射的光束,形成目标光型的过程进行说明,其中,图12为本申请所提供的车辆灯光的控制方法的第三种实施例步骤流程图。
步骤1201、车辆确定满足触发条件。
步骤1202、车辆获取导航信息。
本实施例所示的步骤1201至步骤1202的执行过程,请参加实施例一所示的步骤201至步骤202所示,具体不做赘述。
步骤1203、车辆确定待识别对象满足预设条件。
为更好地理解,以下结合图13所示对具体的应用场景进行说明,其中,图13为本申请所提供的第八种应用场景对比示例图。
如图13a所示,图13a为已有方案所示的待行车路径存在的待识别对象的照明示例图。若车辆1300在夜间行驶,那么,车辆的车灯照明范围有限。例如,若在车辆1300的近光灯的照明范围之外,存在待识别对象1301,那么驾驶人员或车辆1300的ADAS均无法准确的识别该待识别对象。例如,车辆的ADAS基于车辆的传感系统(例如车辆所包括的红外成像仪或激光雷达等)识别出了车辆1300前方存在类型未知的待识别对象,该待识别对象的类型可能为障碍物、行人等,能够对车辆1300的行车安全造成影响的对象。此时,因车辆的环境亮度很低,车辆的相机无法识别到该待识别对象的具体类型。
由实施例一所示可知,导航信息为车辆到达导航目的地的一系列平面坐标。本实施例所示的预设条件为ADAS确定待识别对象的平面坐标靠近导航信息所包括的一系列平面坐 标,可知,在待识别对象满足该预设条件的情况下,车辆按照导航信息行车的过程,容易因撞到该待识别对象出现安全事故。
步骤1204、车辆获取待行车路径的信息。
本实施例所述的步骤1204的执行过程,请参见实施例一所示的步骤203所示,具体不做赘述。
步骤1205、车辆获取目标光型。
步骤1206、车辆出射的光束,以目标光型显示于待行车路径上。在本申请实施例中,以目标光型显示在地面上的光可以称之为光毯。
以下对步骤1205至步骤1206进行统一说明:
本实施例对光毯的长度、宽度、弯曲方向、闪光频率以及亮度等说明,请参见上述实施例,具体在本实施例中不做限定。本实施例所示的光毯所满足的条件为,该光毯覆盖待识别对象的平面坐标,可知,如图13b所示,其中,图13b为本申请所提供的光毯对待行车路径存在的待识别对象的照明示例图。
车辆1300在待行车路径上所显示的光毯1302至少覆盖目标区域1303,所述目标区域1303为所述待识别对象所占的区域。可知,光毯1302能够照亮该目标区域1303。
本实施例所示的目标区域1303可位于光毯1302的中央区域。需明确地是,本实施例对光毯1303和光毯1302之间的相对位置关系不做限定,只要光毯1302至少覆盖目标区域1303即可。
步骤1207、车辆采集待识别图像。
本实施例中,在车辆的光毯覆盖目标区域的情况下,那么光毯能够照亮位于该目标区域的待识别对象。车辆可基于车辆的相机再次对光毯进行拍摄,因光毯的亮度足够,那么车辆基于相机能够拍摄到包括光毯的清晰的待识别图像。
步骤1208、车辆根据待识别图像获取待识别对象的类型。
车辆可根据物体识别算法、SFM算法、视频跟踪或AI等识别出待识别图像中,所包括的待识别对象的类型。即识别出待识别对象的类型是行人、车辆或障碍物等类型,还可选地识别出待识别对象的具体尺寸。例如,若待识别对象是行人,车辆还能够基于该待识别图像识别出该行人的身高。又如,若待识别对象是障碍物,车辆还能够基于该待识别图像识别出该障碍物的尺寸等。
步骤1209、车辆根据待识别对象的类型确定行车决策。
本实施例中,在车辆识别出待识别对象的类型的情况下,车辆的ADAS能够基于该待识别对象的类型确定行车决策,对行车决策的具体说明可参见实施例二所示,具体不做赘述。可知,车辆基于ADAS根据待识别对象的类型的行车决策,实现车辆在待行车路径上对待识别对象的避让,避让的方式包括但不限于转向,换道,紧急刹车等。
可选地,在车辆识别出待识别对象的类型的情况下,车辆也可通过语音等方式提示驾驶人员。
需明确地是,本实施例以待识别对象位于待行车路径上为例进行示例性说明,在其他示例中,该待识别对象可位于车辆周围的任意区域,车辆的ADAS检测到待识别对象的平面 坐标后,即可根据该待识别对象的平面坐标出射光束,以保证该光束所行车的光毯能够照亮待识别对象。车辆即可识别出被照亮的待识别对象的具体类型。
采用本实施例所示的方法,在车辆的待行车路径上存在待识别对象的情况下,车辆可通过出射的光束所形成的光毯照亮该待识别对象。车辆能够识别到被照亮的待识别对象识别出具体的类型,以便于车辆执行对应的行车决策或车辆的驾驶人员根据被照亮的待识别对象驾驶车辆避让等,提高了车辆前方存在待识别对象的场景下,车辆驾驶的安全。
实施例四
实施例三所示,以待识别对象存在于车辆前方的待行车路径上为例,本实施例以待识别对象存在于车辆周围的任意位置,如待识别对象存在于车辆的正前方,侧前方,车辆的右侧方,车辆的左侧方或车辆的后侧方等,本实施例所示的光毯的作用在于,实现对存在于车辆周围的待识别对象的准确识别。本实施例结合图14所示对车辆出射的光束,形成光毯的过程进行说明,其中,图14为本申请所提供的车辆灯光的控制方法的第四种实施例步骤流程图。
步骤1401、车辆确定满足触发条件。
步骤1402、车辆获取行车信息。
本实施例所示的行车信息可为实施例一所示的导航信息或车机数据,也可为实施例二所示的驾驶辅助信息,具体说明请参见实施例一或实施例二所示,具体不做赘述。
步骤1403、车辆获取待行车路径的信息。
步骤1404、车辆获取第一光型。
步骤1405、车辆出射的第一光束,以第一光型显示于待行车路径上。
本实施例所示的车辆经由步骤1401至步骤1405以获取第一光型,对确定第一光型的过程的说明,请参见实施例一或实施例二所示的获取光毯的过程的说明,具体不做赘述。
步骤1406、车辆获取待识别对象的平面坐标。
为更好地理解,以下结合如图15所示的应用场景进行说明,其中,图15为本申请所提供的第九种应用场景对比示例图。
如图15a所示,图15a为已有方案所示的车辆前方存在的待识别对象的照明示例图。若车辆1500在夜间行驶,但是,车辆的车灯照明范围有限。例如,若在车辆1500的近光灯的照明范围之外,存在待识别对象1501,那么驾驶人员或车辆1500的ADAS均无法准备的识别该待识别对象1501。
如图15b所示,图15b为本申请所提供的光毯对车辆前方存在的待识别对象的照明示例图。车辆1500已在待行车路径上显示第一光型1502。但是,该第一光型1502的照明范围有限。例如,待识别对象1501位于第一光型1502的照明范围之外,驾驶人员或车辆1500的ADAS均无法准备的识别该待识别对象1501。
可见,车辆的ADAS基于车辆的传感系统(例如车辆所包括的红外成像仪或激光雷达等)识别出了车辆1500前方存在类型未知的待识别对象1501,该待识别对象1501的类型可能为障碍物、行人等,能够对车辆1500的行车安全造成影响的对象。此时,因车辆的环境亮 度很低,车辆的相机无法识别到该待识别对象1501。为此,本实施例所示的车辆的ADAS获取该待识别对象1501的平面坐标。
步骤1407、车辆获取第二光型。
步骤1408、车辆出射的第二光束,以第二光型显示于车辆周围。
本实施例所示的第二光型的长度、宽度、弯曲方向、闪光频率以及亮度等说明,请参见上述实施例对光毯的说明,具体在本实施例中不做限定。本实施例所示的第二光型所满足的条件为,该第二光型覆盖待识别对象的平面坐标,可知,如图15b所示,车辆1500所显示的第二光型1503至少覆盖目标区域,所述目标区域为所述待识别对象所占的区域。可知,第二光型1503能够照亮该目标区域。
本实施例所示的目标区域可位于第二光型1503的中央区域。需明确地是,本实施例对目标区域和第二光型1503之间的相对位置关系不做限定,只要第二光型1503至少覆盖目标区域即可。
可知,本实施例所示的车辆所显示的光毯,包括上述所示的用于照亮待行车路径的第一光型以及包括用于照亮待识别对象的第二光型。即本实施例所示的光毯由第一光型和第二光型叠加而成。
步骤1409、车辆采集待识别图像。
步骤1410、车辆根据待识别图像获取待识别对象的类型。
步骤1411、车辆根据待识别对象的类型确定行车决策。
本实施例所示的步骤1409至步骤1411的具体执行过程,请参见实施例三所示的步骤1207至步骤1208的执行过程的说明,具体不做赘述。
实施例五
本实施例提供了一种灯光系统,该灯光系统所出射的光束,能够在待行车路径上显示光毯,其中,该光毯的具体说明,请参见上述实施例一至实施例四任一实施例所示,具体在本实施例中不做限定。
如图16所示,其中,图16为本申请所提供的灯光系统的一种实施例结构示例图。
本实施例所示的灯光系统1600包括车灯模块1601以及与该车灯模块1601连接的控制单元1602。
所述控制单元1602用于,获取行车信息,所述行车信息包括导航信息和/或驾驶辅助信息,所述控制单元1602还用于获取待行车路径的信息;所述控制单元1602还用于获取与所述行车信息和所述待行车路径的信息对应的目标光型。
所述车灯模块1601用于出射光束,且该光束以所述目标光型,显示于所述待行车路径上。
以下对本实施例所示的灯光系统1600在车辆内部具体的位置的几种可选示例进行说明:
示例1:如图1所示的示例为例,该车辆100的前方设置独立的车灯模块150,该独立设置的车灯模块仅用于出射具有目标光型的光束。该车灯模块150包括车灯模块1601以及 与该车灯模块1601连接的控制单元1602。
示例2:车辆100的前方设置独立的车灯模块150,该独立设置的车灯模块仅用于出射具有目标光型的光束。该车灯模块150包括车灯模块1601,而车辆的计算机系统140包括控制单元1602。
示例3:车辆具有左侧前照灯,该左侧前照灯包括车灯模块1601以及与该车灯模块1601连接的控制单元1602,或者,该左侧前照灯包括车灯模块1601,而车辆的计算机系统140包括控制单元1602。
示例4:车辆具有右侧前照灯,该右侧前照灯包括车灯模块1601以及与该车灯模块1601连接的控制单元1602,或者,该右侧前照灯包括车灯模块1601,而车辆的计算机系统140包括控制单元1602。
示例5:本示例所示的车灯模块1601包括第一子车灯模块和第二子车灯模块,该第一子车灯模块出射第一子光束,该第二子车灯模块出射第二子光束,本示例所示的目标光型包括所述第一子光束在待行车路径上所显示的光型和第二子光束在待行车路径上所显示的光型。第一子车灯模块位于右侧前照灯内部,第二子车灯模块位于左侧前照灯内部。
可选地,该控制单元1602位于计算机系统140内,或,控制单元1602位于右侧前照灯内,或,控制单元1602位于左侧前照灯内,或,控制单元1602包括第一子控制单元和第二子控制单元,所述第一子控制单元和第二子控制单元分别位于右侧前照灯、左侧前照灯或车辆的计算机系统140中任意两个内。
可选地,车灯模块1601包括第一子车灯模块和第二子车灯模块分别位于车辆的左侧雾灯和右侧雾灯内,具体说明,请参见示例5所示,具体不做赘述。
本申请还包括一种车辆,该车辆包括如图16所示的灯光系统。
以上所述,以上实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的精神和范围。

Claims (19)

  1. 一种车辆灯光的控制方法,其特征在于,所述方法包括:
    获取行车信息,所述行车信息包括导航信息、驾驶辅助信息以及车机数据中的至少一种;
    获取待行车路径的信息;
    获取与所述行车信息和所述待行车路径的信息对应的目标光型;
    将车辆出射的光束以所述目标光型,显示于所述待行车路径上。
  2. 根据权利要求1所述的方法,其特征在于,所述获取与所述行车信息和所述待行车路径的信息对应的目标光型包括:
    获取至少一个第一显示属性,所述至少一个第一显示属性与所述行车信息对应;
    获取至少一个第二显示属性,所述至少一个第二显示属性与所述待行车路径的信息对应;
    确定具有所述至少一个第一显示属性和所述至少一个第二显示属性的光型为所述目标光型。
  3. 根据权利要求2所述的方法,其特征在于,所述获取至少一个第一显示属性包括:
    获取行车列表,所述行车列表包括不同的行车信息和不同的显示属性的对应关系;
    获取所述至少一个第一显示属性,所述至少一个第一显示属性为所述行车列表中,与所述行车信息对应的显示属性。
  4. 根据权利要求1至3任一项所述的方法,其特征在于,所述目标光型的宽度等于所述车辆的宽度,和/或,所述目标光型的长度大于或等于所述车辆的长度。
  5. 根据权利要求1至4任一项所述的方法,其特征在于,所述目标光型与所述待行车路径的形态一致,和/或,所述目标光型的宽度大于或等于所述待行车路径的宽度。
  6. 根据权利要求1至5任一项所述的方法,其特征在于,若所述待行车路径呈弧形,所述目标光型也呈弧形,且所述待行车路径的弯曲方向和所述目标光型的弯曲方向一致。
  7. 根据权利要求1至6任一项所述的方法,其特征在于,所述行车信息为来自所述车辆的高级驾驶辅助系统ADAS的行车决策,所述目标光型与所述行车决策的类型对应。
  8. 根据权利要求1至6任一项所述的方法,其特征在于,所述行车信息为所述车辆的车速,所述车辆的车速与所述目标光型的长度,以所述目标光型显示在地面上的光的闪光频率,或以所述目标光型显示在地面上的光的亮度中的至少一项呈正相关关系。
  9. 根据权利要求1至6任一项所述的方法,其特征在于,所述行车信息为所述车辆所位于的环境亮度。
  10. 根据权利要求1至6任一项所述的方法,其特征在于,所述行车信息为所述车辆与前方车辆之间的间距,所述间距的大小与以所述目标光型显示在地面上的光的亮度、或以所述目标光型显示在地面上的光的闪光频率中的至少一项呈负相关关系。
  11. 根据权利要求1至6任一项所述的方法,其特征在于,所述行车信息为所述车辆周围存在待识别对象,所述目标光型至少覆盖目标区域,所述目标区域为所述待识别对象所占的区域。
  12. 根据权利要求1至11任一项所述的方法,其特征在于,所述待行车路径的中心线与所述目标光型的中心线之间的间距小于或等于第一间距。
  13. 根据权利要求1至12任一项所述的方法,其特征在于,沿所述待行车路径的横向方向,所述目标光型的边界线与所述待行车路径的车道线之间的间距小于或等于第二间距。
  14. 根据权利要求1至13任一项所述的方法,其特征在于,所述获取行车信息之前,所述方法还包括:
    确定满足触发条件,所述触发条件为以下所示的至少一项:
    已获取用于显示所述目标光型的指令,车速大于或等于第一预设值,环境亮度小于或等于第二预设值,车速的变化量大于或等于第三预设值,所述环境亮度的变化量大于或等于第四预设值,或所述待行车路径的形态出现改变。
  15. 一种灯光系统,其特征在于,所述灯光系统包括车灯模块和控制单元,所述控制单元与所述车灯模块连接,所述控制单元用于,获取行车信息,所述行车信息包括导航信息、驾驶辅助信息以及车机数据中的至少一种,还用于获取待行车路径的信息;还用于获取与所述行车信息和所述待行车路径的信息对应的目标光型;所述车灯模块用于将车辆出射的光束以所述目标光型,显示于所述待行车路径上。
  16. 根据权利要求15所述的灯光系统,其特征在于,所述控制单元用于:
    获取至少一个第一显示属性,所述至少一个第一显示属性与所述行车信息对应;
    获取至少一个第二显示属性,所述至少一个第二显示属性与所述待行车路径的信息对应;
    确定具有所述至少一个第一显示属性和所述至少一个第二显示属性的光型为所述目标光型。
  17. 根据权利要求15或16所述的灯光系统,其特征在于,所述行车信息为来自所述车辆的高级驾驶辅助系统ADAS的行车决策,所述目标光型与所述行车决策的类型对应。
  18. 根据权利要求15至17任一项所述的灯光系统,其特征在于,所述行车信息为所述车辆周围存在待识别对象,所述目标光型至少覆盖目标区域,所述目标区域为所述待识别对象所占的区域。
  19. 一种车辆,其特征在于,所述车辆包括如权利要求15至18任一项所述的灯光系统。
PCT/CN2022/101849 2021-08-16 2022-06-28 一种车辆灯光的控制方法、灯光系统以及车辆 WO2023020123A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22857436.4A EP4368450A1 (en) 2021-08-16 2022-06-28 Vehicle light control method, lighting system, and vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110939124.6A CN115891815A (zh) 2021-08-16 2021-08-16 一种车辆灯光的控制方法、灯光系统以及车辆
CN202110939124.6 2021-08-16

Publications (1)

Publication Number Publication Date
WO2023020123A1 true WO2023020123A1 (zh) 2023-02-23

Family

ID=85176214

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/101849 WO2023020123A1 (zh) 2021-08-16 2022-06-28 一种车辆灯光的控制方法、灯光系统以及车辆

Country Status (3)

Country Link
EP (1) EP4368450A1 (zh)
CN (2) CN115891815A (zh)
WO (1) WO2023020123A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6343869B1 (en) * 1996-12-18 2002-02-05 Koito Manufacturing Co., Ltd. Light unit for vehicle
CN106114345A (zh) * 2016-08-02 2016-11-16 常州星宇车灯股份有限公司 基于图像处理的智能灯光调节系统及其调节方法
CN108216009A (zh) * 2016-12-22 2018-06-29 柯美汽车零部件(上海)有限公司 一种多信息融合的自适应前照灯系统及其控制方法
CN108973842A (zh) * 2018-06-14 2018-12-11 吉林省瑞中科技有限公司 汽车驾驶辅助灯
CN109466424A (zh) * 2018-10-15 2019-03-15 浙江吉利汽车研究院有限公司 一种智能远光灯控制系统及智能控制方法
GB2579024A (en) * 2018-11-14 2020-06-10 Jaguar Land Rover Ltd Vehicle control system and method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4161584B2 (ja) * 2002-02-07 2008-10-08 トヨタ自動車株式会社 移動体用安全装置
JP4720764B2 (ja) * 2006-11-16 2011-07-13 株式会社デンソー 前照灯制御装置
CN104842860B (zh) * 2015-05-20 2018-06-19 浙江吉利汽车研究院有限公司 一种应用于智能驾驶汽车上的行驶路径指示方法及系统
WO2018092710A1 (ja) * 2016-11-18 2018-05-24 パナソニックIpマネジメント株式会社 通知装置、自動運転車両、通知方法、プログラム、非一時的記録媒体、および通知システム
US10717384B2 (en) * 2017-10-25 2020-07-21 Pony Ai Inc. System and method for projecting trajectory path of an autonomous vehicle onto a road surface
JP7282543B2 (ja) * 2019-02-19 2023-05-29 日産自動車株式会社 車両の走行経路表示方法及び車両の走行経路表示装置
CN111439195A (zh) * 2019-07-15 2020-07-24 长城汽车股份有限公司 利用车灯投射图案的方法、车灯系统及车辆

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6343869B1 (en) * 1996-12-18 2002-02-05 Koito Manufacturing Co., Ltd. Light unit for vehicle
CN106114345A (zh) * 2016-08-02 2016-11-16 常州星宇车灯股份有限公司 基于图像处理的智能灯光调节系统及其调节方法
CN108216009A (zh) * 2016-12-22 2018-06-29 柯美汽车零部件(上海)有限公司 一种多信息融合的自适应前照灯系统及其控制方法
CN108973842A (zh) * 2018-06-14 2018-12-11 吉林省瑞中科技有限公司 汽车驾驶辅助灯
CN109466424A (zh) * 2018-10-15 2019-03-15 浙江吉利汽车研究院有限公司 一种智能远光灯控制系统及智能控制方法
GB2579024A (en) * 2018-11-14 2020-06-10 Jaguar Land Rover Ltd Vehicle control system and method

Also Published As

Publication number Publication date
CN115571044A (zh) 2023-01-06
EP4368450A1 (en) 2024-05-15
CN115891815A (zh) 2023-04-04

Similar Documents

Publication Publication Date Title
US11097660B2 (en) Driver assistance apparatus and control method for the same
US9970615B1 (en) Light-based vehicle-device communications
US10232713B2 (en) Lamp for a vehicle
US11854212B2 (en) Traffic light detection system for vehicle
CN110356402B (zh) 车辆控制装置、车辆控制方法及存储介质
US10479274B2 (en) Vehicle and control method for the same
JP6368958B2 (ja) 車両制御システム、車両制御方法、および車両制御プログラム
US10391931B2 (en) System and method for providing enhanced passenger use of an autonomous vehicle
US10824148B2 (en) Operating an autonomous vehicle according to road user reaction modeling with occlusions
JP6680136B2 (ja) 車外表示処理装置及び車外表示システム
KR101943809B1 (ko) 차량의 통지 장치
US10796580B2 (en) Vehicular image projection
US20200211379A1 (en) Roundabout assist
CN110271544A (zh) 车辆控制装置、车辆控制方法及存储介质
KR20210095757A (ko) 복수의 센서들을 이용하여 자율 주행을 수행하는 차량 및 그의 동작 방법
US11747815B2 (en) Limiting function of a vehicle control device related to defective image
US20240020988A1 (en) Traffic light detection and classification for autonomous driving vehicles
US11267397B2 (en) Autonomous driving vehicle information presentation apparatus
US20220176987A1 (en) Trajectory limiting for autonomous vehicles
US20230391250A1 (en) Adaptive illumination system for an autonomous vehicle
WO2023020123A1 (zh) 一种车辆灯光的控制方法、灯光系统以及车辆
WO2022202256A1 (ja) 車両制御装置、車両制御方法
WO2023020115A1 (zh) 一种车灯模块、灯光系统以及车辆
US11865967B2 (en) Adaptive illumination system for an autonomous vehicle
US20210171065A1 (en) Autonomous driving vehicle information presentation apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22857436

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022857436

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2022857436

Country of ref document: EP

Effective date: 20240207

NENP Non-entry into the national phase

Ref country code: DE