CN116572837A - Information display control method and device, electronic equipment and storage medium - Google Patents

Information display control method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116572837A
CN116572837A CN202310468653.1A CN202310468653A CN116572837A CN 116572837 A CN116572837 A CN 116572837A CN 202310468653 A CN202310468653 A CN 202310468653A CN 116572837 A CN116572837 A CN 116572837A
Authority
CN
China
Prior art keywords
information
driving
navigation data
intensity level
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310468653.1A
Other languages
Chinese (zh)
Inventor
李畅
张涛
张波
韩雨青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Zejing Automobile Electronic Co ltd
Original Assignee
Jiangsu Zejing Automobile Electronic Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Zejing Automobile Electronic Co ltd filed Critical Jiangsu Zejing Automobile Electronic Co ltd
Priority to CN202310468653.1A priority Critical patent/CN116572837A/en
Publication of CN116572837A publication Critical patent/CN116572837A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/001Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles integrated in the windows, e.g. Fresnel lenses
    • B60K35/23
    • B60K35/28
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K37/00Dashboards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • B60R16/0231Circuits relating to the driving or the functioning of the vehicle
    • B60R16/0232Circuits relating to the driving or the functioning of the vehicle for measuring vehicle parameters and indicating critical, abnormal or dangerous conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60K2360/166
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The disclosure relates to the technical field of driving assistance, and in particular relates to an information display control method and device, electronic equipment and a storage medium. The method comprises the following steps: collecting current driving monitoring information; wherein the driving monitoring information comprises at least one of driving environment information and driving state information; and determining an information reminding intensity level according to the driving monitoring information, and controlling the display style of navigation data corresponding to the AR-HUD equipment according to the information reminding intensity level. According to the scheme, the display effect of the navigation data based on the AR-HUD can be adaptively changed according to the change of the driving environment, so that the display mode of the navigation data displayed by the AR-HUD is more in line with the requirements of driving scenes, and the interference to a driver is reduced; and may increase interaction with the driver.

Description

Information display control method and device, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of driving assistance, and in particular relates to an information display control method and device, electronic equipment and a storage medium.
Background
In existing HUD (Head Up Display) devices, AR-HUD (Augmented Reality Head Up Display ) based on augmented reality technology is included. The AR-HUD can project data information such as navigation information, driving information and the like onto the front windshield through the optical system, and finally reflects on the windshield to enter human eyes for imaging, so that an augmented reality effect combined with a real scene and seen on the front windshield is formed. When the AR-HUD displays navigation information, contents such as driving directions, recommended lanes, driving paths and the like can be displayed; however, the display effect of the navigation icon is relatively single and fixed; and the display effect of the navigation icon may be superimposed on pedestrians and vehicles on a real road.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The disclosure provides an information display control method and device, electronic equipment and a storage medium, which can realize the self-adaptive change of the display effect of navigation data.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an information display control method including:
collecting current driving monitoring information; wherein the driving monitoring information comprises at least one of driving environment information and driving state information;
and determining an information reminding intensity level according to the driving monitoring information, and controlling the display style of navigation data corresponding to the AR-HUD equipment according to the information reminding intensity level.
In some exemplary embodiments, when the current driving monitoring information is collected, the method further includes:
determining the current driving scene type according to the navigation data and the current position information;
And when the current driving scene type is the target driving scene type, controlling the display style of navigation data corresponding to the AR-HUD equipment according to the determined information reminding intensity level.
In some exemplary embodiments, the determining an information reminding intensity level according to the driving monitoring information and controlling a display style of navigation data corresponding to the AR-HUD device according to the information reminding intensity level includes:
when the driving monitoring information is judged to meet a preset judgment condition, determining a corresponding information reminding intensity level and a navigation data display strategy corresponding to the information reminding intensity level;
and executing the navigation data display strategy to control the display style of the navigation data corresponding to the AR-HUD equipment.
In some exemplary embodiments, the method further comprises:
defining a driving risk level based on driving monitoring information;
and configuring the corresponding information reminding intensity level and the navigation data display strategy corresponding to the information reminding intensity level for each driving risk level in advance.
In some exemplary embodiments, when the current driving monitoring information is collected, the method further includes:
And determining a corresponding driving risk level according to the driving monitoring information, and determining a corresponding information reminding intensity level according to the driving risk level.
In some exemplary embodiments, each type of information in the driving monitoring information is preconfigured with a corresponding priority;
the method further comprises the steps of:
when a plurality of types of driving monitoring information are acquired, determining the driving monitoring information with the highest priority as driving monitoring information to be executed, and determining the information reminding intensity level according to the driving monitoring information to be executed.
In some exemplary embodiments, the driving environment information includes: at least one of vehicle information, pedestrian information and indication signal information of a preset distance range; the driving state information comprises: at least one of alarm information, speed per hour information and driving control information.
In some exemplary embodiments, the method further comprises:
the navigation data display strategy comprises the following steps: any one or more of the number, color, size, arrangement, and display position of the navigation icons.
According to a second aspect of the present disclosure, there is provided an information display control apparatus including:
The data acquisition module is used for acquiring current driving monitoring information; wherein the driving monitoring information comprises at least one of driving environment information and driving state information;
and the display style control module is used for determining an information reminding intensity level according to the driving monitoring information and controlling the display style of navigation data corresponding to the AR-HUD equipment according to the information reminding intensity level.
According to a third aspect of the present disclosure, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described information display control method.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to implement the above-described information display control method via execution of the executable instructions.
According to the information display control method provided by the embodiment of the disclosure, the corresponding information reminding intensity levels are configured for different driving monitoring information in advance, so that the driving monitoring information can be collected in real time in the driving process of a vehicle, the information reminding intensity level suitable for the current driving environment can be determined by using the driving environment information and/or the driving state information, and the display mode of the navigation data can be adjusted in a self-adaptive mode according to the information reminding intensity level, so that the display effect of the navigation data based on the AR-HUD can be changed in a self-adaptive mode according to the change of the driving environment, the display mode of the navigation data displayed by the AR-HUD is enabled to be more in accordance with the requirements of driving scenes, and the interference to a driver is reduced; and may increase interaction with the driver.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
Fig. 1 schematically illustrates a schematic diagram of an information display control method in an exemplary embodiment of the present disclosure;
FIG. 2 schematically illustrates a schematic diagram of a method of controlling a display style of navigation data in an exemplary embodiment of the present disclosure;
FIG. 3 schematically illustrates a schematic diagram of a display style of a navigation icon under an initial state policy in an exemplary embodiment of the present disclosure;
FIG. 4 schematically illustrates an effect diagram of a display style of a navigation icon under a second navigation data display policy based on alert information in an exemplary embodiment of the present disclosure;
FIG. 5 schematically illustrates an effect diagram of a display style of a navigation icon of a fourth navigation data display strategy based on a yellow signal indicator in an exemplary embodiment of the present disclosure;
FIG. 6 schematically illustrates an effect diagram of a display style of a navigation icon of a fourth navigation data display strategy based on a red signal indicator in an exemplary embodiment of the present disclosure;
FIG. 7 schematically illustrates an effect diagram of a display style of a third navigation data display policy navigation icon based on pedestrian information in an exemplary embodiment of the present disclosure;
FIG. 8 schematically illustrates a schematic view of a sector-shaped monitoring area in an exemplary embodiment of the present disclosure;
FIG. 9 schematically illustrates a schematic view of a front vehicle monitoring area in an exemplary embodiment of the present disclosure;
FIG. 10 schematically illustrates a schematic diagram of a display style of a navigation icon under an initial state strategy when turning in an exemplary embodiment of the present disclosure;
FIG. 11 schematically illustrates a schematic view of a display style of a reduced pitch navigation icon while turning in an exemplary embodiment of the present disclosure;
FIG. 12 schematically illustrates a schematic view of a display style of a changed navigation icon while turning in an exemplary embodiment of the present disclosure;
FIG. 13 schematically illustrates a schematic display principle of an AR-HUD in an exemplary embodiment of the present disclosure;
fig. 14 schematically illustrates a composition diagram of an information display control apparatus in an exemplary embodiment of the present disclosure;
Fig. 15 schematically illustrates a composition diagram of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
Aiming at the defects and shortcomings of the prior art, the information display control method is provided in the example embodiment, can be applied to AR-HUD equipment on a vehicle, can realize the self-adaptive change of the reminding intensity of navigation information displayed by the AR-HUD equipment on a driver in different driving scenes, and reduces the interference on the driver. Referring to fig. 1, the above-described information display control method may include:
step S11, current driving monitoring information is collected; wherein the driving monitoring information comprises at least one of driving environment information and driving state information;
and step S12, determining an information reminding intensity level according to the driving monitoring information, and controlling the display mode of navigation data corresponding to the AR-HUD equipment according to the information reminding intensity level.
According to the information display control method provided by the example embodiment, the corresponding information reminding intensity levels are configured for different driving monitoring information in advance, so that the driving monitoring information can be collected in real time in the driving process of a vehicle, the appropriate information reminding intensity level in the current driving environment can be determined by using driving environment information and/or driving state information, and the display style of navigation data can be adjusted in a self-adaptive mode according to the information reminding intensity level, so that the display effect of the navigation data based on the AR-HUD can be changed in a self-adaptive mode according to the change of the driving environment, the display mode of the navigation data displayed by the AR-HUD is more in accordance with the requirements of driving scenes, and the interference to a driver is reduced; and may increase interaction with the driver.
Next, each step of the information display control method in the present exemplary embodiment will be described in more detail with reference to the drawings and examples.
In step S11, current driving monitoring information is collected; the driving monitoring information comprises at least one of driving environment information and driving state information.
In this example embodiment, the driving monitoring information may refer to data collected in real time during the driving process of the vehicle through a vehicle, an intelligent vehicle system, or various types of sensors mounted on an intelligent terminal device. The driving monitoring information comprises driving environment information and/or driving state information. During the driving of the vehicle, contents such as navigation information, speed information, etc. may be displayed using the AR-HUD device. At this time, the assembled sensor can be used for collecting the driving environment information and the driving state information in real time.
In this example embodiment, the driving environment information includes: at least one of vehicle information, pedestrian information and indication signal information of a preset distance range; the driving state information comprises: at least one of alarm information, speed per hour information and driving control information.
Specifically, the vehicle information in the preset range may be to set up a coordinate system with the host vehicle as the origin, identify the vehicle in a certain distance range in the specified direction, and calculate the distance between the other vehicle and the host vehicle. For example, surrounding objects, vehicles may be identified by an ADAS (advanced driving assistance system ) sensor, such as a laser radar, millimeter wave radar, camera, or the like. By means of the ADAS controller, the relative positions (xv, yv) of the surrounding vehicles can be calculated; from this relative position, relative distance information between the host vehicle and the surrounding vehicles can be calculated.
The pedestrian information may be that a pedestrian in a certain distance range in a specified direction is identified in a coordinate system established with the host vehicle as an origin. For example, surrounding pedestrians may be recognized by a laser radar, a millimeter wave radar, a camera, or the like in a circular or sector-shaped monitoring range having the vehicle as an origin and a certain length as a radius. For example, the relative positions (xv, yv) of surrounding pedestrians can be calculated by the ADAS controller, and further the relative distance between the surrounding pedestrians and the host vehicle can be calculated using the relative positions. For example, referring to fig. 8, when the vehicle turns left, a left 270 ° sector area of the own vehicle may be established as a specified monitoring range; when the vehicle turns right, a right 270 ° sector area of the own vehicle may be established as a specified monitoring range.
The indication signal information may be traffic light information in a current driving road of the vehicle. For example, the color of the traffic light may be obtained by identifying the image captured by the camera; alternatively, the color of the traffic light may be obtained by navigating the real-time data.
The alert information may be an ADAS alert event, such as forward collision warning (Forward Collision Warning, FCW), blind-Spot Detection (BSD), pedestrian collision warning (Pedestrian Collision Warning, PCW), etc. The alarm information can be obtained through ADAS.
The speed per hour information may be a current running speed of the vehicle. For example, it may be acquired by a speed sensor of the smart terminal device or by a speed sensor of the vehicle. The driving control information can be a braking action of the vehicle and can be obtained through a braking sensor of the vehicle.
In step S12, an information reminding intensity level is determined according to the driving monitoring information, and a display style of navigation data corresponding to the AR-HUD device is controlled according to the information reminding intensity level.
In this example embodiment, referring to fig. 2, the step S12 may include:
step S21, when judging that the driving monitoring information meets a preset judging condition, determining a corresponding information reminding intensity level and a navigation data display strategy corresponding to the information reminding intensity level;
And S22, executing the navigation data display strategy to control the display style of the navigation data corresponding to the AR-HUD equipment.
Specifically, for each type of driving monitoring information, corresponding information reminding intensity levels can be configured in advance, and corresponding navigation data display strategies can be configured for different information reminding intensity levels. Wherein the navigation data display policy may include: any one or more of the number, color, size, arrangement, and display position of the navigation icons. The navigation data display strategy in the general navigation state can be preset as an initial state strategy; for example, the display style of the navigation icons under the initial state policy may be a display style including an initial number (for example, four) of consecutive navigation arrows with a certain distance, where each navigation arrow is configured with the same color, for example, green, and the display effect is as shown in fig. 3. The display style may be configured as an initial display style.
For example, a first information alert intensity level may be configured for the alert information based on the ADAS system, and the corresponding first navigation data display policy may include: the number of the navigation arrows is 1, the color is red, and the display gesture of the navigation arrows is adjusted to avoid overlapping with the display position of the alarm information. The display effect is shown in fig. 4.
In addition, a second information alert intensity level may be configured for the vehicle information, and the corresponding second navigation data display policy may include: the number of navigation arrows is 1, and the color is orange. A third information alert intensity level may be configured for pedestrian information, and the corresponding third navigation data display policy may include: the number of navigation arrows is 1, and the color is yellow. The display effect is shown in fig. 7.
The indication signal information may be configured with a fourth information reminding intensity level, and the corresponding fourth navigation data display policy includes: in the yellow light, the number of navigation arrows is 2, the color is the initial color, and the display effect is shown in fig. 5; in the case of red light, the number of navigation arrows is 1, the color is the initial color, and the display effect is shown in fig. 6. Alternatively, in some exemplary embodiments, the navigation data display policy may further include downsizing the navigation arrow.
For the speed per hour information and the driving control information, a fifth information reminding intensity level can be configured for the speed per hour information and the driving control information; correspondingly, a corresponding fifth navigation data display strategy can be configured for the fifth information reminding intensity level.
Of course, in some exemplary embodiments of the present disclosure, the vehicle information, the pedestrian information may also be configured to the same information alert intensity level. The same navigation data display strategy can also be configured for different information reminding intensity levels.
Specifically, for each type of driving monitoring information, a corresponding parameter threshold may be preset for determining whether the collected current parameter meets a preset determination condition. Specifically, after the current parameter value corresponding to each type of driving monitoring information is obtained according to the driving monitoring information acquired in real time, the current parameter value can be compared with the corresponding parameter threshold value to judge whether the preset judging condition is met. If the acquired current parameter value accords with the corresponding parameter threshold range, judging that the preset judging condition is met, and further determining the corresponding information reminding intensity level and the corresponding navigation data display strategy.
For example, as for the pedestrian information described above, a parameter threshold value thereof may be configured such that the distance between the pedestrian and the vehicle is 7m. For example, referring to fig. 8, a circular or sector area with a radius of 7m may be established in a coordinate system with the vehicle as an origin; if the calculated pedestrian relative position is in the fan-shaped range, judging that the preset judging condition is met; or if the position of the pedestrian is not in the fan-shaped range, judging that the preset judging condition is not met.
For the above-described vehicle information, the parameter threshold may be configured such that the distance between the preceding vehicle and the host vehicle is a fixed value; alternatively, corresponding distance thresholds may be configured for different speed conditions, for example, the distance threshold is V/1.8; wherein V is the speed per hour. For example, as shown in fig. 9, the distance between another vehicle and the host vehicle within 30m ahead of the host vehicle may be identified, and for example, the distance determination may be performed for the vehicle in the same lane or in an adjacent lane as the host vehicle. According to the current speed of the acquired time, calculating to obtain a corresponding distance threshold value Xv; if the distance between the vehicle and other vehicles in front is larger than Xv, judging that the preset condition is not met; or if the distance between the vehicle and the front vehicle is smaller than or equal to Xv, judging that the preset condition is met, and executing the information reminding intensity level corresponding to the vehicle information.
For example, for the alarm information, when the ADAS system sends out the ADAS alarm information, it may be determined that the preset determination condition is satisfied. For the indication signal information, when a yellow light or a red light is recognized, it may be judged that a preset judgment condition is satisfied. When the yellow light and the red light are used, different information reminding intensity levels and different navigation data display strategies can be configured. For the speed per hour information, a corresponding speed threshold value can be configured to be 3km/h, 5km/h or other values, and when the acquired current speed is smaller than the speed threshold value, the preset judging condition is judged to be met. The driving control information may be determined to satisfy a preset determination condition when the braking operation of the driver is recognized. In some exemplary embodiments, the speed information and the brake action information may be determined in combination.
In this exemplary embodiment, in the step S22, after determining the information reminding intensity level corresponding to the type parameter and the corresponding navigation data display policy to be executed according to the type of the driving monitoring information, the current navigation data display policy may be changed into the navigation data display policy to be executed.
For example, in the current navigation state under normal driving, when the vehicle turns, the navigation data displayed by the AR-HUD device is the initial state strategy, and the display effect of the current navigation icon is shown in fig. 3. If the currently acquired driving monitoring information is ADAS alarm information, judging that preset judging conditions are met according to the ADAS alarm information, and determining a corresponding first information reminding intensity level and a corresponding first navigation data display strategy to be executed. The display style of the navigation icon may be changed from the display style shown in fig. 3 to the display style shown in fig. 4.
In this example embodiment, each type of information in the driving monitoring information is preconfigured with a corresponding priority. The method further comprises the steps of: when a plurality of types of driving monitoring information are acquired, determining the driving monitoring information with the highest priority as driving monitoring information to be executed, and determining the information reminding intensity level according to the driving monitoring information to be executed.
Specifically, the warning information in the driving state information may be configured as the highest priority, the indication signal information in the driving environment information may be configured as the second priority, the vehicle information and the pedestrian information may be configured as the third priority, and the speed information and the driving control information may be configured as the fourth priority.
When a plurality of types of driving monitoring information are collected, whether the corresponding preset judging conditions are met or not can be judged for each type of driving monitoring information. If two or more than two different types of driving monitoring information are judged to meet the corresponding preset judging conditions, the driving monitoring information with the highest priority can be selected, and the corresponding navigation data display strategy is executed.
For example, when the vehicle turns right, the current traffic indicator lamp is identified as a red lamp according to the collected driving monitoring information, pedestrians exist around the vehicle, and a driver has braking actions, namely, indication signal information, pedestrian information and driving control information. And determining a fourth information reminding intensity level of the indication signal information to be executed and a corresponding fourth navigation data display strategy based on the pre-configured priority when the preset parameter thresholds meet preset judging conditions.
In this example embodiment, the method further includes: defining a driving risk level based on driving monitoring information; and configuring the corresponding information reminding intensity level and the navigation data display strategy corresponding to the information reminding intensity level for each driving risk level in advance.
Specifically, for each type of driving environment information and driving state information, a corresponding driving risk level may be preconfigured according to the risk level of the collected data. For example, when the risk level of the collected data is high, which means that the vehicle is affected by environmental factors currently, the corresponding weak information reminding intensity can be configured, so that the attention of the driver can not be dispersed by the navigation reminding information. When the risk degree of the collected data is low or no risk, the vehicle is indicated to be affected by environmental factors at present, and the corresponding strong information reminding intensity can be configured, so that a driver can pay more attention to the navigation reminding information.
For example, the highest first risk level may be configured for the alert information, and the corresponding configurable information alert strength is defined as the weakest first information alert strength level. Correspondingly, the alarm information can be configured as the driving monitoring information with the highest priority. And secondly, risk levels of indication signal information, vehicle information and pedestrian information can be respectively configured to be sequentially reduced, and corresponding information reminding intensity is sequentially increased. Correspondingly, as driving monitoring information, the data priority is also reduced in sequence.
In this example embodiment, based on the foregoing, the method further includes: and determining a corresponding driving risk level according to the driving monitoring information, and determining a corresponding information reminding intensity level according to the driving risk level.
Specifically, when a plurality of types of driving monitoring information are collected, it may be first determined whether a corresponding preset determination condition is satisfied. If a plurality of driving monitoring information meeting the preset judging conditions exists, selecting an information reminding intensity level corresponding to the driving monitoring information with the highest driving risk level and a corresponding navigation data display strategy.
For example, the currently collected driving monitoring information includes: the warning information and the pedestrian information can be judged whether the warning information and the pedestrian information meet the corresponding preset judging conditions. If the first information prompting intensity level and the first navigation data displaying strategy corresponding to the warning information are met, the highest running risk level can be achieved according to the warning information.
In this example embodiment, when the current driving monitoring information is collected, the method further includes:
step S31, determining the current driving scene type according to the navigation data and the current position information;
And step S32, when the current driving scene type is the target driving scene type, controlling the display style of navigation data corresponding to the AR-HUD equipment according to the determined information reminding intensity level.
Specifically, a plurality of driving scene types may be predefined according to driving actual environments. Specifically, a turn scene, a low speed scene, a high speed scene, a cut-in scene, and the like may be included. For example, a turning scene may refer to a driving scene of turning and turning around at an intersection or a T-junction. The low speed scene may be a driving scene with a driving speed below 60 km/h; the high speed scene may be a driving scene having a driving speed greater than or equal to 60 km/h.
In the running process of the vehicle, the running monitoring information can be collected in real time; meanwhile, the path information and the current position information of the navigation data can be acquired in real time. And according to the collected current position information, the specific position in the planning path can be determined, so that the current driving scene type can be determined. In addition, the current low-speed scene or high-speed scene can be determined according to the acquired speed per hour information.
The target driving scene type may be a turning scene. When the vehicle is determined to be currently in a turning scene according to the navigation data and the current position information, whether the acquired driving monitoring information meets preset conditions or not can be triggered to be judged.
In this example embodiment, when changing to the navigation data display policy to be executed according to the current navigation data display policy, the dynamic change process of the navigation icon may be configured. For example, referring to fig. 10, 11, and 12, the dynamic change process may be to first reduce the spacing between the plurality of navigation arrows, and then reduce the plurality of navigation icons to one or two navigation arrows. In addition, based on the collected pedestrian position and the vehicle position, the display position of the navigation icon can be configured so as to avoid the display of the navigation icon overlapping with a real object in a real scene.
For example, when signal light information is collected, it may be first determined whether the signal light color is green; if yes, controlling to execute the navigation data initial state strategy; if not, judging whether the color of the signal lamp is yellow, and if so, controlling and executing a display mode corresponding to the yellow signal indicator lamp in the fourth navigation data display strategy; if not, judging whether the signal lamp is red, if so, controlling and executing a display mode corresponding to the red signal indicator lamp in the fourth navigation data display strategy.
When the surrounding vehicle information is acquired, acquiring the relative position coordinates of the vehicle, judging whether the vehicle is in a preset monitoring influence area according to the relative position coordinates, and if not, controlling to execute the navigation data initial state strategy; if yes, judging whether the relative distance is smaller than a preset threshold value, if not, controlling to execute the navigation data initial state strategy, and if yes, controlling to execute the second navigation data display strategy.
When the surrounding pedestrian information is acquired, firstly acquiring the relative position coordinates of pedestrians, and judging whether the pedestrians are in a preset pedestrian monitoring area or not according to the relative position coordinates; if not, controlling to execute the navigation data initial state strategy; if yes, controlling to execute a third navigation data display strategy.
And when judging that the ADAS alarm event exists according to the alarm information, controlling to execute a corresponding first navigation data display strategy, otherwise, executing a navigation data initial state strategy.
When a brake signal is acquired, whether the vehicle speed is smaller than a preset threshold value or not can be judged first; if not, executing the navigation data initial state strategy; if yes, executing a fifth navigation data display strategy.
The navigation icon can be obtained by calculation through a display operation unit of the AR-HUD system; the display operation unit may be a three-dimensional engine tool. Referring to fig. 13, with the driver's eye point coordinates as a marker point, the driver's field angle FOV is the same as that of the AR-HUD device, a virtual three-dimensional camera is set, and the virtual three-dimensional camera environmental size parameter is kept consistent with the real world; at this time, the graphics are attached to the environment ground where the virtual three-dimensional camera is located in equal proportion.
According to the information display control method provided by the embodiment of the disclosure, a user can acquire driving monitoring information in real time in the driving process of a driving vehicle, judge specific parameters of the driving monitoring information, and adjust the display style of the navigation icons by utilizing a preconfigured navigation data display strategy when the judgment meets preset judgment conditions. The driving monitoring information may include surrounding driving vehicles, surrounding pedestrians, traffic light information, driver stepping on the brake, etc. When the environment variable is changed, the number, the gesture and the shape of the navigation indication arrows are changed. According to the change, the navigation arrow can change the reminding intensity of the driver in different driving scenes, so that the interference to the driver is reduced. For example, for traffic light information of an intersection, when a vehicle approaches the traffic light intersection, if the vehicle encounters a yellow light or a red light, the number, the shape and the gesture of indication arrows in navigation/AR-HUD navigation can be changed, so that the reminding intensity is reduced, and the interference to a driver is reduced. For surrounding vehicle information, when the front vehicle decelerates, parks or turns, and when other vehicles are arranged on the turning side (including a vehicle blind area), the number, the shape and the gesture of indication arrows in navigation/AR-HUD navigation can be changed, so that the reminding intensity is reduced, and the interference to a driver is reduced. For surrounding pedestrian information, when pedestrians appear in front of the vehicle or pedestrians pass through a steering side (including a vehicle blind area) during steering, the number, the shape and the posture of indication arrows in navigation/AR-HUD navigation are changed, so that the reminding intensity is reduced, and the interference to a driver is reduced. When ADAS alarm events occur, such as front collision early warning FCW, blind area monitoring early warning BSD, pedestrian collision early warning PCW and the like, the number, the shape and the gesture of indication arrows in navigation/AR-HUD navigation can be changed, so that the reminding intensity is reduced, and the interference to a driver is reduced. When the driver actively steps on the brake, the number, the shape and the posture of the indication arrows in the navigation/AR-HUD navigation can be changed after the vehicle speed is reduced to a preset speed.
It is noted that the above-described figures are only schematic illustrations of processes involved in a method according to an exemplary embodiment of the invention, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Further, referring to fig. 14, in the embodiment of the present example, there is also provided an information display control apparatus 140, including: a data acquisition module 1401 and a display style control module 1402. Wherein, the liquid crystal display device comprises a liquid crystal display device,
the data acquisition module 1401 may be configured to acquire current driving monitoring information; the driving monitoring information comprises at least one of driving environment information and driving state information.
The display style control module 1402 may be configured to determine an information reminding intensity level according to the driving monitoring information, and control a display style of navigation data corresponding to the AR-HUD device according to the information reminding intensity level.
The specific details of each module in the information display control apparatus 140 are described in detail in the corresponding display control method, and thus are not described herein.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Fig. 15 shows a schematic diagram of an electronic device suitable for use in implementing embodiments of the invention.
It should be noted that, the electronic device 1000 shown in fig. 15 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present disclosure.
As shown in fig. 15, the electronic apparatus 1000 includes a central processing unit (Central Processing Unit, CPU) 1001 that can perform various appropriate actions and processes according to a program stored in a Read-Only Memory (ROM) 1002 or a program loaded from a storage section 1008 into a random access Memory (Random Access Memory, RAM) 1003. In the RAM 1003, various programs and data required for system operation are also stored. The CPU 1001, ROM 1002, and RAM 1003 are connected to each other by a bus 1004. An Input/Output (I/O) interface 1005 is also connected to bus 1004.
The following components are connected to the I/O interface 1005: an input section 1006 including a keyboard, a mouse, and the like; an output portion 1007 including a Cathode Ray Tube (CRT), a liquid crystal display (Liquid Crystal Display, LCD), and a speaker; a storage portion 1008 including a hard disk or the like; and a communication section 1009 including a network interface card such as a LAN (Local Area Network ) card, a modem, or the like. The communication section 1009 performs communication processing via a network such as the internet. The drive 1010 is also connected to the I/O interface 1005 as needed. A removable medium 1011, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is installed on the drive 1010 as needed, so that a computer program read out therefrom is installed into the storage section 1008 as needed.
In particular, according to embodiments of the present application, the processes described below with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program loaded on a storage medium, the computer program comprising program code for performing the method shown in the flowchart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 1009, and/or installed from the removable medium 1011. When executed by a Central Processing Unit (CPU) 1001, the computer program performs various functions defined in the system of the present application.
Specifically, the electronic device may be an intelligent mobile electronic device such as a mobile phone, a tablet computer or a notebook computer. Alternatively, the electronic device may be an intelligent electronic device such as a desktop computer.
It should be noted that, the storage medium shown in the embodiments of the present invention may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any storage medium that is not a computer readable storage medium and that can transmit, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a storage medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present invention may be implemented by software, or may be implemented by hardware, and the described units may also be provided in a processor. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
It should be noted that, as another aspect, the present application also provides a storage medium, which may be included in an electronic device; or may exist alone without being incorporated into the electronic device. The storage medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the methods described in the embodiments below. For example, the electronic device may implement the steps of the information display control method shown in fig. 1.
Furthermore, the above-described drawings are only schematic illustrations of processes included in the method according to the exemplary embodiment of the present application, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (11)

1. An information display control method, characterized in that the method comprises:
collecting current driving monitoring information; wherein the driving monitoring information comprises at least one of driving environment information and driving state information;
and determining an information reminding intensity level according to the driving monitoring information, and controlling the display style of navigation data corresponding to the AR-HUD equipment according to the information reminding intensity level.
2. The information display control method according to claim 1, wherein when the current driving monitoring information is collected, the method further comprises:
determining the current driving scene type according to the navigation data and the current position information;
and when the current driving scene type is the target driving scene type, controlling the display style of navigation data corresponding to the AR-HUD equipment according to the determined information reminding intensity level.
3. The information display control method according to claim 1 or 2, wherein the determining an information reminding intensity level according to the driving monitoring information and controlling a display style of navigation data corresponding to the AR-HUD device according to the information reminding intensity level comprise:
When the driving monitoring information is judged to meet a preset judgment condition, determining a corresponding information reminding intensity level and a navigation data display strategy corresponding to the information reminding intensity level;
and executing the navigation data display strategy to control the display style of the navigation data corresponding to the AR-HUD equipment.
4. The information display control method according to claim 1, characterized in that the method further comprises:
defining a driving risk level based on driving monitoring information;
and configuring the corresponding information reminding intensity level and the navigation data display strategy corresponding to the information reminding intensity level for each driving risk level in advance.
5. The information display control method according to claim 4, wherein when the current driving monitoring information is collected, the method further comprises:
and determining a corresponding driving risk level according to the driving monitoring information, and determining a corresponding information reminding intensity level according to the driving risk level.
6. The information display control method according to claim 1, wherein each type of information in the driving monitoring information is preconfigured with a corresponding priority;
The method further comprises the steps of:
when a plurality of types of driving monitoring information are acquired, determining the driving monitoring information with the highest priority as driving monitoring information to be executed, and determining the information reminding intensity level according to the driving monitoring information to be executed.
7. The information display control method according to claim 1, wherein the running environment information includes: at least one of vehicle information, pedestrian information and indication signal information of a preset distance range;
the driving state information comprises: at least one of alarm information, speed per hour information and driving control information.
8. The information display control method according to claim 3, characterized in that the method further comprises:
the navigation data display strategy comprises the following steps: any one or more of the number, color, size, arrangement, and display position of the navigation icons.
9. An information display control apparatus, characterized by comprising:
the data acquisition module is used for acquiring current driving monitoring information; wherein the driving monitoring information comprises at least one of driving environment information and driving state information;
and the display style control module is used for determining an information reminding intensity level according to the driving monitoring information and controlling the display style of navigation data corresponding to the AR-HUD equipment according to the information reminding intensity level.
10. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to execute the information display control method of any one of claims 1 to 8 via execution of the executable instructions.
11. A storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the information display control method according to any one of claims 1 to 8.
CN202310468653.1A 2023-04-27 2023-04-27 Information display control method and device, electronic equipment and storage medium Pending CN116572837A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310468653.1A CN116572837A (en) 2023-04-27 2023-04-27 Information display control method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310468653.1A CN116572837A (en) 2023-04-27 2023-04-27 Information display control method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116572837A true CN116572837A (en) 2023-08-11

Family

ID=87538820

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310468653.1A Pending CN116572837A (en) 2023-04-27 2023-04-27 Information display control method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116572837A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117075350A (en) * 2023-09-27 2023-11-17 江苏泽景汽车电子股份有限公司 Driving interaction information display method and device, storage medium and electronic equipment

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017097687A (en) * 2015-11-26 2017-06-01 矢崎総業株式会社 Vehicular information presentation device
CN107107752A (en) * 2014-12-25 2017-08-29 三菱电机株式会社 Display control unit
CN108136907A (en) * 2015-10-09 2018-06-08 日产自动车株式会社 Display apparatus and vehicle display methods
JP2019113809A (en) * 2017-12-26 2019-07-11 マクセル株式会社 Head-up display device
CN111954902A (en) * 2018-03-30 2020-11-17 松下知识产权经营株式会社 Video display system, video display method, program, and moving object provided with video display system
CN113165510A (en) * 2018-11-23 2021-07-23 日本精机株式会社 Display control apparatus, method and computer program
CN113183758A (en) * 2021-04-28 2021-07-30 昭通亮风台信息科技有限公司 Auxiliary driving method and system based on augmented reality
JP2022044136A (en) * 2020-09-07 2022-03-17 日本精機株式会社 Display controller, head-up display device, and display control method for head-up display device
JP2022084266A (en) * 2020-11-26 2022-06-07 日本精機株式会社 Display control device, display device, and image display control method
CN115107516A (en) * 2022-07-07 2022-09-27 重庆长安汽车股份有限公司 Display method and device of head-up display system, vehicle and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107107752A (en) * 2014-12-25 2017-08-29 三菱电机株式会社 Display control unit
CN108136907A (en) * 2015-10-09 2018-06-08 日产自动车株式会社 Display apparatus and vehicle display methods
JP2017097687A (en) * 2015-11-26 2017-06-01 矢崎総業株式会社 Vehicular information presentation device
JP2019113809A (en) * 2017-12-26 2019-07-11 マクセル株式会社 Head-up display device
CN111954902A (en) * 2018-03-30 2020-11-17 松下知识产权经营株式会社 Video display system, video display method, program, and moving object provided with video display system
CN113165510A (en) * 2018-11-23 2021-07-23 日本精机株式会社 Display control apparatus, method and computer program
JP2022044136A (en) * 2020-09-07 2022-03-17 日本精機株式会社 Display controller, head-up display device, and display control method for head-up display device
JP2022084266A (en) * 2020-11-26 2022-06-07 日本精機株式会社 Display control device, display device, and image display control method
CN113183758A (en) * 2021-04-28 2021-07-30 昭通亮风台信息科技有限公司 Auxiliary driving method and system based on augmented reality
CN115107516A (en) * 2022-07-07 2022-09-27 重庆长安汽车股份有限公司 Display method and device of head-up display system, vehicle and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117075350A (en) * 2023-09-27 2023-11-17 江苏泽景汽车电子股份有限公司 Driving interaction information display method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
US9855894B1 (en) Apparatus, system and methods for providing real-time sensor feedback and graphically translating sensor confidence data
US11377025B2 (en) Blocked information displaying method and system for use in autonomous vehicle
US11827274B2 (en) Turn path visualization to improve spatial and situational awareness in turn maneuvers
US10452930B2 (en) Information display device mounted in vehicle including detector
US20200376961A1 (en) Method for displaying the course of a trajectory in front of a transportation vehicle or an object by a display unit, and device for carrying out the method
JP6443716B2 (en) Image display device, image display method, and image display control program
WO2019060891A9 (en) Augmented reality dsrc data visualization
US20210323540A1 (en) Vehicle driving and monitoring system, vehicle including the vehicle driving and monitoring system, method for maintaining a situational awareness at a sufficient level, and computer readable medium for implementing the method
CN111169381A (en) Vehicle image display method and device, vehicle and storage medium
JP2023174676A (en) Vehicle display control device, method and program
CN116572837A (en) Information display control method and device, electronic equipment and storage medium
CN114298908A (en) Obstacle display method and device, electronic equipment and storage medium
CN110834626B (en) Driving obstacle early warning method and device, vehicle and storage medium
US11766938B1 (en) Augmented reality head-up display for overlaying a notification symbol over a visually imperceptible object
US11845429B2 (en) Localizing and updating a map using interpolated lane edge data
CN114170846B (en) Vehicle lane change early warning method, device, equipment and storage medium
WO2022168540A1 (en) Display control device and display control program
CN115649167A (en) Vehicle lane change determining method and device, electronic equipment and storage medium
JP2019144971A (en) Control system and control method for moving body
JP7167812B2 (en) VEHICLE CONTROL DEVICE, METHOD, PROGRAM AND VEHICLE DISPLAY DEVICE
JP2022516849A (en) Heads-up display systems, methods, data carriers, processing systems and vehicles
CN110979319A (en) Driving assistance method, device and system
WO2022270207A1 (en) Vehicular display control device and vehicular display control program
CN218287728U (en) Road lane line scene reconstruction system and vehicle
JP3222638U (en) Safe driving support device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination