CN116767262A - Driving auxiliary line display method, device, equipment and medium - Google Patents

Driving auxiliary line display method, device, equipment and medium Download PDF

Info

Publication number
CN116767262A
CN116767262A CN202310751628.4A CN202310751628A CN116767262A CN 116767262 A CN116767262 A CN 116767262A CN 202310751628 A CN202310751628 A CN 202310751628A CN 116767262 A CN116767262 A CN 116767262A
Authority
CN
China
Prior art keywords
current
current road
road condition
vehicle
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310751628.4A
Other languages
Chinese (zh)
Inventor
王梓安
于欢
李木犀
陈后立
杨雪珠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FAW Group Corp
Original Assignee
FAW Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FAW Group Corp filed Critical FAW Group Corp
Priority to CN202310751628.4A priority Critical patent/CN116767262A/en
Publication of CN116767262A publication Critical patent/CN116767262A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a display method, a device, equipment and a medium of a driving auxiliary line. The method comprises the following steps: acquiring current road condition information and current vehicle information of a target vehicle; generating a driving assistance strategy corresponding to the target vehicle based on the current road condition information and the current vehicle information; the driving assistance strategy comprises a safety grade and a driving assistance line; and displaying the driving auxiliary line according to the safety level through an augmented reality technology. Through the technical scheme of the invention, the driving pressure of a driver can be effectively reduced, and the risk of driving accidents is reduced.

Description

Driving auxiliary line display method, device, equipment and medium
Technical Field
The invention relates to the technical field of augmented reality, in particular to a driving auxiliary line display method, device, equipment and medium.
Background
With rapid development of society and technology, car navigation has become an important auxiliary means for driving a vehicle. However, with conventional car navigation, since the navigation is under the windshield, the driver can only determine the driving route on the map by the afterlight or short low head, and such distraction may cause driving accidents.
In addition, when passing through a roundabout or a plurality of branches, conventional car navigation generally informs a driver of a route through voice, and voice information may not be able to convey accurate information to the driver due to various uncertainty factors. In addition, when the driver looks at the car navigation map, there is a possibility that the specific traveling direction cannot be seen on the map due to a small front branch road distance.
Therefore, how to improve the utilization efficiency of the vehicle navigation and ensure the driving safety is a problem to be solved urgently at present.
Disclosure of Invention
The invention provides a display method, a device, equipment and a medium for driving auxiliary lines, which can solve the problems that the utilization efficiency of vehicle navigation is low and the driving safety cannot be ensured.
According to an aspect of the present invention, there is provided a display method of a driving assistance line, including:
acquiring current road condition information and current vehicle information of a target vehicle;
generating a driving assistance strategy corresponding to the target vehicle based on the current road condition information and the current vehicle information; the driving assistance strategy comprises a safety grade and a driving assistance line;
and displaying the driving auxiliary line according to the safety level through an augmented reality technology.
According to another aspect of the present invention, there is provided a display device of a driving assistance line, including:
the data acquisition module is used for acquiring the current road condition information and the current vehicle information of the target vehicle;
the auxiliary strategy generation module is used for generating a driving auxiliary strategy corresponding to the target vehicle based on the current road condition information and the current vehicle information; the driving assistance strategy comprises a safety grade and a driving assistance line;
and the auxiliary line display module is used for displaying the driving auxiliary line according to the safety level through an augmented reality technology.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores a computer program executable by the at least one processor, and the computer program is executed by the at least one processor, so that the at least one processor can execute the driving assistance line display method according to any embodiment of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to implement the method for displaying a drive line according to any one of the embodiments of the present invention when executed.
According to the technical scheme, a driving assistance strategy corresponding to a target vehicle is generated based on the acquired current road condition information and the current vehicle information; the driving assistance strategy comprises a safety grade and a driving assistance line; furthermore, the driving auxiliary line is displayed according to the safety level through the augmented reality technology, so that the problems that the utilization efficiency of vehicle-mounted navigation is low and the driving safety cannot be guaranteed are solved, the driving pressure of a driver can be effectively reduced, and the risk of driving accidents is reduced.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a method for displaying a driving assistance line according to a first embodiment of the present invention;
fig. 2 is a flowchart of a method for displaying a driving assistance line according to a second embodiment of the present invention;
FIG. 3 is a flowchart of a training process for a target security computing model according to a second embodiment of the invention;
FIG. 4 is a flowchart of a target secure computing model usage process provided in accordance with a second embodiment of the present invention;
fig. 5 is a flowchart of an alternative method for displaying a driving assistance line according to the second embodiment of the present invention;
fig. 6 is a schematic structural diagram of a display device for a driving auxiliary line according to a third embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device implementing a method for displaying a driving assistance line according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "target," "current," and the like in the description and claims of the present invention and the above-described drawings are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a driving assistance line display method according to an embodiment of the present invention, where the method may be applied to enhance the current map navigation during the driving process of a vehicle, and the method may be performed by a driving assistance line display device, where the driving assistance line display device may be implemented in hardware and/or software, and the driving assistance line display device may be configured in an electronic device, and may be configured in a vehicle system of a vehicle, for example. As shown in fig. 1, the method includes:
s110, acquiring current road condition information and current vehicle information of the target vehicle.
The target vehicle may refer to a vehicle that is currently required to generate a driving assistance line. The current road condition information may refer to the road condition information of the current driving road of the target vehicle. The current vehicle information may refer to current vehicle parameter information of the target vehicle.
In an alternative embodiment, obtaining the current road condition information and the current vehicle information of the target vehicle may include: collecting current road condition information and current vehicle information of a target vehicle in real time through a vehicle-mounted camera or a vehicle-mounted radar of the target vehicle; the current road condition information comprises: the current road type, the current lane and the current road condition; the current vehicle information includes: current vehicle speed and current vehicle weight.
The current road type may refer to a road type of a current driving road of the target vehicle. For example, it may be of a curve type or a normal road type. The current lane may refer to a lane ordering of the current driving road of the target vehicle. For example, the first lane may be the left side, the middle lane, or the like. The current road condition may refer to a road surface condition of a current driving road of the target vehicle. The road conditions can be rain road conditions, snow road conditions, red light facilities and the like. The current vehicle speed may refer to a vehicle speed at which the target vehicle is currently traveling. The current vehicle weight may refer to the vehicle body weight at the current time of the target vehicle.
Specifically, the current road condition information and the current vehicle information corresponding to the target vehicle can be acquired in real time through the vehicle-mounted camera or the vehicle-mounted radar of the target vehicle, so that an effective basis is provided for subsequent operation.
S120, generating a driving assistance strategy corresponding to the target vehicle based on the current road condition information and the current vehicle information; the driving assistance strategy comprises a safety level and a driving assistance line.
The driving assistance policy may refer to a driving assistance instruction corresponding to the target vehicle at the current moment. The safety level may refer to safety advice information generated for the target vehicle according to the current road condition information and the current vehicle information. Illustratively, security levels may be categorized as safe, low risk, and high risk. Specifically, when the target vehicle can normally run, the safety level can be matched with safety; when equipment such as a red light, a crosswalk or a school exists in front of the target vehicle, the safety level can be matched with low risk; the security level may match the high risk when the current speed of the target vehicle exceeds the current road segment speed limit.
The driving auxiliary line can be a driving process auxiliary line constructed according to the current road condition information and the current vehicle information. For example, the driving assistance line may include a path direction in which the target vehicle needs to travel, the page may include an obstacle existing in front of the target vehicle, and the like.
And S130, displaying the driving auxiliary line according to the safety level through an augmented reality technology.
For example, the colors of the driving assistance lines may be classified according to the security level. If the security level is matched with the security, the driving auxiliary line can correspond to blue; when the safety level is matched with low risk, the driving auxiliary line can be corresponding to yellow; when the security level matches a high risk, the driving assistance line may correspond to red.
Specifically, when the target vehicle can normally run, the safety level is matched and safe, and a blue driving auxiliary line can be generated for display; when equipment such as a red light, a crosswalk or a school exists in front of the target vehicle, the safety level is matched with low risk, and a yellow driving auxiliary line can be generated for display; when the current speed of the target vehicle exceeds the current road section speed limit, the safety level is matched with the high risk, and a red driving auxiliary line can be generated for display.
According to the technical scheme, a driving assistance strategy corresponding to a target vehicle is generated based on the acquired current road condition information and the current vehicle information; the driving assistance strategy comprises a safety grade and a driving assistance line; furthermore, the driving auxiliary line is displayed according to the safety level through the augmented reality technology, so that the problems that the utilization efficiency of vehicle-mounted navigation is low and the driving safety cannot be guaranteed are solved, the driving pressure of a driver can be effectively reduced, and the risk of driving accidents is reduced.
Example two
Fig. 2 is a flowchart of a driving assistance line display method according to a second embodiment of the present invention, where the embodiment is based on the foregoing embodiment, and in this embodiment, the operation of generating a driving assistance policy corresponding to a target vehicle based on the current road condition information and the current vehicle information is specifically refined, and may specifically include: analyzing and processing the current road condition information, and determining the current road type, the current lane and the current road condition of the target vehicle; if the current road type is a curve, generating a driving auxiliary strategy corresponding to the current road condition and the current vehicle information according to the target safety calculation model; and if the current road type is a normal road, generating a driving assistance strategy corresponding to the current lane based on the current vehicle information, the current road condition and the current map navigation. As shown in fig. 2, the method includes:
s210, acquiring current road condition information and current vehicle information of a target vehicle in real time through a vehicle-mounted camera or a vehicle-mounted radar of the target vehicle; the current road condition information comprises: the current road type, the current lane and the current road condition; the current vehicle information includes: current vehicle speed and current vehicle weight.
S220, analyzing and processing the current road condition information, and determining the current road type, the current lane and the current road condition of the target vehicle.
S230, judging the current road type, and if the current road type is a curve, executing a step S240; if the current road type is the normal road, step S250 is executed.
The analysis processing may refer to an operation of analyzing a current road type, a current lane and a current road condition in the current road condition information. For example, image processing may be used.
Specifically, after the current road condition information and the current vehicle information of the target vehicle are obtained, the current road type, the current lane and the current road condition corresponding to the target vehicle can be determined by analyzing and processing the current road condition information.
S240, generating a driving assistance strategy corresponding to the current road condition and the current vehicle information according to the target safety calculation model.
The target safety calculation model may refer to a model trained in advance for calculating a current safety level of the target vehicle.
In an optional embodiment, before generating the driving assistance policy corresponding to the current road condition and the current vehicle information according to the target safety calculation model if the current road type is a curve, the method further includes: constructing an initial data set, and data processing the initial data set to generate a target data set and a corresponding security tag; and inputting the target data set and the corresponding security tag into a preset learning algorithm model for training to obtain a target security calculation model.
Where the initial dataset may refer to the actual dataset initially collected. In general, the actual data in the initial data set may not completely match the security influencing factors, and there may be missing values in the actual data. Data processing may refer to filtering operations on actual data in the initial dataset and missing value compensation operations. The target data set may refer to a final data set obtained by processing the initial data set. The security tag may refer to a security level tag corresponding to each data in the target data set. The pre-set learning algorithm model may refer to a pre-selected learning algorithm. By way of example, a random forest may be used.
FIG. 3 is a flowchart of a training process of a target security computing model according to an embodiment of the present invention. Specifically, firstly, an initial data set containing a large amount of actual data is constructed, then, the initial data set is subjected to data processing to obtain a target data set, further, a preset learning algorithm model is selected, and the target data set is utilized to train the preset learning algorithm model, so that a target safety calculation model meeting training standards is obtained.
In an optional embodiment, if the current road type is a curve, generating the driving assistance policy corresponding to the current road condition and the current vehicle information according to the target safety calculation model may include: if the current road type is a curve, inputting the current road condition and the current vehicle information into a target safety calculation model to generate a safety calculation result; and determining a safety level according to the safety calculation result, determining a driving auxiliary line corresponding to the current road according to the current map navigation, and combining the safety level and the driving auxiliary line to generate a driving auxiliary strategy corresponding to the target vehicle.
The safety calculation result may refer to a safety level calculation result corresponding to the current vehicle. The current map navigation may refer to a navigation result generated by the in-vehicle navigation according to a travel path of the target vehicle.
FIG. 4 is a flowchart illustrating a process for using a target security computing model according to an embodiment of the present invention. Specifically, if the current road type is a curve, the current road condition and the current vehicle information can be input into a trained target safety calculation model, and a safety calculation result can be generated. For example, if the current road condition is a rainy road condition, the current vehicle weight is 2000kg, and the current vehicle speed is 40km/h. The current road condition, the current vehicle weight and the current vehicle speed can be input into a trained target safety calculation model, and a safety calculation result can be obtained to be safe. And further, according to the safety calculation result, the corresponding safety level can be determined, and the driving auxiliary line corresponding to the current road can be determined by combining the current map navigation, so that the driving auxiliary strategy of the curve road type is completed.
S250, generating a driving assistance strategy corresponding to the current lane based on the current vehicle information, the current road condition and the current map navigation.
In an optional embodiment, if the current road type is a normal road, generating a driving assistance policy corresponding to the current lane based on the current vehicle information, the current road condition and the current map navigation includes: if the current road type is a normal road, determining the safety level of the target vehicle based on the current vehicle information and the current road condition; and determining a driving auxiliary line corresponding to the current road based on the current map navigation, and combining the safety level and the driving auxiliary line to generate a driving auxiliary strategy corresponding to the target vehicle.
Specifically, if the current road type is a normal road, the safety level of the target vehicle can be determined according to the current vehicle speed and the current road condition, the driving auxiliary line corresponding to the current road can be determined according to the current map navigation, and then the safety level and the driving auxiliary line are combined to generate the driving auxiliary strategy corresponding to the target vehicle in the current lane.
And S260, displaying the driving auxiliary line on a front windshield corresponding to the target vehicle according to the target color corresponding to the safety level through an augmented reality technology.
The target color may refer to a color matched with a safety level corresponding to the driving assistance line.
Specifically, after the driving assistance policy corresponding to the target vehicle is generated, the corresponding target color can be matched for the driving assistance line according to the safety level corresponding to the driving assistance line, and the driving assistance line is displayed on the front windshield corresponding to the target vehicle according to the target color by the augmented reality technology, so that the coincidence of the driving assistance line and the real road is realized.
It is noted that if the security level is low risk or high risk, the synchronization prompt may be performed in a voice broadcast manner. For example, if the reason corresponding to the high risk is that the vehicle speed is too high, the driver may be notified of the slow down in a voice broadcast manner.
According to the technical scheme, the current road condition information and the current vehicle information of the target vehicle are acquired in real time through the vehicle-mounted camera or the vehicle-mounted radar of the target vehicle, and then the current road condition information is analyzed and processed to determine the current road type, the current lane and the current road condition of the target vehicle; if the current road type is a curve, generating a driving auxiliary strategy corresponding to the current road condition and the current vehicle information according to the target safety calculation model; if the current road type is a normal road, generating a driving assistance strategy corresponding to the current lane based on the current vehicle information, the current road condition and the current map navigation; finally, the driving auxiliary line is displayed on the front windshield corresponding to the target vehicle according to the target color corresponding to the safety level through the augmented reality technology, so that the problems that the utilization efficiency of vehicle-mounted navigation is low and the driving safety cannot be guaranteed are solved, the driving pressure of a driver can be effectively reduced, and the risk of driving accidents is reduced.
Fig. 5 is a flowchart of an alternative method for displaying a driving assistance line according to an embodiment of the present invention. Specifically, firstly, current road condition information and current vehicle information of a target vehicle are collected in real time through a vehicle-mounted camera or a vehicle-mounted radar of the target vehicle; further, analyzing and processing the current road condition information, and determining the current road type, the current lane and the current road condition of the target vehicle; then, generating a driving assistance strategy corresponding to the target vehicle based on the current road condition, the current vehicle information and the current map navigation; and finally, displaying the driving auxiliary line on a front windshield corresponding to the target vehicle according to the target color corresponding to the safety level by an augmented reality technology.
Example III
Fig. 6 is a schematic structural diagram of a display device for driving auxiliary lines according to a third embodiment of the present invention. As shown in fig. 6, the apparatus includes: a data acquisition module 310, an auxiliary policy generation module 320, and an auxiliary line display module 330;
the data acquisition module 310 is configured to acquire current road condition information and current vehicle information of a target vehicle;
the auxiliary policy generating module 320 is configured to generate a driving auxiliary policy corresponding to the target vehicle based on the current road condition information and the current vehicle information; the driving assistance strategy comprises a safety grade and a driving assistance line;
and the auxiliary line display module 330 is used for displaying the driving auxiliary line according to the safety level through an augmented reality technology.
According to the technical scheme, a driving assistance strategy corresponding to a target vehicle is generated based on the acquired current road condition information and the current vehicle information; the driving assistance strategy comprises a safety grade and a driving assistance line; furthermore, the driving auxiliary line is displayed according to the safety level through the augmented reality technology, so that the problems that the utilization efficiency of vehicle-mounted navigation is low and the driving safety cannot be guaranteed are solved, the driving pressure of a driver can be effectively reduced, and the risk of driving accidents is reduced.
Optionally, the data acquisition module 310 may specifically be configured to: collecting current road condition information and current vehicle information of a target vehicle in real time through a vehicle-mounted camera or a vehicle-mounted radar of the target vehicle; the current road condition information comprises: the current road type, the current lane and the current road condition; the current vehicle information includes: current vehicle speed and current vehicle weight.
Optionally, the auxiliary policy generating module 320 may specifically include: the system comprises a data analysis unit, a first strategy generation unit and a second strategy generation unit;
the data analysis unit is used for analyzing and processing the current road condition information and determining the current road type, the current lane and the current road condition of the target vehicle;
the first strategy generation unit is used for generating a driving auxiliary strategy corresponding to the current road condition and the current vehicle information according to the target safety calculation model if the current road type is a curve;
and the second strategy generation unit is used for generating a driving auxiliary strategy corresponding to the current lane based on the current vehicle information, the current road condition and the current map navigation if the current road type is a normal road.
Optionally, the display device of the driving assistance line may further include: the model construction module is used for constructing an initial data set before a driving auxiliary strategy corresponding to the current road condition and the current vehicle information is generated according to the target safety calculation model if the current road type is a curve, and generating a target data set and a corresponding safety label by data processing of the initial data set; and inputting the target data set and the corresponding security tag into a preset learning algorithm model for training to obtain a target security calculation model.
Optionally, the first policy generating unit may specifically be configured to:
if the current road type is a curve, inputting the current road condition and the current vehicle information into a target safety calculation model to generate a safety calculation result;
and determining a safety level according to the safety calculation result, determining a driving auxiliary line corresponding to the current road according to the current map navigation, and combining the safety level and the driving auxiliary line to generate a driving auxiliary strategy corresponding to the target vehicle.
Optionally, the second policy generating unit may specifically be configured to:
if the current road type is a normal road, determining the safety level of the target vehicle based on the current vehicle information and the current road condition;
and determining a driving auxiliary line corresponding to the current road based on the current map navigation, and combining the safety level and the driving auxiliary line to generate a driving auxiliary strategy corresponding to the target vehicle.
Optionally, the auxiliary line display module 330 may specifically be configured to: and displaying the driving auxiliary line on a front windshield corresponding to the target vehicle according to the target color corresponding to the safety level through an augmented reality technology.
The display device of the driving auxiliary line provided by the embodiment of the invention can execute the display method of the driving auxiliary line provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 7 shows a schematic diagram of an electronic device 410 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 7, the electronic device 410 includes at least one processor 420, and a memory, such as a Read Only Memory (ROM) 430, a Random Access Memory (RAM) 440, etc., communicatively coupled to the at least one processor 420, wherein the memory stores computer programs executable by the at least one processor, and the processor 420 may perform various suitable actions and processes according to the computer programs stored in the Read Only Memory (ROM) 430 or the computer programs loaded from the storage unit 490 into the Random Access Memory (RAM) 440. In RAM440, various programs and data required for the operation of electronic device 410 may also be stored. The processor 420, ROM 430, and RAM440 are connected to each other by a bus 450. An input/output (I/O) interface 460 is also connected to bus 450.
Various components in the electronic device 410 are connected to the I/O interface 460, including: an input unit 470 such as a keyboard, a mouse, etc.; an output unit 480 such as various types of displays, speakers, and the like; a storage unit 490, such as a magnetic disk, an optical disk, or the like; and a communication unit 4100, such as a network card, modem, wireless communication transceiver, etc. The communication unit 4100 allows the electronic device 410 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunications networks.
Processor 420 may be a variety of general-purpose and/or special-purpose processing components having processing and computing capabilities. Some examples of processor 420 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 420 performs the various methods and processes described above, such as the method of displaying the drive assist line.
The method comprises the following steps:
acquiring current road condition information and current vehicle information of a target vehicle;
generating a driving assistance strategy corresponding to the target vehicle based on the current road condition information and the current vehicle information; the driving assistance strategy comprises a safety grade and a driving assistance line;
and displaying the driving auxiliary line according to the safety level through an augmented reality technology.
In some embodiments, the method of driving assistance line display may be implemented as a computer program, which is tangibly embodied on a computer-readable storage medium, such as the storage unit 490. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 410 via the ROM 430 and/or the communication unit 4100. When the computer program is loaded into RAM440 and executed by processor 420, one or more steps of the above-described driving assistance line display method may be performed. Alternatively, in other embodiments, the processor 420 may be configured to perform the method of displaying the driving assistance line in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A display method of a driving assistance line, comprising:
acquiring current road condition information and current vehicle information of a target vehicle;
generating a driving assistance strategy corresponding to the target vehicle based on the current road condition information and the current vehicle information; the driving assistance strategy comprises a safety grade and a driving assistance line;
and displaying the driving auxiliary line according to the safety level through an augmented reality technology.
2. The method according to claim 1, wherein the obtaining the current road condition information and the current vehicle information of the target vehicle includes:
collecting current road condition information and current vehicle information of a target vehicle in real time through a vehicle-mounted camera or a vehicle-mounted radar of the target vehicle; the current road condition information comprises: the current road type, the current lane and the current road condition; the current vehicle information includes: current vehicle speed and current vehicle weight.
3. The method of claim 1, wherein generating a driving assistance policy corresponding to the target vehicle based on the current road condition information and the current vehicle information comprises:
analyzing and processing the current road condition information, and determining the current road type, the current lane and the current road condition of the target vehicle;
if the current road type is a curve, generating a driving auxiliary strategy corresponding to the current road condition and the current vehicle information according to the target safety calculation model;
and if the current road type is a normal road, generating a driving assistance strategy corresponding to the current lane based on the current vehicle information, the current road condition and the current map navigation.
4. The method of claim 3, further comprising, before generating a driving assistance policy corresponding to the current road condition and the current vehicle information according to the target safety calculation model if the current road type is a curve:
constructing an initial data set, and data processing the initial data set to generate a target data set and a corresponding security tag;
and inputting the target data set and the corresponding security tag into a preset learning algorithm model for training to obtain a target security calculation model.
5. The method of claim 4, wherein if the current road type is a curve, generating a driving assistance policy corresponding to the current road condition and the current vehicle information according to the target safety calculation model comprises:
if the current road type is a curve, inputting the current road condition and the current vehicle information into a target safety calculation model to generate a safety calculation result;
and determining a safety level according to the safety calculation result, determining a driving auxiliary line corresponding to the current road according to the current map navigation, and combining the safety level and the driving auxiliary line to generate a driving auxiliary strategy corresponding to the target vehicle.
6. The method of claim 3, wherein generating a driving assistance policy corresponding to the current lane based on the current vehicle information, the current road condition, and the current map navigation if the current road type is a normal road comprises:
if the current road type is a normal road, determining the safety level of the target vehicle based on the current vehicle information and the current road condition;
and determining a driving auxiliary line corresponding to the current road based on the current map navigation, and combining the safety level and the driving auxiliary line to generate a driving auxiliary strategy corresponding to the target vehicle.
7. The method of claim 1, wherein the displaying the driving assistance line by augmented reality technology according to a security level comprises:
and displaying the driving auxiliary line on a front windshield corresponding to the target vehicle according to the target color corresponding to the safety level through an augmented reality technology.
8. A display device for a driving assistance line, comprising:
the data acquisition module is used for acquiring the current road condition information and the current vehicle information of the target vehicle;
the auxiliary strategy generation module is used for generating a driving auxiliary strategy corresponding to the target vehicle based on the current road condition information and the current vehicle information; the driving assistance strategy comprises a safety grade and a driving assistance line;
and the auxiliary line display module is used for displaying the driving auxiliary line according to the safety level through an augmented reality technology.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the driving assistance line display method according to any one of claims 1 to 7.
10. A computer-readable storage medium storing computer instructions for causing a processor to execute the method of displaying a drive line according to any one of claims 1 to 7.
CN202310751628.4A 2023-06-25 2023-06-25 Driving auxiliary line display method, device, equipment and medium Pending CN116767262A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310751628.4A CN116767262A (en) 2023-06-25 2023-06-25 Driving auxiliary line display method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310751628.4A CN116767262A (en) 2023-06-25 2023-06-25 Driving auxiliary line display method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN116767262A true CN116767262A (en) 2023-09-19

Family

ID=88005978

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310751628.4A Pending CN116767262A (en) 2023-06-25 2023-06-25 Driving auxiliary line display method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN116767262A (en)

Similar Documents

Publication Publication Date Title
CN115480726B (en) Display method, display device, electronic equipment and storage medium
CN113052047B (en) Traffic event detection method, road side equipment, cloud control platform and system
CN114463985A (en) Driving assistance method, device, equipment and storage medium
CN115273477A (en) Crossing driving suggestion pushing method, device and system and electronic equipment
CN113435392A (en) Vehicle positioning method and device applied to automatic parking and vehicle
CN117168488A (en) Vehicle path planning method, device, equipment and medium
CN116767262A (en) Driving auxiliary line display method, device, equipment and medium
US20230029628A1 (en) Data processing method for vehicle, electronic device, and medium
CN114998863B (en) Target road identification method, device, electronic equipment and storage medium
CN114694401B (en) Method and device for providing reference vehicle speed in high-precision map and electronic equipment
EP3989201B1 (en) Light state data processing method, apparatus and system
CN115782919A (en) Information sensing method and device and electronic equipment
CN115959154A (en) Method and device for generating lane change track and storage medium
CN115063765A (en) Road side boundary determining method, device, equipment and storage medium
CN113591569A (en) Obstacle detection method, obstacle detection device, electronic apparatus, and storage medium
CN114911813B (en) Updating method and device of vehicle-mounted perception model, electronic equipment and storage medium
CN113806361B (en) Method, device and storage medium for associating electronic monitoring equipment with road
CN115857176B (en) Head-up display, height adjusting method and device thereof and storage medium
CN115771460B (en) Display method and device for lane change information of vehicle, electronic equipment and storage medium
CN116299199A (en) Radar distance display method and device, electronic equipment and storage medium
CN117593896A (en) Opposite incoming vehicle early warning method, device, equipment and storage medium
CN117407311A (en) Test scene generation method and device, electronic equipment and storage medium
CN117533301A (en) Method and device for controlling vehicle to run based on vehicle running information
CN116520837A (en) Method, device, equipment and storage medium for predicting real-time track of vehicle
CN118062010A (en) Channel change decision method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination