CN113895458B - Vehicle driving behavior management method and device, vehicle and storage medium - Google Patents
Vehicle driving behavior management method and device, vehicle and storage medium Download PDFInfo
- Publication number
- CN113895458B CN113895458B CN202111248414.2A CN202111248414A CN113895458B CN 113895458 B CN113895458 B CN 113895458B CN 202111248414 A CN202111248414 A CN 202111248414A CN 113895458 B CN113895458 B CN 113895458B
- Authority
- CN
- China
- Prior art keywords
- vehicle
- driving
- driving behavior
- node
- behavior
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/005—Handover processes
- B60W60/0053—Handover processes from vehicle to occupant
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W40/09—Driving style or behaviour
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/30—Driving style
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Abstract
The application provides a management method and device for driving behavior of a vehicle, the vehicle and a storage medium, the vehicle is applied to the vehicle, and the vehicle comprises: the central control display screen displays a basic driving interface comprising 3D map navigation, and the method comprises the following steps: in an automatic driving state, calculating key nodes for driving behavior change in the expected driving behavior according to the expected driving behavior of the vehicle; and displaying a node graph representing the key nodes, wherein the node graph is placed at road surface positions with changed driving behaviors in the 3D map navigation. By the method, visual presentation of the key nodes corresponding to the change positions where the driving behavior is changed in the expected driving behaviors of the vehicle is realized, so that in-vehicle personnel can grasp driving dynamics in advance through the displayed key nodes, and can intuitively and correctly judge whether the vehicle is continuously suitable for the automatic driving state of the vehicle, thereby avoiding driving safety hidden danger and improving the trust degree of the in-vehicle personnel on the automatic driving capability of the vehicle.
Description
Technical Field
The embodiment of the invention relates to the technical field of automatic driving, in particular to a method and a device for managing driving behaviors of a vehicle, the vehicle and a storage medium.
Background
Autopilot cars are also increasingly coming into the public's view, which mainly enables autopilot of the vehicle through an integrated autopilot control system.
Currently, in the implementation of automatic driving, an automatic driving control system is mainly used for combining the environmental information of a vehicle and the running information of the vehicle to make a driving decision in real time or in advance. Meanwhile, an information display device configured in the vehicle can also present the running condition of automatic driving in the form of a graphical interface.
However, the existing graphical interface can only present the current driving state of automatic driving, and when navigating, users can only be reminded by taking static buildings and positions as nodes, for example, the users are reminded of driving operations by 50 meters in front of the buildings; meanwhile, the user cannot be effectively informed in advance through the interface when the vehicle cannot continue to automatically drive, so that the user cannot intuitively and correctly judge the automatic driving state of the vehicle and takes over in time when needed, and potential safety hazards of driving exist.
Disclosure of Invention
The embodiment of the application provides a vehicle driving behavior management method, device, vehicle and storage medium, which realize effective display of expected driving behaviors of the vehicle in an automatic driving state.
A first aspect is applied to a vehicle, the vehicle comprising: the central control display screen is provided with a basic driving interface comprising 3D map navigation, and the method comprises the following steps:
in an automatic driving state, calculating key nodes of driving behavior change in the expected driving behavior according to the expected driving behavior of the vehicle;
and displaying a node graph representing the key nodes, wherein the node graph is placed at a road surface position with changed driving behavior in the 3D map navigation.
According to the vehicle driving behavior management method provided by the embodiment, in an automatic driving state, key nodes for driving behavior change in the expected driving behavior can be calculated according to the expected driving behavior of the vehicle; and then displaying a node graph representing the key nodes. The execution of the method is equivalent to that when the vehicle runs in an automatic driving state, the running navigation route of the vehicle can be displayed, and the visual presentation of the key nodes corresponding to the changed positions where the driving behavior is changed in the expected driving behavior of the vehicle is realized, namely, the key nodes can be represented through the node graph displayed on the basic running interface or the vehicle navigation pavement. Through the mode, people in the vehicle can grasp driving dynamics in advance through the displayed key nodes, can intuitively and correctly judge whether the vehicle is continuously suitable for the automatic driving state of the vehicle or not, and timely take over the vehicle when required, so that potential safety hazards of driving are avoided, and the trust degree of the people in the vehicle on the automatic driving capability of the vehicle is improved.
Further, calculating a key node of driving behavior change in the expected driving behavior according to the expected driving behavior of the vehicle, including:
acquiring expected driving behaviors of the vehicle in a first preset time period for automatic driving of the vehicle, wherein the first preset time period is a time period of a first set time length backwards from the current moment;
and determining a key node of the expected driving behavior, wherein the key node is a position node of the driving behavior, and the position node is used for changing the state of the vehicle.
The selectable item provides a specific implementation of key node calculation, which is equivalent to providing technical support for visual display of key nodes representing driving behavior change on a central control display screen. The feasibility of the method provided by the embodiment is ensured.
Further, displaying a node graph representing the key node, comprising:
determining the interval duration of the moment when the expected driving behavior changes from the current moment;
determining a specific road position where the driving behavior change occurs according to the current running speed of the vehicle, the interval duration and the current position in a road running environment;
Determining a driving behavior change position of the key node in the 3D map navigation by combining the specific road position;
marking the driving behavior change position by using a node graph, and displaying the node graph on a driving road surface of the basic driving interface and/or a driving road surface outside a vehicle;
and the presentation position of the node graph is matched with the actual occurrence position of the driving behavior change and is positioned on the driving route of the vehicle.
Further, marking the driving behavior change position by using a node graph, and displaying the node graph on a driving road surface of the basic driving interface and/or a driving road surface outside a vehicle, including:
determining the driving behavior of which the driving behavior is changed, marking the driving behavior as driving behavior to be changed, and determining node display attributes matched with the driving behavior to be changed;
marking the driving behavior modification position using a node graph having the node presentation attribute,
and displaying the node graph on the driving road surface of the basic driving interface and/or the driving road surface outside the vehicle.
The selectable item and the last selectable item both provide technical support for visual display of the key nodes representing the driving behavior change on the central control display screen, and for example, the specific implementation of the node graph display corresponding to the key nodes is provided. Further ensuring the feasibility of the method provided by the embodiment.
Further, the method further comprises:
receiving automatic driving intervention triggering operation;
displaying a driving intervention control area through a central control display screen, wherein the driving intervention control area comprises at least one driving control assembly;
receiving control and issuing an operation instruction, wherein the control is that a driver selects a target driving control component from the driving intervention control area and drags the target driving control component to a target point on the basic driving interface, and the operation instruction is an instruction corresponding to the target point execution target driving control component;
and responding to the operation instruction, and controlling the driving behavior of the vehicle.
The selectable items are newly added functional technical features in the vehicle driving behavior management method provided by the embodiment, and through the selectable feature items, the method is equivalent to the fact that when a driver has an operation requirement on the vehicle in an automatic driving state, the driver can flexibly operate the driving behavior of the vehicle through the presented driving control assembly. Compared with the prior art that the driving operation is switched to the driver mode and then the driver is responded, the selectable newly-added feature does not need to switch the driving mode, so that the occurrence of potential running danger in the mode switching process is avoided, meanwhile, the influence of frequent mode switching on the driving performance of the vehicle is avoided, in addition, the driver can automatically respond to the operation and control action only by selecting and dragging the driving operation assembly, the substantial operation on the vehicle is not needed, the dominant position of the automatic driving is better reflected, and the automatic driving value of the vehicle is effectively improved.
Further, controlling driving behavior of the vehicle in response to the operation instruction includes:
extracting a target driving control component and a target point in the operation instruction;
obtaining an actual environment position corresponding to the target point in a road running environment and a target driving behavior corresponding to the target driving control component;
controlling the vehicle to perform the target driving behavior at the actual environmental position.
Through the steps of the method, the specific implementation of the control of the vehicle to change the driving behavior is provided in response to the control instruction, the control time, the specific control position and the like of the driving behavior are defined, and technical support is provided for the driver to flexibly operate the vehicle in an automatic driving state.
Further, each driving control component corresponds to one driving behavior to be controlled of the vehicle;
the driving behavior to be controlled comprises at least one of the following: vehicle acceleration, vehicle deceleration, vehicle lane change, vehicle overtaking, vehicle turning around, vehicle steering, parking, and take over vehicle maneuvers.
The selectable characteristic item gives out specific characteristic attributes of the driving control assembly, and meanwhile, the driving behavior of a driver realized through specific controllable driving of the driving control assembly is also clarified, so that the driver is ensured to have wider controllable range in an automatic driving state.
Further, the method further comprises:
and displaying the driving behavior of the vehicle in a second preset time period by adopting an identification graph on the driving road surface of the basic driving interface and/or the driving road surface outside the vehicle, wherein the second preset time period is a time period of a second set time length from the current moment to the rear.
On the basis of the visual driving behavior change place and the flexible control of the driver on the driving behavior of the vehicle, the visual display characteristics of the driving behavior of the vehicle are further enriched through the selectable characteristic items, the graphical display of the driving behavior of the vehicle on the central control display screen is realized, so that personnel in the vehicle can more intuitively know the driving state of the vehicle, the driving intention of the vehicle in a certain time period in the future can be completely and effectively displayed when the driving behavior of the vehicle is changed, and the use experience of the user on the vehicle is comprehensively promoted.
Further, the display of the driving behavior of the vehicle in the second predetermined period of time by using the identification graph includes:
obtaining driving behaviors included by the vehicle in the second preset time period, and respectively marking the driving behaviors as driving behaviors to be displayed;
And controlling each driving behavior to be displayed to display the identification graph in a corresponding identification graph display form according to the execution time sequence of each driving behavior to be displayed.
The selectable item provides technical support for complete visual display of the driving behavior of the vehicle on the central control display screen, and the feasibility of the method provided by the embodiment is ensured.
Further, the method further comprises:
and if the driving behavior of the vehicle is changed in the second preset time period, controlling the displayed identification graph to change and remind according to the matched dynamic effect.
The selectable item is another newly added function of the method provided by the embodiment, the function optimization of the graphical display of the driving behaviors is mainly realized, and the visual effect of the graphical display of the driving behaviors is further improved through the dynamic effect presentation of the driving behavior change time.
Further, the display width of the identification graph is smaller than the lane width of one lane of the basic driving interface driving road surface and/or the driving road surface outside the vehicle;
the range of the presentation length of the identification graph is as follows: driving forward a distance of the second predetermined period of time from a current position of the vehicle;
The identification graph comprises key nodes representing the change time of the driving behavior of the vehicle, and node display attributes of the key nodes are matched with the driving behavior to be changed.
Further, the display form of the identification graph of each driving behavior is a basic path, and the basic path is represented by a continuous curve or straight line with a set width, or is represented by a group of unit graphs forming a curve or straight line;
different driving behaviors correspond to different presentation sizes and presentation colors of the basic paths; or the colors, shapes and sizes of the unit patterns adopted by the corresponding different basic paths;
the shape of the unit graph is at least one of a circle, a round dot, a triangle, an arrow, a square and a three-dimensional graph;
the spacing of adjacent cell patterns represents the current speed of the vehicle.
The selectable item and the last selectable item provide the static feature limitation of the provided identification graph, and achieve the effect of enriching the attribute features of the identification graph.
In a second aspect, an embodiment of the present application provides a management device for driving behavior of a vehicle, configured to be configured with the vehicle, where the vehicle includes a central control display screen, and the central control display screen displays a basic driving interface including 3D map navigation, where the device includes:
The key node calculation module is used for calculating key nodes of driving behavior change in the expected driving behavior according to the expected driving behavior of the vehicle in an automatic driving state;
and the key node display module is used for displaying a node graph representing the key node, and the node graph is arranged at a road surface position where the driving behavior is changed in the 3D map navigation.
In a third aspect, embodiments of the present application provide a vehicle, including:
a central control display screen;
one or more controllers;
a storage means for storing one or more programs;
when the one or more programs are executed by the one or more controllers, the one or more controllers are caused to implement the method for managing driving behavior of a vehicle according to the first aspect of the embodiment described above.
In a fourth aspect, embodiments of the present application provide a storage medium containing computer-executable instructions that, when executed by a computer processor, implement a method of managing driving behavior of a vehicle as described in the first aspect of the above embodiments.
According to the vehicle driving behavior management method, visual presentation of the key nodes corresponding to the change positions where driving behavior change occurs in the expected driving behaviors of the vehicle in an automatic driving state is achieved; the function realizes that the personnel in the vehicle can grasp the driving dynamics in advance through the displayed key nodes, and can intuitively and correctly judge whether the vehicle is continuously suitable for the automatic driving state of the vehicle or not, so that the vehicle can be taken over in time when the driver is required to take over, and the driving safety hidden trouble is better avoided. Meanwhile, the method also realizes that when the driver has the operation requirement on the vehicle in the automatic driving state, the driving behavior of the vehicle can be flexibly operated through the presented driving control assembly without switching the driving mode, so that the occurrence of potential driving danger in the mode switching process is avoided, and further, the influence of frequent mode switching on the driving performance of the vehicle is also avoided. In addition, the method further enriches the visual presentation characteristics of the driving behavior of the vehicle on the basis of visualizing key nodes for changing the driving behavior and realizing flexible operation of the driving behavior by a driver, and particularly realizes the graphical display of the driving behavior of the vehicle on a central control display screen; the method and the system enable in-vehicle personnel to more intuitively know the driving state of the vehicle, and enable the driving intention of the vehicle in a certain time period in the future to be completely and effectively displayed when the driving behavior of the vehicle is changed. According to the method provided by the embodiment, the automatic driving value of the vehicle is effectively improved, and the use experience of the user on the vehicle is further improved in an omnibearing manner.
Drawings
Fig. 1 is a flow chart of a method for managing driving behavior of a vehicle according to an embodiment of the present application;
FIG. 2 is an exemplary effect diagram of a graphical display of nodes in the method provided by the present application;
FIG. 3 is a flowchart illustrating the calculation of a key node in the method according to the embodiment of the present application;
FIG. 4 is a flowchart illustrating an implementation of node graphical display in a method according to an embodiment of the present application;
FIG. 5 is a flow chart showing the response of the method according to the present embodiment to the operation of the driver in the automatic driving state;
FIG. 6 is a diagram illustrating a scenario for implementing a driving behavior response operated by a driver in a method for managing driving behavior of a vehicle according to an embodiment of the present disclosure;
FIG. 7 is a flow chart of an implementation of a method according to an embodiment of the present application in response to driver-triggered driving behavior;
FIG. 8 shows a display form of an identification pattern matched with a constant-speed driving of a vehicle in the method for managing driving behavior of a vehicle according to the embodiment;
FIG. 9 shows a display form of an identification pattern matched with a lane change of a vehicle in the method for managing driving behavior of the vehicle according to the present embodiment;
FIG. 10 shows a display form of an identification pattern matched with a vehicle overtaking in the method for managing driving behavior of a vehicle according to the present embodiment;
FIG. 11 shows a pattern of display of a logo matching the steering of a vehicle in the method for managing driving behavior of a vehicle according to the present embodiment;
FIG. 12 shows a display form of an identification pattern matched with a turning of a vehicle in the method for managing driving behavior of a vehicle according to the present embodiment;
FIG. 13 is a diagram showing a pattern of identification matching a vehicle stop in the method for managing driving behavior of a vehicle according to the present embodiment;
FIG. 14 is a diagram showing a pattern of identification matching for the end of automatic driving in the method for managing driving behavior of a vehicle according to the present embodiment;
FIG. 15 shows a pattern of display of a logo matched with a vehicle pre-warning in the method for managing driving behavior of a vehicle according to the present embodiment;
fig. 16 is a schematic structural diagram of a device for managing driving behavior of a vehicle according to an embodiment of the present application;
fig. 17 is a block diagram of a vehicle according to an embodiment of the present application.
Detailed Description
The following description of the technical solutions in the embodiments of the present application will be made with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments.
The ordinal terms such as "first," "second," and the like in the embodiments of the present application are used for distinguishing a plurality of objects, and are not used to define the size, content, sequence, timing, application scenario, priority, importance, and the like of the plurality of objects.
Fig. 1 is a flow chart of a method for managing driving behavior of a vehicle according to an embodiment of the present application, where the method is applicable to managing driving behavior of a vehicle in an automatic driving state, and may be executed by a device for managing driving behavior of a vehicle according to an embodiment of the present application, where the device may be implemented by software and/or hardware, and may be integrated in a vehicle. The vehicle further comprises a central control display screen, and a 3D map navigation basic driving interface is displayed on the central control display screen.
It should be noted that, the application background of the method provided in this embodiment may be that the vehicle is in an automatic driving state and is running along a pre-planned route. In the driving process of the vehicle, a road environment picture is presented in a basic driving interface of the central control display screen, wherein the road environment display picture comprises, but is not limited to, a real-time road live-action picture based on a camera, a road environment picture which is simulated through a 3D model based on laser radar/millimeter wave radar/ultrasonic radar/visual recognition and an algorithm thereof, and a 3D map navigation which is driven by the vehicle is also displayed in the road environment picture so as to be used for displaying the driving path of the vehicle.
Specifically, as shown in fig. 1, the method for managing driving behavior of a vehicle according to the present embodiment includes:
s101, under an automatic driving state, calculating key nodes for driving behavior change in the expected driving behavior according to the expected driving behavior of the vehicle.
It should be noted that, in the process of executing the steps of the method provided in this embodiment, the vehicle may be considered to be in an automatic driving state all the time, and thus the execution subject of the method provided in this embodiment may be considered to be an automatic driving system in the vehicle.
In this embodiment, the automatic driving system in the vehicle may determine the driving decision in a certain time in the future by reading the related information in the road driving environment, and the predicted result may be used as the expected driving behavior.
In general, driving behavior of a vehicle may include: vehicle acceleration, vehicle deceleration, vehicle lane change, vehicle turn around, vehicle overtaking, vehicle steering, vehicle changing autopilot grade, etc. While each driving behavior actually corresponds to a driving change state of the vehicle. In the present embodiment, when at least two driving behaviors are included in the expected driving behaviors determined in a future period of time with respect to the vehicle, the vehicle may be considered to have an operation of changing from one driving behavior to another driving behavior in the future period of time. The present embodiment can record, as a key node, a position where one driving behavior is changed to another driving behavior such that the vehicle state is changed.
In the implementation of the calculation of the key node, one implementation manner may be to determine the time when the vehicle changes from one driving state to another driving state, and then determine the position where the driving state changes by combining the vehicle speed at the current time and the position where the vehicle is located, so that the position can be determined as the key node where the driving state changes.
S102, displaying a node graph representing the key nodes, wherein the node graph is placed at a road surface position with changed driving behavior in the 3D map navigation.
In this embodiment, after the expected driving behavior is obtained through the above steps, if it is determined that the expected driving behavior includes the key node in which the driving state of the vehicle changes, the visualization operation of the key node may be performed through this step. In particular, the visualization of key nodes may be characterized by a node graph. The displayed node pattern may preferably be placed within a 3D navigation map, and in particular may be placed at a road surface location where the driving behavior is altered. The node graph presented in the 3D navigation map can be displayed in a basic driving interface and can also be displayed on a navigation road surface outside a vehicle.
It can be known that the key node may represent a starting position of the driving behavior change corresponding to the driving behavior change moment of the vehicle in the basic driving interface. In order to realize the display of the key nodes, the initial positions corresponding to the key nodes in the 3D navigation map are determined, the initial positions can be determined according to specific changing positions corresponding to the actual road running environment of the vehicle when the driving behavior is changed, and the specific changing positions can be determined according to the time point when the driving behavior is changed, the current running speed of the vehicle and other information.
The step can obtain the corresponding driving behavior change position in the 3D navigation map based on the determined specific change position conversion, and display the key node in a node graph at the driving behavior change position. The node graph displayed can be a graph with a given size or a given shape (such as a rectangle with a certain size or a circle with a certain radius).
It can be understood that the display information given by the central control display screen may be the road surface of the vehicle outside the vehicle from the view angle of the driver, or may be the basic driving interface of the vehicle driving on the road from the view angle above the road surface. The information display view angle in the central control display screen does not influence the display of the node graph, and the node graph can be displayed on the driving road surface outside the vehicle when the driving road surface of the vehicle at the view angle of the driver is displayed; the node pattern may also be displayed on a basic travel interface when the vehicle travels on the road at a view angle above the road surface.
By way of example, FIG. 2 illustrates an example effect diagram of a graphical display of nodes in the methods provided herein. As shown in fig. 2, the presented interface is mainly a basic driving interface which is displayed at a viewing angle above a road surface, the vehicle 1 is regarded as a target vehicle, the vehicle is currently driven in a second lane, the navigation route 01 of the vehicle is presented in the basic driving interface, when the predicted driving behavior including acceleration of the vehicle in a future period of time is predicted by the method of the embodiment, the predicted driving behavior corresponds to the driving behavior change relative to the current uniform driving, the node graph 10 is displayed in the basic driving interface, and is specifically displayed on the navigation route 01 of the vehicle, and the navigation route 01 of the vehicle is presented in a rectangular form, which specifically represents the initial change position of the driving state when the vehicle 1 is driven in the driving behavior of the vehicle acceleration.
The management method for the driving behavior of the vehicle, which is provided by the embodiment of the application, is equivalent to the fact that when the vehicle runs in an automatic driving state, the driving navigation route of the vehicle can be displayed, and visual presentation of the key nodes corresponding to the changed positions where the driving behavior is changed in the expected driving behavior of the vehicle is realized, namely, the key nodes can be represented through the node graph displayed on a basic driving interface or a vehicle navigation pavement. Through the mode, people in the vehicle can grasp driving dynamics in advance through the displayed key nodes, can intuitively and correctly judge whether the vehicle is continuously suitable for the automatic driving state of the vehicle or not, and timely take over the vehicle when required, so that potential safety hazards of driving are avoided, and the trust degree of the people in the vehicle on the automatic driving capability of the vehicle is improved.
As a first alternative embodiment of the present embodiment, based on the above embodiment, the present first alternative embodiment further embodies key nodes for calculating driving behavior changes in the expected driving behavior according to the expected driving behavior of the vehicle, and fig. 3 shows a flowchart for implementing calculation of the key nodes in the method provided by the embodiment of the present application. As shown in fig. 3, the first alternative embodiment specifically includes the following steps:
s1011, acquiring expected driving behaviors of the vehicle in a first preset time period for automatic driving, wherein the first preset time period is a time period of a first set time length from the current moment to the back.
In this embodiment, the driving decision of the automatic driving system for the vehicle in the first predetermined period of time may be obtained through this step and noted as the expected driving behavior. The present embodiment may preferably select the first set period of time from 2 seconds to 10 seconds. Since the determination of the driving decision of the vehicle by the automatic driving system in the vehicle is an existing implementation means, the description thereof will not be repeated here.
S1012, determining a key node of the expected driving behavior, wherein the key node is a position node of the vehicle state change caused by the driving behavior change.
In this embodiment, through this step, it may be predicted whether there is a change in the state of the vehicle when the vehicle is traveling according to the obtained expected driving behavior, and if so, it may be considered that there is a change in the driving behavior of the vehicle, and the location node where the change in the driving behavior occurs may be referred to as a key node.
As described above, each driving behavior actually corresponds to a driving change state of the vehicle, in each driving change state, there is a start time of the change, and this embodiment may record the start time as a time when the driving behavior change causes the vehicle state to change, and in order to determine a key node of the driving behavior change, first, a time point when the vehicle state changes may be determined first.
Illustratively, for each driving behavior change of the vehicle, such as:
1) The vehicle is accelerated at a constant speed, which is equivalent to continuously increasing the vehicle speed at a constant speed and controlling the increase of the total speed to exceed 5km/h, and the embodiment can record the moment when the vehicle speed starts to change from the constant speed to the moment when the vehicle is accelerated.
2) The vehicle is decelerated at a constant speed, which is equivalent to continuously reducing the vehicle speed at a constant speed, and the reduction of the total speed is controlled to exceed 5km/h, and the embodiment can record the moment when the vehicle speed starts to change as the moment when the vehicle state changes to the vehicle deceleration.
3) The vehicle changes lanes from the current lane, which is equivalent to that the running track of the vehicle will deviate from the central position of the current lane and move towards the target adjacent lane until the vehicle runs centrally on the target lane, and the embodiment can record the moment when the vehicle starts to turn as the moment when the vehicle state is changed lanes.
4) Turning around the vehicle from the current state corresponds to a continuous decrease in vehicle speed, accompanied by continuous steering, until the traveling direction becomes the opposite direction. The present embodiment can record the time at which the vehicle speed starts to change as the time at which the vehicle state exhibits turning around.
5) Overtaking the vehicle from the current state corresponds to a continuous increase in vehicle speed, with the travel track moving in front of other vehicles in the adjacent lane, and then maintaining or decelerating to a constant speed travel state and maintaining the center in the lane. The present embodiment may record the time at which the vehicle speed starts to change as the time at which the vehicle state exhibits overtaking.
6) The vehicle turns from the current state, which corresponds to the vehicle turning from a constant speed, and the vehicle speed is continuously reduced with the continuous turning until the running direction becomes the target direction. The present embodiment can record the time at which the vehicle speed starts to change as the time at which the vehicle state exhibits steering.
7) The adjustment of the driving level of the vehicle corresponds to the change of the automatic driving level of the vehicle from a constant speed, and the vehicle changes the automatic driving function, for example, the automatic driving is degraded to the adaptive cruise control. The embodiment may record the time when the key node is the function switch as the time when the vehicle state is presented as the driving level adjustment.
The corresponding time when the state of the vehicle changes can be determined no matter what driving behavior change occurs to the vehicle, and the position node can be correspondingly determined to serve as a key node for the driving behavior change.
The first alternative embodiment provides a specific implementation of the calculation of the key nodes, which is equivalent to providing technical support for visually displaying the key nodes representing the driving behavior change on the central control display screen. The feasibility of the method provided by the embodiment is ensured.
As a second alternative embodiment of the present embodiment, based on the foregoing embodiment, the present second alternative embodiment further embodies displaying a node graph representing the key node, and fig. 4 shows a flowchart for implementing node graph display in the method provided by the embodiment of the present application. As shown in fig. 4, the second alternative embodiment specifically includes the following steps:
s1021, determining the interval duration between the time when the expected driving behavior generates the driving behavior change and the current time.
In this embodiment, this step is used to determine the interval duration, where the executing body may acquire the system time in real time, so that the current time information may be acquired, and after the moment of the driving behavior change is known, the interval duration of the two may be determined.
S1022, determining the specific road position where the driving behavior change occurs according to the current running speed of the vehicle, the interval duration and the current position in the road running environment.
The present embodiment may preferably make the vehicle travel in a uniform speed state before the driving behavior is changed, and the present execution body may also obtain the current travel speed of the vehicle, and after a known interval period, determine the distance the vehicle will travel based on the above information, and further combine the obtained specific road position of the vehicle in the road travel environment.
S1023, determining the driving behavior change position of the key node in the 3D map navigation by combining the specific road position.
In this embodiment, the specific road position may correspond to a road position where the driving behavior of the vehicle in the actual driving environment becomes more. Meanwhile, the execution body of the embodiment can also acquire the presentation ratio of the 3D map and the actual road running environment in the presented 3D map navigation. Thus, after knowing the specific road position and the above-mentioned presentation ratio, the driving behavior change position of the vehicle in the 3D map navigation can be determined.
S1024, marking the driving behavior change position by using a node graph, and displaying the node graph on the driving road surface of the basic driving interface and/or the driving road surface outside the vehicle.
The step can be that after the driving behavior change position of the key node in the 3D map navigation is determined through the steps, the driving behavior change position is marked by adopting a node graph representing the key node; meanwhile, the node graph can be directly displayed at the driving behavior change position corresponding to the basic driving interface in consideration of the difference of the visual angles presented by the 3D navigation map on the central control display screen, and also can be displayed at the driving behavior change position corresponding to the vehicle driving road surface at the visual angle of the driver.
The presentation position of the node graph is matched with the actual occurrence position of the driving behavior change and is positioned on the driving route of the vehicle. Meanwhile, in order to better characterize different driving behavior changes through the node graph in consideration of the diversity of driving behaviors of the vehicle, different node display attributes can be set for the different driving behavior changes, and the node display attributes can define the shape, the size, the color and the like of the node graph. So that the personnel in the vehicle can quickly identify what driving behavior change the vehicle is about to take through the different node graphs.
Further, in the second alternative embodiment, the driving behavior change position may be marked by using a node graphic, and the node graphic may be displayed on the driving road surface of the basic driving interface and/or the driving road surface outside the vehicle, where the following operations are implemented:
determining the driving behavior of which the driving behavior is changed, marking the driving behavior as driving behavior to be changed, and determining node display attributes matched with the driving behavior to be changed; marking the driving behavior change position by using a node graph with the node display attribute; and displaying the node graph on the driving road surface of the basic driving interface and/or the driving road surface outside the vehicle.
In this embodiment, when the driving behavior change time is pre-determined, what driving behavior change is about to occur may be pre-determined, and in this step, what driving behavior change is determined to be the driving behavior to be changed corresponding to the driving behavior change time.
It should be noted that, in this embodiment, different node display attributes are set for different driving behaviors, where the node display attributes may be understood as attributes defining a presentation form when a corresponding key node is presented, and the node display attributes may include: the shape, size, and fill color of the key nodes, etc. According to the embodiment, the node display attributes preset for different driving behaviors can be searched through the steps, so that the node display attributes matched with the driving behaviors to be changed are determined, and finally, the key nodes can be displayed at the identification points of the basic driving interface according to the shapes, the sizes, the colors and the like set in the node display attributes.
The second optional embodiment of the present application provides technical support for visually displaying the key nodes at the moment of changing the driving behavior on the central control display screen, so as to further ensure the feasibility of the method provided by the present embodiment.
In addition, the first and second optional embodiments of the present application provide visual presentation of the key nodes, and on this basis, the present embodiment may preferably describe a new functional feature, where the new functional feature may be described as performing a voice prompt on the driving behavior modification event after determining the key node that characterizes the driving behavior modification event.
Illustratively, the steps of operation of the added functionality may be described as:
a1, carrying out voice early warning prompt on the vehicle, wherein the driving behavior of the voice early warning prompt is about to be changed.
This step may be performed directly after the driving behavior modification event of the vehicle is pre-determined. The voice early warning prompt performed by the method can comprise a change starting time of driving behavior change, an actual change position, a change to be generated and the like.
Through the voice early warning prompt to the personnel in the vehicle, the driver can be informed in advance what kind of change will happen next, so that the user has better psychological expectation on the automatic driving state of the vehicle.
b1, when the vehicle starts to execute the driving behavior change, performing voice broadcasting prompt of the vehicle to execute the driving behavior change.
The step can be used for carrying out voice reminding when the driving behavior change is about to be carried out in the running process of the vehicle, wherein the voice reminding time is equivalent to the starting execution time of the driving change behavior and is mainly used for informing the vehicle of what driving behavior is about to be presented under the control of an automatic driving system.
Through the optional features of the embodiment, after the driving behavior change event is determined, voice early warning reminding can be performed on the driving behavior change event, and voice real-time reminding is performed when the driving behavior change moment is reached, so that effective grasp of the driving state of the vehicle by personnel in the vehicle can be ensured, and further the effects of avoiding the potential safety hazards of driving and improving the trust degree of the automatic driving capability of the vehicle by the personnel in the vehicle can be achieved.
As a third alternative embodiment of the present embodiment, on the basis of the above-described embodiments, the present third alternative embodiment is further optimized and newly added with a functional feature that can be described as realizing flexible manipulation of the driving behavior of the driver's vehicle in the automatic driving state.
It should be noted that the implementation of the new function of the third alternative embodiment is not limited to the execution of the visual presentation of the key nodes, which is equivalent to an independent functional main body, and may be executed at any time when the vehicle is in an automatic driving state.
Specifically, fig. 5 shows a flowchart of the implementation of the method provided by the present embodiment in response to the operation of the driver in the autopilot state, where the flowchart shown in fig. 5 may be performed independently of the method shown in fig. 1 on the basis of the flowchart shown in fig. 1, and as shown in fig. 5, the steps of the operation for optimizing the addition in the third alternative embodiment may be described as follows:
in the implementation of the vehicle in automatic driving, the driver is often required to assist or directly take over the vehicle, so that the vehicle is driven safely. When the driver is needed to participate in the automatic driving process, the operation of switching the automatic driving mode to the driving mode of the driver is often needed, so that the vehicle can normally receive the control signal of the driver to the vehicle, and the driver can take over the vehicle.
However, in the process of switching from the automatic driving mode to the driver driving mode to realize that the driver takes over the vehicle, it takes time to switch from the automatic driving mode to the driver driving mode, timely taking over of the driver may not be realized, and there may be a risk of failure of the automatic driving control during this period; in addition, frequent driving mode changes also affect the driving performance of the vehicle, and meanwhile, in the driving mode of the driver, the driver needs to perform substantial control on the vehicle, the whole process still takes people as driving leading, and the automatic driving value of the vehicle cannot be better reflected.
Based on this, a flexible manipulation of the driving behaviour of the vehicle by the driver can be achieved by the method steps provided by the present third alternative embodiment.
S103, receiving automatic driving intervention triggering operation.
In this embodiment, the driving intervention triggering operation may specifically be an operation triggered by some form when an in-vehicle person (especially, a driver, preferably, the driver) has a need to control the driving behavior of the vehicle. The vehicle (specifically, an automatic driving control system in the vehicle) as the execution subject of the present embodiment may generate the driving intervention triggering operation by receiving the driving intervention triggering operation by clicking or continuously clicking an option in a certain button/a certain area/a certain taskbar on the center control display.
S104, displaying a driving intervention control area through a central control display screen, wherein the driving intervention control area comprises at least one driving control assembly.
In this embodiment, the present execution subject may respond to the received driving intervention triggering operation, so that through this step, a driving intervention manipulation area is displayed on the central control display screen, and the driving intervention manipulation area may be understood as a manipulation item selection area for the driver to perform driving behavior manipulation, which may be pre-integrated in the vehicle as a plug-in unit.
Wherein the driving intervention manipulation zone comprises at least one driving manipulation component for the selection of a driver. Each driving control component can be transversely or longitudinally arranged in the driving intervention operation area in a form of a draggable button, and each driving control component can be considered to correspond to one driving behavior to be controlled of the vehicle; for example, the vehicle-presentable driving behavior to be manipulated may include at least one of: vehicle acceleration, vehicle deceleration, vehicle lane change, vehicle overtaking, vehicle turning around, vehicle steering, parking, and take over vehicle maneuvers.
The control object of the vehicle is changed from the automatic driving system to the driver or is in an adaptive cruise control level.
S105, receiving control, and issuing an operation instruction, wherein the control is that a driver selects a target driving control component from the driving intervention control area and drags the target driving control component to a target point on the basic driving interface, and the operation instruction is an instruction corresponding to the target driving control component aiming at the target point.
In this embodiment, the manipulation received in this step may be a triggering operation of the driver on the selected driving manipulation component in the driving intervention manipulation area, where the driver may select a matched driving manipulation component according to a manipulation requirement of an individual on a driving behavior of the vehicle, and the driving manipulation component may be denoted as a target driving manipulation component.
Specifically, the triggering operation of the target driving manipulation assembly by the driver can be specifically expressed as follows: and dragging the target driving control component to a certain position point of the basic driving interface. It can be known that the basic driving interface includes a road environment display screen, in which scenes such as a logo building, a road, a fork, a viaduct and the like can be displayed, and when a driver expects a vehicle at a certain intersection or the scenes such as a certain building can respond to the control of the vehicle, the target driving control component can be dragged to the position point of the scenes. Therefore, the embodiment can record a certain position point to which the driver drags as a target point of the target driving control component on the basic driving interface.
It should be noted that, considering that the driving control component is presented in the driving control area in the form of a button, in order to facilitate dragging, in this embodiment, after the driver selects the target driving control component, the target driving control component is controlled to be presented in the form of an anchor point, which is finally equivalent to dragging the anchor point corresponding to the target driving control component to the target point.
After receiving the control formed by the driver trigger, the step can generate an operation instruction and send the operation instruction to the relevant execution unit of the vehicle to execute the relevant operation. The operation instruction may be an instruction that can be used to inform the control unit of the vehicle to execute the driving behavior corresponding to the target driving manipulation assembly at the target point selected by the driver.
S106, responding to the operation instruction, and controlling the driving behavior of the vehicle.
In this embodiment, after the execution body receives the operation instruction through the above steps, the operation instruction can be analyzed through the step, so as to obtain the control requirement of the driver on the driving behavior of the vehicle, and the driving behavior of the vehicle can be controlled to meet the control requirement expected by the driver through the response to the operation instruction.
Specifically, the control requirement of the driver included in the operation instruction can be analyzed through the step, for example, the control requirement can be control of vehicle acceleration, control of vehicle deceleration, control of vehicle steering, control of vehicle lane change and the like, and then the step can control the vehicle to achieve the operation requirement expected by the driver at the target point where the target driving control component is located, for example, achieve vehicle acceleration at the target point, achieve vehicle deceleration at the target point, achieve vehicle steering at the target point (the target point can represent the direction in which the vehicle is about to turn), achieve lane change of the vehicle at the target point and the like.
It is known that the target point can be regarded as the actual execution of the driver's steering demand, so long as the step can analyze the post-acceleration/deceleration speed, the direction to be steered, the road to be changed, etc. characterized by the target point from the operation instruction.
It should be noted that, in the process of executing the steps of the method in this embodiment, the vehicle may be considered to be in an automatic driving state all the time, that is, the execution of the steps by the vehicle is equivalent to being implemented by the automatic driving system of the vehicle. Compared with the existing method of switching to the driver driving mode and responding to the driver's operation, the present embodiment does not need to switch the driving mode.
To facilitate better understanding of the driver's flexible operation of the driving behavior of the vehicle in the vehicle autopilot state, the present embodiment is given by way of example. Fig. 6 is a diagram illustrating a scenario for implementing a driving behavior response handled by a driver in the method for managing driving behavior of a vehicle according to the embodiment of the present application. As shown in fig. 6, a scenario in which the driver selects a target driving maneuver assembly in the driving intervention manipulation zone 11 and drags to the target point 121 within the base travel interface 12 is specifically presented.
As can be seen from fig. 6, the driving intervention manipulation zone 11 can be presented above the basic driving interface 12 and can be displayed in particular in a position which does not interfere with the road environment picture presentation. Meanwhile, the driving manipulation components that may be included in the driving intervention manipulation section 11 are vehicle acceleration (+5 km/h), vehicle deceleration (-5 km/h), parking, taking over the vehicle, vehicle passing, parking, and the like. In the screen shown in fig. 6, the driver selects vehicle acceleration (+5 km/h) as the target driving manipulation member, and drags the driving manipulation member of the vehicle acceleration to the target point 121, thereby triggering manipulation and further receiving the issued operation instruction.
In response to the above-described operation instruction for vehicle acceleration, the execution subject of the present method analyzes the intention of the driver to accelerate the vehicle, and controls the vehicle to start the operation for accelerating the vehicle when the vehicle travels to the target point 121.
In addition, the execution main body can control the vehicle to finish parking at a certain position point when the driver places the parked driving control assembly at the position point in front of the road; it is also possible to control the vehicle to achieve automatic lane changing of the vehicle at a certain point of the adjacent lane when the driver places the driving control assembly of the overtaking at that point.
The method steps provided in the third alternative embodiment are equivalent to that when the driver has the operation requirement of the vehicle, the driver can flexibly operate the driving behavior of the vehicle through the presented driving control assembly when the vehicle is in an automatic driving state. Compared with the prior art that the driving control is responded to the driver after the mode is switched to the driver mode, the novel method in the third alternative embodiment does not need to switch the driving mode, so that potential running danger in the mode switching process is avoided, meanwhile, the influence of frequent mode switching on the driving performance of the vehicle is avoided, in addition, the driver can automatically respond to the control behavior by only selecting and dragging the driving control component, the substantial operation on the vehicle is not needed, the dominant position of automatic driving is better reflected, and the automatic driving value of the vehicle is effectively improved.
Meanwhile, it should be noted that, in the process of executing the method provided by the third alternative embodiment, the executing body may also consider that the driving behavior of the vehicle is changed when responding to the driving behavior corresponding to the driving operation component by triggering the driving operation component by the driver, so that the method provided by the present application may use the node graph corresponding to the changed driving behavior to display the relevant key nodes.
Further, as a fourth optional embodiment of the present embodiment, based on the third optional embodiment, the present fourth optional embodiment further responds to the operation instruction, and controls the driving behavior of the vehicle to be materialized, fig. 7 shows a flowchart of implementation of responding to the driving behavior triggered by the driver in the method provided by the embodiment of the present application, and as shown in fig. 7, specifically includes the following steps:
and S1061, extracting a target driving control component and a target point in the operation instruction.
In this embodiment, after receiving the operation instruction, the execution subject of the method provided in this embodiment may analyze the operation instruction through this step, and may extract the target driving control component corresponding to the operation instruction, the position information of the associated target point in the road environment screen, and so on.
S1062, obtaining an actual environment position corresponding to the target point in the road running environment and a target driving behavior corresponding to the target driving control component.
With the above description, the present step may determine the actual environment position of the target point in the road driving environment based on the position information of the target point and in combination with information such as the display ratio of the road environment picture and the road driving environment where the vehicle is actually located. Meanwhile, the step can also analyze the control requirement of the driver corresponding to the target driving control component, namely, the driving behavior which the driver expects the vehicle to present is obtained.
S1063 controlling the vehicle to perform the target driving behavior at the actual environmental position.
After determining the driving behavior change place expected by the driver, that is, the actual environment position, and the target driving behavior expected by the driver to be presented by the vehicle through the steps, the step can be used for controlling the vehicle to present the target driving behavior of the target driving control assembly when the vehicle runs to the actual environment position.
It should be noted that, the specific control of the execution body on the vehicle is mainly determined by the target driving behavior corresponding to the target driving control component. For example, assuming that the target driving behavior is parking, the present execution subject needs to perform deceleration control on the vehicle from the current time so that the vehicle is parked at the actual environmental position just before, and for example, when the target driving behavior is overtaking, the present execution subject needs to determine when to start acceleration control on the vehicle so that the vehicle can reach the overtaking condition just at the actual environmental position just before, thereby starting to perform the driving behavior of overtaking. For another example, when the target driving behavior is lane change, the present execution subject also controls the speed or direction of the vehicle in advance so that the vehicle can make a lane change at the actual environmental position.
The fourth alternative embodiment provides a specific implementation of responding to the control issuing operation and controlling the vehicle to change the driving behavior, so that the control time, the specific control position and the like of the driving behavior are defined, and technical support is provided for the driver to flexibly operate the vehicle in an automatic driving state.
As a fifth alternative embodiment of the present embodiment, on the basis of any of the above embodiments, the present fifth alternative embodiment further optimizes and adds a functional feature, which may be described as exhibiting driving behavior of the vehicle in a certain period of time from the present to the rear through the identification pattern.
It should be noted that the implementation of the new function of the fifth alternative embodiment is also not limited to the implementation stage of the target driving behavior corresponding to the driving control component selected by the driver, and it is also equivalent to an independent functional body, and may be implemented in an automatic driving state of the vehicle, or may be implemented independently in the course of responding to the driving behavior of the target driven by the driver.
Specifically, the operation steps of optimizing the addition of this fifth alternative embodiment can be described as:
and displaying the driving behavior of the vehicle in a second preset time period by adopting an identification graph on the driving road surface of the basic driving interface and/or the driving road surface outside the vehicle, wherein the second preset time period is a time period of a second set time length from the current moment to the rear.
In this embodiment, the road surface of the basic driving interface and/or the road surface of the vehicle outside may be regarded as the relative limitation of the display view angle of the content to be displayed in this optional embodiment according to the different display view angles when the information of the central control display screen is displayed. The identification pattern is understood to be a representation of the pattern form of the driving intention of the driving behavior of the vehicle. The value range of the second predetermined period may preferably be 5 seconds to 10 seconds. The executing body can also pre-judge the driving behaviors of the vehicle in the second preset time period, so that the driving behaviors of the vehicle are determined from the current moment to the ending moment of the second preset time period.
It is known that the driving behaviour exhibited by the vehicle during this second predetermined period of time is likewise determined on the basis of the driving decisions given by the automatic driving system. After determining that the second preset time period includes driving behaviors, displaying the identification graph according to the display modes matched with the driving behaviors.
On the basis of the visual and/or phonetic representation of the driving behavior change moment and the flexible control of the driving behavior of the vehicle by the driver, the visual representation characteristic of the driving behavior of the vehicle is further enriched through the fifth alternative embodiment, the graphical representation of the driving behavior of the vehicle on the central control display screen is realized, so that the driver in the vehicle can more intuitively know the driving state of the vehicle, and the driving intention of the vehicle in a certain time period in the future can be completely and effectively displayed when the driving behavior of the vehicle is changed, and the use experience of the user on the vehicle is comprehensively promoted.
As a sixth optional embodiment of the present embodiment, on the basis of the above fifth optional embodiment, the present sixth optional embodiment further displays driving behavior of the vehicle in a second predetermined period of time using an identification pattern, by embodying:
a2, obtaining driving behaviors included in the second preset time period of the vehicle, and respectively recording the driving behaviors as driving behaviors to be displayed.
For example, it may be determined by the driving decision of the automated driving system on the vehicle which driving behaviour (there may be one or two driving behaviour, or even a plurality of driving behaviour) are included within the second predetermined period of time, which driving behaviour may be denoted as driving behaviour to be exhibited; in addition, it may also be determined whether a target driving behavior of the driver trigger manipulation is received within a second predetermined period of time, which may also be noted as a driving behavior to be exhibited.
b2, controlling each driving behavior to be displayed according to the execution time sequence of each driving behavior to be displayed, and displaying the identification graph in a corresponding identification graph display form.
In this embodiment, the driving behaviors to be displayed may be ordered according to the execution time, and the matched display forms of the identification graphics are determined for each driving behavior to be displayed, so that the display of the identification graphics of each driving behavior to be displayed may be performed on the basic driving interface.
In this embodiment, the above-mentioned identification graphic is mainly attached to the road surface in the basic driving interface or the upper space of the road surface of the vehicle outside the vehicle for displaying. The execution subject can select different display attachment modes by presenting different view angles to the road environment picture. If the execution main body selects the visual angle of the upper part of the road surface, the identification graph can be displayed by attaching to the upper part of the road surface.
In this embodiment, the presentation position of the identification graphic may be considered to be matched with the actual occurrence position of the driving behavior of the vehicle, and presented on the driving route of the vehicle, and by matching with the position of the environmental interface, the user may intuitively perceive the position and time of the occurrence of the driving behavior. Meanwhile, the display width of the identification graph is preferably smaller than the lane width of one lane in the basic driving interface; the identification pattern is only positioned on the route in front of the running direction of the vehicle, such as the head direction of forward running and the tail direction of reversing.
In this embodiment, the presentation length range of the identification graphic may preferably be: driving forward a distance of the second predetermined period of time from a current position of the vehicle; for example, it may be a distance of 5-10 seconds from the vehicle position to the front, which may vary according to the traveling vehicle speed, a practical distance of not less than 20 meters at a minimum.
In addition, when it is determined that at least two driving behaviors exist in the second preset time period, the situation that the driving behaviors are changed is equivalent, and therefore, the displayed identification graph can further preferably comprise key nodes representing the driving behavior change moment of the vehicle, and the change of the driving behaviors is represented; and the key nodes are displayed according to the node display attribute of the driving behavior to be changed. For example, the overtaking node is represented by a square shape on the base path, and means that the vehicle starts overtaking when traveling to the position shown by the square.
Meanwhile, regarding the displayed logo patterns, it should be further explained that the display form of each of the logo patterns of the driving behaviors may be preferably a basic path, and the basic path is represented by a continuous curve or straight line with a set width, or by a set of unit patterns forming a curve or straight line. Illustratively, the base path may represent the distance by a change in size. For a coherent curve or line, the width at near is greater than the width at far; for the cell pattern composition path, the size of the near cell pattern is larger than the size of the far cell pattern. The overall dimensional change conforms to the perspective relationship of the overall road environment interface.
In addition, when continuous curves or straight lines with set widths are adopted for characterization, the display sizes and the display colors of different basic paths are corresponding; when a group of unit patterns formed into curves or straight lines are used for representing, colors, shapes and sizes of the unit patterns used for corresponding to different basic paths are used.
Taking the color represented by the identification graph as an example, first, the color of the identification graph must obviously distinguish the road environment interface in the look and feel so as to ensure the recognition degree. Second, the base path may employ different colors to characterize different driving behaviors. For example, a vehicle may be indicated by a set of different colors at different levels of autopilot, e.g., gray indicates that autopilot is not activated, blue indicates that it is at an adaptive cruise level, and green indicates that it is fully autopilot. The vehicle speed change may be represented by a set of color changes, such as a green, a red, and a color that transitions gradually in between, and a path pattern that gradually changes from green to red, i.e., a vehicle speed that gradually accelerates from the vehicle speed represented by green to the vehicle speed represented by red.
It should be noted that the distance between adjacent unit patterns may indicate the vehicle speed when the vehicle travels to the corresponding position, for example, a portion with a large distance indicates a fast traveling speed and a portion with a small distance indicates a slow traveling speed.
In the present embodiment, the identification pattern may be considered to have a certain directivity, and the directivity may preferably be the same as the traveling direction of the vehicle. The shape of the unit graph in this embodiment may be at least one of a circle, a dot, a triangle, an arrow, a square, and a three-dimensional graph; by way of example, the present embodiment may indicate that the vehicle is at different automatic driving levels through different unit graphs, for example, a path is composed of dots to indicate that automatic driving is not started, a triangle is composed to indicate that the vehicle is at an adaptive cruise control level, and a square is composed to indicate that automatic driving is complete. The graphic elements form paths through arrangement, and the arrangement interval can be non-unique.
As a seventh alternative embodiment of the present embodiment, in addition to the fifth alternative embodiment described above, the seventh alternative embodiment further preferably includes:
and if the driving behavior of the vehicle is changed in the second preset time period, controlling the displayed identification graph to change and remind according to the matched dynamic effect.
This seventh alternative embodiment may be regarded as a further optimization of the presented identification pattern, which may enable dynamic presentation of the identification pattern by controlling the identification pattern in addition to the static presentation of different driving behaviors of the vehicle by this alternative embodiment. Specifically, the identification graph can be controlled to effectively present when the driving intention or the driving behavior of the vehicle changes in a certain distance. Illustratively, the presented dynamic effects may include: the basic graphic elements are changed in shape, turned over, rotated, enlarged, reduced, cut, combined, appearing, disappeared, color graded, etc.
The seventh alternative embodiment provides technical support for complete visual display of the driving behavior of the vehicle on the central control display screen, and the feasibility of the method provided by the embodiment is ensured.
Also, on the basis of the fifth alternative embodiment, the seventh alternative embodiment may further optimize the vanishing process including the identification pattern for the road section on which the vehicle has traveled after exhibiting the driving behavior of the vehicle for the second predetermined period of time using the identification pattern.
The newly added functional feature of the seventh alternative embodiment can also be regarded as further optimization of the presented logo, which mainly realizes the vanishing processing of the logo which has exceeded the display range, and the vanishing processing may be that the logo of the road section on which the vehicle has traveled may be gradually vanished after the logo presentation.
The seventh alternative embodiment may be considered as another newly added function of the method provided by the present embodiment, mainly implementing the functional optimization of the graphical display of driving behavior, and further improving the visual effect of the graphical display of driving behavior by presenting the dynamic effect of the change time of driving behavior.
It should be noted that, regarding the display of the identification graphics of the driving behavior of the vehicle on the basic driving interface, through the above description of the present embodiment, it can be known that the display modes of the identification graphics of different driving behaviors are different, and the display modes of the identification graphics matched with the driving behaviors can be implemented through flexible pre-configuration.
One of the configuration forms is used for illustrating the display forms of the identification graphics corresponding to several common driving behaviors of the vehicle.
Specifically, fig. 8 shows a display form of an identification graph matched with the uniform running of the vehicle in the method for managing the driving behavior of the vehicle provided by the embodiment. As shown in fig. 8, the driving behavior vehicle corresponding to the display form runs at a constant speed. The vehicle can be represented by a basic path with uniform space and single color and certain directivity, and the vehicle can travel at a uniform speed in the distance. Wherein the basic path adopted is characterized by a group of unit patterns which form a straight line, the colors of the unit patterns can be configured as blue, and the shapes of the unit patterns can be configured as triangles.
Fig. 9 shows a display form of an identification pattern matched with a lane change of a vehicle in the method for managing driving behavior of a vehicle according to the present embodiment. As shown in fig. 9, the driving behavior corresponding to the display form is a lane change, and the lane change further includes a situation that the vehicle enters the ramp or merges into the main road. The method can be used for indicating that the uniform speed lane change occurs in the illustrated path through the basic path which has uniform space and single color, has certain directivity and spans lanes. The basic path adopted is characterized by a group of unit patterns forming a curve, the colors of the unit patterns can be configured to be blue, and the shapes of the unit patterns can be configured to be triangular. The identification graph comprises key nodes for representing the occurrence time of the lane change.
Fig. 10 shows a display form of an identification pattern matched with a vehicle overtaking in the method for managing driving behavior of a vehicle according to the present embodiment. As shown in fig. 10, the driving behavior corresponding to the display mode is that the vehicle overtakes. The device can indicate that the vehicle is going to overtake at the shown position through the basic path of the road with certain directivity and crossing the road by the interval change, and has acceleration and deceleration change at the interval color change position. Wherein the basic path adopted is characterized by a group of unit graphs forming a curve, the colors of the unit graphs can configure a uniform-speed driving part as blue, and the part with acceleration and deceleration change is configured as orange, red and yellow gradual change. The shape can also be configured as a triangle, and the identification graph comprises key nodes for representing the occurrence time of the overtaking of the vehicle.
Fig. 11 shows a pattern showing a pattern of the identification of the matching of the steering of the vehicle in the method for managing the driving behavior of the vehicle according to the present embodiment. As shown in fig. 11, the driving behavior vehicle corresponding to the display form turns. The vehicle steering device can indicate that the vehicle is steering according to the shown path through a basic path with certain directivity of the color change by the distance change, and acceleration and deceleration changes are generated at the position of the distance color change. The basic path adopted is characterized by a group of unit patterns forming a curve, the colors of the unit patterns can configure a uniform-speed driving part to be dark blue, and the part with acceleration and deceleration change is configured to be light blue and green gradual change. The shape may also be configured as a triangle, with the identification pattern including key nodes that characterize the moment of occurrence of the vehicle steering.
Fig. 12 shows a display form of an identification pattern matched with a turning of a vehicle in the method for managing driving behavior of a vehicle according to the present embodiment. As shown in fig. 12, the driving behavior vehicle corresponding to the display form turns around. The vehicle can be represented by a curve path graph with certain directivity and crossing the vehicle track through the interval change, the vehicle is turned according to the path, and acceleration and deceleration changes are generated at the position of the interval color change. The basic path adopted is characterized by a group of unit patterns forming a curve, the colors of the unit patterns can configure a uniform-speed driving part to be dark blue, and the part with acceleration and deceleration change is configured to be light blue and green gradual change. The shape can also be configured as a triangle, and meanwhile, the identification graph comprises key nodes for representing the occurrence time of turning around the vehicle.
Fig. 13 shows a pattern showing an identification pattern matched with a stop of a vehicle in the method for managing driving behavior of a vehicle according to the present embodiment. As shown in fig. 13, the driving behavior vehicle corresponding to the display form is stopped. Which may represent a vehicle stop position, such as a transverse line segment, by some sort of pattern on the path that is distinct from the cell pattern. The corresponding pre-stop deceleration behavior is indicated by the color and cell spacing, and no path pattern is present after the stop position. The shape of the unit pattern representing the driving part can be configured as a triangle, the shape representing the stop can be configured as a transverse line segment, the color of the unit pattern of the driving part can be configured as a dark blue, and the part with acceleration and deceleration change is configured as a gradual change of light blue and green. Meanwhile, the identification graph comprises key nodes for representing the occurrence time of stopping the vehicle.
Fig. 14 shows a pattern showing a logo matching the end of automatic driving in the method for managing driving behavior of a vehicle according to the present embodiment. As shown in fig. 14, the driving behavior vehicle corresponding to the display form ends the automatic driving. Which may represent the end of the vehicle autopilot position, such as a broken link symbol, by some sort of graphic on the path that is distinct from the cell pattern. The corresponding deceleration behavior before stopping is represented by the color and the cell pitch, and the path diagram changes to a pattern in which the automatic driving is not started after the end position. The shape of the unit graph representing the driving part can be configured as a triangle, the shape representing the position of ending the automatic driving can be configured as a broken chain symbol, the color of the unit graph of the driving part can be configured as a dark blue, and the color of the unit graph after ending the automatic driving can be configured as a light blue. Meanwhile, the identification graph comprises key nodes for representing the occurrence time of the vehicle ending automatic driving.
Therefore, the method provided by the embodiment is not limited to the use in the automatic driving state of the vehicle, and can be applied to the driving state of the driver or the self-adaptive cruise driving state.
Fig. 15 shows a display form of an identification pattern matched with a vehicle early warning in the method for managing driving behavior of a vehicle according to the present embodiment. As shown in fig. 15, the driving behavior corresponding to the display form is a vehicle warning. It may be indicated by some sort of graph on the path that differs from the cell graph that there is a road safety risk at the location, such as a narrowing of the lane. The shape of the unit graph representing the driving part can be configured as a triangle, the shape of the part representing the risk early warning can be configured as a vehicle narrowing symbol, the color of the unit graph of the driving part can be configured as a dark blue, the color of the unit graph after early warning is started can be configured as a dark blue, and the displayed size is relatively reduced. Meanwhile, the identification graph comprises key nodes representing the early warning occurrence time of the vehicle.
Fig. 16 is a schematic structural diagram of a vehicle driving behavior management device provided in an embodiment of the present application, where the device is suitable for use in controlling driving behavior of a vehicle in an automatic driving state, and the device may be implemented in a software and/or hardware manner and may be integrated in the vehicle. The vehicle further includes a central control display screen, on which a basic driving interface including 3D map navigation is displayed, specifically, as shown in fig. 16, a key node calculating module 21 and a key node display module 22.
The key node calculation module 21 is configured to calculate, in an automatic driving state, a key node of a driving behavior change in an expected driving behavior according to the expected driving behavior of the vehicle;
a key node display module 22 for displaying a node pattern representing the key node, the node pattern being placed at a road surface position of a driving behavior change within the 3D map navigation.
The management device for driving behavior of the vehicle, provided by the embodiment, is equivalent to that when the vehicle runs in an automatic driving state, not only can the running navigation route of the vehicle be displayed, but also the visual presentation of the key nodes corresponding to the changed positions where the driving behavior is changed in the expected driving behavior of the vehicle is realized, namely, the key nodes can be represented by the node graph displayed on the basic running interface or the vehicle navigation pavement. Through the device, people in the vehicle can grasp driving dynamics in advance through the displayed key nodes, can also intuitively and correctly judge whether the vehicle is continuously suitable for the automatic driving state of the vehicle or not, and timely take over the vehicle when required, so that potential safety hazards of driving are avoided, and the trust degree of the people in the vehicle on the automatic driving capability of the vehicle is improved.
Further, the key node calculation module 21 may be specifically configured to obtain an expected driving behavior of the vehicle in a first preset period of time when the vehicle is automatically driven, where the first preset period of time is a period of time of a first set duration that is backward from a current moment; and determining a key node of the expected driving behavior, wherein the key node is a position node of the driving behavior, and the position node is used for changing the state of the vehicle.
Further, the key node display module 22 may specifically include:
a duration determining unit, configured to determine an interval duration between the time when the expected driving behavior changes and the current time;
a position determining unit configured to determine a specific road position at which the driving behavior change occurs, according to a current running speed of the vehicle, the interval duration, and a current position in a road running environment;
a change position determining unit, configured to determine a driving behavior change position of the key node in the 3D map navigation in combination with the specific road position;
the node display unit is used for marking the driving behavior change position by using a node graph and displaying the node graph on a driving road surface of the basic driving interface and/or a driving road surface outside a vehicle;
And the presentation position of the node graph is matched with the actual occurrence position of the driving behavior change and is positioned on the driving route of the vehicle.
Further, the node display unit may be specifically configured to determine a driving behavior in which the driving behavior change occurs, record the driving behavior as a driving behavior to be changed, and determine a node display attribute that matches the driving behavior to be changed; marking the driving behavior change position by using a node graph with the node display attribute; and displaying the node graph on the driving road surface of the basic driving interface and/or the driving road surface outside the vehicle.
Further, the apparatus may further include: the system comprises a first receiving module, an information display module, a second receiving module and a driving control module.
The first receiving module is used for receiving automatic driving intervention triggering operation;
the information display module is used for displaying a driving intervention control area through the central control display screen, and the driving intervention control area comprises at least one driving control assembly;
the second receiving module is used for receiving control and issuing an operation instruction, the control is that a driver selects a target driving control component from the driving intervention control area and drags the target driving control component to a target point on the basic driving interface, and the operation instruction is an instruction corresponding to the target driving control component aiming at the target point;
And the driving control module is used for responding to the operation instruction and controlling the driving behavior of the vehicle.
Further, the driving control module may specifically be configured to:
extracting a target driving control component and a target point in the operation instruction;
obtaining an actual environment position corresponding to the target point in a road running environment and a target driving behavior corresponding to the target driving control component;
controlling the vehicle to perform the target driving behavior at the actual environmental position.
Further, each driving control component corresponds to one driving behavior to be controlled of the vehicle;
the driving behavior to be controlled comprises at least one of the following: vehicle acceleration, vehicle deceleration, vehicle lane change, vehicle overtaking, vehicle turning around, vehicle steering, parking, and take over vehicle maneuvers.
Further, the apparatus may further include:
the identification graph display module is used for displaying driving behaviors of the vehicle in a second preset time period by adopting an identification graph on the driving road surface of the basic driving interface and/or the driving road surface outside the vehicle, wherein the second preset time period is a time period of a second set time length from the current moment to the back.
Further, the identification graphic presentation module may be specifically configured to:
obtaining driving behaviors included by the vehicle in the second preset time period, and respectively marking the driving behaviors as driving behaviors to be displayed;
and controlling each driving behavior to be displayed to display the identification graph in a corresponding identification graph display form according to the execution time sequence of each driving behavior to be displayed.
Further, the apparatus may further include:
and the dynamic effect display module is used for controlling the displayed identification graph to carry out change reminding according to the matched dynamic effect if the driving behavior of the vehicle is changed in the second preset time period.
Further, the display width of the identification graph is smaller than the lane width of one lane of the basic driving interface driving road surface and/or the driving road surface outside the vehicle;
the range of the presentation length of the identification graph is as follows: driving forward a distance of the second predetermined period of time from a current position of the vehicle;
the identification graph comprises key nodes representing the change time of the driving behavior of the vehicle, and node display attributes of the key nodes are matched with the driving behavior to be changed.
Further, the display form of the identification graph of each driving behavior is a basic path, and the basic path is represented by a continuous curve or straight line with a set width, or is represented by a group of unit graphs forming a curve or straight line;
Different driving behaviors correspond to different presentation sizes and presentation colors of the basic paths; or the colors, shapes and sizes of the unit patterns adopted by the corresponding different basic paths;
the shape of the unit graph is at least one of a circle, a round dot, a triangle, an arrow, a square and a three-dimensional graph;
the spacing of adjacent cell patterns represents the current speed of the vehicle.
The vehicle driving behavior management device provided by the embodiment can execute the vehicle driving behavior management method provided by the method embodiment, and has the corresponding functional modules and beneficial effects of the execution method. The implementation principle and technical effect of the present embodiment are similar to those of the above method embodiment, and are not described here again.
Fig. 17 is a block diagram of a vehicle according to an embodiment of the present application, and as shown in fig. 17, the vehicle may include a central control display 31, a controller 32, a storage device 33, an input device 34, and an output device 35; the number of controllers 32 in the vehicle may be one or more, one controller 32 being taken as an example in fig. 17; the central display 31, the controller 32, the storage device 33, the input device 34 and the output device 35 in the vehicle may be connected by a bus or other means, in fig. 17 by way of example.
The central control display screen 31 is used for presenting a basic driving interface during the driving process of the vehicle.
The storage device 33 is a computer-readable storage medium that can be used to store software programs, computer-executable programs, and modules, such as program instructions/modules (e.g., the key node calculation module 21 and the key node display module 22) corresponding to the method of managing driving behavior of a vehicle in the embodiment of the present invention. The controller 32 executes various functional applications of the vehicle and data processing, that is, implements the above-described management method of the driving behavior of the vehicle by running software programs, instructions, and modules stored in the storage device 33.
The storage device 33 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for functions; the storage data area may store data created according to the use of the terminal, etc. In addition, the storage 33 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some examples, the storage device 33 may further include memory remotely located with respect to the controller 32, which may be connected to the vehicle via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 34 may be used to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the vehicle. The output means 35 may comprise a display device such as a display screen.
The present embodiments also provide a computer-readable storage medium having stored thereon a computer program for performing a method of managing driving behavior of a vehicle when executed by a computer processor, the method comprising:
in an automatic driving state, calculating key nodes of driving behavior change in the expected driving behavior according to the expected driving behavior of the vehicle; and displaying a node graph representing the key nodes, wherein the node graph is placed at a road surface position with changed driving behavior in the 3D map navigation.
Of course, the computer program of the computer readable storage medium provided in the embodiments of the present application is not limited to the method operations described above, but may also perform related operations in the vehicle driving behavior management method provided in any embodiment of the present application.
From the above description of embodiments, it will be clear to a person skilled in the art that the present application may be implemented by means of software and necessary general purpose hardware, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as a floppy disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a FLASH memory (FLASH), a hard disk, or an optical disk of a computer, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a grid device, etc.) to perform the method described in the embodiments of the present application.
It should be noted that, in the embodiment of the vehicle driving behavior management apparatus described above, each unit and module included are only divided according to the functional logic, but not limited to the above-described division, as long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for distinguishing from each other, and are not used to limit the protection scope of the present application.
Note that the above is only a preferred embodiment of the present application and the technical principle applied. Those skilled in the art will appreciate that the present application is not limited to the particular embodiments described herein, but is capable of numerous obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the present application. Therefore, while the present application has been described in connection with the above embodiments, the present application is not limited to the above embodiments, but may include many other equivalent embodiments without departing from the spirit of the present application, the scope of which is defined by the scope of the appended claims.
Claims (14)
1. A method of managing driving behavior of a vehicle, characterized by being applied to a vehicle, the vehicle comprising: the central control display screen is provided with a basic driving interface comprising 3D map navigation, and the method comprises the following steps:
Calculating key nodes of the driving behavior change of the vehicle in the expected driving behavior of the vehicle according to the expected driving behavior of the vehicle in an automatic driving state; the key node represents a starting position of the driving behavior change of the vehicle corresponding to the driving behavior change moment of the vehicle in the basic driving interface;
displaying a node graph representing the key nodes, wherein the node graph is placed at a road surface position of the driving behavior change in the 3D map navigation;
wherein the method further comprises:
receiving automatic driving intervention triggering operation;
displaying a driving intervention control area through a central control display screen, wherein the driving intervention control area comprises at least one driving control assembly;
receiving control and issuing an operation instruction, wherein the control is that a driver selects a target driving control component from the driving intervention control area and drags the target driving control component to a target point on the basic driving interface, and the operation instruction is an instruction corresponding to the target point execution target driving control component;
and responding to the operation instruction, and controlling the driving behavior of the vehicle.
2. The method of claim 1, wherein calculating key nodes of driving behavior alterations in the expected driving behavior based on the expected driving behavior of the vehicle comprises:
Acquiring expected driving behaviors of the vehicle in a first preset time period for automatic driving of the vehicle, wherein the first preset time period is a time period of a first set time length backwards from the current moment;
and determining a key node of the expected driving behavior, wherein the key node is a position node of the driving behavior, and the position node is used for changing the state of the vehicle.
3. The method of claim 1, wherein displaying a node graph representing the key nodes comprises:
determining the interval duration of the moment when the expected driving behavior changes from the current moment;
determining a specific road position where the driving behavior change occurs according to the current running speed of the vehicle, the interval duration and the current position in a road running environment;
determining a driving behavior change position of the key node in the 3D map navigation by combining the specific road position;
marking the driving behavior change position by using a node graph, and displaying the node graph on a driving road surface of the basic driving interface and/or a driving road surface outside a vehicle;
and the presentation position of the node graph is matched with the actual occurrence position of the driving behavior change and is positioned on the driving route of the vehicle.
4. A method according to claim 3, wherein marking the driving behavior modification position using a node pattern and displaying the node pattern on a road surface of the basic driving interface and/or an off-vehicle road surface, comprises:
determining the driving behavior of which the driving behavior is changed, marking the driving behavior as driving behavior to be changed, and determining node display attributes matched with the driving behavior to be changed;
marking the driving behavior change position by using a node graph with the node display attribute;
and displaying the node graph on the driving road surface of the basic driving interface and/or the driving road surface outside the vehicle.
5. The method of claim 1, wherein controlling the driving behavior of the vehicle in response to the operating instruction comprises:
extracting a target driving control component and a target point in the operation instruction;
obtaining an actual environment position corresponding to the target point in a road running environment and a target driving behavior corresponding to the target driving control component;
controlling the vehicle to perform the target driving behavior at the actual environmental position.
6. The method of claim 2, wherein each steering assembly corresponds to a steering behavior of the vehicle to be steered;
The driving behavior to be controlled comprises at least one of the following: vehicle acceleration, vehicle deceleration, vehicle lane change, vehicle overtaking, vehicle turning around, vehicle steering, parking, and take over vehicle maneuvers.
7. The method of any one of claims 1-6, further comprising:
and displaying the driving behavior of the vehicle in a second preset time period by adopting an identification graph on the driving road surface of the basic driving interface and/or the driving road surface outside the vehicle, wherein the second preset time period is a time period of a second set time length from the current moment to the rear.
8. The method of claim 7, wherein presenting the driving behavior of the vehicle for a second predetermined period of time using an identification graphic comprises:
obtaining driving behaviors included by the vehicle in the second preset time period, and respectively marking the driving behaviors as driving behaviors to be displayed;
and controlling each driving behavior to be displayed to display the identification graph in a corresponding identification graph display form according to the execution time sequence of each driving behavior to be displayed.
9. The method as recited in claim 7, further comprising:
and if the driving behavior of the vehicle is changed in the second preset time period, controlling the displayed identification graph to change and remind according to the matched dynamic effect.
10. The method according to claim 8 or 9, wherein,
the display width of the identification graph is smaller than the lane width of one lane of the basic driving interface driving road surface and/or the driving road surface outside the vehicle;
the range of the presentation length of the identification graph is as follows: driving forward a distance of the second predetermined period of time from a current position of the vehicle;
the identification graph comprises key nodes representing the change time of the driving behavior of the vehicle, and node display attributes of the key nodes are matched with the driving behavior to be changed.
11. The method according to claim 8 or 9, wherein,
the display form of the identification graph of each driving behavior is a basic path, and the basic path is represented by a continuous curve or straight line with a set width, or is represented by a group of unit graphs forming the curve or straight line;
different driving behaviors correspond to different presentation sizes and presentation colors of the basic paths; or the colors, shapes and sizes of the unit patterns adopted by the corresponding different basic paths;
the shape of the unit graph is at least one of a circle, a round dot, a triangle, an arrow, a square and a three-dimensional graph;
The spacing of adjacent cell patterns represents the current speed of the vehicle.
12. A management device for driving behavior of a vehicle, the management device being configured in the vehicle, the vehicle comprising a central control display screen, the central control display screen displaying a basic driving interface including 3D map navigation, the device comprising:
the key node calculation module is used for calculating key nodes of the driving behavior change of the vehicle in the expected driving behavior of the vehicle according to the expected driving behavior of the vehicle in an automatic driving state; the key node represents a starting position of the driving behavior change of the vehicle corresponding to the driving behavior change moment of the vehicle in the basic driving interface;
the key node display module is used for displaying a node graph representing the key node, and the node graph is arranged at a road surface position where driving behavior is changed in the 3D map navigation;
wherein the apparatus further comprises: the system comprises a first receiving module, an information display module, a second receiving module and a driving control module;
the first receiving module is used for receiving automatic driving intervention triggering operation;
the information display module is used for displaying a driving intervention control area through the central control display screen, and the driving intervention control area comprises at least one driving control assembly;
The second receiving module is used for receiving control and issuing an operation instruction, the control is that a driver selects a target driving control component from the driving intervention control area and drags the target driving control component to a target point on the basic driving interface, and the operation instruction is an instruction corresponding to the target driving control component aiming at the target point;
and the driving control module is used for responding to the operation instruction and controlling the driving behavior of the vehicle.
13. A vehicle, characterized by comprising:
a central control display screen;
one or more controllers;
a storage means for storing one or more programs;
when the one or more programs are executed by the one or more controllers, the one or more controllers are caused to implement the method of managing vehicle driving behavior as recited in any one of claims 1-11.
14. A storage medium containing computer executable instructions which, when executed by a computer processor, implement the method of managing vehicle driving behaviour according to any one of claims 1 to 11.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111248414.2A CN113895458B (en) | 2021-10-26 | 2021-10-26 | Vehicle driving behavior management method and device, vehicle and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111248414.2A CN113895458B (en) | 2021-10-26 | 2021-10-26 | Vehicle driving behavior management method and device, vehicle and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113895458A CN113895458A (en) | 2022-01-07 |
CN113895458B true CN113895458B (en) | 2023-06-30 |
Family
ID=79026378
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111248414.2A Active CN113895458B (en) | 2021-10-26 | 2021-10-26 | Vehicle driving behavior management method and device, vehicle and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113895458B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115123303A (en) * | 2022-07-18 | 2022-09-30 | 腾讯科技(深圳)有限公司 | Vehicle driving state display method and device, electronic equipment and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008241249A (en) * | 2007-03-23 | 2008-10-09 | Pioneer Electronic Corp | Navigation device, navigation method, and program, recording medium, and information display device and method |
CN106080744A (en) * | 2015-04-27 | 2016-11-09 | 丰田自动车株式会社 | Automatic driving vehicle system |
CN113436455A (en) * | 2020-03-23 | 2021-09-24 | 爱信艾达株式会社 | Driving support system and driving support program |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104890670B (en) * | 2014-03-06 | 2019-09-10 | 富顶精密组件(深圳)有限公司 | Driving assistance system and driving assistance method |
JP6241341B2 (en) * | 2014-03-20 | 2017-12-06 | アイシン・エィ・ダブリュ株式会社 | Automatic driving support device, automatic driving support method and program |
JP6531983B2 (en) * | 2015-07-31 | 2019-06-19 | パナソニックIpマネジメント株式会社 | Automatic driving apparatus, automatic driving support method and automatic driving support program |
JP5945999B1 (en) * | 2015-07-31 | 2016-07-05 | パナソニックIpマネジメント株式会社 | Driving support device, driving support system, driving support method, driving support program, and autonomous driving vehicle |
JP6728558B2 (en) * | 2016-01-25 | 2020-07-22 | 日立オートモティブシステムズ株式会社 | Automatic operation control device and automatic operation control method |
WO2017158764A1 (en) * | 2016-03-16 | 2017-09-21 | 本田技研工業株式会社 | Vehicle control system, vehicle control method, and vehicle control program |
JP6375568B2 (en) * | 2016-04-28 | 2018-08-22 | 本田技研工業株式会社 | Vehicle control system, vehicle control method, and vehicle control program |
KR101919889B1 (en) * | 2017-05-31 | 2018-11-19 | 엘지전자 주식회사 | Display device and vehicle comprising the same |
JP6621032B2 (en) * | 2017-01-30 | 2019-12-18 | パナソニックIpマネジメント株式会社 | Driving support device, driving support system, driving support method, driving support program, and autonomous driving vehicle |
JP2018147450A (en) * | 2017-03-09 | 2018-09-20 | オムロン株式会社 | Mode change control device, mode change control system, mode change control method, and program |
US10471963B2 (en) * | 2017-04-07 | 2019-11-12 | TuSimple | System and method for transitioning between an autonomous and manual driving mode based on detection of a drivers capacity to control a vehicle |
CN109987095B (en) * | 2018-01-02 | 2022-09-23 | 奥迪股份公司 | Driving assistance system and method |
CN109709966B (en) * | 2019-01-15 | 2021-12-07 | 阿波罗智能技术(北京)有限公司 | Control method and device for unmanned vehicle |
KR102061750B1 (en) * | 2019-05-15 | 2020-01-03 | 주식회사 라이드플럭스 | Method and apparatus for controlling a vehicle’s driving operation using advance information |
CN110207719A (en) * | 2019-05-21 | 2019-09-06 | 北京百度网讯科技有限公司 | A kind of processing method and processing device in automatic Pilot path |
KR20210081939A (en) * | 2019-12-24 | 2021-07-02 | 엘지전자 주식회사 | Xr device and method for controlling the same |
CN112298185B (en) * | 2020-11-06 | 2021-12-14 | 苏州挚途科技有限公司 | Vehicle driving control method and device and electronic equipment |
-
2021
- 2021-10-26 CN CN202111248414.2A patent/CN113895458B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008241249A (en) * | 2007-03-23 | 2008-10-09 | Pioneer Electronic Corp | Navigation device, navigation method, and program, recording medium, and information display device and method |
CN106080744A (en) * | 2015-04-27 | 2016-11-09 | 丰田自动车株式会社 | Automatic driving vehicle system |
CN113436455A (en) * | 2020-03-23 | 2021-09-24 | 爱信艾达株式会社 | Driving support system and driving support program |
Also Published As
Publication number | Publication date |
---|---|
CN113895458A (en) | 2022-01-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113474205B (en) | Method for operating a driver information system in a vehicle and driver information system | |
CN108140311B (en) | Parking assistance information display method and parking assistance device | |
CN105261224B (en) | Intelligent vehicle control method and apparatus | |
US10699579B2 (en) | Autonomous driving system and autonomous driving vehicle | |
JP6558733B2 (en) | Driving support method, driving support device, driving control device, vehicle, and driving support program using the same | |
US11449200B2 (en) | Vehicular display device and display method in vehicular display device | |
EP2711908A1 (en) | Lane change assistant system | |
US20060040239A1 (en) | Driving simulator having articial intelligence profiles, replay, hazards, and other features | |
CN113748039A (en) | Method for operating a driver information system in a vehicle and driver information system | |
US20140067250A1 (en) | Lane change assist information visualization system | |
CN111148676A (en) | Adaptive spacing selection for optimized efficiency | |
JP6540453B2 (en) | Information presentation system | |
WO2016170764A1 (en) | Driving assistance method and driving assistance device, driving control device, vehicle, and driving assistance program using such method | |
JP2018520045A (en) | Vehicle speed control method | |
CN113895458B (en) | Vehicle driving behavior management method and device, vehicle and storage medium | |
Wang et al. | Augmented reality-based advanced driver-assistance system for connected vehicles | |
JP6825683B1 (en) | Vehicle display control device, vehicle display device, vehicle display control method and program | |
JP7472983B2 (en) | Control device, control method and program | |
CN110789342A (en) | Display device for vehicle | |
CN107848420A (en) | Select the method and motor vehicle of the analysis object for the function in motor vehicle | |
CN114863668B (en) | Vehicle formation running control method and device and electronic equipment | |
JP4631075B2 (en) | Vehicle risk avoidance guide device | |
JP2021088356A (en) | Vehicle display control device, vehicle display device, vehicle display control method and program | |
JP2019049513A (en) | Vehicle travel control method and device | |
JP7533536B2 (en) | Automatic driving control device and automatic driving control program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20220411 Address after: 201821 Building 2, No. 1688, Yecheng Road, Jiading District, Shanghai Applicant after: Shanghai Jidu Automobile Co.,Ltd. Address before: 201815 zone B, floor 1, building 2, No. 468, Huirong Road, Jiading District, Shanghai Applicant before: Jidu Automobile Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |