CN114655231B - Truck standard driving assistance method and system - Google Patents
Truck standard driving assistance method and system Download PDFInfo
- Publication number
- CN114655231B CN114655231B CN202210431952.3A CN202210431952A CN114655231B CN 114655231 B CN114655231 B CN 114655231B CN 202210431952 A CN202210431952 A CN 202210431952A CN 114655231 B CN114655231 B CN 114655231B
- Authority
- CN
- China
- Prior art keywords
- data
- driving
- safety
- truck
- behavior
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/0097—Predicting future conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/0098—Details of control systems ensuring comfort, safety or stability not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2530/00—Input parameters relating to vehicle conditions or values, not covered by groups B60W2510/00 or B60W2520/00
- B60W2530/201—Dimensions of vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/20—Static objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/402—Type
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2710/00—Output or target parameters relating to a particular sub-units
- B60W2710/06—Combustion engines, Gas turbines
- B60W2710/0605—Throttle position
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2710/00—Output or target parameters relating to a particular sub-units
- B60W2710/18—Braking system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2710/00—Output or target parameters relating to a particular sub-units
- B60W2710/20—Steering systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2710/00—Output or target parameters relating to a particular sub-units
- B60W2710/30—Auxiliary equipments
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Human Computer Interaction (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses a truck standard driving assistance method, relates to the technical field of assistant driving, and solves the technical problem of driving safety, and the method comprises the following steps: pre-establishing a safety behavior plan database; acquiring environmental data around a vehicle body and action data of a vehicle control component; the action data comprises brake pedal data, accelerator data, steering wheel data, retarder data, car lamp data and parking action data; identifying the environmental data and judging the current driving scene, identifying the action data and evaluating the space-time expected driving behavior; inquiring a safety behavior plan database according to the driving scene to obtain a matched safety behavior plan; comparing the space-time expected driving behavior with the safety behavior plan to obtain the risk degree of the space-time expected driving behavior; and if the risk degree exceeds a preset safety threshold value, sending alarm information to the driver. The invention also discloses a truck standard driving auxiliary system.
Description
Technical Field
The invention relates to the technical field of assistant driving, in particular to a truck standard driving assisting method and device.
Background
The automobile driving safety consciousness is an important determining factor of traffic safety, and as the price of fuel oil rises year by year, and many driver ideas stay in concepts of past neutral gear sliding fuel saving and the like, more irregular driving behaviors exist, and traffic safety hidden dangers are generated; the problem is more serious in truck driving due to higher fuel consumption, even the action of shutting down or incorrectly using the retarder is carried out, and the danger is larger because the inertia of the truck is large.
The system and the method for improving the driving behavior of the patent vehicle with the notice number of CN106314420B optimize a brake pedal signal, an accelerator pedal signal and a steering wheel angle signal to find an optimal path as much as possible, so that bad driving behaviors, particularly driving behaviors such as frequent parking, rapid acceleration, accelerator pressing, frequent lane changing, curve driving, overlong-time low-speed driving, rapid braking, frequent speed changing and the like, are reduced, unnecessary oil consumption and emission can be obviously reduced, potential safety hazards are eliminated, and the vehicle prompts the driving behavior to a driver to improve the driving behavior of the driver and help the driver to form good driving habits.
The above provides a solution for improving driving safety, but it is mainly real-time action signal optimization, and there is a certain time difference from signal optimization processing to driver receiving prompt and to its action response, there is hysteresis in vehicle moving at high speed and not enough to deal with the driver's active behavior.
Disclosure of Invention
The invention aims to solve the technical problem of the prior art, and aims to provide a truck standard driving assistance method capable of improving driving safety.
The invention also aims to provide a truck standard driving auxiliary system capable of improving driving safety.
In order to achieve the above object, the present invention provides a truck normative driving assistance method, including:
pre-establishing a safety behavior plan database;
acquiring environmental data around a vehicle body and action data of a vehicle control component; the action data comprises brake pedal data, accelerator data, steering wheel data, retarder data, car lamp data and parking action data;
identifying the environmental data, judging the current driving scene, identifying the action data and evaluating the space-time expected driving behavior;
inquiring the safety behavior plan database according to the driving scene to obtain a matched safety behavior plan;
comparing the space-time expected driving behavior with the safety behavior plan to obtain the risk degree of the space-time expected driving behavior; and if the risk degree exceeds a preset safety threshold value, sending alarm information to the driver.
As a further refinement, the evaluating the spatiotemporal desired driving behavior from the motion data comprises:
classifying the driving actions according to the identification result of the action data to obtain m sample data x = (x) under various driving actions 1 ,x 2 ,……x m );
The corresponding m possible values of the unknown parameter z are z = { z = { 1 ,z 2 ……z m }, conditional distribution p (x, z | θ), conditional distribution p (z | x, θ);
setting theta as a parameter of a Gaussian element;
suppose thatWhere γ is the weight, μ is the ith Gaussian C i The mean value of (E), sigma is a covariance matrix, the number of y Gaussian elements, and the following steps are established:
e, step E: computing a conditional probability expectation for the joint distribution: q i (z (i) )=P(z (i) |x i ,θ k );
Alternately iterating the step E and the step M to obtain an estimated theta, and judging to obtain a space-time expected driving behavior according to the estimated theta and a preset driving behavior demarcation standard;
where k is the number of iterations, L () is the likelihood function, and Q is the unknown parameter z (i) Is calculated as a function of the probability density of (c).
Further, still include: when the space-time expected driving behavior is executed, the action data which are not in accordance with the safety behavior plan are corrected, so that the action data meet the real-time safety behavior plan.
Further, the environmental data around the vehicle body includes image data and radar feature data.
Furthermore, a driving guide identification scene, a guardrail scene, an uphill scene, a curve scene and a downhill scene in the image data are identified, and safety plans corresponding to the scenes are limited to be started with a retarder.
And further, obtaining vehicle parallel information, vehicle following information and vehicle type information according to the radar characteristic data.
Further, the environment data further includes: position data and road condition data of the vehicle and other surrounding vehicles; identifying the environmental data comprises calculating and determining other vehicles closest to the surroundings and relative distances according to the position data and road condition data of the vehicle and the other vehicles.
Further, identifying the environmental data further comprises:
when the radar is triggered, comparing and verifying the image recognition result and the radar feature recognition result, and if the radar does not pass the verification, stopping correcting the action data which do not conform to the safety behavior plan;
when the radar is not triggered, if the image recognition result is related to other vehicles, comparing and verifying the image recognition result with the judgment result and the relative distance of other vehicles closest to the periphery, and if the image recognition result is not passed through the verification, executing image recognition self-checking; and if the image identification self-check is abnormal, sending abnormal warning information to the driver.
And further, acquiring the environment data and the action data through an OBD bus or a vehicle-mounted central control.
In order to achieve the second objective, the truck normative driving assistance system comprises a memory and a processor, wherein the memory stores a computer program which can be loaded by the processor and executes the truck normative driving assistance method.
Advantageous effects
Compared with the prior art, the invention has the advantages that:
according to the method and the device, the current driving environment of the vehicle is identified, and the driver can carry out various driving actions, so that the time-space expected driving behaviors which are possibly generated next can be predicted, then the time-space expected driving behaviors are compared with the safety plan matched with the driving environment, and early warning is timely carried out when the risk is too high.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
The invention will be further described with reference to specific embodiments shown in the drawings.
Referring to fig. 1, a truck normative driving assistance method includes:
a safety behavior plan database is established in advance, and a preset safety behavior plan is stored in the safety behavior plan database;
acquiring environmental data around a vehicle body and action data of a vehicle control component; the action data comprises brake pedal data, accelerator data, steering wheel data, retarder data, car lamp data and parking action data; the environmental data and the action data can be acquired through an OBD bus or a vehicle-mounted central control, namely, on the premise that the OBD bus is distributed, part of the data can be acquired through the OBD bus; other data can also be obtained by configuring corresponding transmission lines to be connected to the vehicle-mounted central control unit;
identifying the environmental data and judging the current driving scene, identifying the action data and evaluating the space-time expected driving behavior, namely predicting the space-time expected driving behavior which possibly occurs next;
inquiring a safety behavior plan database according to the driving scene to obtain a matched safety behavior plan;
comparing the space-time expected driving behavior with the safety behavior plan to obtain the risk degree of the space-time expected driving behavior; and if the risk degree exceeds a preset safety threshold value, sending alarm information to the driver.
The method can identify the current driving environment of the vehicle, and can predict the space-time expected driving behaviors which are possibly generated next along with various driving actions of the driver, then compare the space-time expected driving behaviors with the safety plan matched with the driving environment, and timely give an early warning when the risk is too high.
Evaluating the spatiotemporal expected driving behavior according to the motion data includes:
classifying the driving actions according to the identification result of the action data to obtain m sample data x = (x) under various driving actions 1 ,x 2 ,……x m );
The corresponding m possible values of the unknown parameter z are z = { z = { 1 ,z 2 ……z m -a conditional distribution p (x, z | θ), a conditional distribution p (z | x, θ);
setting theta as a parameter of a Gaussian element;
suppose thatWhere γ is the weight, μ is the ith Gaussian C i The mean value sigma is covariance matrix, the number of y Gaussian elements, and the following steps are established:
e, step (E): calculating a conditional probability expectation of the joint distribution: q i (z (i) )=P(z (i) |x i ,θ k );
Alternately iterating the step E and the step M to obtain an estimated theta, and judging to obtain a space-time expected driving behavior according to the estimated theta and a preset driving behavior demarcation standard;
where k is the number of iterations, L () is the likelihood function, and Q is the unknown parameter z (i) Is calculated as a function of the probability density of (c).
Namely, a Gaussian model is established by using the action recognition result, and model parameters are estimated by using an EM (effective velocity) algorithm (step E and step M are iterated alternately), so that the driving behavior which is most likely to occur is obtained.
And determining the risk degree of the space-time expected driving behaviors according to the comparison result, specifically calculating the similarity, wherein the simple calculation is that the number of the items is the same, namely whether the driving form exists in the safety plan or not.
The method also comprises the following steps: when the risk degree exceeds a preset safety threshold value, when the driving behavior is expected to be executed in time and space, the action data which do not conform to the safety behavior plan are corrected, so that the action data meet the real-time safety behavior plan.
For example, when the driver neglects the early warning and drives with the predicted driving behavior action, when the risk is determined to be too high, the machine intervenes to limit the operation and control of the driver, and selects the matched safety behavior plan to execute so as to ensure the driving safety. The point is that the fatigue driving time efficiency is relatively better when a truck driver drives for a long distance, and the accident can be effectively avoided.
The environmental data around the automobile body includes image data, radar characteristic data, acquires environmental data through automobile body installation camera, radar, and camera, radar installation are more complete, and precision and effective range are better more big.
And identifying a driving guide identification scene, a guardrail scene, an uphill scene, a curve scene and a downhill scene in the image data, wherein safety plans corresponding to the scenes have limited opening of a retarder. Since the illegal operation of the truck in such a scene is relatively easy to cause accidents due to inertia, the limited opening of the retarder can improve the driving safety.
The vehicle parallel information, the vehicle following information and the vehicle type information can be obtained according to the radar characteristic data, and the vehicle type information comprises a cart, a trolley, a super-long vehicle and the like.
The environmental data further includes: position data and road condition data of the vehicle and other surrounding vehicles; identifying the environmental data includes calculating and determining other vehicles closest to the surroundings and relative distances according to the position data of the vehicle and the other vehicles and the road condition data.
Specifically, the method comprises the following steps: acquiring position data based on a vehicle-mounted GPS unit; the road condition data can be obtained from navigation software. Currently, after a user joins a group, part of navigation software can share the position of all people in real time, and accordingly, the relative distance can be directly obtained. It will be appreciated that the other vehicles closest in the surroundings are: determining a vehicle with the minimum relative distance in the front, the back, the left and the right in the same direction according to the road condition data; and if the image recognition determines that no solid guardrail exists in the middle of the bidirectional lane, determining the vehicle with the minimum relative distance between the front and the back and the left and the right.
Identifying the environmental data further comprises:
when the radar is triggered, comparing and verifying the image recognition result and the radar feature recognition result, and if the radar does not pass the verification, stopping correcting the action data which do not conform to the safety behavior plan; the radar in front of the vehicle body is characterized by vehicles, and when no vehicle exists in front of the vehicle body through image recognition, the comparison and verification are not passed, and under the scene, the radar only prompts a driver to guide the driver to observe the driving environment without active intervention, so that the influence of machine faults on driving is avoided; the above arrangement is mainly adopted, and the current vehicle body radar is likely to be triggered by mistake due to mechanical failure in the driving process, which is common in many vehicles;
when the radar is not triggered, if the image recognition result is related to other vehicles, comparing and verifying the image recognition result with the judgment result and the relative distance of other vehicles closest to the periphery, and if the image recognition result is not passed through the verification, executing image recognition self-checking; and if the image recognition self-checking is abnormal, sending abnormal warning information to the driver. Namely, the vehicle at a far distance is identified and verified, and if the vehicle exists in the front 100m when the relative distance is calculated, and no vehicle is identified in the image extracted from the front vehicle body camera video, the comparison and verification are not passed. Therefore, the arrangement is mainly that the camera of the vehicle, especially the truck, is inevitably covered by dirt due to the influence of the driving area environment and the cleaning maintenance degree of the vehicle body, and further influences the image recognition.
According to the method, the environmental data can be cross-compared and verified, the probability of scene misjudgment is reduced, and the using effect is guaranteed.
The truck standard driving assistance system comprises a memory and a processor, wherein the memory stores a computer program which can be loaded by the processor and executes the truck standard driving assistance method.
The above is only a preferred embodiment of the present invention, and it should be noted that it is obvious to those skilled in the art that several variations and modifications can be made without departing from the structure of the present invention, which will not affect the effect of the implementation of the present invention and the utility of the patent.
Claims (9)
1. A truck normative driving assistance method is characterized by comprising the following steps:
pre-establishing a safety behavior plan database;
acquiring environmental data around a vehicle body and action data of a vehicle control component; the action data comprises brake pedal data, accelerator data, steering wheel data, retarder data, car lamp data and parking action data;
identifying the environmental data, judging the current driving scene, identifying the action data and evaluating the space-time expected driving behavior;
inquiring the safety behavior plan database according to the driving scene to obtain a matched safety behavior plan;
comparing the space-time expected driving behavior with the safety behavior plan to obtain the risk degree of the space-time expected driving behavior; if the risk degree exceeds a preset safety threshold value, sending alarm information to a driver;
evaluating spatiotemporal expected driving behavior from the motion data includes:
classifying the driving actions according to the identification result of the action data to obtain m sample data x = (x) under various driving actions 1 ,x 2 ,......x m );
The corresponding m possible values of the unknown parameter z are z = { z = { 1 ,z 2 ......z m -a conditional distribution p (x, z | θ), a conditional distribution p (z | x, θ);
setting theta as a parameter of a Gaussian element;
suppose thatWherein, gamma is weight, mu is the ith Gaussian element C i The mean value sigma is covariance matrix, the number of y Gaussian elements, and the following steps are established:
e, step E: calculating a conditional probability expectation of the joint distribution: q i (z (i) )=P(z (i) |x i ,θ k );
Alternately iterating the step E and the step M to obtain an estimated theta, and judging to obtain a space-time expected driving behavior according to the estimated theta and a preset driving behavior demarcation standard;
where k is the number of iterations, L () is the likelihood function, and Q is the unknown parameter z (i) Is determined.
2. The truck normative driving assistance method according to claim 1, further comprising: when the space-time expected driving behavior is executed, the action data which are not in accordance with the safety behavior plan are corrected, so that the action data meet the real-time safety behavior plan.
3. The truck normative driving assistance method according to claim 1 or 2, wherein the environmental data around the vehicle body includes image data and radar feature data.
4. The method for assisting the normative driving of the truck according to claim 3, wherein a driving guide identification scene, a guardrail scene, an uphill scene, a curve scene and a downhill scene in the image data are identified, and safety plans corresponding to the scenes have limited retarder opening.
5. The truck normative driving assistance method as claimed in claim 3, wherein vehicle parallel information, vehicle following information and vehicle type information are obtained according to the radar feature data.
6. The truck normative driving assistance method according to claim 3, wherein the environmental data further includes: position data and road condition data of the vehicle and other surrounding vehicles; identifying the environmental data comprises calculating and determining other vehicles closest to the surroundings and relative distances according to the position data of the vehicle and the other vehicles and the road condition data.
7. The truck normative driving assistance method of claim 6, wherein identifying the environmental data further comprises:
when the radar is triggered, comparing and verifying the image recognition result and the radar feature recognition result, and if the radar does not pass the verification, stopping correcting the action data which do not conform to the safety behavior plan;
when the radar is not triggered, if the image recognition result is related to other vehicles, comparing and verifying the image recognition result with the judgment result and the relative distance of other vehicles closest to the periphery, and if the image recognition result is not passed through the verification, executing image recognition self-checking; and if the image recognition self-checking is abnormal, sending abnormal warning information to the driver.
8. The truck-specification driving assistance method according to claim 1, wherein the environmental data and the motion data are acquired through an OBD bus or an on-board central control.
9. A truck normative driving assistance system comprising a memory and a processor, wherein the memory stores thereon a computer program that can be loaded by the processor and that executes a truck normative driving assistance method according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210431952.3A CN114655231B (en) | 2022-04-22 | 2022-04-22 | Truck standard driving assistance method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210431952.3A CN114655231B (en) | 2022-04-22 | 2022-04-22 | Truck standard driving assistance method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114655231A CN114655231A (en) | 2022-06-24 |
CN114655231B true CN114655231B (en) | 2023-03-28 |
Family
ID=82036718
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210431952.3A Active CN114655231B (en) | 2022-04-22 | 2022-04-22 | Truck standard driving assistance method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114655231B (en) |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9688286B2 (en) * | 2009-09-29 | 2017-06-27 | Omnitracs, Llc | System and method for integrating smartphone technology into a safety management platform to improve driver safety |
EP3361466B1 (en) * | 2017-02-14 | 2024-04-03 | Honda Research Institute Europe GmbH | Risk-based driver assistance for approaching intersections of limited visibility |
CN109101011A (en) * | 2017-06-20 | 2018-12-28 | 百度在线网络技术(北京)有限公司 | Sensor monitoring method, device, equipment and the storage medium of automatic driving vehicle |
US20200122741A1 (en) * | 2018-10-22 | 2020-04-23 | Bendix Commercial Vehicle Systems Llc | System and Method for Providing User-Specific Driver Assistance |
CN110386145B (en) * | 2019-06-28 | 2020-07-07 | 北京理工大学 | Real-time prediction system for driving behavior of target driver |
CN110304075B (en) * | 2019-07-04 | 2020-06-26 | 清华大学 | Vehicle track prediction method based on hybrid dynamic Bayesian network and Gaussian process |
CN111639882B (en) * | 2020-06-15 | 2023-05-19 | 江苏电力信息技术有限公司 | Deep learning-based electricity risk judging method |
CN113642108B (en) * | 2021-08-11 | 2023-11-10 | 北京航空航天大学 | Method for generating key test cases of traffic scene of unmanned vehicle crossroad |
-
2022
- 2022-04-22 CN CN202210431952.3A patent/CN114655231B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN114655231A (en) | 2022-06-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20190155291A1 (en) | Methods and systems for automated driving system simulation, validation, and implementation | |
CN110155046B (en) | Automatic emergency braking hierarchical control method and system | |
US10921814B2 (en) | Vehicle control system and method, and travel assist server | |
CN113165652B (en) | Verifying predicted trajectories using a mesh-based approach | |
CN111062240B (en) | Monitoring method and device for automobile driving safety, computer equipment and storage medium | |
US11518380B2 (en) | System and method for predicted vehicle incident warning and evasion | |
US20200108808A1 (en) | Emergency braking for autonomous vehicles | |
CN112937520B (en) | Emergency braking method and device for vehicle, commercial vehicle and storage medium | |
CN112026761A (en) | Automobile auxiliary driving method based on data sharing | |
CN113570747B (en) | Driving safety monitoring system and method based on big data analysis | |
US20210390857A1 (en) | Information processing device, program, and information processing method | |
CN116872921A (en) | Method and system for avoiding risks of vehicle, vehicle and storage medium | |
CN115042782B (en) | Vehicle cruise control method, system, equipment and medium | |
CN117022323A (en) | Intelligent driving vehicle behavior analysis and prediction system and method | |
CN110103954B (en) | Electric control-based automobile rear-end collision prevention early warning device and method | |
CN113104045B (en) | Vehicle collision early warning method, device, equipment and storage medium | |
CN105946578A (en) | Accelerator pedal control method and device and vehicle | |
CN114655231B (en) | Truck standard driving assistance method and system | |
CN116901963A (en) | Brake control method and device for automatic driving vehicle, vehicle and medium | |
CN115320407B (en) | Vehicle control method and vehicle control device | |
CN111784142A (en) | Task complexity quantification model of advanced driving assistance system | |
CN114889635A (en) | Vehicle automatic driving control system, method, commercial vehicle and storage medium | |
CN113386773A (en) | Method and device for judging reliability of visual identification | |
JP7538300B2 (en) | Method and apparatus for improving object recognition rate of self-driving vehicles | |
CN115139999B (en) | Vehicle and pedestrian anti-collision control method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |