CN117698762A - Intelligent driving assistance system and method based on environment perception and behavior prediction - Google Patents

Intelligent driving assistance system and method based on environment perception and behavior prediction Download PDF

Info

Publication number
CN117698762A
CN117698762A CN202311700918.2A CN202311700918A CN117698762A CN 117698762 A CN117698762 A CN 117698762A CN 202311700918 A CN202311700918 A CN 202311700918A CN 117698762 A CN117698762 A CN 117698762A
Authority
CN
China
Prior art keywords
data
driver
behavior
sensor
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311700918.2A
Other languages
Chinese (zh)
Inventor
江晓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Haishi Yantai Information Technology Co ltd
Original Assignee
Haishi Yantai Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Haishi Yantai Information Technology Co ltd filed Critical Haishi Yantai Information Technology Co ltd
Priority to CN202311700918.2A priority Critical patent/CN117698762A/en
Publication of CN117698762A publication Critical patent/CN117698762A/en
Pending legal-status Critical Current

Links

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The invention relates to the field of traffic control, in particular to an intelligent driving assistance system and method based on environment awareness and behavior prediction. Firstly, deeply analyzing behavior data of a driver, fusing behavior characteristics of the driver and sensor data of a vehicle, and predicting the next action of the driver; secondly, designing an annular time delay structure, capturing the inherent mode and the change trend of data, and mapping the sensor characteristics into a high-dimensional characteristic space; and finally, introducing a heuristic function to simulate the influence of traffic flow on speed, and performing fine adjustment on the path according to the real-time traffic information. The method solves the problems that the processing and response time is long after the sensor data are received in the prior art; rely on only a single sensor and do not take into account the influence of the external environment when analyzing driver behavior; and the traditional driving auxiliary system is often based on fixed algorithm and parameters, and is not adaptively adjusted according to real-time data, so that the system has poor performance.

Description

Intelligent driving assistance system and method based on environment perception and behavior prediction
Technical Field
The invention relates to the field of traffic control, in particular to an intelligent driving assistance system and method based on environment awareness and behavior prediction.
Background
With the rapid development of automatic driving and intelligent transportation systems, how to improve driving safety and efficiency becomes an increasingly important problem. While existing driving assistance systems help to some extent to improve driving safety, they are generally only effective in certain situations or conditions. For example, some systems may perform well only in highway environments, but not be accurate enough or respond quickly in complex urban traffic environments. In addition, existing systems often ignore the combined impact of driver behavior and environmental factors, resulting in inaccurate predictions and responses.
Therefore, there is a need to develop an intelligent driving assistance system based on environmental awareness and behavior prediction that can provide highly accurate and real-time support under a variety of driving environments and conditions. Such systems need to be able to analyze large amounts of sensor data in real time, including but not limited to cameras, radar, ultrasonic sensors, etc., as well as driver behavior data. In addition, the system needs to be highly adaptive to be able to dynamically adjust to real-time environmental and traffic conditions.
Chinese patent application number: CN202111474434.1, publication date: 2022.03.18 discloses a safety education system for intelligently assisting driving behaviors and electronic equipment, which are characterized by comprising a vehicle data processing module, a scoring module, a learning module and a message touch module; the vehicle data processing module is used for acquiring and analyzing intelligent auxiliary driving data so as to determine irregular intelligent auxiliary driving behaviors; the scoring module is used for acquiring the nonstandard intelligent auxiliary driving behaviors determined by the vehicle data processing module and scoring the nonstandard intelligent auxiliary driving behaviors based on a preset scoring mechanism so as to determine the safe driving score of the user; the learning module is used for acquiring the safe driving score determined by the scoring module and pushing corresponding safe driving learning content; the message touch module is used for receiving a message work order generated based on the safe driving learning content and notifying a user in a corresponding message reminding mode. The method and the system realize that the user is educated continuously in the process of using the intelligent auxiliary driving function.
However, the above technology has at least the following technical problems: the prior art has long processing and response time after receiving the sensor data, which may lead to insufficient response in emergency situations and increase accident risk; the accuracy of obstacle recognition is degraded depending on only a single sensor; in analyzing the driver behavior, the influence of the external environment is not considered, resulting in inaccuracy of prediction; conventional driving assistance systems often are based on fixed algorithms and parameters, rather than adaptive adjustments based on real-time data, and perform poorly.
Disclosure of Invention
According to the intelligent driving assistance system and the intelligent driving assistance method based on environment perception and behavior prediction, the problems that in the prior art, after sensor data are received, processing and responding time is long, reaction is not timely enough in an emergency situation, and accident risk is increased are solved; the accuracy of obstacle recognition is degraded depending on only a single sensor; in analyzing the driver behavior, the influence of the external environment is not considered, resulting in inaccuracy of prediction; conventional driving assistance systems often are based on fixed algorithms and parameters, rather than adaptive adjustments based on real-time data, and perform poorly. The intelligent driving assistance system and the intelligent driving assistance method based on environment perception and behavior prediction are realized, the behaviors and the environment data of the driver are analyzed in real time through deep learning and multi-sensor fusion technology, and optimal path suggestions are provided for the driver and barriers are accurately identified, so that the driving safety and efficiency are greatly improved.
The application provides an intelligent driving assistance system and method based on environment awareness and behavior prediction, and specifically comprises the following technical scheme:
an intelligent driving assistance system based on environmental awareness and behavior prediction, comprising:
the system comprises a data acquisition module, a driver behavior recognition module, a sensor analysis module, an obstacle recognition module, an early warning module and a dynamic path planning module;
the driver behavior recognition module is used for deeply analyzing the behavior data of the driver and calculating the weighting characteristics by using the behavior data and the environment data of the driver; the driver behavior recognition module is used for fusing behavior characteristics of a driver with sensor data of the vehicle; the driver behavior recognition module is connected with the dynamic path planning module in a data transmission mode;
the sensor analysis module is used for designing an annular time delay structure for each sensor data, capturing the inherent mode and the change trend of the data, and connecting the sensor analysis module with the obstacle recognition module in a data transmission mode;
the obstacle recognition module fuses data of the radar, the camera and the ultrasonic wave multiple sensors; when an obstacle or a vehicle other than the obstacle is detected, the obstacle identification module transmits information to the early warning module; the obstacle recognition module is connected with the early warning module and the dynamic path planning module in a data transmission mode;
the dynamic path planning module is used for providing optimal driving advice according to the behavior of a driver and real-time traffic information; and introducing a heuristic function to simulate the influence of traffic flow on speed, and performing fine adjustment on the path according to the real-time traffic information.
The intelligent driving assistance method based on the environment awareness and the behavior prediction is applied to an intelligent driving assistance system based on the environment awareness and the behavior prediction, and comprises the following steps of:
s100: carrying out deep analysis on the behavior data of the driver, fusing the behavior characteristics of the driver and the sensor data of the vehicle, and predicting the next action of the driver;
s200: designing an annular time delay structure, capturing the inherent mode and the change trend of data, and mapping the sensor characteristics into a high-dimensional characteristic space;
s300: and introducing a heuristic function to simulate the influence of traffic flow on speed, and performing fine adjustment on the path according to the real-time traffic information.
Preferably, the S100 specifically includes:
the driver behavior data is deeply analyzed through the driver behavior recognition module, the weighting characteristic F (t) is calculated by using the driver behavior data X (t) and the environment data E (t), and the calculation formula of the weighting characteristic is as follows:
wherein t represents time, ω (t) is a time-dependent weighting factor, and the calculation formula of ω (t) is:
where k is a positive number, used to control the slope of the function, θ is a threshold, used to determine the sensitivity of the environmental factor; the formula for ω (t) is derived from a logic function, any input can be mapped between 0 and 1, resulting in a weight between 0 and 1.
Preferably, the step S100 further includes:
and carrying out feature fusion on the behavior features of the driver and the sensor data of the vehicle, describing the dynamic response of the driver by using a difference equation, and predicting the next action of the driver.
Preferably, the S200 specifically includes:
and (3) adopting a time delay embedding theory, designing a ring-shaped time delay structure for each sensor data based on the inherent dynamics of a time sequence, and capturing the inherent mode and the change trend of the data.
Preferably, the step S200 further includes:
defining a feature vector for each sensor, said feature vector being not only a simple representation of the raw data, but also obtained by taking into account historical information of the data; for radar sensors, not only current data but also data from the past few seconds are considered; thereby capturing motion information of the object, including velocity and acceleration.
Preferably, the step S200 further includes:
after extracting features from all sensors, the sensor features are mapped into a high-dimensional unified feature space.
Preferably, the step S200 further includes:
further optimization is based on the characteristics of the expanded map.
Preferably, the step S300 specifically includes:
introducing a heuristic function, wherein the heuristic function considers the current speed, the target direction, the average speed of surrounding vehicles and the traffic flow, and the influence of the traffic flow on the speed is simulated by using a logistic function; the speed decreases as the traffic flow increases.
Preferably, the step S300 further includes:
modifying the algorithm a using a heuristic function; based on the idea of sliding windows, a window is defined on the initial path, and the path in the window is finely adjusted according to real-time traffic information.
The beneficial effects are that:
the technical solutions provided in the embodiments of the present application have at least the following technical effects or advantages:
1. by deeply analyzing the behavior data of the driver and predicting the next action of the driver, the system can warn or take measures in advance to avoid potential dangerous situations, thereby greatly improving the driving safety; the system can dynamically adjust the path according to the real-time traffic information and the behavior of the driver, ensure that the optimal path can be found in various traffic environments, and improve the driving efficiency;
2. the obstacle recognition module fused by multiple sensors can accurately recognize obstacles and other vehicles in the external environment, and timely react to avoid collision; the system not only considers the behavior of the driver, but also combines environmental data such as road conditions, traffic flow, weather conditions and the like, so that the prediction is more accurate and practical;
3. by means of feature fusion of the behavior features of the driver and the sensor data of the vehicle, the system can obtain a more comprehensive feature representation, and therefore prediction accuracy is improved; the system adopts a time delay structure, and can capture the inherent mode and the change trend of the data, so that the system can be excellent in different driving scenes.
4. The technical scheme of the method and the device can effectively solve the problems that after sensor data are received, the processing and response time is long, the response is not timely enough in an emergency, and the accident risk is increased; the accuracy of obstacle recognition is degraded depending on only a single sensor; in analyzing the driver behavior, the influence of the external environment is not considered, resulting in inaccuracy of prediction; conventional driving assistance systems often are based on fixed algorithms and parameters, rather than adaptive adjustments based on real-time data, and perform poorly. The intelligent driving assistance system and the intelligent driving assistance method based on environment perception and behavior prediction are realized, the behaviors and the environment data of the driver are analyzed in real time through deep learning and multi-sensor fusion technology, and optimal path suggestions are provided for the driver and barriers are accurately identified, so that the driving safety and efficiency are greatly improved.
Drawings
FIG. 1 is a block diagram of an intelligent driving assistance system based on context awareness and behavior prediction as described herein;
FIG. 2 is a flow chart of an intelligent driving assistance method based on context awareness and behavior prediction as described herein.
Detailed Description
According to the intelligent driving assistance system and the intelligent driving assistance method based on environment perception and behavior prediction, the problems that in the prior art, after sensor data are received, processing and responding time is long, reaction is not timely enough in an emergency situation, and accident risk is increased are solved; the accuracy of obstacle recognition is degraded depending on only a single sensor; in analyzing the driver behavior, the influence of the external environment is not considered, resulting in inaccuracy of prediction; conventional driving assistance systems often are based on fixed algorithms and parameters, rather than adaptive adjustments based on real-time data, and perform poorly.
The technical scheme in the embodiment of the application aims to solve the problems, and the overall thought is as follows:
by deeply analyzing the behavior data of the driver and predicting the next action of the driver, the system can warn or take measures in advance to avoid potential dangerous situations, thereby greatly improving the driving safety; the system can dynamically adjust the path according to the real-time traffic information and the behavior of the driver, ensure that the optimal path can be found in various traffic environments, and improve the driving efficiency; the obstacle recognition module fused by multiple sensors can accurately recognize obstacles and other vehicles in the external environment, and timely react to avoid collision; the system not only considers the behavior of the driver, but also combines environmental data such as road conditions, traffic flow, weather conditions and the like, so that the prediction is more accurate and practical; by means of feature fusion of the behavior features of the driver and the sensor data of the vehicle, the system can obtain a more comprehensive feature representation, and therefore prediction accuracy is improved; the system adopts a time delay structure, and can capture the inherent mode and the change trend of the data, so that the system can be excellent in different driving scenes.
In order to better understand the above technical solutions, the following detailed description will refer to the accompanying drawings and specific embodiments.
Referring to fig. 1, an intelligent driving assistance system based on environmental awareness and behavior prediction described in the present application includes the following parts:
the system comprises a data acquisition module 10, a driver behavior recognition module 20, a sensor analysis module 30, an obstacle recognition module 40, an early warning module 50 and a dynamic path planning module 60;
the data acquisition module 10 is used for acquiring data about road conditions, traffic flow, weather conditions and the like from vehicle-mounted sensors, wherein the data are called environmental data, and in addition, the data acquisition module 10 can also acquire the environmental data from an external data source (such as a weather station), and is connected with the driver behavior recognition module 20 and the sensor analysis module 30 in a data transmission mode;
the driver behavior recognition module 20 is configured to perform deep analysis on the behavior data of the driver, and calculate a weighting characteristic using the behavior data of the driver and the environmental data; in addition, the driver behavior recognition module fuses the behavior characteristics of the driver with the sensor data of the vehicle to obtain a more comprehensive characteristic representation; the driver behavior recognition module 20 is connected with the dynamic path planning module 60 in a data transmission mode;
the sensor analysis module 30 is configured to design an annular time delay structure for each sensor data, and capture the natural mode and the variation trend of the data, so as to implement efficient feature extraction, where the sensor analysis module 30 is connected to the obstacle recognition module 40 by a data transmission manner;
the obstacle recognition module 40 is configured to accurately recognize obstacles and other vehicles by data fusion of multiple sensors such as radar, camera, and ultrasonic; when an obstacle or other vehicle is detected, the module will transmit information to the early warning module 50; the obstacle recognition module 40 is connected with the early warning module 50 and the dynamic path planning module 60 in a data transmission mode;
the early warning module 50 is configured to send out a warning to remind the driver of paying attention to the obstacle or other vehicles in front when receiving the information of the obstacle or other vehicles;
the dynamic path planning module 60 is configured to provide an optimal driving suggestion according to the behavior of the driver and real-time traffic information; a heuristic function is introduced to simulate the influence of traffic flow on speed, and fine adjustment of the path is carried out according to real-time traffic information.
Referring to fig. 2, the intelligent driving assistance method based on environment awareness and behavior prediction described in the application comprises the following steps:
s100: carrying out deep analysis on the behavior data of the driver, fusing the behavior characteristics of the driver and the sensor data of the vehicle, and predicting the next action of the driver;
and starting an intelligent driving auxiliary system based on environment perception and behavior prediction, starting to work by each sensor, and transmitting data to a central processing unit in real time. The sensor comprises a camera, a radar, an ultrasonic sensor and the like. In order to monitor the behavior of the driver in real time and predict the next action, a driver behavior recognition module based on deep learning is designed.
During driving, the driver's behavior is affected by the environment in which it is located. For example, when a driver is on a highway, the behavior pattern may be different from that on a city street. Therefore, in order to more accurately predict the next action of the driver, it is necessary to consider the environment in which it is located.
Specifically, the data acquisition module acquires data about road conditions, traffic flows, weather conditions, and the like from the in-vehicle sensors, which are referred to as environmental data E (t), which can be obtained by the in-vehicle sensors and external data sources (e.g., weather stations).
In order to better understand the interactivity of the driver with its surrounding environment, the driver behavior recognition module performs an in-depth analysis on the driver's behavior data, and uses the driver's behavior data X (t) and the environmental data E (t) to calculate the weighted feature F (t), where the calculation formula of the weighted feature is:
wherein t represents time, ω (t) is a time-dependent weighting factor, which can better reflect the environmental impact, and its calculation formula is:
where k is a positive number, θ is a threshold value, used to control the slope of the function, determining the sensitivity of the environmental factor. This formula is derived from a logic function that maps any input to between 0 and 1, resulting in a weight between 0 and 1.
The behavior characteristics of the driver and the sensor data of the vehicle are very important information sources, and in order to improve the distinguishing degree of the characteristics and the accuracy of prediction, a driver behavior recognition module fuses the two data together so as to obtain a more comprehensive characteristic representation. Considering that different sensor data may have different scales and distributions, sensor data from different sources is first normalized to ensure that they are on the same scale. The normalized formula is:
wherein S is norm (t) is normalized sensor data, S (t) is sensor data, μ S Sum sigma S The mean and standard deviation of the sensor data, respectively, e is a small constant, preventing the denominator from being zero. Feature fusion is carried out on the behavior features of the driver and the sensor data of the vehicle:
where G (t) is a fused feature and λ and γ are weighting parameters used to control the relationship between different data sources. These parameters are obtained by cross-validation to ensure optimal fusion.
Driving is a continuous, dynamic process. In order to predict the next action of the driver, it is necessary to consider the past behavior of the driver and the current environmental state. The driver behavior recognition module uses a differential equation to describe the dynamic response of the driver, and the specific calculation formula is as follows:
wherein A (t) is as followsShowing the driver's dynamic response at time t, beta 1 Is an autoregressive coefficient representing the relationship between the driver's motion at time t and his motion at time t-1, beta 2 Is the influence factor of the fusion feature G (t) on the action of the driver at the moment t, beta 3 Is an integral coefficient representing the effect of the cumulative effect of the fused feature G (T) on the current action of the driver over a period of time, delta is a differential coefficient representing the effect of the action of the driver at time T-2 on the action thereof at time T, T is a time window for taking into account the past driving behaviour, T e [ T-T, T]. Key features are extracted from the driver's behavior and environmental factors, then these features are fused with other sensor data, and finally a dynamic system model is used to predict the driver's next action. Not only improves the accuracy of prediction, but also provides deep understanding of the behavior of the driver, thereby providing strong support for an intelligent driving assistance system based on environment perception and behavior prediction.
S200: designing an annular time delay structure, capturing the inherent mode and the change trend of data, and mapping the sensor characteristics into a high-dimensional characteristic space;
in order to accurately identify obstacles and other vehicles in the external environment, a multi-sensor fusion obstacle identification module is introduced. The module realizes accurate identification of obstacles and other vehicles through data fusion of multiple sensors such as a radar, a camera, ultrasonic waves and the like.
The early warning module may issue a warning when an obstacle or other vehicle is detected. Considering the time sequence of the sensor data, the sensor analysis module designs an annular time delay structure for each sensor data based on the inherent dynamics of the time sequence by adopting a time delay embedding theory, and captures the inherent mode and the variation trend of the data, thereby realizing efficient feature extraction.
In particular, a feature vector is defined for each sensor, which is not just a simple representation of the raw data, but is derived by taking into account historical information of the data. For example, for one radar sensor, not only the current data but also the data of the past few seconds are considered. Thereby capturing motion information of the object, such as velocity and acceleration. To achieve this, the following mathematical formula is used:
wherein H is i (t) represents the eigenvector of the ith sensor at time t, σ is the activation function, W circ,n N-th row, X of the annular weight matrix i (t-mτ) represents raw data of the ith sensor at time t-mτ, τ is time delay, m is the number of time delays, b circ Representing the bias vector. A weight matrix W is defined for each sensor's data circ,n And a bias vector b circ . These weights and biases are learned through training data, ensuring that the extracted features are meaningful.
After extracting the characteristics from all the sensors, the sensor characteristics are mapped into a high-dimensional unified characteristic space so as to perform characteristic fusion, and the specific formula is as follows:
wherein H is rxpand Representing the mapped feature vector and E representing the mapping matrix. The mapping matrix is used to map the features of all sensors into the same high-dimensional space, so that the features can be effectively fused on the same semantic level.
To ensure that the decoded data is of high quality, further optimization is based on the characteristics of the expanded map, the specific formula is:
wherein H represents the characteristic vector of the sensor, F reg Is a regularized sensor feature, lambda (t) is a timeThe regularization parameters relied upon, p being the order of regularization, determine the strength of the regularization. Ensuring that the resulting features are sparse is valuable in practical applications can reduce the computational effort and improve the interpretation of the model.
The regularized sensor features are converted into a two-dimensional "image" format, considering that the regularized features may not be suitable for direct input into the deep learning model. The obstacle recognition module uses a convolutional neural network to recognize obstacles and vehicles, and inputs the converted images into a convolutional neural network model, wherein the convolutional neural network is in the prior art. The convolutional neural network model is processed through a plurality of convolutional layers, a pooling layer and a full connection layer, and finally a probability map is output, wherein the probability map represents the probability of an obstacle or a vehicle on each pixel point.
A predetermined threshold is used to determine which pixels have a sufficiently high probability of being considered an obstacle or vehicle. To ensure that each obstacle or vehicle is detected only once, non-maximum suppression (NMS), which is prior art, is used to eliminate overlapping detection boxes. Therefore, a powerful obstacle and vehicle identification tool is provided for the intelligent driving-assisting safety system, and the safety and efficiency of driving are improved.
S300: and introducing a heuristic function to simulate the influence of traffic flow on speed, and performing fine adjustment on the path according to the real-time traffic information.
According to the behavior of the driver and the real-time traffic information, the dynamic path planning module can provide the optimal driving suggestion, quickly find an initial path in the traffic environment and finely adjust the path according to the real-time traffic information.
A heuristic function is introduced that takes into account the current speed of the vehicle, the target direction, the average speed of the surrounding vehicles and the traffic flow. Specifically, a heuristic function is defined as:
wherein h (o) representsHeuristic function, d (o) is Euclidean distance of node o to target, v (o) is average speed of node o, f (o) is traffic flow, k 1 And k 2 Is a weight parameter. The effect of traffic flow on speed was simulated using a logistic function. As traffic flow increases, speed decreases.
The algorithm a was modified using the heuristic function described above to find the initial path quickly. After the initial path is found, fine adjustment of the path is required according to the real-time traffic information. Based on the idea of sliding windows, a window is defined on the initial path. The path within the window is fine-tuned based on real-time traffic information. Specifically, a sliding window size of W is defined that will cover W nodes on the initial path. Within the window, path trimming is performed using the following formula:
p(o)=h(o)+α×τ(o)+β×∫ 0 W g(o)do
where p (o) represents the predicted path cost of node o, τ (o) is the real-time traffic delay of node o, g (o) is the rate of change of traffic flow, and α and β are weight parameters. The window is slid forward one node and the above steps are repeated until the target is reached. The method can quickly find the initial path in the traffic environment and fine-tune the path according to the real-time traffic information, thereby realizing real-time path planning.
In summary, the intelligent driving assistance system and the method based on the environment awareness and the behavior prediction are completed.
The technical scheme in the embodiment of the application at least has the following technical effects or advantages:
1. by deeply analyzing the behavior data of the driver and predicting the next action of the driver, the system can warn or take measures in advance to avoid potential dangerous situations, thereby greatly improving the driving safety; the system can dynamically adjust the path according to the real-time traffic information and the behavior of the driver, ensure that the optimal path can be found in various traffic environments, and improve the driving efficiency;
2. the obstacle recognition module fused by multiple sensors can accurately recognize obstacles and other vehicles in the external environment, and timely react to avoid collision; the system not only considers the behavior of the driver, but also combines environmental data such as road conditions, traffic flow, weather conditions and the like, so that the prediction is more accurate and practical;
3. by means of feature fusion of the behavior features of the driver and the sensor data of the vehicle, the system can obtain a more comprehensive feature representation, and therefore prediction accuracy is improved; the system adopts a time delay structure, and can capture the inherent mode and the change trend of the data, so that the system can be excellent in different driving scenes.
Effect investigation:
the technical scheme of the method and the device can effectively solve the problems that after sensor data are received, the processing and response time is long, the response is not timely enough in an emergency, and the accident risk is increased; the accuracy of obstacle recognition is degraded depending on only a single sensor; in analyzing the driver behavior, the influence of the external environment is not considered, resulting in inaccuracy of prediction; conventional driving assistance systems often are based on fixed algorithms and parameters, rather than adaptive adjustments based on real-time data, and perform poorly. Through a series of effect investigation, the system or the method can finally realize an intelligent driving assistance system and method based on environment perception and behavior prediction through verification, and through deep learning and a multi-sensor fusion technology, the behavior and environment data of a driver are analyzed in real time, an optimal path suggestion is provided for the driver, and obstacles are accurately identified, so that the driving safety and efficiency are greatly improved.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (10)

1. An intelligent driving assistance system based on environmental awareness and behavior prediction, characterized by comprising the following parts:
the system comprises a data acquisition module, a driver behavior recognition module, a sensor analysis module, an obstacle recognition module, an early warning module and a dynamic path planning module;
the driver behavior recognition module is used for deeply analyzing the behavior data of the driver and calculating the weighting characteristics by using the behavior data and the environment data of the driver; the driver behavior recognition module is used for fusing behavior characteristics of a driver with sensor data of the vehicle; the driver behavior recognition module is connected with the dynamic path planning module in a data transmission mode;
the sensor analysis module is used for designing an annular time delay structure for each sensor data, capturing the inherent mode and the change trend of the data, and connecting the sensor analysis module with the obstacle recognition module in a data transmission mode;
the obstacle recognition module fuses data of the radar, the camera and the ultrasonic wave multiple sensors; when an obstacle or a vehicle other than the obstacle is detected, the obstacle identification module transmits information to the early warning module; the obstacle recognition module is connected with the early warning module and the dynamic path planning module in a data transmission mode;
the dynamic path planning module is used for providing optimal driving advice according to the behavior of a driver and real-time traffic information; and introducing a heuristic function to simulate the influence of traffic flow on speed, and performing fine adjustment on the path according to the real-time traffic information.
2. An intelligent driving assistance method based on environment awareness and behavior prediction, applied to the intelligent driving assistance system based on environment awareness and behavior prediction as claimed in claim 1, characterized by comprising the following steps:
s100: carrying out deep analysis on the behavior data of the driver, fusing the behavior characteristics of the driver and the sensor data of the vehicle, and predicting the next action of the driver;
s200: designing an annular time delay structure, capturing the inherent mode and the change trend of data, and mapping the sensor characteristics into a high-dimensional characteristic space;
s300: and introducing a heuristic function to simulate the influence of traffic flow on speed, and performing fine adjustment on the path according to the real-time traffic information.
3. The intelligent driving assistance method based on environmental awareness and behavior prediction according to claim 2, wherein S100 specifically comprises:
the driver behavior data is deeply analyzed through the driver behavior recognition module, the weighted feature f (t) is calculated by using the driver behavior data X (t) and the environment data E (t), and the calculation formula of the weighted feature is as follows:
wherein t represents time, ω (t) is a time-dependent weighting factor, and the calculation formula of ω (t) is:
where k is a positive number, used to control the slope of the function, θ is a threshold, used to determine the sensitivity of the environmental factor; the formula for ω (t) is derived from a logic function, any input can be mapped between 0 and 1, resulting in a weight between 0 and 1.
4. The intelligent driving assistance method based on environmental awareness and behavior prediction according to claim 3, wherein S100 further comprises:
and carrying out feature fusion on the behavior features of the driver and the sensor data of the vehicle, describing the dynamic response of the driver by using a difference equation, and predicting the next action of the driver.
5. The intelligent driving assistance method based on environmental awareness and behavior prediction according to claim 2, wherein S200 specifically comprises:
and (3) adopting a time delay embedding theory, designing a ring-shaped time delay structure for each sensor data based on the inherent dynamics of a time sequence, and capturing the inherent mode and the change trend of the data.
6. The intelligent driving assistance method based on environmental awareness and behavior prediction according to claim 5, wherein S200 further comprises:
defining a feature vector for each sensor, said feature vector being not only a simple representation of the raw data, but also obtained by taking into account historical information of the data; for radar sensors, not only current data but also data from the past few seconds are considered; thereby capturing motion information of the object, including velocity and acceleration.
7. The intelligent driving assistance method based on environmental awareness and behavior prediction according to claim 6, wherein S200 further comprises:
after extracting features from all sensors, the sensor features are mapped into a high-dimensional unified feature space.
8. The intelligent driving assistance method based on environmental awareness and behavior prediction according to claim 7, wherein S200 further comprises:
further optimization is based on the characteristics of the expanded map.
9. The intelligent driving assistance method based on environmental awareness and behavior prediction according to claim 2, wherein S300 specifically comprises:
introducing a heuristic function, wherein the heuristic function considers the current speed, the target direction, the average speed of surrounding vehicles and the traffic flow, and the influence of the traffic flow on the speed is simulated by using a logistic function; the speed decreases as the traffic flow increases.
10. The intelligent driving assistance method based on environmental awareness and behavior prediction according to claim 9, wherein S300 further comprises:
modifying the algorithm a using a heuristic function; based on the idea of sliding windows, a window is defined on the initial path, and the path in the window is finely adjusted according to real-time traffic information.
CN202311700918.2A 2023-12-12 2023-12-12 Intelligent driving assistance system and method based on environment perception and behavior prediction Pending CN117698762A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311700918.2A CN117698762A (en) 2023-12-12 2023-12-12 Intelligent driving assistance system and method based on environment perception and behavior prediction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311700918.2A CN117698762A (en) 2023-12-12 2023-12-12 Intelligent driving assistance system and method based on environment perception and behavior prediction

Publications (1)

Publication Number Publication Date
CN117698762A true CN117698762A (en) 2024-03-15

Family

ID=90152724

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311700918.2A Pending CN117698762A (en) 2023-12-12 2023-12-12 Intelligent driving assistance system and method based on environment perception and behavior prediction

Country Status (1)

Country Link
CN (1) CN117698762A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102788931A (en) * 2012-07-18 2012-11-21 嘉兴学院 Power transformer winding fault diagnosing method
CN106570560A (en) * 2016-11-02 2017-04-19 温州大学 Driving style quantitative evaluation method based on standardized driving behaviors and phase space reconstruction
CN106864460A (en) * 2017-03-20 2017-06-20 上海识加电子科技有限公司 Driver's unlawful practice monitoring method, device and installing mechanism
CN108400975A (en) * 2018-02-02 2018-08-14 常州高清信息技术有限公司 A kind of wireless telecommunication system for bus operation
WO2021238303A1 (en) * 2020-05-29 2021-12-02 华为技术有限公司 Motion planning method and apparatus
CN114038218A (en) * 2021-12-28 2022-02-11 江苏泰坦智慧科技有限公司 Chained feedback multi-intersection signal lamp decision system and method based on road condition information
CN116646568A (en) * 2023-06-02 2023-08-25 陕西旭氢时代科技有限公司 Fuel cell stack parameter optimizing method based on meta heuristic
CN117008574A (en) * 2023-07-31 2023-11-07 载合汽车科技(苏州)有限公司 Intelligent network allies oneself with car advanced auxiliary driving system and autopilot system test platform
CN117037115A (en) * 2023-08-09 2023-11-10 新疆大学 Automatic driving obstacle avoidance system and method based on machine vision
CN117141519A (en) * 2023-10-18 2023-12-01 柳州工学院 Unmanned system and method based on image processing

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102788931A (en) * 2012-07-18 2012-11-21 嘉兴学院 Power transformer winding fault diagnosing method
CN106570560A (en) * 2016-11-02 2017-04-19 温州大学 Driving style quantitative evaluation method based on standardized driving behaviors and phase space reconstruction
CN106864460A (en) * 2017-03-20 2017-06-20 上海识加电子科技有限公司 Driver's unlawful practice monitoring method, device and installing mechanism
CN108400975A (en) * 2018-02-02 2018-08-14 常州高清信息技术有限公司 A kind of wireless telecommunication system for bus operation
WO2021238303A1 (en) * 2020-05-29 2021-12-02 华为技术有限公司 Motion planning method and apparatus
CN114038218A (en) * 2021-12-28 2022-02-11 江苏泰坦智慧科技有限公司 Chained feedback multi-intersection signal lamp decision system and method based on road condition information
CN116646568A (en) * 2023-06-02 2023-08-25 陕西旭氢时代科技有限公司 Fuel cell stack parameter optimizing method based on meta heuristic
CN117008574A (en) * 2023-07-31 2023-11-07 载合汽车科技(苏州)有限公司 Intelligent network allies oneself with car advanced auxiliary driving system and autopilot system test platform
CN117037115A (en) * 2023-08-09 2023-11-10 新疆大学 Automatic driving obstacle avoidance system and method based on machine vision
CN117141519A (en) * 2023-10-18 2023-12-01 柳州工学院 Unmanned system and method based on image processing

Similar Documents

Publication Publication Date Title
RU2701051C2 (en) Method, system and machine-readable storage media for detecting objects using recurrent neural network and linked feature map
US9527384B2 (en) Driving context generation system for generating driving behavior description information
CN112700470B (en) Target detection and track extraction method based on traffic video stream
US11189171B2 (en) Traffic prediction with reparameterized pushforward policy for autonomous vehicles
CN113261035A (en) Trajectory prediction method and related equipment
CN109109863B (en) Intelligent device and control method and device thereof
US11242050B2 (en) Reinforcement learning with scene decomposition for navigating complex environments
CN114462667A (en) SFM-LSTM neural network model-based street pedestrian track prediction method
US11691634B1 (en) On-vehicle driving behavior modelling
CN116901975B (en) Vehicle-mounted AI security monitoring system and method thereof
US11975742B2 (en) Trajectory consistency measurement for autonomous vehicle operation
CN116331221A (en) Driving assistance method, driving assistance device, electronic equipment and storage medium
Zhang et al. The AD4CHE dataset and its application in typical congestion scenarios of traffic jam pilot systems
CN110097571B (en) Quick high-precision vehicle collision prediction method
US20240017746A1 (en) Assessing present intentions of an actor perceived by an autonomous vehicle
CN115214708A (en) Vehicle intention prediction method and related device thereof
CN117698762A (en) Intelligent driving assistance system and method based on environment perception and behavior prediction
CN115223144A (en) Unmanned mine car sensor data screening method and device based on cloud data
EP3654246B1 (en) Method, vehicle, system, and storage medium for indicating anomalous vehicle scenario using encoder network and replay buffer
EP4062333A2 (en) Ensemble of narrow ai agents
WO2024093321A1 (en) Vehicle position acquiring method, model training method, and related device
US12033399B1 (en) Turn and brake action prediction using vehicle light detection
EP4145242B1 (en) Perception field based driving related operations
US20230033243A1 (en) Systems and methods for object proximity monitoring around a vehicle
US20240190420A1 (en) Method and apparatus of predicting possibility of accident in real time during vehicle driving

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination