CN114913711A - Auxiliary driving system and method based on predicted vehicle cut-in possibility - Google Patents
Auxiliary driving system and method based on predicted vehicle cut-in possibility Download PDFInfo
- Publication number
- CN114913711A CN114913711A CN202110183027.9A CN202110183027A CN114913711A CN 114913711 A CN114913711 A CN 114913711A CN 202110183027 A CN202110183027 A CN 202110183027A CN 114913711 A CN114913711 A CN 114913711A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- distance
- cut
- surrounding
- parameters
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 33
- 238000012549 training Methods 0.000 claims abstract description 48
- 238000013528 artificial neural network Methods 0.000 claims abstract description 27
- 238000003384 imaging method Methods 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 6
- 238000003062 neural network model Methods 0.000 description 10
- 238000001514 detection method Methods 0.000 description 9
- 125000004122 cyclic group Chemical group 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 238000002485 combustion reaction Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000011176 pooling Methods 0.000 description 2
- 230000000306 recurrent effect Effects 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000009849 deactivation Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000002779 inactivation Effects 0.000 description 1
- 230000015654 memory Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000006403 short-term memory Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/165—Anti-collision systems for passive traffic, e.g. including static obstacles, trees
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Traffic Control Systems (AREA)
Abstract
The present disclosure relates to a driving assistance system based on a predicted vehicle cut-in likelihood, the system comprising: a training data set collection unit configured to collect physical parameters and distance parameters around a reference vehicle and cut-in situations of surrounding vehicles; a model training unit configured to train a model using the physical parameters and the distance parameters and the cut-in conditions of the surrounding vehicles as inputs to a deep neural network; an execution unit configured to perform driving assistance using the trained model. The driving assistance system and method according to the present disclosure can predict the cut-in possibility of the surrounding vehicle by measuring some physical parameters and distance parameters of the surrounding vehicle.
Description
Technical Field
The present disclosure relates to the field of vehicle safe driving technology, and more particularly, to a driving assistance system and method based on a predicted vehicle cut-in possibility.
Background
During vehicle travel, determining a predicted vehicle cut-in probability may be used to identify a cut-in target so that an Adaptive Cruise Control (ACC) function may begin to react to the vehicle before it is actually selected as the true target. These targets are then provided to the control function along with the actual cut-in probability.
In determining the predicted cut-in probability, three individual probabilities are typically calculated using the following three parameters, respectively: the lateral velocity of the target; the overlap ratio between the target and the lane; target turn indicator light information. The total probability is calculated by combining these three individual probabilities as the predicted cut-in probability. The disadvantages of such predicted cut-in probabilities are: it is not suitable for the case where the driving is relatively hard, and in some cases it is predicted that the target cut-in is too late to prevent the collision from occurring.
Accordingly, there is a need for improved methods of determining predicted vehicle cut-in probabilities.
Disclosure of Invention
The purpose of the present disclosure is to provide a driving assistance system and method based on a prediction of the vehicle cut-in likelihood by recording some physical parameters and distance parameters of the cut-in vehicle before the start of the vehicle cut-in behavior. The technical scheme provided by the disclosure can be used for detecting the cut-in possibility of surrounding vehicles.
Thus, according to a first aspect of the present disclosure, there is provided a driving assistance system based on a predicted vehicle cut-in likelihood, the system comprising:
a training data set collection unit configured to collect physical parameters and distance parameters around a reference vehicle and cut-in situations of surrounding vehicles;
a model training unit configured to train a model using the physical parameters and the distance parameters and the cut-in conditions of the surrounding vehicles as inputs to a deep neural network;
an execution unit configured to perform driving assistance using the trained model.
In a preferred embodiment, in the training data set collection unit, the data of the physical parameter and the distance parameter includes: the average speed of all vehicles in the lane where the reference vehicle is located; average speed of all vehicles in the left lane; average speed of all vehicles in the right lane; vehicle density of the current road; a lateral distance between the reference vehicle and a surrounding vehicle; a longitudinal distance between the reference vehicle and a surrounding vehicle; a speed of the reference vehicle; the speed of the surrounding vehicle; a road type; weather conditions; lateral distance of surrounding vehicles to lane line.
In a preferred embodiment, the data of the physical parameter and the distance parameter are acquired by sensors arranged around the reference vehicle, the sensors being any one or a combination of any plurality of: an imaging device, a laser radar, a millimeter wave radar, an ultrasonic sensor, preferably an imaging device.
In a preferred embodiment, the cut-in conditions of the surrounding vehicle include vehicle cut-in and vehicle no cut-in.
In a preferred embodiment, the physical parameters and the distance parameters and the cut-in situation of the surrounding vehicles are stored in an online server, and the training data set collecting unit obtains the physical parameters and the distance parameters and the cut-in situation of the surrounding vehicles through the online server. For example, a vehicle in a certain area uploads the acquired physical parameters and distance parameters around the reference vehicle and the cut-in condition of the surrounding vehicle to the online server.
In a preferred embodiment, the training data set collection unit and the model training unit are located on an online server side.
In a preferred embodiment, the execution unit comprises sending the training model to a vehicle in need thereof. Preferably, the vehicle in need comprises an information obtaining unit and a prediction unit, and the current vehicle obtains the trained model from the online server through wireless communication.
According to a second aspect of the present disclosure, there is provided a driving assistance method based on a predicted vehicle cut-in likelihood, the method comprising:
(1) collecting physical parameters and distance parameters around a reference vehicle and the cut-in conditions of surrounding vehicles;
(2) taking the physical parameters, the distance parameters and the cut-in conditions of surrounding vehicles as the input of a deep neural network to train a model;
(3) and performing auxiliary driving by using the trained model.
In a preferred embodiment, in (1), the data of the physical parameter and the distance parameter includes: the average speed of all vehicles in the lane where the reference vehicle is located; average speed of all vehicles in the left lane; average speed of all vehicles in the right lane; vehicle density of the current road; a lateral distance between the reference vehicle and a surrounding vehicle; a longitudinal distance between the reference vehicle and a surrounding vehicle; a speed of the reference vehicle; the speed of the surrounding vehicle; a road type; weather conditions; lateral distance of surrounding vehicles to lane line.
In a preferred embodiment, in (1), the data of the physical parameter and the distance parameter are acquired by sensors provided around the reference vehicle, the sensors being any one or a combination of any plurality of: an imaging device, a laser radar, a millimeter wave radar, an ultrasonic sensor, preferably an imaging device.
In a preferred embodiment, in (1), the cut-in situation of the surrounding vehicle includes vehicle cut-in and vehicle not cut-in.
In a preferred embodiment, in (2), the physical parameters and the distance parameters and the cut-in conditions of the surrounding vehicles are stored in an online server, and the model is trained by obtaining the physical parameters and the distance parameters and the cut-in conditions of the surrounding vehicles from the online server as input of the deep neural network. For example, a vehicle in a certain area uploads the acquired physical parameters and distance parameters around the reference vehicle and the cut-in condition of the surrounding vehicle to the online server.
In a preferred embodiment, (1) and (2) are performed at the online server side. And the current vehicle acquires the trained model from the online server through wireless communication.
In a preferred embodiment, in (3) comprises sending the training model to a vehicle in need thereof. Preferably, the vehicle in need acquires the physical parameters and distance parameters of its surroundings, and calculates the possibility of cut-in of the surrounding vehicle based on the physical parameters and distance parameters of its surroundings and the trained model.
According to a third aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method according to the second aspect of the present disclosure.
With the system and method of the present disclosure, a neural network model is provided that predicts the cut-in likelihood of a vehicle by measuring some physical parameters and distance parameters of the cut-in vehicle, which can be calculated for the cut-in likelihood of surrounding vehicles in order to prepare for possible vehicle cut-in events in advance.
Drawings
The present disclosure may be better understood from the following description of specific embodiments thereof taken in conjunction with the accompanying drawings, in which like reference numerals identify identical or functionally similar elements.
Fig. 1 shows a schematic view of a system according to an embodiment of the present disclosure.
FIG. 2 shows a schematic diagram of a deep neural network model in accordance with systems and methods of one embodiment of the present disclosure.
Fig. 3 shows a block flow diagram of a method according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure are described with reference to the drawings. The following detailed description and drawings are included to illustrate the principles of the disclosure, which is not to be limited to the preferred embodiments described, but is to be defined by the claims. The disclosure will now be described in detail with reference to exemplary embodiments thereof, some of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same reference numerals in different drawings represent the same or similar elements unless otherwise specified. The aspects described in the following exemplary embodiments do not represent all aspects of the present disclosure. Rather, these aspects are merely examples of the driving assistance systems and methods of the various aspects of the present disclosure recited in the appended claims.
According to the system of the present disclosure, the cut-in probability of the surrounding vehicle is calculated by measuring some physical parameters and distance parameters of the surrounding vehicle. The system may thus be applied to a vehicle, for example with receiving means provided on the vehicle for receiving said trained model, or on an application or application software (APP) of the mobile device. The vehicle may be an internal combustion engine vehicle using an internal combustion engine as a drive source, an electric vehicle or a fuel cell vehicle using an electric motor as a drive source, a hybrid vehicle using both of the above as drive sources, or a vehicle having another drive source. The vehicle is preferably an autonomous vehicle which is not operated by the driver in the driver's seat, and therefore an autonomous vehicle is more required to be equipped with a system according to the present disclosure.
The autonomous vehicles referred to herein include fully autonomous vehicles, as well as vehicles having autonomous driving modes. The automatic driving vehicle applicable to the present disclosure has the following basic features: for example, such vehicles are mounted with a plurality of sensors or positioning devices, such as an image pickup device, a laser radar, a millimeter wave radar, an ultrasonic sensor, a vehicle-mounted communication (V2X) device, a Highly Automated Driving (HAD) map, and the like, which are capable of detecting the environment around the vehicle such as surrounding objects, obstacles, infrastructure, and the like; these vehicles are able to detect the position of the current vehicle through a Global Navigation Satellite System (GNSS) and one or a combination of sensor detection results and HAD maps; the vehicles can obtain navigation paths through an online server; these vehicles are able to plan a route to be traveled based on the perception and location results; such vehicles can also send control commands to the powertrain, steering system, braking system, etc. based on the planned route.
Fig. 1 shows a schematic view of a system 100 for predicting a vehicle cut-in likelihood according to an embodiment of the present disclosure. As shown in FIG. 1, the system 100 includes a training data set collection unit 110, a model training unit 120, and an execution unit 130.
In fig. 1, the training data set collecting unit 110 is configured to collect physical parameters and distance parameters around the reference vehicle and the cut-in situation of the surrounding vehicle. In the present disclosure, the vehicle cut-in means that, in a case where a vehicle travels behind another lane, the vehicle is merged to the other lane. In a preferred example, the data of the physical parameter and the distance parameter may include: -the average speed (avgvogo) of all vehicles in the lane in which the reference vehicle is located; average speed of all vehicles in the left lane (AvgVleft); average speed of all vehicles in the right lane (AvgVright); vehicle density (density) of the current road; a lateral (X-direction) distance (Dx) between the reference vehicle and a surrounding vehicle; a longitudinal (Y-direction) distance (Dy) between the reference vehicle and a surrounding vehicle; -the speed of the reference vehicle (Vego); speed of surrounding vehicles (Vobject); road Type (RT); weather conditions (Wc); lateral distance (Lx) of surrounding vehicles to the lane line. The reference vehicle refers to a vehicle for inspecting whether another vehicle cuts into the front of the reference vehicle. The data referring to the physical and distance parameters around the vehicle and the cut-in situation of the surrounding vehicle may be obtained from the detection device, the vehicle equipment and/or other means. The detection means may or may not be part of the training data set collection unit 110. The detection means may be one or more and may be mounted on the reference vehicle or on a roadside facility. For example, the detection device is any one or a combination of any plurality of the following detection devices mounted on the reference vehicle: an imaging device, a laser radar, a millimeter wave radar, an ultrasonic sensor, preferably an imaging device.
In the present disclosure, the cut-in situation of the surrounding vehicle includes vehicle cut-in and vehicle no cut-in. Detecting a likelihood of a surrounding vehicle cutting into a reference vehicle, the data of the physical parameter and the distance parameter being statistically based on the reference vehicle. The average speed of all vehicles in the lane where the reference vehicle is located, the average speed of all vehicles in the left lane, and the average speed of all vehicles in the right lane are calculated based on vehicles within a distance of 100 meters, 50 meters, or 20 meters before and after the reference vehicle position, for example. The distance before and after the reference vehicle position may be the same or different. For example, the distance before the reference vehicle position is 50 meters, and the distance after the reference vehicle position is 20 meters. Similarly, the vehicle density of the current road is also the vehicle density within a distance before and after the reference vehicle position, for example, vehicles within 100 meters, 50 meters, or 20 meters before and after. The vehicle density may be expressed as a number of vehicles per unit distance, e.g. an average number of vehicles per 10 meters, an average number of vehicles per 20 meters. The surrounding vehicles refer to vehicles in front and at back of the lanes on both sides of the vehicle, for example, the vehicles are within 10 meters or 5 meters of the transverse distance, and within 20 meters or 10 meters of the longitudinal distance. The lateral distance between the reference vehicle and the surrounding vehicle is the distance between lines parallel to the lanes passing through the two vehicles. The longitudinal distance between the reference vehicle and the surrounding vehicle is the distance between lines perpendicular to the lane passing the two vehicles. The lateral distance of a surrounding vehicle to a lane line is the distance between a line parallel to the lane and the lane passing the surrounding vehicle. The physical parameters and the distance parameters may be measured by sensors mounted on the reference vehicle or may be detected by sensors mounted on roadside facilities. When the roadside facility is used for detection, the roadside setting can take each vehicle as a reference vehicle for statistics, so that more data volume can be obtained, and the training of a model at the back is more comprehensive.
In the present disclosure, the road types include, for example, highways, urban roads, suburban roads, etc., which may be detected by sensors or from online maps. Weather conditions include day, night, sunny day, cloudy day, light rain, medium rain, heavy rain, fog, visibility, etc. The detection may be by sensors mounted on the reference vehicle or on roadside facilities, or may be from information released by the weather department. More real-time and accurate road type and weather condition information can be obtained through sensor detection arranged on the reference vehicle or on roadside facilities. For example, a water accumulation condition or an icing condition on the road cannot be accurately known through information issued by a meteorological department.
In the present disclosure, the speed of the reference vehicle may be obtained from a driving parameter of the vehicle or a navigation device; the speed of the surrounding vehicle may be measured by a sensor mounted on the reference vehicle. Alternatively, their speed may be detected by sensors mounted on the roadside facility. If the surrounding vehicle has a wireless connection to the driving assistance system 100 of the present disclosure based on the predicted vehicle cut-in likelihood, the speed of the surrounding vehicle may be passed through its driving parameters or navigation device.
In the present disclosure, the physical parameters and the distance parameters and the cut-in situation of the surrounding vehicle are stored in an online server, and the training data set collecting unit obtains the physical parameters and the distance parameters and the cut-in situation of the surrounding vehicle through the online server. For example, a vehicle in a certain area uploads the physical parameters and distance parameters around the vehicle and the hand-in situation of the surrounding vehicle, which are acquired by the vehicle, to the online server. Since driving habits may differ between countries and regions and between cities of a country, it is preferable to perform subsequent model training on a data set of a certain region.
In fig. 1, the model training unit 120 is configured to train the model by using the physical parameter and the distance parameter and the cut-in situation of the surrounding vehicle as inputs of the deep neural network. And (3) a neural network model for predicting vehicle cut-in is constructed in advance, and the neural network model is trained by using a large amount of real historical driving data as training data to obtain a vehicle cut-in prediction model. The vehicle cut-in prediction model can accurately predict the possibility that a surrounding vehicle cuts in at a certain moment according to a physical parameter and a distance parameter around the certain vehicle at the moment. These ten parameters of the surrounding vehicle of the pre-cut process will be used as inputs to the deepening neural network of the training model. After sufficient training samples, the accuracy will reach a sufficient level. In a preferred embodiment, the training data set collection unit and the model training unit are located on the online server side.
In an exemplary embodiment, fig. 2 shows a schematic diagram of a neural network. Deep Neural Networks (DNNs) are discriminant models with at least one hidden layer of Neural Networks that can be trained using back-propagation algorithms. As shown in FIG. 2, the neural network that may be used for model training unit 120 includes an input layer, a hidden layer, and an output layer where values of vehicle cut-in likelihood are output, e.g., 80%, 75%, 65%, 96%, 1%, 89%, 43%, etc. The physical parameters and distance parameters around the reference vehicle and the cut-in situation of the surrounding vehicle collected by the training data set collection unit 110 are input at the input layer. In an example embodiment, the hidden layer may include a convolutional layer, a Linear rectification function (RecU) layer, a max-pooling layer, a full link layer, a BN layer (Batch Normalization) deactivation (Dropout) layer, a cyclic neural network encoding layer, and a cyclic neural network decoding layer. In one embodiment, the recurrent neural network coding layer and the recurrent neural network decoding layer may be implemented using a long short-term memory network (LSTM). For example, the part from the convolutional layer to the cyclic neural network coding layer can be considered as a coding process. The long-short term memory network after the cyclic neural network coding layer is a cyclic neural network decoding layer, and can be considered as a process for decoding a coding result. Wherein, the convolution layer linear rectification function layer and the maximum pooling layer form a convolution neural network; the full-junction layer, the BN layer and the inactivation layer form a full convolution neural network. The method can be realized by adopting the existing model training method, and the preset neural network model is subjected to model training by utilizing the training data set to obtain the vehicle cut-in possibility prediction model.
In one example, model training of a preset neural network model using a training data set is performed by: the method comprises the steps of collecting physical parameters and distance parameters around a reference vehicle and the cut-in conditions of surrounding vehicles, carrying out parameterization on data, taking a part of parameterized data set as a training set of a neural network, and training a neural network model. The neural network model is verified by the test set by taking the parameterized data set of the other part as the test set while training the neural network, when the accuracy reaches the requirement, the accuracy does not float after the test set is randomly replaced, and the overfitting phenomenon does not occur, so that the neural network training reaches the standard. With the increase of training samples, the neural network can be updated in real time.
In fig. 1, the execution unit 130 is configured to perform driving assistance using the trained model. In one example, the on-demand vehicle may include an information acquisition unit and a prediction unit. For example, the training model is sent to a vehicle in need, the vehicle in need acquires physical parameters and distance parameters around the current vehicle, and the probability of vehicle cut-in around the current vehicle is calculated based on the physical parameters and distance parameters around the current vehicle and the training model. The physical parameters and distance parameters of the current vehicle surroundings are as described above for the reference vehicle. And the current vehicle acquires the trained model from the online server through wireless communication. On a vehicle equipped with the system, the cut-in probability of all surrounding vehicles can be calculated.
In this embodiment, the physical parameters and distance parameters around the current vehicle are input into a trained neural network model that outputs a value of vehicle cut-in likelihood, e.g., 80%, 75%, 65%, 96%, 1%, 89%, 43%, etc. A value of vehicle cut-in likelihood can be predicted for each vehicle around the current vehicle. In one example, the current vehicle displays or broadcasts the prediction to the driver on the current vehicle for reference by the driver, such as through an in-vehicle display or speaker. Alternatively, the prediction result is used to guide the automatic driving of the current vehicle. In one embodiment, the predicted outcome and the surrounding actual hand-in situation are wirelessly transmitted to the training data set collection unit 110 for further optimization of the model in the model training unit 120.
An assist driving method based on a predicted vehicle cut-in likelihood according to an embodiment of the present disclosure will be described below with reference to the drawings. Fig. 3 is a flowchart illustrating a driving assistance method S100 based on a predicted vehicle cut-in likelihood according to an embodiment of the present disclosure. The driving assistance method S100 based on the predicted vehicle cut-in possibility is executed by the system 100 described above.
As shown in fig. 3, in step S110, physical parameters and distance parameters around the vehicle and the cut-in situation of the surrounding vehicle are collected. In step S120, the physical parameters and the distance parameters and the cut-in situation of the surrounding vehicle are used as the input of the deep neural network to train the model. The detailed description has been made above with respect to the reference to physical parameters and distance parameters around the vehicle and the cut-in situation of the surrounding vehicle, as well as the deep neural network. See in particular the description of the training data set collection unit 110 and the model training unit 120 above.
In step S130, driving assistance is performed using the trained model. In one example, a likelihood of vehicle cut-in around the current vehicle is calculated based on physical and distance parameters around the current vehicle and the trained model. The physical parameters and distance parameters around the current vehicle, and the possibility of calculating the vehicle cut-in around the current vehicle, are also described in detail above and will not be described further here. With the method of the present disclosure, the cut-in likelihood of each vehicle can be calculated by an algorithm based on the physical parameters of the surrounding vehicles so that the current vehicle can react earlier.
The method of the present disclosure may be implemented by a computer program, processing the physical parameters and distance parameters collected in step S110 and the cut-in situation of the surrounding vehicle by the computer program, training a deep neural network model, and calculating the possibility of cut-in of the vehicle around the current vehicle based on the physical parameters and distance parameters around the current vehicle and the trained model. Accordingly, the present disclosure may also include a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method described in embodiments of the present disclosure.
It will be understood by those skilled in the art that the division and order of the steps in the driving assistance method based on the predicted vehicle cut-in possibility of the present disclosure is merely illustrative and not restrictive, and that various omissions, additions, substitutions, modifications and changes may be made by those skilled in the art without departing from the spirit and scope of the present disclosure as set forth in the appended claims and equivalents thereof.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
While the present disclosure has been described in connection with embodiments, it is to be understood by those skilled in the art that the foregoing description and drawings are merely illustrative and not restrictive of the disclosed embodiments. Various modifications and variations are possible without departing from the spirit of the disclosure.
Claims (11)
1. A driving assistance system based on a predicted vehicle cut-in likelihood, the driving assistance system comprising:
a training data set collection unit configured to collect physical parameters and distance parameters around a reference vehicle and cut-in situations of surrounding vehicles;
a model training unit configured to train a model using the physical parameters and the distance parameters and the cut-in conditions of the surrounding vehicles as inputs to a deep neural network;
an execution unit configured to perform driving assistance using the trained model.
2. The driving assistance system according to claim 1, wherein in the training data set collection unit, the data of the physical parameter and the distance parameter include: the average speed of all vehicles in the lane where the reference vehicle is located; average speed of all vehicles in the left lane; average speed of all vehicles in the right lane; vehicle density of the current road; a lateral distance between the reference vehicle and a surrounding vehicle; a longitudinal distance between the reference vehicle and a surrounding vehicle; a speed of the reference vehicle; the speed of the surrounding vehicle; a road type; weather conditions; lateral distance of surrounding vehicles to lane line.
3. The driving assistance system according to claim 2, wherein the data of the physical parameter and the distance parameter are acquired by sensors provided around the reference vehicle, the sensors being any one or a combination of any plurality of: an imaging device, a laser radar, a millimeter wave radar, an ultrasonic sensor, preferably an imaging device.
4. The driver assistance system according to claim 1 or 2, wherein the execution unit comprises sending the training model to a vehicle in need thereof.
5. The driver assistance system according to claim 4, wherein the vehicle in need acquires physical parameters and distance parameters of its surroundings, and calculates the possibility of cut-in of the surrounding vehicle based on the physical parameters and distance parameters of its surroundings and the trained model.
6. A driving assistance method based on a predicted vehicle cut-in likelihood, the method comprising:
(1) collecting physical parameters and distance parameters around a reference vehicle and the cut-in conditions of surrounding vehicles;
(2) training a model by taking the physical parameters, the distance parameters and the cut-in conditions of surrounding vehicles as the input of a deep neural network;
(3) and performing auxiliary driving by using the trained model.
7. The method of claim 6, wherein in (1), the data for the physical parameter and the distance parameter comprises: the average speed of all vehicles in the lane where the reference vehicle is located; average speed of all vehicles in the left lane; average speed of all vehicles in the right lane; vehicle density of the current road; a lateral distance between the reference vehicle and a surrounding vehicle; a longitudinal distance between the reference vehicle and a surrounding vehicle; a speed of the reference vehicle; the speed of the surrounding vehicle; a road type; weather conditions; lateral distance of surrounding vehicles to lane line.
8. The method of claim 7, wherein the data of the physical parameter and the distance parameter are obtained by sensors disposed around the reference vehicle, the sensors being any one or a combination of any plurality of: an imaging device, a laser radar, a millimeter wave radar, an ultrasonic sensor, preferably an imaging device.
9. The driving assistance method according to claim 6 or 7, characterized by comprising in (3) sending the training model to a vehicle in need.
10. The method of claim 9, wherein the vehicle in need acquires physical parameters and distance parameters of its surroundings, and calculates the likelihood of a surrounding vehicle cut-in based on the physical parameters and distance parameters of its surroundings and the trained model.
11. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the method according to any one of the claims 6-10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110183027.9A CN114913711A (en) | 2021-02-10 | 2021-02-10 | Auxiliary driving system and method based on predicted vehicle cut-in possibility |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110183027.9A CN114913711A (en) | 2021-02-10 | 2021-02-10 | Auxiliary driving system and method based on predicted vehicle cut-in possibility |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114913711A true CN114913711A (en) | 2022-08-16 |
Family
ID=82761459
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110183027.9A Pending CN114913711A (en) | 2021-02-10 | 2021-02-10 | Auxiliary driving system and method based on predicted vehicle cut-in possibility |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114913711A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140195093A1 (en) * | 2013-01-04 | 2014-07-10 | Carnegie Mellon University | Autonomous Driving Merge Management System |
CN108604292A (en) * | 2015-11-26 | 2018-09-28 | 御眼视觉技术有限公司 | Automatic Prediction to the vehicle in incision track and Li Ta responses |
US20190049970A1 (en) * | 2017-08-08 | 2019-02-14 | Uber Technologies, Inc. | Object Motion Prediction and Autonomous Vehicle Control |
CN110267856A (en) * | 2017-03-01 | 2019-09-20 | 本田技研工业株式会社 | Controller of vehicle, control method for vehicle and program |
CN111645682A (en) * | 2020-04-20 | 2020-09-11 | 长城汽车股份有限公司 | Cruise control method and system and vehicle |
CN111845787A (en) * | 2020-08-03 | 2020-10-30 | 北京理工大学 | Lane change intention prediction method based on LSTM |
-
2021
- 2021-02-10 CN CN202110183027.9A patent/CN114913711A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140195093A1 (en) * | 2013-01-04 | 2014-07-10 | Carnegie Mellon University | Autonomous Driving Merge Management System |
CN108604292A (en) * | 2015-11-26 | 2018-09-28 | 御眼视觉技术有限公司 | Automatic Prediction to the vehicle in incision track and Li Ta responses |
CN110267856A (en) * | 2017-03-01 | 2019-09-20 | 本田技研工业株式会社 | Controller of vehicle, control method for vehicle and program |
US20190049970A1 (en) * | 2017-08-08 | 2019-02-14 | Uber Technologies, Inc. | Object Motion Prediction and Autonomous Vehicle Control |
CN111645682A (en) * | 2020-04-20 | 2020-09-11 | 长城汽车股份有限公司 | Cruise control method and system and vehicle |
CN111845787A (en) * | 2020-08-03 | 2020-10-30 | 北京理工大学 | Lane change intention prediction method based on LSTM |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3578924B1 (en) | Warning polygons for weather from vehicle sensor data | |
CN112368662B (en) | Directional adjustment actions for autonomous vehicle operation management | |
US11131554B2 (en) | Systems and methods for vehicle telemetry | |
US20200208998A1 (en) | Systems and methods for safe route planning for a vehicle | |
US20220032955A1 (en) | Vehicle control device and vehicle control method | |
CN113748316B (en) | System and method for vehicle telemetry | |
CN102472629B (en) | For the electronic horizon of driver assistance system | |
CN111902782A (en) | Centralized shared autonomous vehicle operation management | |
CN111373269B (en) | Method and system for determining an effective wind speed for a motor vehicle | |
US11741692B1 (en) | Prediction error scenario mining for machine learning models | |
US20230252084A1 (en) | Vehicle scenario mining for machine learning models | |
CN114274972A (en) | Scene recognition in an autonomous driving environment | |
WO2020139324A1 (en) | Systems and methods for safe route planning for a vehicle | |
CN114968187A (en) | Platform for perception system development of an autopilot system | |
US10532750B2 (en) | Method, device and system for wrong-way driver detection | |
CN112985825A (en) | Method for determining ride stability of an autonomous driving system | |
CN115221151B (en) | Vehicle data transmission method and device, vehicle, storage medium and chip | |
CN114771539B (en) | Vehicle lane change decision method and device, storage medium and vehicle | |
CN113227831A (en) | Guardrail estimation method based on multi-sensor data fusion and vehicle-mounted equipment | |
US11400958B1 (en) | Learning to identify safety-critical scenarios for an autonomous vehicle | |
CN114913711A (en) | Auxiliary driving system and method based on predicted vehicle cut-in possibility | |
US20230054974A1 (en) | Intersection Risk Indicator | |
CN114968189A (en) | Platform for perception system development of an autopilot system | |
CN113808382A (en) | Auxiliary driving system and method based on vehicle cut-in critical level prediction | |
US20230113532A1 (en) | Path planning for vehicle based on accident intensity |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |