CN113808382A - Auxiliary driving system and method based on vehicle cut-in critical level prediction - Google Patents
Auxiliary driving system and method based on vehicle cut-in critical level prediction Download PDFInfo
- Publication number
- CN113808382A CN113808382A CN202010544873.4A CN202010544873A CN113808382A CN 113808382 A CN113808382 A CN 113808382A CN 202010544873 A CN202010544873 A CN 202010544873A CN 113808382 A CN113808382 A CN 113808382A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- cut
- parameters
- threshold level
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 238000012549 training Methods 0.000 claims abstract description 43
- 238000013528 artificial neural network Methods 0.000 claims abstract description 26
- 230000001133 acceleration Effects 0.000 claims description 34
- 238000003384 imaging method Methods 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 6
- 238000003062 neural network model Methods 0.000 abstract description 12
- 238000001514 detection method Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 5
- 125000004122 cyclic group Chemical group 0.000 description 5
- 238000010586 diagram Methods 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000002485 combustion reaction Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000011176 pooling Methods 0.000 description 2
- 230000000306 recurrent effect Effects 0.000 description 2
- 238000007792 addition Methods 0.000 description 1
- 238000010420 art technique Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000009849 deactivation Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000002779 inactivation Effects 0.000 description 1
- 230000015654 memory Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000006403 short-term memory Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
- G08G1/162—Decentralised systems, e.g. inter-vehicle communication event-triggered
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
Abstract
The present disclosure relates to a driving assistance system and method based on predicting a vehicle cut-in threshold level. The driving assistance system includes: a training data set collection unit configured to collect parameters of a reference vehicle and a cut-in vehicle and cut-in threshold level values; a model training unit configured to train a model using the parameters and the critical level values as inputs to a deep neural network; an execution unit configured to perform driving assistance using the trained model. The driving assistance system and method according to the present disclosure can train a deep neural network model that can predict the cut-in threshold level of a vehicle by collecting parameters and cut-in threshold level values of a reference vehicle and a cut-in vehicle.
Description
Technical Field
The present disclosure relates to the field of vehicle safe driving technology, and more particularly, to a driving assistance system and method based on predicting a vehicle cut-in threshold level.
Background
During the running process of the vehicle, the cut-in of other vehicles can affect the running of the current vehicle, and even a collision accident can happen. Therefore, it is important to determine the critical level of vehicle cut-in.
Prior art techniques typically calculate three individual probabilities using three parameters, respectively, when predicting a critical level of vehicle cut-in: a lateral velocity of the target vehicle; an overlap ratio between the target vehicle and the lane; target turn indicator light information. The total probability is calculated by combining the three individual probabilities for predicting the criticality level of vehicle cut-in. The disadvantages of such predicted vehicle cut-in threshold levels are: it is not suitable for areas where the driving behavior is more reckless and how close the vehicle is cut into, and in some cases it is predicted that the time when the target vehicle is cut into is too late to prevent a collision from occurring. Also, in this case, the driver needs to interrupt the ACC function, and thus the current ACC performance cannot meet the demand.
Accordingly, there is a need for improved driver assistance systems and methods based on predicting vehicle cut-in criticality.
Disclosure of Invention
The invention aims to provide a driving assistance system and a driving assistance method based on prediction of a vehicle cut-in critical level by recording a physical parameter and a distance parameter related to the critical level of the vehicle cut-in. The technical scheme provided by the disclosure can be used for predicting the cut-in critical level of the target vehicle.
Thus, according to a first aspect of the present disclosure, there is provided a driving assistance system based on predicting a vehicle cut-in threshold level, the system comprising:
a training data set collection unit configured to collect parameters of a reference vehicle and a cut-in vehicle and cut-in threshold level values;
a model training unit configured to train a model using the parameters and the critical level values as inputs to a deep neural network;
an execution unit configured to perform driving assistance using the trained model.
In a preferred embodiment, the parameters of the reference vehicle and the cut-in vehicle include lateral speed, longitudinal speed, lateral acceleration, longitudinal acceleration, heading angle, yaw angle of the reference vehicle; the transverse speed, the longitudinal speed, the transverse acceleration, the longitudinal acceleration, the course angle and the yaw angle of the cut-in vehicle; a lateral distance, a longitudinal distance, and a closest distance of the reference vehicle to the cut-in vehicle travel lane of the cut-in vehicle.
In a preferred embodiment, the parameters of the reference vehicle and the cut-in vehicle are acquired by sensors arranged on the reference vehicle or on the road infrastructure.
In one example, the sensor is any one or a combination of any plurality of: an imaging device, a laser radar, a millimeter wave radar, an ultrasonic sensor, preferably an imaging device.
In a preferred embodiment, the execution unit sends the trained model to a vehicle in need thereof.
In a preferred embodiment, the vehicle in need thereof is provided with: an information acquisition unit configured to acquire parameters of a current vehicle and a target vehicle; a prediction unit configured to calculate a cut-in threshold level value for the target vehicle based on the parameters of the current and target vehicles and the trained model.
In a preferred embodiment, the parameters of the current vehicle and the target vehicle include a lateral speed, a longitudinal speed, a lateral acceleration, a longitudinal acceleration, a heading angle, a yaw angle of the current vehicle; the lateral speed, the longitudinal speed, the lateral acceleration, the longitudinal acceleration, the course angle and the yaw angle of the target vehicle; the transverse distance, the longitudinal distance and the nearest distance between the current vehicle and the target vehicle running channel.
In a preferred embodiment, the parameters of the current vehicle and the target vehicle are acquired by sensors disposed around the current vehicle.
In one example, the sensor is any one or a combination of any plurality of: an imaging device, a laser radar, a millimeter wave radar, an ultrasonic sensor, preferably an imaging device.
In a preferred embodiment, the parameters of the reference vehicle and the cut-in threshold level values are stored on the online server side, and the training data set collecting unit obtains the parameters of the reference vehicle and the cut-in threshold level values through the online server. For example, a vehicle in a certain region uploads the parameters and cut-in threshold level values of its acquired reference and cut-in vehicles to the online server.
In a preferred embodiment, the training data set collection unit and the model training unit are located on an online server side from which vehicles in need thereof obtain the trained models via wireless communication.
According to a second aspect of the present disclosure, there is provided a driving assistance method based on predicting a cut-in threshold level of a vehicle, the method comprising:
(1) collecting parameters and cut-in critical level values of a reference vehicle and a cut-in vehicle;
(2) training a model by taking the parameters of the reference vehicle and the cut-in critical level value as the input of a deep neural network;
(3) and performing auxiliary driving by using the trained model.
In a preferred embodiment, in (1), the parameters of the reference vehicle and the cut-in vehicle include lateral speed, longitudinal speed, lateral acceleration, longitudinal acceleration, heading angle, yaw angle of the reference vehicle; the transverse speed, the longitudinal speed, the transverse acceleration, the longitudinal acceleration, the course angle and the yaw angle of the cut-in vehicle; a lateral distance, a longitudinal distance, and a closest distance of the reference vehicle to the cut-in vehicle travel lane of the cut-in vehicle.
In a preferred embodiment, in (1), the parameters of the reference vehicle and the cut-in vehicle are acquired by sensors provided on the reference vehicle or on the road infrastructure.
In a preferred embodiment, the sensor is any one or a combination of any plurality of: an imaging device, a laser radar, a millimeter wave radar, an ultrasonic sensor, preferably an imaging device.
In a preferred embodiment, in (3) comprises sending the trained model to a vehicle in need thereof.
In a preferred embodiment, the vehicle in need thereof: acquiring parameters of a current vehicle and a target vehicle; calculating a cut-in threshold level value for the target vehicle based on the parameters of the current and target vehicles and the trained model.
In a preferred embodiment, the parameters of the current vehicle and the target vehicle include a lateral speed, a longitudinal speed, a lateral acceleration, a longitudinal acceleration, a heading angle, a yaw angle of the current vehicle; the lateral speed, the longitudinal speed, the lateral acceleration, the longitudinal acceleration, the course angle and the yaw angle of the target vehicle; the transverse distance, the longitudinal distance and the nearest distance between the current vehicle and the target vehicle running channel.
In a preferred embodiment, the parameters of the current vehicle and the target vehicle are acquired by sensors disposed around the current vehicle.
In a preferred embodiment, the sensor is any one or a combination of any plurality of: an imaging device, a laser radar, a millimeter wave radar, an ultrasonic sensor, preferably an imaging device.
In a preferred embodiment, in (1), the parameters and cut-in threshold level values of the reference vehicle and cut-in vehicle are stored in an online server, and the training data set collecting unit obtains the parameters and cut-in threshold level values of the reference vehicle and cut-in vehicle through the online server. For example, a vehicle in a certain region uploads the parameters and cut-in threshold level values of its acquired reference and cut-in vehicles to the online server.
In a preferred embodiment, (1) and (2) are performed at an online server side from which a vehicle in need thereof acquires the trained model through wireless communication.
According to a third aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method according to the second aspect of the present disclosure.
With the system and method of the present disclosure, a deep neural network model is trained by collecting parameters and cut-in threshold level values for reference vehicles and cut-in vehicles, which model can enable prediction of the threshold level for vehicle cut-in to make preparation for vehicle cut-in events in advance.
Drawings
The present disclosure may be better understood from the following description of specific embodiments thereof taken in conjunction with the accompanying drawings, in which like reference numerals identify identical or functionally similar elements.
Fig. 1 shows a schematic view of a system according to one embodiment of the present disclosure.
Figure 2 shows a schematic diagram of a deep neural network model of a system and method according to one embodiment of the present disclosure.
Fig. 3 shows a block flow diagram of a method according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure are described with reference to the drawings. The following detailed description and drawings are included to illustrate the principles of the disclosure, which is not to be limited to the preferred embodiments described, but is to be defined by the claims. The disclosure will now be described in detail with reference to exemplary embodiments thereof, some of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings, in which like reference numerals refer to the same or similar elements in different drawings unless otherwise indicated. The aspects described in the following exemplary embodiments do not represent all aspects of the present disclosure. Rather, these aspects are merely examples of systems and methods according to various aspects of the present disclosure recited in the appended claims.
According to the system of the present disclosure, the critical level of vehicle cut-in is calculated by measuring physical and distance parameters of the current vehicle and cut-in vehicle. Thus, the system may be applied to a vehicle, for example with receiving means on the vehicle for receiving said trained model, or receiving said trained model on the APP of the mobile device. The vehicle may be an internal combustion engine vehicle using an internal combustion engine as a drive source, an electric vehicle or a fuel cell vehicle using an electric motor as a drive source, a hybrid vehicle using both of the above as drive sources, or a vehicle having another drive source. The vehicle is preferably an autonomous vehicle that is not operated by the driver in the driver's seat, and therefore an autonomous vehicle is more likely to be equipped with a system according to the present disclosure.
The autonomous vehicles referred to herein include fully autonomous vehicles, as well as vehicles having autonomous driving modes. The automatic driving vehicle applicable to the present disclosure has the following basic features: for example, such vehicles are mounted with a plurality of sensors or positioning devices, such as an image pickup device, a laser radar, a millimeter wave radar, an ultrasonic sensor, a vehicle-mounted communication (V2X) device, a Highly Automated Driving (HAD) map, and the like, which are capable of detecting the environment around the vehicle such as surrounding objects, obstacles, infrastructure, and the like; these vehicles are able to detect the location of the current vehicle through Global Navigation Satellite System (GNSS) and one or a combination of sensor detection and HAD maps; the vehicles can obtain navigation paths through the online server; these vehicles are able to plan a route to be traveled based on the perception and location results; such vehicles can also send control commands to the powertrain, steering system, braking system, etc. based on the planned route.
Fig. 1 shows a schematic view of a driving assistance system 100 based on predicting a vehicle cut-in threshold level according to an embodiment of the present disclosure. As shown in FIG. 1, the system 100 includes a training data set collection unit 110, a model training unit 120, and an execution unit 130.
In fig. 1, the training data set collection unit 110 is configured to collect parameters and cut-in threshold level values for the reference vehicle and the cut-in vehicle. In the present disclosure, the vehicle cut-in means that, in a case where a vehicle travels behind another lane, the vehicle is merged to the other lane. In a preferred example, the parameters of the reference vehicle and the cut-in vehicle include a lateral speed, a longitudinal speed, a lateral acceleration, a longitudinal acceleration, a heading angle, a yaw angle of the reference vehicle; the transverse speed, the longitudinal speed, the transverse acceleration, the longitudinal acceleration, the course angle and the yaw angle of the cut-in vehicle; a lateral distance, a longitudinal distance, and a closest distance of the reference vehicle to the cut-in vehicle travel lane of the cut-in vehicle. The reference vehicle refers to a vehicle in front of which another vehicle cuts. The data for the parameters of the reference vehicle and the cut-in threshold level values may be obtained from the detection device, the vehicle equipment and/or other means. The detection means may or may not be part of the training data set collection unit 110. The detection means may be one or more and may be mounted on the reference vehicle or on the road infrastructure. For example, the detection device is any one or a combination of any plurality of the following detection devices mounted on the reference vehicle: an imaging device, a laser radar, a millimeter wave radar, an ultrasonic sensor, preferably an imaging device.
In the present disclosure, the critical level value for vehicle cut-in refers to the risk level at vehicle cut-in, which may be measured, for example, by referring to the minimum time interval between the vehicle and the cut-in vehicle. The time interval may be calculated by dividing the distance between the two by the relative velocity. The smaller the minimum time interval, the higher the risk level, the greater the critical level value for the vehicle cut-in. For example, the critical level value for the vehicle cut-in is represented by 0-5, 0 representing no risk, e.g., the minimum time interval is greater than 2 seconds; 5 indicates the most dangerous, e.g. the minimum time interval is less than 0.1 second. The distance between the reference vehicle and the cut-in vehicle divided by the relative speed may be detected by a detection device mounted on the reference vehicle or on the road infrastructure. For example, the detection device is any one or a combination of any plurality of the following detection devices mounted on the reference vehicle: an imaging device, a laser radar, a millimeter wave radar, an ultrasonic sensor, preferably an imaging device.
In the present disclosure, the lateral speed of the vehicle refers to a speed perpendicular to the direction of motion. The longitudinal speed of the vehicle refers to the speed in the direction of motion. The lateral acceleration of the vehicle refers to an acceleration perpendicular to the direction of motion, for example, when the vehicle turns on a horizontal road, the acceleration is provided by a lateral frictional force. The longitudinal acceleration of the vehicle refers to an acceleration in a moving direction, such as an acceleration provided by driving the vehicle to accelerate or decelerate. The course angle of the vehicle is an included angle between the driving direction of the vehicle and the tangent line of the circumference of the point; or the included angle between the speed of the mass center of the vehicle and the horizontal axis. The yaw angle of the vehicle is the yaw angle of the vehicle relative to the lane, and the yaw angle of the vehicle can be determined by adopting a lane line included angle method, namely the yaw angle of the vehicle is calculated according to the angle deviation generated by two lane lines in the image acquired by the camera. The lateral distance between the reference vehicle and the cut-in vehicle is the distance between lines parallel to the lane passing through the two vehicles. The longitudinal distance between the reference vehicle and the cut-in vehicle is the distance between lines perpendicular to the lane passing through the two vehicles. The closest distance between the reference vehicle and the cut-in vehicle travel path is the distance between a line parallel to the lane passing through the reference vehicle and the cut-in vehicle travel path.
The parameters of the reference vehicle and the cut-in vehicle may be measured by sensors mounted on the reference vehicle or detected by sensors mounted on the road infrastructure. When the road infrastructure is used for detection, the road infrastructure can count all vehicle cut-in events, so that more data volume can be obtained, and the training of a model at the back is more comprehensive.
In the present disclosure, the speed of the reference vehicle may be obtained from a driving parameter of the vehicle or a navigation device; the speed of the cut-in vehicle may be measured by a sensor mounted on the reference vehicle. Alternatively, their speed may be detected by sensors mounted on the road infrastructure. If the cut-in vehicle has a wireless connection to the driver assistance system 100 of the present disclosure based on predicting the critical level of vehicle cut-in, the speed of the cut-in vehicle may be obtained by its travel parameter calculation or from its navigation device.
In this disclosure, the system according to the present disclosure may be located on an online server side, where model training is performed, such as the training data set collection unit and the model training unit. And the vehicle acquires the trained model from the online server through wireless communication. The parameters of the reference vehicle and the cut-in critical level value are stored in an online server, and the training data set collecting unit obtains the parameters of the reference vehicle and the cut-in critical level value through the online server. For example, the parameters and cut-in threshold level values obtained for the reference vehicle and cut-in vehicle are uploaded to the online server.
In fig. 1, the model training unit 120 is configured to train the model using the parameters and the critical level values as inputs to the deep neural network. An assistant driving neural network model used for predicting the vehicle cut-in critical level is constructed in advance, and the neural network model is trained by taking a large amount of real historical driving data as training data to obtain a prediction model of the vehicle cut-in critical level. The vehicle cut-in critical level prediction model can accurately predict the vehicle cut-in critical level according to the physical parameters and the distance parameters of the current vehicle and the target vehicle. The parameters of the reference vehicle and the cut-in vehicle will be the inputs to the deep neural network of the training model. After sufficient training samples, the accuracy will reach a sufficient level. In a preferred embodiment, the training data set collection unit and the model training unit are located on an online server side.
In an exemplary embodiment, fig. 2 shows a schematic diagram of a neural network. The following data is illustratively collected for 2000 hand-in events as shown in FIG. 2: a target vehicle yaw angle, a current vehicle yaw angle, a lateral distance, a longitudinal distance, a target vehicle yaw angle, a current vehicle yaw angle, and a threshold level. Deep Neural Networks (DNNs) are discriminant models with at least one hidden layer of Neural Networks that can be trained using back-propagation algorithms. As shown in FIG. 2, a neural network that may be used for model training unit 120 includes an input layer, a hidden layer, and an output layer. The parameters of the reference vehicle and the cut-in vehicle collected by the training data set collection unit 110 and the critical level of vehicle cut-in are input at the input layer. In an example embodiment, the hidden layer may include a convolutional layer, a Linear rectification function (RecU) layer, a max pooling layer, a full link layer, a BN layer (Batch Normalization), a deactivation (Dropout) layer, a cyclic neural network coding layer, and a cyclic neural network decoding layer. In one embodiment, the recurrent neural network coding layer and the recurrent neural network decoding layer may be implemented using a long short-term memory network (LSTM). For example, the part from the convolutional layer to the cyclic neural network coding layer can be considered as a coding process. The long-short term memory network after the cyclic neural network coding layer is a cyclic neural network decoding layer, and can be considered as a process for decoding a coding result. Wherein, the convolution layer, the linear rectification function layer and the maximum pooling layer form a convolution neural network; the full-junction layer, the BN layer and the inactivation layer form a full convolution neural network. The method can be realized by adopting the existing model training method, and the preset neural network model is subjected to model training by utilizing the training data set to obtain a prediction model of the vehicle cut-in critical level.
In one example, model training of a preset neural network model using a training data set is performed by: parameters of a reference vehicle and a cut-in vehicle and a critical level of the cut-in of the vehicle are collected, data are parameterized, and a part of parameterized data set is taken as a training set of a neural network to train a neural network model. The neural network model is verified by the test set by taking the other part of parameterized data set as the test set while training the neural network, when the accuracy reaches the requirement, the accuracy does not float after the test set is randomly replaced, the overfitting phenomenon does not occur, and the neural network training reaches the standard at the moment. With the increase of training samples, the neural network can be updated in real time.
In fig. 1, the execution unit 130 is configured to perform driving assistance using the trained model. For example, the trained model is sent to a vehicle in need thereof, which calculates a cut-in threshold level value for the target vehicle based on its parameters and the trained model. In one example, the vehicle in need includes an information acquisition unit and a prediction unit; the information acquisition unit is configured to acquire parameters of a current vehicle and a target vehicle, and the prediction unit is configured to calculate a cut-in threshold level value of the target vehicle based on the parameters of the current vehicle and the target vehicle and the trained model. The parameters of the current vehicle and the target vehicle are as described above for the reference vehicle and the cut-in vehicle. The information acquisition unit and the prediction unit are positioned at the current vehicle end, and the current vehicle acquires the trained model from the online server end through wireless communication. On a vehicle equipped with the system, cut-in threshold levels for all cut-in vehicles can be calculated.
In this embodiment, the parameters of the current vehicle and the target vehicle are input into a trained neural network model, and the neural network model outputs a value of the vehicle cut-in critical level, for example, a value of 1-5, wherein a larger value indicates a higher vehicle cut-in critical level and a higher risk degree. In one example, the prediction unit displays or broadcasts the prediction result to the driver on the current vehicle for reference by the driver, such as through an in-vehicle display or speaker. Alternatively, the prediction result is used to guide the automatic driving of the current vehicle. In one embodiment, the predicted outcome and actual vehicle cut-in threshold level are wirelessly transmitted to the training data set collection unit 110 for further optimization of the model in the model training unit 120.
A driving assist method based on a predicted vehicle cut-in threshold level according to an embodiment of the present disclosure will be described below with reference to the drawings. Fig. 3 is a flowchart illustrating a driving assistance method S100 based on a predicted vehicle cut-in threshold level according to an embodiment of the present disclosure. The driving assistance method S100 based on the predicted vehicle cut-in threshold level is performed by the system 100 described above.
As shown in fig. 3, in step S110, parameters of the reference vehicle and the cut-in vehicle and cut-in threshold level values are collected. In step S120, the parameters of the reference vehicle and the cut-in critical level values are used as inputs to the deep neural network to train the model. The above has been detailed with respect to parameters and cut-in threshold level values with reference to the vehicle and cut-in vehicle, and the deep neural network. See in particular the description of the training data set collection unit 110 and the model training unit 120 above.
In step S130, driving assistance is performed using the trained model. In one example, a cut-in threshold level value for the target vehicle is calculated based on parameters of the current and target vehicles and the trained model. The parameters of the current vehicle and the target vehicle, and the cut-in threshold level value of the target vehicle are also described in detail above, and are not described herein again. With the method of the present disclosure, the threshold level of vehicle cut-in may be calculated by an algorithm based on the parameters of the current vehicle and the target vehicle so that the current vehicle may react earlier.
The method of the present disclosure may be accomplished by using a computer program, processing the parameters and cut-in threshold level values of the reference vehicle and the cut-in vehicle collected in step S110 by the computer program, training a deep neural network model, and calculating the cut-in threshold level of the target vehicle based on the parameters of the current vehicle and the target vehicle and the trained model. Accordingly, the present disclosure may also include a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method described in embodiments of the present disclosure.
It will be understood by those skilled in the art that the division and order of the steps in the driving assistance method based on the prediction of the cut-in threshold level of the vehicle according to the present disclosure is merely illustrative and not restrictive, and that various omissions, additions, substitutions, modifications and changes may be made by those skilled in the art without departing from the spirit and scope of the present disclosure as set forth in the appended claims and their equivalents.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
While the present disclosure has been described in connection with embodiments, it is to be understood by those skilled in the art that the foregoing description and drawings are merely illustrative and not restrictive of the disclosed embodiments. Various modifications and variations are possible without departing from the spirit of the disclosure.
Claims (13)
1. A driving assistance system based on predicting a vehicle cut-in threshold level, the driving assistance system comprising:
a training data set collection unit configured to collect parameters of a reference vehicle and a cut-in vehicle and cut-in threshold level values;
a model training unit configured to train a model using the parameters and the critical level values as inputs to a deep neural network;
an execution unit configured to perform driving assistance using the trained model.
2. The driver assistance system of claim 1, wherein the parameters of the reference vehicle and the cut-in vehicle include a lateral speed, a longitudinal speed, a lateral acceleration, a longitudinal acceleration, a heading angle, a yaw angle of the reference vehicle; the transverse speed, the longitudinal speed, the transverse acceleration, the longitudinal acceleration, the course angle and the yaw angle of the cut-in vehicle; a lateral distance, a longitudinal distance, and a closest distance of the reference vehicle to the cut-in vehicle travel lane of the cut-in vehicle.
3. The driver assistance system according to claim 2, wherein the parameters of the reference vehicle and the cut-in vehicle are obtained by sensors provided on the reference vehicle or on the road infrastructure.
4. The driver assistance system according to claim 3, wherein the sensor is any one or a combination of any plurality of: an imaging device, a laser radar, a millimeter wave radar, an ultrasonic sensor, preferably an imaging device.
5. The driver assistance system according to claim 1 or 2, wherein the execution unit comprises sending the trained model to a vehicle in need thereof.
6. The driver-assist system of claim 5, wherein the on-demand vehicle calculates a cut-in threshold level value for the target vehicle based on its and target vehicle parameters and the trained model.
7. A method of driver assistance based on predicting a cut-in threshold level of a vehicle, the method comprising:
(1) collecting parameters and cut-in critical level values of a reference vehicle and a cut-in vehicle;
(2) training a model by taking the parameters of the reference vehicle and the cut-in critical level value as the input of a deep neural network;
(3) and performing auxiliary driving by using the trained model.
8. The driving assist method according to claim 7, wherein in (1), the parameters of the reference vehicle and the cut-in vehicle include a lateral speed, a longitudinal speed, a lateral acceleration, a longitudinal acceleration, a heading angle, a yaw angle of the reference vehicle; the transverse speed, the longitudinal speed, the transverse acceleration, the longitudinal acceleration, the course angle and the yaw angle of the cut-in vehicle; a lateral distance, a longitudinal distance, and a closest distance of the reference vehicle to the cut-in vehicle travel lane of the cut-in vehicle.
9. The driving assist method according to claim 8, wherein in (1), the parameters of the reference vehicle and the cut-in vehicle are acquired by sensors provided on the reference vehicle or on a road infrastructure.
10. The driving assist method according to claim 9, wherein the sensor is any one or a combination of any plural ones of: an imaging device, a laser radar, a millimeter wave radar, an ultrasonic sensor, preferably an imaging device.
11. The driving assist method according to claim 7 or 8, comprising in (3) sending the trained model to a vehicle in need.
12. The driving assist method according to claim 11, wherein the vehicle in need calculates a cut-in threshold level value of the target vehicle based on its and target vehicle parameters and the trained model.
13. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the method according to any one of the claims 7-12.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010544873.4A CN113808382A (en) | 2020-06-15 | 2020-06-15 | Auxiliary driving system and method based on vehicle cut-in critical level prediction |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010544873.4A CN113808382A (en) | 2020-06-15 | 2020-06-15 | Auxiliary driving system and method based on vehicle cut-in critical level prediction |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113808382A true CN113808382A (en) | 2021-12-17 |
Family
ID=78944223
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010544873.4A Pending CN113808382A (en) | 2020-06-15 | 2020-06-15 | Auxiliary driving system and method based on vehicle cut-in critical level prediction |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113808382A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108216237A (en) * | 2016-12-16 | 2018-06-29 | 现代自动车株式会社 | For controlling the device and method of the autonomous driving of vehicle |
CN108428343A (en) * | 2018-05-17 | 2018-08-21 | 长沙理工大学 | A kind of more vehicle driving behavior analysis and danger early warning method and system |
CN109299784A (en) * | 2018-08-01 | 2019-02-01 | 广州大学 | A kind of auxiliary driving method neural network based, device and readable storage medium storing program for executing |
CN109808706A (en) * | 2019-02-14 | 2019-05-28 | 上海思致汽车工程技术有限公司 | Learning type assistant driving control method, device, system and vehicle |
CN111081065A (en) * | 2019-12-13 | 2020-04-28 | 北京理工大学 | Intelligent vehicle collaborative lane change decision model under road section mixed traveling condition |
-
2020
- 2020-06-15 CN CN202010544873.4A patent/CN113808382A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108216237A (en) * | 2016-12-16 | 2018-06-29 | 现代自动车株式会社 | For controlling the device and method of the autonomous driving of vehicle |
CN108428343A (en) * | 2018-05-17 | 2018-08-21 | 长沙理工大学 | A kind of more vehicle driving behavior analysis and danger early warning method and system |
CN109299784A (en) * | 2018-08-01 | 2019-02-01 | 广州大学 | A kind of auxiliary driving method neural network based, device and readable storage medium storing program for executing |
CN109808706A (en) * | 2019-02-14 | 2019-05-28 | 上海思致汽车工程技术有限公司 | Learning type assistant driving control method, device, system and vehicle |
CN111081065A (en) * | 2019-12-13 | 2020-04-28 | 北京理工大学 | Intelligent vehicle collaborative lane change decision model under road section mixed traveling condition |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10026317B2 (en) | Autonomous probability control | |
US10289113B2 (en) | Autonomous occupant attention-based control | |
CN112368662B (en) | Directional adjustment actions for autonomous vehicle operation management | |
CN107121979B (en) | Autonomous confidence control | |
US20170248953A1 (en) | Autonomous peril control | |
US20170247040A1 (en) | Autonomous vehicle control transitioning | |
CN112512887B (en) | Driving decision selection method and device | |
CN112534483B (en) | Method and device for predicting vehicle exit | |
US20220089153A1 (en) | Scenario identification in autonomous driving environments | |
EP3971864A1 (en) | Risk estimation in autonomous driving environments | |
CN115042821B (en) | Vehicle control method, vehicle control device, vehicle and storage medium | |
CN114968187A (en) | Platform for perception system development of an autopilot system | |
US11254326B2 (en) | Automatic comfort score system based on human driving reference data | |
US11532188B2 (en) | Architecture and methodology for state estimation failure detection using crowdsourcing and deep learning | |
CN115221151B (en) | Vehicle data transmission method and device, vehicle, storage medium and chip | |
CN114771539B (en) | Vehicle lane change decision method and device, storage medium and vehicle | |
CN114782638B (en) | Method and device for generating lane line, vehicle, storage medium and chip | |
US20220270356A1 (en) | Platform for perception system development for automated driving system | |
CN113808382A (en) | Auxiliary driving system and method based on vehicle cut-in critical level prediction | |
CN115205848A (en) | Target detection method, target detection device, vehicle, storage medium and chip | |
CN114913711A (en) | Auxiliary driving system and method based on predicted vehicle cut-in possibility | |
CN115407344B (en) | Grid map creation method, device, vehicle and readable storage medium | |
US20230406362A1 (en) | Planning-impacted prediction evaluation | |
US20230113532A1 (en) | Path planning for vehicle based on accident intensity | |
US20220366782A1 (en) | Method of measuring road performance using headway dynamics |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20211217 |
|
RJ01 | Rejection of invention patent application after publication |