CN108470460B - Peripheral vehicle behavior identification method based on smart phone and RNN - Google Patents

Peripheral vehicle behavior identification method based on smart phone and RNN Download PDF

Info

Publication number
CN108470460B
CN108470460B CN201810320788.2A CN201810320788A CN108470460B CN 108470460 B CN108470460 B CN 108470460B CN 201810320788 A CN201810320788 A CN 201810320788A CN 108470460 B CN108470460 B CN 108470460B
Authority
CN
China
Prior art keywords
vehicle
behavior
rnn
relative
vehicles
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810320788.2A
Other languages
Chinese (zh)
Other versions
CN108470460A (en
Inventor
蔡英凤
朱南楠
张云顺
孙晓强
陈龙
梁军
王海
储小军
何友国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu University
Original Assignee
Jiangsu University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu University filed Critical Jiangsu University
Priority to CN201810320788.2A priority Critical patent/CN108470460B/en
Publication of CN108470460A publication Critical patent/CN108470460A/en
Application granted granted Critical
Publication of CN108470460B publication Critical patent/CN108470460B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096791Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24147Distances to closest patterns, e.g. nearest neighbour classification
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72457User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to geographic location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention discloses a peripheral vehicle behavior identification method based on a smart phone and an RNN (radio network), belonging to the field of intelligent driving and comprising the following steps: a. an off-line training link: typical surrounding vehicle behaviors are induced and divided, and the relative features and the vehicle behaviors of the tracked vehicle and the main vehicle are subjected to vector coding by utilizing vehicle running data collected by the smart phone to serve as a training set for RNN parameter learning. b. And (3) an online detection link: based on a real-time traffic scene, the main vehicle forms a new characteristic matrix by combining the running data of the tracked vehicle and the self vehicle through 4G communication, and the new characteristic matrix is used as the input of the trained RNN to distinguish the behavior mode of the surrounding vehicle. The intelligent mobile phone is used as hardware for data collection and vehicle communication, and has the advantages of feasibility and convenience; by utilizing the characteristic that RNN is good at processing high-dimensional matrix operation, relative characteristics of the vehicle and surrounding vehicles are enriched, the identification rate is improved, and meanwhile, higher real-time performance of behavior identification is guaranteed.

Description

Peripheral vehicle behavior identification method based on smart phone and RNN
Technical Field
The invention belongs to the field of intelligent driving of vehicles, and particularly relates to a peripheral vehicle behavior identification method based on a smart phone and an RNN.
Background
In recent years, vehicle behavior recognition is turning dynamic running vehicle-based peripheral vehicle behavior recognition from a fixed-position monitoring system vehicle behavior recognition. The key of the behavior recognition is to learn the behavior pattern of the vehicle, establish a behavior recognition model, and then recognize the vehicle behavior through the trained vehicle behavior recognition model, even predict the vehicle behavior.
In order to provide a training set for a surrounding vehicle behavior recognition model, vehicle driving state information is collected by using a smart phone sensor, and a vehicle-to-vehicle communication family is established through a 4G network. At present, the research on state collection and information transmission of the intelligent automobile by using the intelligent mobile phone is less. The foreign learner Mucahit Karaduman uses a GPS (global positioning system) of a smart phone, an accelerometer and a gyroscope as sensors for acquiring vehicle driving information, and uses an HMM (hidden Markov model) to classify vehicle driving tracks, so that a good effect is achieved.
For peripheral vehicle behavior identification, no matter the existing solution adopts radar and a camera to actively acquire the driving data of peripheral vehicles, or adopts data transmission V2V between motor vehicles based on wireless to passively receive data, expensive hardware equipment needs to be additionally built, while the smart phone acquires rich information, no too large obstacle can be set for researchers in cost consideration, and meanwhile, the smart phone is provided with a GPS and an inertial sensor, so that the functions of positioning and speed measurement can be conveniently realized, therefore, the smart phone is used as hardware for data acquisition and information transmission, and the high feasibility and convenience are realized.
In the aspect of vehicle behavior recognition modeling, a traditional method generally adopts an HMM, namely a hidden markov model, and although in the aspect of model establishment, the HMM can well meet the requirements of vehicle behavior modeling on a sequence model, the HMM has poor classification capability, and meanwhile, an observation sequence input as a model can cover quite limited vehicle driving characteristics, so that the error recognition rate of the HMM is still high. RNNs, the recurrent neural networks, have been proposed as early as the end of the 90 s, but have not been widely used until the intense heat of deep learning in the last two years, particularly in the field of natural language processing NLP, where RNNs have helped make a dramatic progress. In particular, in the recognition of the emotion of an english sentence, a multi-input and single-output RNN model is often used, that is, each word of an english sentence is used as multiple inputs, and the RNN model outputs one of happiness, anger, sadness and happiness to determine the emotion of the sentence to be expressed. The invention is inspired by the research, encodes relative characteristics in the vehicle driving process in a characteristic vector mode by using a word embedding method in natural language processing, utilizes the support of RNN on high-dimensional matrix operation, and takes Softmax as a classifier to process multi-classification problems, thereby effectively improving the accuracy and the real-time performance of an identification algorithm while enriching the relative driving characteristics of the vehicle.
Disclosure of Invention
The invention provides a vehicle behavior identification method, which can accurately identify the behaviors of surrounding vehicles and provide a reference basis for the trajectory planning of intelligent vehicles.
The purpose of the invention is realized by the following technical scheme:
a peripheral vehicle behavior identification method based on a smart phone and an RNN is characterized by comprising the following steps:
step 1, an off-line training link: inducing and dividing typical surrounding vehicle behaviors, collecting driving data of surrounding vehicles by using a smart phone arranged in the vehicle according to a specific position based on a real-time traffic scene, coding relative features of a main vehicle and the surrounding vehicles at the same moment in a feature vector form, representing a complete vehicle behavior by using a feature matrix, and manually marking corresponding behaviors in a label vector form; using the collected and marked relative characteristic data (including a characteristic matrix and a corresponding label vector) of the surrounding vehicle as input of RNN parameter learning, and updating model parameters;
step 2, an online detection link: the tracked target vehicle transmits the collected driving information of the vehicle to the smart phone on the main vehicle in real time through the smart phone, the main vehicle generates a new characteristic matrix by combining the relative characteristics of the two vehicles, and the behavior mode of the tracked vehicle is distinguished by using the trained RNN.
Typical surrounding vehicle behaviors are summarized and divided in the step 1, and specifically, the following steps are performed: typical surrounding vehicle behaviors are divided into: the behavior of the front vehicle is as follows: braking, rear vehicle behavior: following, behavior of the left-hand vehicle: lane change, overtaking, doubling, right-side behavior: lane changing, overtaking and doubling.
In step 1, running data of surrounding vehicles is collected by using a smart phone arranged in a vehicle according to a specific position based on a real-time traffic scene, and the method specifically comprises the following steps: the intelligent mobile phone is horizontally arranged, the horizontal plane where the intelligent mobile phone is located is parallel to the horizontal plane where the vehicle transverse axis and the vehicle longitudinal axis are located, vehicle characteristic data are calculated by utilizing various data collected by the intelligent mobile phone, namely a unified map coordinate system is established for a tracked target vehicle and a main vehicle by a GPS of the intelligent mobile phone, the customized APP is used for measuring and calculating speed information of the main vehicle by utilizing GPS historical information, a gyroscope of the intelligent mobile phone is used for recording the change angle of the vehicle longitudinal axis in the map coordinate system, and an accelerometer is used for collecting vehicle acceleration information.
In the step 1, the relative features of the host vehicle and the surrounding vehicles at the same time are coded in a feature vector mode, and the method specifically comprises the following steps:
step 1.1, defining coordinates of tracked vehicles and main vehicle track points at the moment t as (xp) respectivelyt,ypt)、(xht,yht) At respective speeds upt、uhtAcceleration is ap respectivelyt、ahtThe included angle between the longitudinal axis of the vehicle and the positive half axis of the Y axis of the map coordinate system is α pt、αhtThe transverse and longitudinal relative distance between the two vehicles is Deltaxt=xpt-xht、Δyt=ypt-yhtRelative velocity is Δ ut=upt-uhtRelative acceleration of Δ at=apt-ahtThe relative included angle of the longitudinal axis of the vehicle is delta αt=|αpt-αhtThe included angle between the bisector of the included angle of the longitudinal axis and the positive half axis of the Y axis of the map coordinate system is βt=αht+(apt-αht)/2;
Step 1.2, collecting a large amount of data, removing abnormal values through statistical analysis, dividing value intervals of the 6 characteristics (the areas where the angular bisectors of the transverse relative distance, the longitudinal relative distance, the relative speed, the relative acceleration, the longitudinal axis relative included angle of the vehicle and the longitudinal axis relative included angle) in the step 1.1 are located, and optimizing and adjusting to enable the divided areas to be more representative; finally obtaining the number of the areas where the angular bisectors of the transverse relative distance area, the longitudinal relative distance area, the relative speed area, the relative acceleration area, the vehicle longitudinal axis relative included angle area and the longitudinal axis relative included angle are respectively n1,n2,n3,n4,n5,n6Then a dimension ofN=n1+n2+n3+n4+n5+n6Feature vector x of<t>To represent the relative attribute between the tracked vehicle and the host vehicle at a certain moment, and for the elements in the feature vector, when the feature satisfies the division area, the element is marked as 1, otherwise, the element is marked as 0, and all possible feature vectors share the condition that D is equal to n1×n2×n3×n4n5×n6And (4) seed preparation.
In step 1, a feature matrix may be used to represent a complete vehicle behavior, specifically: the time that a certain peripheral vehicle takes to complete a complete behavior is T, and the complete behavior can be represented by N × T feature matrix formed by T N-dimensional vectors.
The corresponding behavior in the step 1 is manually marked in a tag vector form, and specifically comprises the following steps: with an 8-dimensional label column vector y<t>The behavior of the surrounding vehicles is represented, 0-7 elements respectively correspond to 8 kinds of behaviors of the surrounding vehicles summarized in the foregoing, and the element is marked as 1 when the behavior meets the requirement of the surrounding vehicles, and the rest elements are marked as 0.
The RNN in step 1 is specifically:
RNN is a recurrent neural network, the invention adopts a recurrent neural network with multiple inputs and single output, the structure is shown in figure 2, the simplified structure is shown in figure 3, wherein x<t>For inputting the feature vector of the hidden layer corresponding to the time step, the hidden layer will also receive the hidden layer activation value a of the previous step<t-1>Wherein a is<0>Generally, the prediction result is directly initialized to zero vector and finally output
Figure BDA0001625205370000031
Wherein the input, activation and output have corresponding weight matrix Wax,Waa,WayThe input value of the previous time is represented as the weight of the output value of this time.
As in the propagation process of fig. 2 and 3, there are:
Figure BDA0001625205370000032
a<t>=g1(Waaa<t-1>+Waxx<t>+ba) (2)
Figure BDA0001625205370000033
wherein b isa,byIs two deviation parameters, activation function g1Selecting tan function, g2And selecting Softmax regression to process the multi-classification problem, wherein an activation function is used for adding a nonlinear factor to the model, and the defect that the linear model can only process two classes is overcome.
In the step 1, the collected and marked relative feature data (feature matrix and corresponding label vector) of the surrounding vehicle is used as input of RNN parameter learning, and the model parameters are updated, specifically: inputting the feature data in the training set into the initialized RNN model, setting a cost function by Softmax regression, iterating by using a gradient descent method to minimize the cost value, and obtaining the final RNN model after iterating.
The specific process of the step 2 comprises the following steps:
step 2.1, the tracked vehicle and the main vehicle utilize a smart phone sensor to obtain the running information of the vehicle in real time;
2.2, the tracked target vehicle transmits the own vehicle information to the main vehicle in real time through 4G communication;
and 2.3, combining the information of the host vehicle with the information of the target vehicle, extracting a new characteristic matrix as the input of the trained RNN, and outputting a prediction result by the RNN
Figure BDA0001625205370000041
And determining the behavior type of the surrounding vehicle.
The invention has the beneficial effects that:
(1) the smart phone is used as hardware for acquiring driving data and transmitting information, so that the smart phone has the advantage of low price;
(2) the relative attributes of the running vehicles are coded in a characteristic vector mode, and the obtained characteristic information is enriched;
(3) by utilizing the characteristic that RNN is good at processing high-dimensional matrix operation, the algorithm can still be ensured to have higher real-time performance even under the condition of very complex operation;
(4) softmax is used as a classifier of the RNN model, and a nonlinear factor is added into the model, so that the multi-classification problem is effectively processed.
(5) The method comprises the steps of acquiring the driving data of surrounding vehicles in a passive information receiving mode, and avoiding the influence of traffic conditions and environmental factors on active detection;
(6) the smart phone is used as hardware for data acquisition and information transmission, so that the feasibility and convenience are high;
(7) the RNN is used for supporting high-dimensional matrix operation, Softmax is used as a classifier to process multi-classification problems, and the accuracy and the real-time performance of an identification algorithm are effectively improved while the relative driving characteristics of vehicles are enriched.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a block diagram of an RNN according to the present invention;
FIG. 3 is a simplified block diagram of the RNN of the present invention.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments. The embodiment is implemented on the premise of the technical scheme of the invention. The detailed real-time mode and the specific operation process are given, and the protection scope of the invention is not limited to the following embodiments.
Setting all vehicle individuals participating in vehicle behavior recognition to be put with smart phones according to specific positions, wherein the smart phones have the function of collecting required data, and can form information intercommunication among the smart phones, inform each other of vehicle identities and exchange data; each vehicle can be used as a tracked vehicle and a main vehicle; once the host vehicle is set, vehicles adjacent around the vehicle in front, rear, left, and right are set as tracked target vehicles; since the distance of the adjacent vehicle is short, the maximum communication distance is set to 250 m.
As shown in fig. 1, a method for recognizing behavior of a neighboring vehicle based on a smart phone and an RNN includes:
step 1, an off-line training link: typical surrounding vehicle behaviors are summarized and divided, driving data of surrounding vehicles are collected by using a smart phone placed in the vehicles according to specific positions on the basis of real-time traffic scenes, and relative features of a main vehicle and the surrounding vehicles at the same moment are coded in a feature vector mode, so that a complete vehicle behavior can be represented by using a feature matrix, and corresponding behaviors are manually marked in a label vector mode; using the collected and marked relative characteristic data (characteristic matrix and corresponding label vector) of the surrounding vehicle as input of RNN parameter learning, and updating model parameters;
step 2, an online detection link: the tracked target vehicle transmits the collected driving information of the vehicle to the smart phone on the main vehicle in real time through the smart phone, the main vehicle generates a new characteristic matrix by combining the relative characteristics of the two vehicles, and the behavior mode of the tracked vehicle is distinguished by using the trained RNN.
Typical surrounding vehicle behaviors are summarized and divided in the step 1 specifically as follows: typical surrounding vehicle behaviors are divided into: the behavior of the front vehicle is as follows: braking, rear vehicle behavior: following, behavior of both left and right side cars: lane changing, overtaking and doubling.
The step 1 of collecting the driving data of the surrounding vehicles by using the smart phones arranged in the vehicles according to specific positions based on the real-time traffic scene specifically comprises the following steps: the smart phone is parallel to the transverse and longitudinal axes of the vehicle, so that various data collected by the smart phone are used for calculating vehicle characteristic data, namely a GPS of the smart phone is used for establishing a uniform map coordinate system for a tracked target vehicle and the main vehicle, the customized APP is used for measuring and calculating speed information of the main vehicle by using GPS historical information, and a gyroscope of the smart phone is used for recording the change angle of the longitudinal axis of the vehicle in the map coordinate system and collecting vehicle acceleration information by using an accelerometer.
In the step 1, the relative features of the host vehicle and the surrounding vehicles at the same time are coded in a feature vector mode, and the method specifically comprises the following steps:
step 1.1, defining coordinates of tracked vehicles and main vehicle track points at the moment t as (xp) respectivelyt,ypt)、(xht,yht) At respective speeds upt、uhtAcceleration is ap respectivelyt、ahtThe included angle between the longitudinal axis of the vehicle and the positive half axis of the Y axis of the map coordinate system is α pt、αhtThe transverse and longitudinal relative distance between the two vehicles is Deltaxt=xpt-xht、Δyt=ypt-yhtRelative velocity is Δ ut=upt-uhtRelative acceleration of Δ at=apt-ahtThe relative included angle of the longitudinal axis of the vehicle is delta αt|αpt-αhtThe included angle between the bisector of the included angle of the longitudinal axis and the positive half axis of the Y axis of the map coordinate system is βt=αht(αpt-αht)/2;
Step 1.2, collecting a large amount of data, removing abnormal values through statistical analysis, dividing the value-taking intervals of the 6 characteristics in the step 1.1, and optimizing and adjusting to enable the divided areas to be more representative; finally obtaining the number of the areas where the angular bisectors of the transverse relative distance area, the longitudinal relative distance area, the relative speed area, the relative acceleration area, the vehicle longitudinal axis relative included angle area and the longitudinal axis relative included angle are respectively n1,n2,n3,n4,n5,n6Then one dimension N ═ N may be used1+n2+n3+n4+n5+n6Feature vector x of<t>To represent the relative attribute between the tracked vehicle and the host vehicle at a certain moment, and for the elements in the feature vector, when the feature satisfies the division area, the element is marked as 1, otherwise, the element is marked as 0, and all possible feature vectors share the condition that D is equal to n1×n2×n3×n4×n5×n6In this embodiment, the lateral relative distances [ -6,6 ] are set]The mixture is divided into 24 regions on average at intervals of 0.5, and then 26 regions of (-infinity, -6), (6, + ∞) are added, and the unit is m; dividing the longitudinal relative distance into (— infinity, -50), [ -50, -20), [ -20, -10), [ -10, -5), [ -5, -4), [ -4, -3), [ -3, -2), [ -2, -1), [ -1, -0), [0,1), [1,2), [2,3), [4,5),18 regions of [5,10 ], [10,20 ], [20,50 ], [50, + ∞) in m; dividing the relative velocity into 8 regions (— ∞, -20), [ -20, -10), [ -10, -5), [ -5,0), [0,5), [5,10), [10,20), [20, + ∞), with km/h; the relative acceleration is divided into 6 regions (— infinity, -6), [ -6, -3), [ -3,0), [0,3), [3,6), [6, + ∞) in m/s2(ii) a Make the longitudinal axis of the vehicle form an included angle [0,90 ]]Equally dividing the space into 9 areas according to the distance of 10, wherein the unit is degree; and from the positive half axis of the Y axis of the map coordinate system to the negative half axis of the Y axis, the map coordinate system is divided into 18 areas on average according to the distance of 10, and the 18 areas are used as the dividing basis of the angle bisector of the included angle of the longitudinal axis.
In step 1, a feature matrix may be used to represent a complete vehicle behavior, and specifically, the time that a certain peripheral vehicle completes a complete behavior is T, so that the complete behavior may be represented by an N × T feature matrix formed by T N-dimensional vectors.
The corresponding behavior in the step 1 is artificially marked in the form of a label vector, specifically, an 8-dimensional label column vector y is used<t>The behavior of the surrounding vehicles is represented, 0-7 elements respectively correspond to 8 vehicle surrounding vehicle behaviors summarized in the front, the element is marked as 1 when the behavior meets the requirement of the surrounding vehicle behavior, and the rest elements are marked as 0.
The RNN in step 1 is specifically:
RNN (recurrent neural network), a multiple-input-to-single-output recurrent neural network structure, is shown in FIG. 2, and its simple structure is shown in FIG. 3, where x<t>For inputting the feature vector of the hidden layer corresponding to the time step, the hidden layer will also receive the hidden layer activation value a of the previous step<t-1>Wherein a is<0>Generally, the prediction result is directly initialized to zero vector and finally output
Figure BDA0001625205370000061
Wherein the input, activation and output have corresponding weight matrix Wax,Waa,WayThe input value of the previous time is represented as the weight of the output value of this time.
As in the propagation process of fig. 2 and 3, there are:
Figure BDA0001625205370000062
a<t>=g1(Waaa<t-1>+Waxx<t>+ba) (2)
Figure BDA0001625205370000071
wherein b isa,byIs two deviation parameters, activation function g1Selecting tanh, g2And selecting Softmax regression to process the multi-classification problem, wherein an activation function is used for adding a nonlinear factor to the model, and the defect that the linear model can only process two classes is overcome.
In the step 1, the collected and marked relative feature data (feature matrix and corresponding label vector) of the surrounding vehicle is used as input for RNN parameter learning, and the updating of the model parameters specifically comprises: inputting the feature data in the training set into the initialized RNN model, setting a cost function by Softmax regression, iterating by using a gradient descent method to minimize the cost value, and obtaining the final RNN model after iterating.
The specific process of the step 2 comprises the following steps:
step 2.1, the tracked vehicle and the main vehicle utilize a smart phone sensor to obtain the running information of the vehicle in real time;
2.2, the tracked target vehicle transmits the own vehicle information to the main vehicle in real time through 4G communication;
and 2.3, combining the information of the host vehicle with the information of the target vehicle, extracting a new characteristic matrix as the input of the trained RNN, and outputting a prediction result by the RNN
Figure BDA0001625205370000072
And determining the behavior type of the surrounding vehicle.
The above-listed detailed description is only a specific description of a possible embodiment of the present invention, and they are not intended to limit the scope of the present invention, and equivalent embodiments or modifications made without departing from the technical spirit of the present invention should be included in the scope of the present invention.

Claims (9)

1. A peripheral vehicle behavior identification method based on a smart phone and an RNN is characterized by comprising the following steps:
step 1, an off-line training link: inducing and dividing typical surrounding vehicle behaviors, collecting driving data of surrounding vehicles by using a smart phone arranged in the vehicle according to a specific position based on a real-time traffic scene, coding relative features of a main vehicle and the surrounding vehicles at the same moment in a feature vector form, representing a complete vehicle behavior by using a feature matrix, and manually marking corresponding behaviors in a label vector form; using the collected and marked relative characteristic data of the surrounding vehicles as input of RNN parameter learning, and updating model parameters; the relative feature data of the surrounding vehicles comprises a feature matrix and corresponding label vectors thereof;
step 2, an online detection link: the tracked target vehicle transmits the collected driving information of the vehicle to the smart phone on the main vehicle in real time through the smart phone, the main vehicle generates a new characteristic matrix by combining the relative characteristics of the two vehicles, and the behavior mode of the tracked target vehicle is distinguished by using the trained RNN.
2. The method for recognizing the behavior of the nearby vehicle based on the smartphone and the RNN as claimed in claim 1, wherein the typical behavior of the nearby vehicle is summarized and classified in the step 1, specifically: typical surrounding vehicle behaviors are divided into: the behavior of the front vehicle is as follows: braking, rear vehicle behavior: following, behavior of both left and right cars: lane changing, overtaking and doubling.
3. The method for recognizing the behavior of the nearby vehicle based on the smart phone and the RNN as claimed in claim 1, wherein the step 1 of coding the relative features of the host vehicle and the nearby vehicle at the same time in the form of feature vectors specifically comprises the following steps:
step (ii) of1.1, defining coordinates of tracked target vehicles and main vehicle track points at the moment t as (xp) respectivelyt,ypt)、(xht,yht) At respective speeds upt、uhtAcceleration is ap respectivelyt、ahtThe included angle between the longitudinal axis of the vehicle and the positive half axis of the Y axis of the map coordinate system is α pt、αhtThe transverse and longitudinal relative distances of the two vehicles are respectively delta xt=xpt-xht、Δyt=ypt-yhtRelative velocity is Δ ut=upt-uhtRelative acceleration of Δ at=apt-ahtThe relative included angle of the longitudinal axis of the vehicle is delta αt=|αpt-αhtThe included angle between the bisector of the included angle of the longitudinal axis and the positive half axis of the Y axis of the map coordinate system is βt=αht+(αpt-αht)/2;
Step 1.2, collecting a large amount of data, removing abnormal values through statistical analysis, dividing value intervals of the 6 characteristics in the step 1.1, and optimizing and adjusting to enable the divided areas to be more representative; finally, the number of areas where the included angles of the angular bisector of the transverse relative distance area, the longitudinal relative distance area, the relative speed area, the relative acceleration area, the vehicle longitudinal axis relative included angle area and the Y-axis positive semi-axis of the map coordinate system are respectively n1,n2,n3,n4,n5,n6Then one dimension N ═ N may be used1+n2+n3+n4+n5+n6Feature vector x of<t>To represent the relative attribute between the tracked target vehicle and the host vehicle at a certain moment, and for the elements in the feature vector, when the feature satisfies the division area, the element is marked as 1, otherwise, the element is marked as 0, and all possible feature vectors share the common condition that D is equal to n1×n2×n3×n4×n5×n6And (4) seed preparation.
4. The method as claimed in claim 1, wherein a feature matrix is used in step 1 to represent a complete vehicle behavior, specifically: the time that a certain peripheral vehicle takes to complete a complete behavior is T, and the complete behavior can be represented by N × T feature matrix formed by T N-dimensional vectors.
5. The method as claimed in claim 4, wherein the behavior corresponding to the behavior in step 1 is manually marked in the form of a tag vector, specifically: with an 8-dimensional label column vector y<t>The behavior of the surrounding vehicles is represented, 0-7 elements are respectively corresponding to 8 types of behaviors of the vehicles surrounding the vehicles, the element is marked as 1 when the behavior meets the requirement of the surrounding, and the rest elements are marked as 0.
6. The method for recognizing behaviors of nearby vehicles based on smart phones and RNNs as claimed in claim 1, wherein the RNN in step 1 is specifically designed to:
RNN adopts a multi-input to single-output recurrent neural network, and x<t>As the input of the feature vector corresponding to the hidden layer at the time step, the hidden layer will also receive the hidden layer activation value a at the previous step<t-1>Wherein a is<0>Directly initializing into zero vector, and finally outputting prediction result
Figure FDA0002495729950000021
Wherein the input, activation and output have corresponding weight matrix Wax,Waa,WayRepresenting the input value of the last time as the weight of the output value of the time;
wherein the propagation process is as follows:
Figure FDA0002495729950000022
a<t>=g1(Waaa<t-1>+Waxx<t>+ba)
Figure FDA0002495729950000023
wherein b isa,byRespectively, a deviation parameter, an activation function g1Selecting tanh, g2A Softmax regression was chosen in which the activation function was used to add non-linear factors to the model.
7. The method for recognizing behaviors of the nearby vehicle based on the smartphone and the RNN according to claim 1, wherein the acquired and marked relative feature data of the nearby vehicle is used as input for RNN parameter learning in step 1 to update model parameters, specifically: inputting the feature data in the training set into the initialized RNN model, setting a cost function by Softmax regression, iterating by using a gradient descent method to minimize the cost value, and obtaining the final RNN model after iterating.
8. The method for recognizing the behavior of the nearby vehicle based on the smart phone and the RNN as claimed in claim 1, wherein the specific process of the step 2 comprises the following steps:
step 2.1, acquiring running information of the tracked target vehicle and the main vehicle in real time by using a smart phone sensor;
2.2, the tracked target vehicle transmits the self-vehicle information to the main vehicle in real time through wireless transmission;
and 2.3, combining the information of the host vehicle with the information of the target vehicle, extracting a new characteristic matrix as the input of the trained RNN, and outputting a prediction result by the RNN
Figure FDA0002495729950000031
And determining the behavior type of the surrounding vehicle.
9. The method for recognizing behaviors of nearby vehicles based on smart phones and RNNs according to claim 8, wherein the step 2.2 is performed by adopting a 4G network for wireless transmission.
CN201810320788.2A 2018-04-11 2018-04-11 Peripheral vehicle behavior identification method based on smart phone and RNN Active CN108470460B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810320788.2A CN108470460B (en) 2018-04-11 2018-04-11 Peripheral vehicle behavior identification method based on smart phone and RNN

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810320788.2A CN108470460B (en) 2018-04-11 2018-04-11 Peripheral vehicle behavior identification method based on smart phone and RNN

Publications (2)

Publication Number Publication Date
CN108470460A CN108470460A (en) 2018-08-31
CN108470460B true CN108470460B (en) 2020-08-28

Family

ID=63263155

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810320788.2A Active CN108470460B (en) 2018-04-11 2018-04-11 Peripheral vehicle behavior identification method based on smart phone and RNN

Country Status (1)

Country Link
CN (1) CN108470460B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109284701A (en) * 2018-09-06 2019-01-29 南京威尔思汽车部件科技有限公司 A kind of driving recognition methods based on regional correlation
CN109886304B (en) * 2019-01-22 2023-09-29 江苏大学 HMM-SVM double-layer improved model-based surrounding vehicle behavior recognition method under complex road conditions
CN109727490B (en) * 2019-01-25 2021-10-12 江苏大学 Peripheral vehicle behavior self-adaptive correction prediction method based on driving prediction field
CN109948654B (en) * 2019-02-15 2020-09-15 山东师范大学 Automobile running state identification method, system and equipment based on user behavior data

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8705849B2 (en) * 2008-11-24 2014-04-22 Toyota Motor Engineering & Manufacturing North America, Inc. Method and system for object recognition based on a trainable dynamic system
CN105719313B (en) * 2016-01-18 2018-10-23 青岛邃智信息科技有限公司 A kind of motion target tracking method based on intelligent real-time video cloud
CN107133974B (en) * 2017-06-02 2019-08-27 南京大学 Gaussian Background models the vehicle type classification method combined with Recognition with Recurrent Neural Network
CN106990714A (en) * 2017-06-05 2017-07-28 李德毅 Adaptive Control Method and device based on deep learning
CN107492251B (en) * 2017-08-23 2020-02-14 武汉大学 Driver identity recognition and driving state monitoring method based on machine learning and deep learning
CN107563332A (en) * 2017-09-05 2018-01-09 百度在线网络技术(北京)有限公司 For the method and apparatus for the driving behavior for determining unmanned vehicle

Also Published As

Publication number Publication date
CN108470460A (en) 2018-08-31

Similar Documents

Publication Publication Date Title
CN108470460B (en) Peripheral vehicle behavior identification method based on smart phone and RNN
CN110007675B (en) Vehicle automatic driving decision-making system based on driving situation map and training set preparation method based on unmanned aerial vehicle
CN109145939B (en) Semantic segmentation method for small-target sensitive dual-channel convolutional neural network
EP4152204A1 (en) Lane line detection method, and related apparatus
CN112700470B (en) Target detection and track extraction method based on traffic video stream
CN112347993B (en) Expressway vehicle behavior and track prediction method based on vehicle-unmanned aerial vehicle cooperation
CN107967486B (en) Method for recognizing behaviors of surrounding vehicles
CN112750150B (en) Vehicle flow statistical method based on vehicle detection and multi-target tracking
CN111310583A (en) Vehicle abnormal behavior identification method based on improved long-term and short-term memory network
CN111837156A (en) Vehicle weight recognition techniques utilizing neural networks for image analysis, viewpoint-aware pattern recognition, and generation of multi-view vehicle representations
JP2019527832A (en) System and method for accurate localization and mapping
JP2021515724A (en) LIDAR positioning to infer solutions using 3DCNN network in self-driving cars
CN109727490B (en) Peripheral vehicle behavior self-adaptive correction prediction method based on driving prediction field
CN112734808B (en) Trajectory prediction method for vulnerable road users in vehicle driving environment
CN112001378B (en) Lane line processing method and device based on feature space, vehicle-mounted terminal and medium
CN114023062A (en) Traffic flow information monitoring method based on deep learning and edge calculation
Wang et al. End-to-end self-driving approach independent of irrelevant roadside objects with auto-encoder
Ma et al. Deconvolution Feature Fusion for traffic signs detection in 5G driven unmanned vehicle
CN113076988B (en) Mobile robot vision SLAM key frame self-adaptive screening method based on neural network
CN114067142A (en) Method for realizing scene structure prediction, target detection and lane level positioning
CN104200226A (en) Particle filtering target tracking method based on machine learning
CN113804182A (en) Grid map creating method based on information fusion
CN111562605B (en) Self-adaptive GPS error observed value identification method
CN106650814B (en) Outdoor road self-adaptive classifier generation method based on vehicle-mounted monocular vision
WO2023179593A1 (en) Data processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant