CN114735010B - Intelligent vehicle running control method and system based on emotion recognition and storage medium - Google Patents

Intelligent vehicle running control method and system based on emotion recognition and storage medium Download PDF

Info

Publication number
CN114735010B
CN114735010B CN202210541021.9A CN202210541021A CN114735010B CN 114735010 B CN114735010 B CN 114735010B CN 202210541021 A CN202210541021 A CN 202210541021A CN 114735010 B CN114735010 B CN 114735010B
Authority
CN
China
Prior art keywords
vehicle
information
driver
driving
motion state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210541021.9A
Other languages
Chinese (zh)
Other versions
CN114735010A (en
Inventor
王红钢
彭勇
王兴华
向国梁
郑孟
向飞
伍贤辉
许倩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Central South University
Original Assignee
Central South University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Central South University filed Critical Central South University
Priority to CN202210541021.9A priority Critical patent/CN114735010B/en
Publication of CN114735010A publication Critical patent/CN114735010A/en
Application granted granted Critical
Publication of CN114735010B publication Critical patent/CN114735010B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W40/09Driving style or behaviour
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0097Predicting future conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0872Driver physiology

Abstract

The invention relates to the technical field of intelligent traffic, and discloses an intelligent vehicle driving control method, system and storage medium based on emotion recognition.

Description

Intelligent vehicle running control method and system based on emotion recognition and storage medium
Technical Field
The invention relates to the technical field of intelligent traffic, in particular to an intelligent vehicle driving control method and system based on emotion recognition and a storage medium.
Background
The popularization of car lets our trip convenient and fast more on the one hand, and on the other hand also makes road traffic accident frequent, causes very big threat for our life and property safety, along with the development of vehicle intelligent technology, drives the auxiliary technology and provides new thinking for solving this problem.
The driving behavior prediction is a key of the vehicle driving assistance technique. The emotion is a psychological quantity of the driver to feedback the traffic environment, and the driver may feel angry, disgust, excited and the like during driving, which may adversely affect the perception, decision and operation of the driver. Therefore, how to accurately identify the emotional state of the driver, adjust the bad emotion of the driver, and integrate the bad emotion into the driving behavior prediction is the key to realize accurate driving assistance of the driver in different emotional states. In addition, the road traffic system has high interactivity, the driving behavior of the driver is closely related to the behaviors of other road users, and the development of the intelligent network connection technology provides reliable support for the driver to acquire the behaviors of the other road users. Therefore, how to consider the interaction between the emotional state of the driver and the road user to achieve the purpose of safe driving becomes a problem to be solved urgently.
Disclosure of Invention
The invention provides an intelligent vehicle running control method, system and storage medium based on emotion recognition, and aims to solve the problems in the prior art.
In order to achieve the purpose, the invention is realized by the following technical scheme:
in a first aspect, the invention provides an intelligent vehicle driving control method based on emotion recognition, which comprises the following steps:
collecting multi-mode signals, historical movement state information of a first vehicle, historical movement state information of a second vehicle and environmental road information when a driver drives the first vehicle, wherein the second vehicle is a vehicle around the first vehicle;
identifying an emotional state of the driver from the multi-modal signal;
predicting the driving track of the first vehicle according to the emotional state, the historical motion state information of the first vehicle, the historical motion state information of the second vehicle and the environmental road information;
and calculating the collision probability corresponding to the driving track according to the driving track and the future motion state information of the second vehicle, generating a collision avoidance deceleration and a steering angle under the condition that the collision probability exceeds a risk threshold, and controlling the vehicle to run based on the collision avoidance deceleration and the steering angle.
In a second aspect, the present application provides an intelligent vehicle driving control system based on emotion recognition, including:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring a multi-mode signal, first vehicle historical motion state information, second vehicle historical motion state information and environmental road information when a driver drives a first vehicle, and the second vehicle is a vehicle around the first vehicle;
the emotion recognition module is used for recognizing the emotional state of the driver according to the multi-mode signals;
the track prediction module is used for predicting the driving track of the first vehicle according to the emotional state, the historical motion state information of the first vehicle, the historical motion state information of the second vehicle and the environmental road information;
the collision risk evaluation module is used for calculating the collision probability corresponding to the driving track according to the driving track and the future motion state information of the second vehicle;
and the auxiliary control module is used for generating collision avoidance deceleration and a steering angle under the condition that the collision probability exceeds a risk threshold value, and controlling the vehicle to run based on the collision avoidance deceleration and the steering angle.
In a third aspect, the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the method steps as set forth in the first aspect.
Has the advantages that:
according to the intelligent vehicle driving control method based on emotion recognition, the emotion information complementation among different modalities is adopted, potential shared information of the modalities is fused, the emotion recognition effectiveness of a driver is improved, a foundation is laid for reasonably adjusting the bad emotion of the driver and accurately predicting the driving behaviors of the driver in different emotion states, the emotion state of the driver and the interaction influence of other vehicles on a road are fully fused, the driving track of the vehicle can be accurately predicted, the collision risk of the vehicle can be evaluated, a reasonable collision avoidance strategy can be formulated, the safety of the intelligent vehicle is effectively improved, and road traffic accidents caused by human factors are reduced.
Drawings
Fig. 1 is a flowchart of an intelligent vehicle driving control method based on emotion recognition according to a preferred embodiment of the present invention;
fig. 2 is a second flowchart of an intelligent vehicle driving control method based on emotion recognition according to a preferred embodiment of the present invention;
fig. 3 is a third flowchart of an intelligent vehicle driving control method based on emotion recognition according to a preferred embodiment of the present invention;
fig. 4 is a block diagram of an intelligent vehicle driving control system based on emotion recognition in accordance with a preferred embodiment of the present invention;
fig. 5 is a schematic diagram of signal connection of an intelligent vehicle driving control system based on emotion recognition according to a preferred embodiment of the present invention.
Detailed Description
The technical solutions of the present invention are described clearly and completely below, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Unless otherwise defined, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this invention belongs. The use of "first," "second," and similar terms in the present application do not denote any order, quantity, or importance, but rather the terms are used to distinguish one element from another. Also, the use of the terms "a" or "an" and the like do not denote a limitation of quantity, but rather denote the presence of at least one. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships are changed accordingly.
Referring to fig. 1 to 3, the present application provides an intelligent vehicle driving control method based on emotion recognition, including:
collecting a multi-mode signal, historical motion state information of a first vehicle, historical motion state information of a second vehicle and environmental road information when a driver drives the first vehicle, wherein the second vehicle is a vehicle around the first vehicle;
recognizing the emotional state of the driver according to the multi-modal signals;
predicting the driving track of the first vehicle according to the emotional state, the historical motion state information of the first vehicle, the historical motion state information of the second vehicle and the environmental road information;
and calculating the collision probability corresponding to the driving track according to the driving track and the future motion state information of the second vehicle, generating collision avoidance deceleration and a steering angle under the condition that the collision probability exceeds a risk threshold, and controlling the vehicle to run based on the collision avoidance deceleration and the steering angle.
In this embodiment, the first vehicle is a vehicle driven by a driver, the second vehicle is a vehicle around the vehicle driven by the driver, the first vehicle historical moving state information is moving state information of the first vehicle in a past time period, and the second vehicle historical moving state information is moving state information of the second vehicle in the past time period.
The multi-mode signals can be collected by a multi-mode information collection device, the first vehicle historical motion state information can be collected by a vehicle information collection device, and the second vehicle historical motion state information and the environmental road information can be collected by a V2X device.
According to the intelligent vehicle driving control method based on emotion recognition, emotion information complementation among different modalities is adopted, potential shared information of the modalities is fused, the effectiveness of emotion recognition of a driver is improved, a foundation is laid for reasonably adjusting bad emotion of the driver and accurately predicting driving behaviors of the driver in different emotion states, the emotion state of the driver and interaction influences of other vehicles on a road are fully fused, vehicle collision risk can be accurately evaluated, a reasonable collision avoidance strategy is formulated, the safety of an intelligent vehicle is effectively improved, and road traffic accidents caused by artificial factors are reduced.
Optionally, the multimodal signal comprises speech information, driver dynamic face sequence images, skin conductance and heart rate information;
identifying an emotional state of the driver from the multi-modal signals, comprising:
extracting corresponding tone features of voice information, facial expression and motion features corresponding to the dynamic face sequence images of the driver, and physiological features of a peripheral nervous system of the driver corresponding to skin electric conduction and heart rate information;
and fusion, transformation and dimension reduction processing are carried out on the tone features, the facial expression and motion features and the physiological features of the peripheral nervous system of the driver, and the emotional state of the driver is identified and quantified from two dimensions of emotional valence and arousal degree so as to obtain the emotional state of the driver.
In the embodiment, the emotion valence is classified from negative to positive through a preset deep neural network model, and positive emotions and negative emotions are distinguished; arousal is a classification from calm to excited that reflects the degree of arousal of an emotion. The multimode information acquisition device acquires the voice tone, the facial expression, the skin electric conduction and the heart rate of a driver according to the instruction of the intelligent vehicle driving auxiliary device. Therefore, multi-modal signals of the driver can be acquired, the emotional state of the driver can be obtained through comprehensive multi-dimensional characteristic analysis, and the analysis result can be more accurate. The multimode information acquisition device comprises 1 audio collector, 2 cameras (1 infrared camera and 1 visible light camera), an intelligent bracelet and a driver auxiliary device, wherein the cameras are respectively arranged at the upper ends of the inner sides of an inside rearview mirror and an A column of the intelligent vehicle, and the intelligent bracelet is used for acquiring behavior expression and neuro-physiological information of the driver and sending the behavior expression and the neuro-physiological information to the driving auxiliary device; the vehicle information acquisition device comprises a GPS, a speed sensor and an acceleration sensor and is used for acquiring the motion state of the intelligent vehicle and sending the motion state to the driving auxiliary device; the V2X device is a 5G vehicle wireless communication device and is used for collecting the motion state of surrounding vehicles and road environment information and sending the information to the driving assisting device.
Optionally, the method further includes:
and generating an emotion regulating scheme corresponding to the emotional state, and sending regulating information to the driver according to the emotion regulating scheme.
In this embodiment, the adjustment information may be music adjustment information or light adjustment information, so that the bad emotion of the driver is timely adjusted through light, music and the like, and dangerous driving behaviors of the driver due to the bad emotion can be remarkably reduced.
Optionally, the first vehicle historical movement state information includes first position information, first speed information and first acceleration information of the vehicle; the second vehicle historical motion state information comprises second position information, second speed information and second acceleration information of the second vehicle; the environmental road information comprises lane information and signal lamp information;
predicting the driving track of the first vehicle according to the emotional state, the historical movement state information of the first vehicle, the historical movement state information of the second vehicle and the environmental road information, and comprising the following steps:
and inputting the first position information, the first speed information, the first acceleration information, the second position information, the second speed information and the second acceleration information into a preset neural network model, and acquiring a driving track hierarchical prediction result output by the neural network model.
In this embodiment, the vehicle information collection device collects position information, speed information, and acceleration information of the intelligent vehicle according to the instruction of the driving assistance device for the intelligent vehicle, and the V2X device collects position information, speed information, acceleration information, lane information, traffic light information, weather information, and the like of other surrounding vehicles according to the instruction of the driving assistance device for the intelligent vehicle.
In one example, the step of identifying and adjusting the emotional state of the driver specifically comprises the following steps:
for the voice information fed back by the multi-mode information acquisition device, the voice information of a driver is extracted through independent component analysis, then the voice information of the driver is preprocessed, and the rhythm characteristics, the fundamental tone frequency, the tone quality characteristics and the like of the driver are extracted.
And preprocessing the dynamic face sequence image of the driver fed back by the multi-mode information acquisition device, and extracting facial expression and motion characteristics of the driver.
And (4) preprocessing the skin electric conduction and heart rate information of the driver fed back by the multi-mode information acquisition device, and extracting the physiological characteristics of the peripheral nervous system of the driver.
And receiving the multi-modal characteristics, fusing, transforming and reducing dimensions of the multi-modal characteristics, and identifying and quantifying the emotional state of the driver from two dimensions of emotional valence and arousal degree based on a deep learning method.
Sending the emotion recognition result of the driver to an emotion induction module and a track prediction module;
the emotion induction module receives the emotion recognition result of the driver fed back by the emotion recognition module and controls the player and the atmosphere lamp to adjust the emotion state of the driver through proper music and light.
Further, the steps of evaluating the driving safety of the intelligent vehicle and assisting the driver in completing the collision avoidance operation specifically comprise the following steps:
and the track prediction module receives the emotion recognition result of the driver fed back by the emotion recognition module, the motion state of the intelligent vehicle fed back by the vehicle information acquisition device and the historical motion state of other surrounding vehicles fed back by the V2X device, and completes vehicle track layered prediction fusing the emotion states of the driver in the high-interaction traffic environment based on a random forest algorithm (RF) and a long-short term memory network (LSTM).
The vehicle track hierarchical prediction result comprises a first layer result and a second layer result;
the first layer result is used for representing a driving decision fusing the emotional state of a driver and the interaction influence between road users, and satisfies the following relational expression:
Figure BDA0003648360780000051
wherein P is a conditional probability, d H Is the driving decision of the driver and is,
Figure BDA0003648360780000052
is the second vehicle historical motion state information, θ H Is the first vehicle historical movement state information,
Figure BDA0003648360780000053
is the second vehicle future motion state information, e H Is the emotional state of the driver;
the second layer result is used for representing a driving track fusing the emotional state of the driver and the interaction effect between road users, and the following relational expression is satisfied:
Figure BDA0003648360780000054
in the formula (I), the compound is shown in the specification,
Figure BDA0003648360780000055
is the driving trajectory of the first vehicle.
In summary, the driving trajectory can then be expressed as:
Figure BDA0003648360780000056
in the formula, D H Is the set of driving decisions of the driver,
Figure BDA0003648360780000057
is the kth driving decision in the set of driving decisions.
Thus, the collision probability corresponding to the driving track can be accurately calculated.
Optionally, the risk threshold includes a first stage risk threshold, a second stage risk threshold, and a third stage risk threshold, and a risk level of the first stage risk threshold is lower than a risk level of the second stage risk threshold, and a risk level of the second stage risk threshold is lower than the third stage risk threshold;
generating a collision avoidance deceleration and a steering angle when the collision probability exceeds a risk threshold, comprising:
and generating collision avoidance deceleration and a steering angle under the condition that the collision probability exceeds a third-stage risk threshold value.
Optionally, the method further includes:
and when the collision probability is lower than the first-stage risk threshold value or the collision probability is higher than the first-stage risk threshold value but lower than the third-stage risk threshold value, generating early warning information according to the driving track of the first vehicle and the future motion state information of the second vehicle.
In the optional implementation mode, by evaluating the safety risk, the hidden danger of the driver in the driving process can be found in time, and the driving safety is ensured. It should be noted that the information about the future movement state of the second vehicle may be obtained in the smart internet environment.
Specifically, the probability of collision between the first vehicle and the second vehicle can be calculated by using a probability flow theory, and the collision risk level can be evaluated; the collision risk ranking is as follows:
Figure BDA0003648360780000061
and then, according to the collision risk assessment result, the future motion state of the second vehicle and the road environment information. When the collision risk level is low (safety and general emergency), controlling the player and the display to send out corresponding early warning information and display potential collision objects; when the collision risk level is higher (particularly critical), the driving auxiliary control module still searches the optimal deceleration and steering angle which should be adopted by the intelligent vehicle based on the optimization theory except for sending early warning information and displaying potential collision objects; and the steering mechanism, the accelerating mechanism and the braking mechanism are controlled to complete the collision avoidance operation of the intelligent vehicle.
In summary, the driving control method of the intelligent vehicle based on emotion recognition considers the influence of the emotional state of the driver and the interaction between road users on the driving behavior of the driver in the process of controlling the driving of the intelligent vehicle. The emotional state of the driver is monitored and recognized in real time by combining the behavior expression of the driver and the neurophysiological information, so that higher recognition precision is achieved; based on the emotion recognition result of the driver, the bad emotion of the driver is timely adjusted through light, music and the like, and dangerous driving behaviors of the driver caused by the bad emotion are remarkably reduced; the interaction among road users in a high-interaction traffic environment is considered, the emotion state of a driver is combined, the vehicle track forming process is abstracted into two layers of discrete driving decision and continuous driving behavior, the vehicle track layered prediction is completed, and the high-interaction traffic system has higher prediction precision; when the collision risk level is low (safety and general emergency), warning information is sent to a driver through the player and the display, and potential collision objects are displayed. Therefore, the invention not only can effectively avoid dangerous driving behaviors of the driver due to bad emotion, but also can improve the safety of the intelligent vehicle and obviously reduce road traffic accidents.
Optionally, the present application further provides an intelligent vehicle driving control system based on emotion recognition, including:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring a multi-mode signal, first vehicle historical motion state information, second vehicle historical motion state information and environmental road information when a driver drives a first vehicle, and the second vehicle is a vehicle around the first vehicle;
the emotion recognition module is used for recognizing the emotional state of the driver according to the multi-mode signals;
the track prediction module is used for predicting the driving track of the first vehicle according to the emotional state, the historical motion state information of the first vehicle, the historical motion state information of the second vehicle and the environmental road information;
the collision risk evaluation module is used for calculating the collision probability corresponding to the driving track according to the driving track and the future motion state information of the second vehicle;
and the auxiliary control module is used for generating collision avoidance deceleration and a steering angle under the condition that the collision probability exceeds a risk threshold value, and controlling the vehicle to run based on the collision avoidance deceleration and the steering angle.
In a complete example, as shown in fig. 4, the intelligent vehicle driving control system based on emotion recognition may include:
1) Multimode information acquisition device, including 1 audio collector, 2 cameras (1 infrared camera and 1 visible light camera), install respectively in intelligent vehicle inside rear-view mirror and the inboard upper end of A post and an intelligent bracelet for gather driver's behavioral expression and neural physiology information, and send to and drive auxiliary device.
2) The vehicle information acquisition device comprises a GPS, a speed sensor and an acceleration sensor and is used for acquiring the motion state of the intelligent vehicle and sending the motion state to the driving auxiliary device;
3) The V2X device is a 5G vehicle wireless communication device and is used for acquiring the motion state of surrounding vehicles and road environment information and sending the motion state and the road environment information to the driving auxiliary device;
4) The intelligent vehicle driving auxiliary device comprises a voice processing module, an image processing module, a physiological signal processing module, an emotion recognition module, an emotion induction module, a track prediction module, a collision risk assessment module, a driving auxiliary control module, a voice control module, a display control module and a vehicle body control module.
a) The voice processing module extracts the voice information of the driver, preprocesses the voice information of the driver, extracts the voice tone characteristics of the voice information, and sends a processing result to the emotion recognition module;
b) The image processing module is used for preprocessing the dynamic face sequence image of the driver, extracting facial expression and motion characteristics of the driver and sending a processing result to the emotion recognition module;
c) The physiological signal processing module is used for preprocessing skin electric conduction and heart rate information of a driver, extracting physiological characteristics of a peripheral nervous system of the driver and sending a processing result to the emotion recognition module;
d) The emotion recognition module is used for combining the multi-modal characteristics fed back by the voice processing module, the image processing module and the physiological signal processing module, fusing, transforming and reducing dimensions of the multi-modal characteristics, recognizing and quantifying the emotional state of the driver, and sending the recognition result to the emotion induction module and the track prediction module;
e) The emotion induction module is used for calling a corresponding emotion adjusting scheme according to the emotion recognition result of the driver and sending the corresponding emotion adjusting scheme to the player and the atmosphere lamp;
f) The track prediction module is used for predicting the future motion state of the intelligent vehicle by combining information fed back by the emotion recognition module, the vehicle information acquisition device and the V2X acquisition device and sending the predicted future motion state to the collision risk evaluation module;
g) The collision risk evaluation module is used for calculating the probability of collision between the intelligent vehicle and other surrounding vehicles by combining information fed back by the track prediction module and the V2X acquisition device, evaluating the collision risk level and sending the collision risk level to the driving auxiliary control module;
h) The driving auxiliary control module is used for sending an instruction to the voice control module to call corresponding preset warning voice prompt information when the collision risk level is low (safety and general emergency) by combining the information fed back by the collision risk evaluation module and the V2X acquisition device, and sending an instruction to the display control module to display information of other road users and obstacles with potential conflict with the intelligent vehicle; when the collision risk is high (particularly critical), besides sending early warning information and displaying potential collision objects, the optimal collision avoidance deceleration and steering angle are searched, and the calculation result is sent to the vehicle body control module to assist the intelligent vehicle in completing collision avoidance.
i) And the voice control module is used for calling corresponding preset warning voice prompt information according to the received instruction and sending the warning voice prompt information to the player.
j) And the display control module is used for controlling the display to display information of other road users and obstacles which have potential conflicts with the intelligent vehicle.
k) And the vehicle body control module is used for sending instructions to the steering mechanism, the accelerating mechanism and the braking mechanism.
Fig. 5 shows signal connections between the devices of the intelligent vehicle driving control system based on emotion recognition.
The intelligent vehicle running control system based on emotion recognition can realize each embodiment of the intelligent vehicle running control method based on emotion recognition, and can achieve the same beneficial effects, and the detailed description is omitted here.
Embodiments of the present application also provide a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the method steps as described above. The readable storage medium can implement the embodiments of the method described above, and can achieve the same beneficial effects, which are not described herein again.
The foregoing detailed description of the preferred embodiments of the invention has been presented. It should be understood that numerous modifications and variations could be devised by those skilled in the art in light of the present teachings without departing from the inventive concepts. Therefore, the technical solutions available to those skilled in the art through logic analysis, reasoning and limited experiments based on the prior art according to the concept of the present invention should be within the scope of protection defined by the claims.

Claims (8)

1. An intelligent vehicle running control method based on emotion recognition is characterized by comprising the following steps:
the method comprises the steps that a multi-mode signal, first vehicle historical motion state information, second vehicle historical motion state information and environmental road information of a driver when the driver drives a first vehicle are collected, wherein the second vehicle is a vehicle around the first vehicle;
identifying an emotional state of the driver from the multi-modal signal;
predicting the driving track of the first vehicle according to the emotional state, the historical motion state information of the first vehicle, the historical motion state information of the second vehicle and the environmental road information;
calculating a collision probability corresponding to the driving track according to the driving track and the future motion state information of the second vehicle, generating a collision-avoidance deceleration and a steering angle under the condition that the collision probability exceeds a risk threshold, and controlling the vehicle to run based on the collision-avoidance deceleration and the steering angle;
the first vehicle historical movement state information comprises first position information, first speed information and first acceleration information of the vehicle; the second vehicle historical motion state information comprises second position information, second speed information and second acceleration information of a second vehicle; the environmental road information comprises lane information and signal lamp information;
the predicting of the driving track of the first vehicle according to the emotional state, the historical motion state information of the first vehicle, the historical motion state information of the second vehicle and the environmental road information comprises the following steps:
inputting the first position information, the first speed information, the first acceleration information, the second position information, the second speed information and the second acceleration information into a preset neural network model, and acquiring a driving track hierarchical prediction result output by the neural network model;
the driving track layered prediction result comprises a first layer result and a second layer result;
the first layer result is used for representing a driving decision fusing the emotional state of a driver and the interaction influence between road users, and satisfies the following relational expression:
Figure FDA0003883143770000011
wherein P is a conditional probability, d H Is the driving decision of the driver and is,
Figure FDA0003883143770000012
is the second vehicle historical motion state information, θ H Is the first vehicle historical movement state information,
Figure FDA0003883143770000013
is the second vehicle future movement state information, e H Is the emotional state of the driverState;
the second layer result is used for representing a driving track fusing the interaction between the emotional state of the driver and the road user, and the following relational expression is satisfied:
Figure FDA0003883143770000014
in the formula (I), the compound is shown in the specification,
Figure FDA0003883143770000015
is a driving trajectory of the first vehicle;
then, the driving trajectory satisfies the following relation:
Figure FDA0003883143770000021
in the formula, D H Is the set of driving decisions of the driver,
Figure FDA0003883143770000022
is the kth driving decision in the set of driving decisions.
2. The intelligent vehicle driving control method based on emotion recognition as recited in claim 1, wherein said multi-modal signals include voice information, driver dynamic face sequence images, skin conductance and heart rate information;
the identifying of the emotional state of the driver from the multi-modal signal includes:
extracting corresponding tone features of the voice information, facial expression and motion features corresponding to the dynamic face sequence image of the driver, and physiological features of a peripheral nervous system of the driver corresponding to the skin electric conduction and heart rate information;
and carrying out fusion, transformation and dimension reduction on the tone features, the facial expression and motion features and the physiological features of the peripheral nervous system of the driver, and identifying and quantifying the emotional state of the driver from two dimensions of emotional valence and arousal degree through a preset deep neural network model to obtain the emotional state of the driver.
3. The intelligent vehicle running control method based on emotion recognition according to claim 1, wherein the method further comprises:
and generating an emotion adjusting scheme corresponding to the emotion state, and sending adjusting information to a driver according to the emotion adjusting scheme.
4. The intelligent vehicle travel control method based on emotion recognition as recited in claim 1, wherein the risk thresholds include a first stage risk threshold, a second stage risk threshold, and a third stage risk threshold, and wherein the risk level of the first stage risk threshold is lower than the risk level of the second stage risk threshold, and wherein the risk level of the second stage risk threshold is lower than the third stage risk threshold;
generating a collision avoidance deceleration and a steering angle under the condition that the collision probability exceeds a risk threshold, comprising:
generating a collision avoidance deceleration and a steering angle if the collision probability exceeds a third stage risk threshold.
5. The intelligent vehicle running control method based on emotion recognition according to claim 4, wherein the method further comprises:
and when the collision probability is lower than the first-stage risk threshold value, or the collision probability is higher than the first-stage risk threshold value but lower than the third-stage risk threshold value, generating early warning information according to the driving track of the first vehicle and the future motion state information of the second vehicle.
6. The utility model provides an intelligent vehicle control system that traveles based on emotion recognition which characterized in that includes:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring a multi-mode signal, first vehicle historical motion state information, second vehicle historical motion state information and environmental road information when a driver drives a first vehicle, and the second vehicle is a vehicle around the first vehicle;
the emotion recognition module is used for recognizing the emotion state of the driver according to the multi-mode signals;
the track prediction module is used for predicting the driving track of the first vehicle according to the emotional state, the historical movement state information of the first vehicle, the historical movement state information of the second vehicle and the environmental road information;
the collision risk evaluation module is used for calculating the collision probability corresponding to the driving track according to the driving track and the future motion state information of the second vehicle;
the auxiliary control module is used for generating collision avoidance deceleration and a steering angle under the condition that the collision probability exceeds a risk threshold value, and controlling the vehicle to run based on the collision avoidance deceleration and the steering angle;
the first vehicle historical motion state information comprises first position information, first speed information and first acceleration information of the vehicle; the second vehicle historical motion state information comprises second position information, second speed information and second acceleration information of a second vehicle; the environmental road information comprises lane information and signal lamp information;
the predicting of the driving track of the first vehicle according to the emotional state, the historical motion state information of the first vehicle, the historical motion state information of the second vehicle and the environmental road information comprises the following steps:
inputting the first position information, the first speed information, the first acceleration information, the second position information, the second speed information and the second acceleration information into a preset neural network model, and acquiring a driving track hierarchical prediction result output by the neural network model;
the driving track layered prediction result comprises a first layer result and a second layer result;
the first layer result is used for representing a driving decision fusing the emotional state of a driver and the interaction influence between road users, and satisfies the following relational expression:
Figure FDA0003883143770000031
wherein P is a conditional probability, d H Is the driving decision of the driver and is,
Figure FDA0003883143770000032
is the second vehicle historical motion state information, θ H Is the first vehicle historical motion state information,
Figure FDA0003883143770000033
is the second vehicle future motion state information, e H Is the emotional state of the driver;
the second layer result is used for representing a driving track fusing the emotional state of the driver and the interaction effect between road users, and the following relational expression is satisfied:
Figure FDA0003883143770000034
in the formula (I), the compound is shown in the specification,
Figure FDA0003883143770000035
is a driving trajectory of the first vehicle;
then, the driving trajectory satisfies the following relation:
Figure FDA0003883143770000036
in the formula D H Is the set of driving decisions of the driver,
Figure FDA0003883143770000037
is the kth driving decision in the set of driving decisions.
7. The intelligent vehicle travel control system based on emotion recognition according to claim 6, further comprising:
and the emotion induction module is used for generating an emotion adjusting scheme corresponding to the emotion state and sending adjusting information to the driver according to the emotion adjusting scheme.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method steps of any one of claims 1 to 5.
CN202210541021.9A 2022-05-17 2022-05-17 Intelligent vehicle running control method and system based on emotion recognition and storage medium Active CN114735010B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210541021.9A CN114735010B (en) 2022-05-17 2022-05-17 Intelligent vehicle running control method and system based on emotion recognition and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210541021.9A CN114735010B (en) 2022-05-17 2022-05-17 Intelligent vehicle running control method and system based on emotion recognition and storage medium

Publications (2)

Publication Number Publication Date
CN114735010A CN114735010A (en) 2022-07-12
CN114735010B true CN114735010B (en) 2022-12-13

Family

ID=82286753

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210541021.9A Active CN114735010B (en) 2022-05-17 2022-05-17 Intelligent vehicle running control method and system based on emotion recognition and storage medium

Country Status (1)

Country Link
CN (1) CN114735010B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115227247B (en) * 2022-07-20 2023-12-26 中南大学 Fatigue driving detection method, system and storage medium based on multisource information fusion
CN117445805B (en) * 2023-12-22 2024-02-23 吉林大学 Personnel early warning and driving control method and system for bus driver and passenger conflict

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109941288A (en) * 2017-12-18 2019-06-28 现代摩比斯株式会社 Safe driving auxiliary device and method
CN110119844A (en) * 2019-05-08 2019-08-13 中国科学院自动化研究所 Introduce robot motion's decision-making technique, the system, device of Feeling control mechanism
CN111994075A (en) * 2020-09-09 2020-11-27 黄日光 Driving assistance method based on artificial intelligence
CN114043990A (en) * 2021-12-15 2022-02-15 吉林大学 Multi-scene traffic vehicle driving state analysis system and method considering auditory information

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2746126B1 (en) * 2012-12-18 2019-04-10 Honda Research Institute Europe GmbH Driver assistance system
US9429946B2 (en) * 2014-12-25 2016-08-30 Automotive Research & Testing Center Driving control system and dynamic decision control method thereof
JP2019209917A (en) * 2018-06-07 2019-12-12 本田技研工業株式会社 Vehicle control device, vehicle control method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109941288A (en) * 2017-12-18 2019-06-28 现代摩比斯株式会社 Safe driving auxiliary device and method
CN110119844A (en) * 2019-05-08 2019-08-13 中国科学院自动化研究所 Introduce robot motion's decision-making technique, the system, device of Feeling control mechanism
CN111994075A (en) * 2020-09-09 2020-11-27 黄日光 Driving assistance method based on artificial intelligence
CN114043990A (en) * 2021-12-15 2022-02-15 吉林大学 Multi-scene traffic vehicle driving state analysis system and method considering auditory information

Also Published As

Publication number Publication date
CN114735010A (en) 2022-07-12

Similar Documents

Publication Publication Date Title
CN114735010B (en) Intelligent vehicle running control method and system based on emotion recognition and storage medium
CN110077414B (en) Vehicle driving safety guarantee method and system based on driver state monitoring
Xing et al. An ensemble deep learning approach for driver lane change intention inference
Alkinani et al. Detecting human driver inattentive and aggressive driving behavior using deep learning: Recent advances, requirements and open challenges
CN112041910B (en) Information processing apparatus, mobile device, method, and program
JP7080598B2 (en) Vehicle control device and vehicle control method
US11458972B2 (en) Vehicle control apparatus
CN107207013B (en) Automatic driving control apparatus, automatic driving control method, and program
US20190225232A1 (en) Passenger Experience and Biometric Monitoring in an Autonomous Vehicle
CN110371132B (en) Driver takeover evaluation method and device
CN112052776B (en) Unmanned vehicle autonomous driving behavior optimization method and device and computer equipment
Tang et al. Driver lane change intention recognition of intelligent vehicle based on long short-term memory network
US20200070848A1 (en) Method and System for Initiating Autonomous Drive of a Vehicle
CN106394555A (en) Unmanned automobile obstacle avoidance system and method based on 3D camera
CN110450783A (en) For running the control unit and method of autonomous vehicle
CN109740477A (en) Study in Driver Fatigue State Surveillance System and its fatigue detection method
CN111540222A (en) Intelligent interaction method and device based on unmanned vehicle and unmanned vehicle
CN116331221A (en) Driving assistance method, driving assistance device, electronic equipment and storage medium
CN107226024B (en) A kind of high-precision vehicle sighting distance acquisition and processing system and method
Corcoran et al. Traffic risk assessment: A two-stream approach using dynamic-attention
CN112455461B (en) Human-vehicle interaction method for automatically driving vehicle and automatically driving system
Meng et al. Application and development of AI technology in automobile intelligent cockpit
CN109866686A (en) The intelligent active safety DAS (Driver Assistant System) and method analyzed in real time based on video
Xing et al. Advanced driver intention inference: Theory and design
CN116129641A (en) Vehicle security situation calculation method and system based on multi-terminal collaborative identification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant