CN115626167A - Driving control method, device and equipment based on emotion recognition and storage medium - Google Patents

Driving control method, device and equipment based on emotion recognition and storage medium Download PDF

Info

Publication number
CN115626167A
CN115626167A CN202211252790.3A CN202211252790A CN115626167A CN 115626167 A CN115626167 A CN 115626167A CN 202211252790 A CN202211252790 A CN 202211252790A CN 115626167 A CN115626167 A CN 115626167A
Authority
CN
China
Prior art keywords
driver
emotion
target
target vehicle
driving control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211252790.3A
Other languages
Chinese (zh)
Inventor
邓鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meizu Technology Co Ltd
Original Assignee
Meizu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meizu Technology Co Ltd filed Critical Meizu Technology Co Ltd
Priority to CN202211252790.3A priority Critical patent/CN115626167A/en
Publication of CN115626167A publication Critical patent/CN115626167A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • B60W60/0016Planning or execution of driving tasks specially adapted for safety of the vehicle or its occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/005Handover processes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/005Handover processes
    • B60W60/0059Estimation of the risk associated with autonomous or manual driving, e.g. situation too complex, sensor failure or driver incapacity
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/22Psychological state; Stress level or workload

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application relates to the field of intelligent driving and discloses a driving control method, device, equipment and storage medium based on emotion recognition. The method comprises the following steps: in response to detecting the starting of a target vehicle, performing emotion recognition on a driver in the target vehicle based on a preset frequency to obtain an emotion type of the driver; if the emotion type of the driver is the target emotion type, calculating a target emotion intensity level of the driver; acquiring a safe driving strategy corresponding to the target emotion intensity level; and executing driving control on the target vehicle according to the safe driving strategy. The invention can actively sense the emotion of the driver and actively switch the driving control mode of the vehicle when the driver is in an unstable emotional state (namely an overstimulation state, such as overexcitation, anger, sadness and the like). Furthermore, the problem that potential safety hazards exist when a driver is in an unstable emotional state in the conventional driving control mode can be solved.

Description

Driving control method, device and equipment based on emotion recognition and storage medium
Technical Field
The application relates to the field of intelligent driving, in particular to a driving control method, device, equipment and storage medium based on emotion recognition.
Background
With continuous progress of sensing technology and automatic driving technology, the current automobile driving control mode is more and more intelligent. The switching of the driving control manner is generally actively selected by the driver, such as an auxiliary driving mode, an automatic driving mode, and the like, and the driving control manner is not usually changed after the selection of the driver.
In the prior art, when the emotion of a driver is unstable, violent driving may occur, and at the moment, the unchanged driving control mode brings potential safety hazards to the driver.
Disclosure of Invention
The embodiment of the application provides a driving control method based on emotion recognition, which can actively sense the emotion of a driver and actively switch the driving control mode of a vehicle when the driver is in an unstable emotion state. Furthermore, the problem that the existing driving control mode has potential safety hazard when the driver is in an unstable emotional state (namely, an overstimulation state, such as overexcitation, overeanger, overstrain and the like) can be solved.
In a first aspect, an embodiment of the present application provides a driving control method based on emotion recognition, which is applied to a vehicle-mounted terminal, and includes: in response to detecting the starting of the target vehicle, performing emotion recognition on a driver in the target vehicle based on a preset frequency to obtain an emotion type of the driver; if the emotion type of the driver is the target emotion type, calculating the target emotion intensity level of the driver; acquiring a safe driving strategy corresponding to the target emotion intensity level; and executing driving control on the target vehicle according to the safe driving strategy.
According to the method provided by the embodiment of the application, the emotion type of the driver is detected, if the detected emotion type is the target emotion type, the emotion intensity of the driver is further detected, the safe driving strategy corresponding to the emotion intensity is further obtained, the vehicle is controlled to be driven based on the safe driving strategy, the personal emotion state of the driver is actively sensed, the driving control mode is adjusted in time, the problem that the driver drives violently due to personal emotion is solved, and the driving safety is improved.
In one possible implementation manner, the target vehicle comprises a camera device, and in response to detecting the starting of the target vehicle, emotion recognition is performed on a driver in the target vehicle based on a preset frequency, and obtaining the current emotion type of the driver comprises: in response to the detection of the starting of the target vehicle, calling a camera device to acquire a face image of a driver in the target vehicle based on a preset frequency; performing feature extraction on the collected face image to obtain a target face feature of the driver; and performing emotion recognition based on the target face characteristics of the driver to obtain the emotion type of the driver.
In one possible implementation manner, the target vehicle includes a recording device therein, and in response to detecting the starting of the target vehicle, performing emotion recognition on a driver in the target vehicle based on a preset frequency, and obtaining the current emotion type of the driver includes: calling a recording device to perform voice acquisition on a driver in the target vehicle based on a preset frequency in response to detecting the starting of the target vehicle; performing feature extraction on the collected voice to obtain a target voice feature of the driver; and performing emotion recognition based on the target voice characteristics of the driver to obtain the emotion type of the driver.
Therefore, the non-physiological characteristic data (namely the face and the voice) of the driver are acquired, the acquisition mode is simple, the non-physiological characteristic data can be acquired directly on the basis of a camera device (such as a vehicle recorder) and a recording device (such as a microphone and a loudspeaker) which are common in vehicles, and therefore the identification is directly based on a software algorithm, and the operation is simple.
In one possible implementation manner, a driver wears a portable terminal with a biological recognition function, the portable terminal periodically collects physiological data of the driver, the portable terminal is electrically connected with a vehicle-mounted terminal, in response to detecting the starting of a target vehicle, emotion recognition is performed on the driver in the target vehicle based on a preset frequency, and obtaining a current emotion type of the driver includes: in response to detecting the starting of the target vehicle, sending a physiological data acquisition request to a portable terminal based on a preset frequency, wherein the portable terminal responds to the physiological data acquisition request and sends current physiological data of a driver to an on-board terminal; receiving physiological data sent by a portable terminal, and performing feature extraction on the physiological data to obtain the physiological features of a driver; and performing emotion recognition based on the physiological characteristics of the driver to obtain the emotion type of the driver.
In one possible implementation manner, the target vehicle includes an electroencephalogram acquisition device, the electroencephalogram acquisition device is disposed above a backrest of the driving seat, and in response to detecting the start of the target vehicle, performing emotion recognition on a driver in the target vehicle based on a preset frequency, and obtaining a current emotion type of the driver includes: calling an electroencephalogram acquisition device to acquire electroencephalograms of a driver in the target vehicle based on a preset frequency in response to the detection of the starting of the target vehicle; performing signal preprocessing on the acquired electroencephalogram signals; performing feature extraction on the preprocessed electroencephalogram signals to obtain target electroencephalogram features; and performing emotion recognition based on the target electroencephalogram characteristics to obtain the emotion type of the driver.
In one possible implementation manner, performing emotion recognition based on the preprocessed electroencephalogram signal, and obtaining an emotion type of the driver includes: smoothing the target electroencephalogram characteristics; performing feature dimension reduction on the smoothed target electroencephalogram features; and classifying or clustering the target electroencephalogram characteristics subjected to characteristic dimension reduction based on the data statistical characteristics to obtain the emotion type of the driver.
In this way, by collecting the physiological characteristic data (i.e. the signals generated by the autonomic nervous system, such as heartbeat and heart rate) and the brain wave data (i.e. the signals generated by the central nervous system) of the driver, since the data are directly measured from the biological nervous system of the person, compared with the face and the voice, artificial camouflage cannot be performed, thereby improving the accuracy of emotion recognition of the driver.
In one possible implementation, the performing of the driving control on the target vehicle according to the safe driving policy includes: if the emotion intensity level is a first level, outputting a safe driving prompt to prompt a driver, if the emotion intensity level is a second level, switching the target vehicle to an auxiliary driving mode, and if the emotion intensity level is a third level, switching the target vehicle to an automatic driving mode.
In this way, when the emotional intensity of the driver is perceived to be low, if the possibility that the driver drives violently is determined, the vehicle is controlled based on the artificial driving of the driver, and the driver is reminded of safe driving; under the condition that the emotional intensity of the driver is moderate, the possibility that the driver drives violently is determined, so that the driver is switched to the auxiliary driving mode; under the condition that the emotional intensity of the driver is sensed to be too high, the possibility that the driver drives violently is determined, so that the automatic driving mode is switched, and the driving safety is improved.
In a second aspect, an embodiment of the present application provides a driving control device based on emotion recognition, including: the emotion type recognition module is used for responding to the detection of the starting of the target vehicle and executing emotion recognition on a driver in the target vehicle on the basis of a preset frequency to obtain the emotion type of the driver; the emotion intensity recognition module is used for calculating a target emotion intensity level of the driver if the emotion type of the driver is a target emotion type; the driving strategy acquisition module is used for acquiring a safe driving strategy corresponding to the target emotion intensity level; and the driving control module is used for executing driving control on the target vehicle according to the safe driving strategy.
In a possible implementation manner, the target vehicle includes a camera, and the emotion recognition module specifically includes: the image acquisition unit is used for responding to the detection of the starting of the target vehicle and calling the camera device to acquire a human face image of a driver in the target vehicle based on a preset frequency; the human face feature extraction unit is used for performing feature extraction on the collected human face image to obtain the target human face feature of the driver; and the face feature recognition unit is used for executing emotion recognition based on the target face features of the driver to obtain the emotion type of the driver.
In a possible implementation manner, the target vehicle includes a recording device therein, and the emotion recognition module specifically includes: the voice acquisition unit is used for responding to the detection of the starting of the target vehicle and calling the recording device to perform voice acquisition on a driver in the target vehicle based on a preset frequency; the voice feature extraction unit is used for performing feature extraction on the collected voice to obtain the target voice feature of the driver; and the voice characteristic recognition unit is used for executing emotion recognition based on the target voice characteristics of the driver to obtain the emotion type of the driver.
In a possible implementation manner, a driver wears a portable terminal with a biological recognition function, the portable terminal periodically collects physiological data of the driver, the portable terminal is electrically connected with a vehicle-mounted terminal, and the emotion recognition module specifically comprises: a physiological data request unit for sending a physiological data acquisition request to the portable terminal based on a preset frequency in response to the detection of the start of the target vehicle, wherein the portable terminal sends the current physiological data of the driver to the vehicle-mounted terminal in response to the physiological data acquisition request; the physiological characteristic extraction unit is used for receiving the physiological data sent by the portable terminal and performing characteristic extraction on the physiological data to obtain the physiological characteristics of the driver; and the physiological characteristic identification unit is used for executing emotion identification based on the physiological characteristics of the driver to obtain the emotion type of the driver.
In a possible implementation, including brain electricity collection system in the target vehicle, the back top of driving seat is arranged in to brain electricity collection system, and emotion recognition module specifically includes: the electroencephalogram signal acquisition unit is used for responding to the detection of the starting of the target vehicle and calling the electroencephalogram acquisition device to acquire electroencephalogram signals for a driver in the target vehicle based on a preset frequency; the electroencephalogram signal preprocessing unit is used for executing signal preprocessing on the acquired electroencephalogram signals; the electroencephalogram feature extraction unit is used for performing feature extraction on the preprocessed electroencephalogram signal to obtain a target electroencephalogram feature; and the electroencephalogram feature recognition unit is used for executing emotion recognition based on the target electroencephalogram feature to obtain the emotion type of the driver.
In a possible implementation manner, the electroencephalogram feature identification unit is specifically configured to: smoothing the target electroencephalogram characteristics; performing feature dimension reduction on the smoothed target electroencephalogram features; and classifying or clustering the target electroencephalogram characteristics subjected to characteristic dimension reduction based on the data statistical characteristics to obtain the emotion type of the driver.
In one possible implementation, the driving control module is specifically configured to: if the emotion intensity level is a first level, outputting a safe driving prompt to prompt a driver, if the emotion intensity level is a second level, switching the target vehicle to an auxiliary driving mode, and if the emotion intensity level is a third level, switching the target vehicle to an automatic driving mode.
The technical effects of the second aspect and its various possible implementations are similar to those of the first aspect and its various possible implementations, and are not described here again.
In a third aspect, an embodiment of the present application provides a driving control apparatus based on emotion recognition, including: a memory and at least one processor, the memory having instructions stored therein; the at least one processor invokes the instructions in the memory to cause the emotion recognition-based driving control apparatus to perform the steps of the emotion recognition-based driving control method described above.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores instructions that, when executed on a computer, cause the computer to perform the steps of the driving control method based on emotion recognition.
Drawings
Fig. 1 is a flowchart of an embodiment of a driving control method based on emotion recognition provided in an embodiment of the present application;
FIG. 2 is a flowchart of another embodiment of a driving control method based on emotion recognition provided in an embodiment of the present application;
FIG. 3 is a flowchart of an embodiment of a driving control method based on emotion recognition provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of a driving control device based on emotion recognition according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of another driving control device based on emotion recognition according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a driving control device based on emotion recognition according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides a driving control method based on emotion recognition, which can actively sense the emotion of a driver and actively switch the driving control mode of a vehicle when the driver is in an unstable emotion state. Furthermore, the problem that the existing driving control mode has potential safety hazards when the driver is in an unstable emotional state (namely, an overstimulated state, such as an overexcited state, an overexcited state and the like) can be solved.
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. Where the terms "first," "second," "third," "fourth," and the like (if any) in the description and claims of this application and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be implemented in other sequences than those illustrated or described herein. Moreover, the terms "comprises," "comprising," or "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It is to be understood that any reference to data acquisition or collection in this application is intended to be authorized by the user.
It can be understood that the execution subject of the present application may be a driving control device based on emotion recognition, and may also be a vehicle-mounted terminal or a server, which is not limited herein.
For convenience of understanding, in the embodiment of the present application, a vehicle-mounted terminal is taken as an execution subject to describe the driving control method based on emotion recognition, and a specific flow of the embodiment of the present application is described below, please refer to fig. 1, and the embodiment of the present application provides a driving control method based on emotion recognition, which is applied to a vehicle-mounted terminal and includes:
101. in response to detecting the starting of the target vehicle, performing emotion recognition on a driver in the target vehicle based on a preset frequency to obtain an emotion type of the driver;
it should be understood that intervention of driving control during vehicle driving may bring safety hazards, so the embodiment of the application performs emotion recognition on a driver when the vehicle is started (namely after the engine is started), and performs driving control on the vehicle before the vehicle is in a driving state.
In order to save energy consumption of the vehicle, the vehicle-mounted terminal performs emotion detection at a preset frequency, the frequency can be adjusted according to actual demand effects, and the embodiment of the application does not limit the frequency. The emotion types may typically include anger, fear, sadness, happiness, and the like.
It is understood that the emotion recognition manner for the driver in the vehicle includes, but is not limited to, emotion recognition based on image, emotion recognition based on voice, emotion recognition based on physiological data, and the like, and the embodiments of the present application are not particularly limited thereto.
In a specific implementation of recognizing emotion based on an image, the target vehicle includes a camera device (e.g., a driving recorder, etc.), and in response to detecting the start of the target vehicle, the vehicle-mounted terminal invokes the camera device to perform face image acquisition on a driver in the target vehicle based on a preset frequency, performs feature extraction on the acquired face image to obtain a target face feature of the driver, and performs emotion recognition based on the target face feature of the driver to obtain an emotion type of the driver.
In a specific implementation of recognizing emotion based on voice, a recording device (such as an acoustic microphone device) is included in the target vehicle, and in response to detecting the starting of the target vehicle, the vehicle-mounted terminal calls the recording device based on a preset frequency to perform voice acquisition on a driver in the target vehicle, performs feature extraction on the acquired voice to obtain a target voice feature of the driver, and performs emotion recognition based on the target voice feature of the driver to obtain an emotion type of the driver.
Therefore, the non-physiological characteristic data (namely expressions and voice) of the driver are acquired, the acquisition mode is simple, the non-physiological characteristic data can be acquired directly on the basis of a camera device (such as a vehicle recorder) and a recording device (such as a microphone and a loudspeaker) which are common in vehicles, and therefore the identification is directly based on a software algorithm, and the operation is simple.
In the specific implementation of emotion recognition based on physiological data, a driver wears a portable terminal (such as a smart watch, a smart bracelet, smart glasses and the like) with a biological recognition function, the portable terminal periodically collects the physiological data of the driver, the portable terminal is electrically connected with a vehicle-mounted terminal, communication can be carried out between the portable terminal and the vehicle-mounted terminal based on Bluetooth or a network, in response to the detection of the starting of a target vehicle, the vehicle-mounted terminal sends a physiological data acquisition request to the portable terminal based on preset frequency, wherein the portable terminal responds to the physiological data acquisition request and sends the current physiological data of the driver to the vehicle-mounted terminal; receiving physiological data sent by a portable terminal, and performing feature extraction on the physiological data to obtain the physiological features of a driver; and performing emotion recognition based on the physiological characteristics of the driver to obtain the emotion type of the driver. This portable terminal carries out the collection of physiological data based on biological identification sensor, and the physiological data who gathers such as heart rate, skin impedance, respiratory rate etc. this application embodiment does not do the restriction to it. In a preferred embodiment, the physiological data collected by the portable terminal comprises three items of data of electrocardio, heart rate and skin electricity, and then the more accurate emotion recognition result is obtained according to the recognition of the physiological data.
In one possible design, the vehicle-mounted terminal can execute emotion recognition based on multiple combinations in the emotion recognition modes, so that the emotion recognition accuracy of the driver is improved.
102. If the emotion type of the driver is the target emotion type, calculating the target emotion intensity level of the driver;
it should be understood that the target emotion type is a type of emotion that is specified in advance and affects driving behavior, such as anger, which affects the driving judgment of the driver during driving, and is usually most directly reflected by hard accelerator depression; sadness, for example, can cause drivers to wander away during driving, thereby increasing driving safety risks.
It should be understood that, in the embodiment of the present application, the target emotion intensity level is calculated according to the data characteristics of the data sample on which emotion recognition is based.
For example, if emotion recognition is performed based on a face image of a driver, a target face feature of the face image is compared with standard face features each representing an emotion intensity level, so that a standard face feature most similar to the target face feature is determined to determine a corresponding emotion intensity level. The standard human face features are that specific facial muscle movement and expression modes are generated on the face of a person under specific emotional intensity, for example, the mouth corners are upwarped when people feel happy, and the eyes are annularly folded; may be irritating to frown, may open eyes, etc., by specifically quantifying these characteristics to obtain different levels of emotional intensity.
For another example, if the speech expression of people in different emotional states is different, the speaking tone is more loud and the speech tone is more tedious when people are happy, and if emotion recognition is performed based on the driver's speech, the target speech feature of the speech is compared with each standard speech feature representing the emotion intensity level, so that the standard speech feature most similar to the target speech feature is determined to determine the corresponding emotion intensity level.
103. Acquiring a safe driving strategy corresponding to the target emotion intensity level;
104. and executing driving control on the target vehicle according to the safe driving strategy.
The content of the safe driving strategy in the embodiment of the present application is not specifically limited, and may include, for example, limiting a speed of a vehicle and, for example, switching different driving modes (such as assisted driving and automatic driving).
According to the method provided by the embodiment of the application, the emotion type of the driver is detected, if the detected emotion type is the target emotion type, the emotion intensity of the driver is further detected, the safe driving strategy corresponding to the emotion intensity is further obtained, the vehicle is controlled to be driven based on the safe driving strategy, the personal emotion state of the driver is actively sensed, the driving control mode is adjusted in time, the problem that the driver drives violently due to personal emotion is solved, and the driving safety is improved.
Referring to fig. 2, an embodiment of the present application provides another driving control method based on emotion recognition, where a target vehicle includes an electroencephalogram acquisition device, and the electroencephalogram acquisition device is disposed above a backrest of a driving seat, and the driving control method based on emotion recognition includes:
201. calling an electroencephalogram acquisition device to acquire electroencephalograms of a driver in the target vehicle based on a preset frequency in response to the detection of the starting of the target vehicle;
it should be understood that the brain electrical signal collecting device needs to contact the scalp of the driver to collect the brain electrical signal, so that it can be placed above the back of the driving seat, so that the back of the brain of the driver can be easily contacted. The brain electricity collecting device can be a wet electrode brain electricity collecting device or a dry electrode brain electricity collecting device, even a brain-computer interface device, and the embodiment of the application does not limit the brain-computer interface device. Meanwhile, because the electroencephalogram signal is very weak, the electroencephalogram signal of the driver can be amplified based on a signal amplifier after being collected.
202. Performing signal preprocessing on the acquired electroencephalogram signals;
it is understood that the signal preprocessing is to separate the interference signal from the electroencephalogram signal of the driver. In specific implementation, because the power frequency interference and the electromagnetic interference are usually in a high-frequency band part, the interference signals can be separated from the electroencephalogram signals based on a high-pass filter or a low-pass filter; for interference signals that are not easily separated by filtering, principal Component Analysis (PCA) can be used to find and separate the interference signals.
203. Performing feature extraction on the preprocessed electroencephalogram signals to obtain target electroencephalogram features;
in the embodiment of the application, fast Fourier Transform (FFT) or Wavelet Transform (WT) is performed on the preprocessed electroencephalogram signal, so that a time-frequency feature reflecting both a time domain and a frequency domain is obtained as the target electroencephalogram feature.
204. Performing emotion recognition based on the target electroencephalogram characteristics to obtain the emotion type of the driver;
in the specific implementation, the emotion type of the driver is obtained by smoothing the target electroencephalogram characteristics, performing characteristic dimension reduction on the smoothed target electroencephalogram characteristics, and finally classifying or clustering the characteristic dimension-reduced target electroencephalogram characteristics based on the data statistical characteristics. For example, supervised learning can be performed on the electroencephalogram characteristics and corresponding emotions based on a classification model (such as naive Bayes, a support vector machine, a decision tree and the like) in deep learning, so that a corresponding emotion recognition model is obtained, and the target electroencephalogram characteristics to be recognized are input into the emotion recognition model to perform emotion classification, so that the emotion type of the driver is obtained; for another example, the target electroencephalogram feature is added into a preset electroencephalogram feature set, and unsupervised clustering processing is performed on the target electroencephalogram feature based on a clustering algorithm, so that a corresponding emotion type is obtained.
205. If the emotion type of the driver is the target emotion type, calculating a target emotion intensity level of the driver;
206. acquiring a safe driving strategy corresponding to the target emotion intensity level;
207. and executing driving control on the target vehicle according to the safe driving strategy.
Steps 205 to 207 are similar to the steps executed in steps 102 to 104, and are not described herein again.
According to the method provided by the embodiment of the application, the brain wave data (namely, signals generated by the central nervous system) of the driver are collected, and the data are directly measured from the biological nervous system of a human body, so that compared with a face and voice of the face, artificial camouflage cannot be performed, and the emotion recognition accuracy of the driver is improved.
Referring to fig. 3, an embodiment of the present application provides another driving control method based on emotion recognition, including:
301. in response to detecting the starting of the target vehicle, performing emotion recognition on a driver in the target vehicle based on a preset frequency to obtain an emotion type of the driver;
302. if the emotion type of the driver is the target emotion type, calculating a target emotion intensity level of the driver;
303. acquiring a safe driving strategy corresponding to the target emotion intensity level;
the steps are similar to the steps executed in steps 301 to 303, and are not described herein again.
304. Performing driving control on the target vehicle according to a safe driving strategy, wherein performing driving control on the target vehicle according to the safe driving strategy includes: and if the emotion intensity level is a first level, outputting a safe driving prompt to prompt a driver, if the emotion intensity level is a second level, switching the target vehicle to an auxiliary driving mode, and if the emotion intensity level is a third level, switching the target vehicle to an automatic driving mode.
It should be appreciated that when the emotional intensity level is the first level, the driver is allowed to fully manually perform driving control on the target vehicle while displaying the relevant safe driving reminder through the display of the in-vehicle terminal or playing the safe driving reminder to the driver through the in-vehicle speaker device.
When the emotional intensity is at the second level, the target vehicle is switched to an auxiliary driving mode, and on the basis that the driver artificially controls the target vehicle, driving assistance, such as lane keeping assistance, braking assistance and the like, is performed under necessary conditions.
When the emotional intensity is at the third level, the target vehicle is switched to an automatic driving mode, in which driving control is completely executed by an automatic driving system of the target vehicle, and the automatic driving system may perform automatic driving control based on a touch sensor or based on a visual image, which is not limited in the embodiment of the present application.
Based on the method provided by the embodiment of the application, under the condition that the emotional intensity of the driver is lower, the possibility that the driver drives violently is determined, the vehicle is still controlled based on the artificial driving of the driver, and the driver is reminded of safe driving; under the condition that the emotional intensity of the driver is moderate, the possibility that the driver drives violently is determined, so that the driver is switched to the auxiliary driving mode; under the condition that the emotional intensity of the driver is sensed to be too high, the possibility that the driver drives violently is determined, so that the automatic driving mode is switched, and the driving safety is improved.
In the foregoing, the driving control method based on emotion recognition in the embodiment of the present application is described, and referring to fig. 4, a driving control device based on emotion recognition in the embodiment of the present application is described below, and an embodiment of the present application provides a driving control device based on emotion recognition, including: the emotion type recognition module 401 is used for responding to the detection of the starting of the target vehicle, executing emotion recognition on a driver in the target vehicle based on a preset frequency, and obtaining an emotion type of the driver; the emotion intensity recognition module 402 is configured to calculate a target emotion intensity level of the driver if the emotion type of the driver is the target emotion type; a driving strategy obtaining module 403, configured to obtain a safe driving strategy corresponding to the target emotion intensity level; a driving control module 404 for performing driving control on the target vehicle according to a safe driving strategy.
According to the device provided by the embodiment of the application, the emotion intensity of the driver is further detected by detecting the emotion type of the driver if the detected emotion type is the target emotion type, the safe driving strategy corresponding to the emotion intensity is further acquired, the vehicle is controlled by driving based on the safe driving strategy, and the personal emotion state of the driver is actively sensed, so that the driving control mode is timely adjusted, the problem that the driver drives violently due to personal emotion is avoided, and the driving safety is improved.
Referring to fig. 5, an embodiment of the present application provides another driving control device based on emotion recognition, including: the emotion type recognition module 401 is used for responding to the detection of the starting of the target vehicle, executing emotion recognition on a driver in the target vehicle based on a preset frequency, and obtaining an emotion type of the driver; the emotion intensity recognition module 402 is used for calculating a target emotion intensity level of the driver if the emotion type of the driver is the target emotion type; a driving strategy obtaining module 403, configured to obtain a safe driving strategy corresponding to the target emotion intensity level; a driving control module 404 for performing driving control on the target vehicle according to a safe driving strategy.
In one possible design, the target vehicle includes an electroencephalogram acquisition device, the electroencephalogram acquisition device is placed above the backrest of the driving seat, and the emotion recognition module 401 specifically includes: the electroencephalogram signal acquisition unit 4011 is used for responding to the starting of the detected target vehicle and calling an electroencephalogram acquisition device to acquire electroencephalogram signals for a driver in the target vehicle based on a preset frequency; the electroencephalogram signal preprocessing unit 4012 is used for executing signal preprocessing on the acquired electroencephalogram signals; the electroencephalogram feature extraction unit 4013 is used for performing feature extraction on the preprocessed electroencephalogram signal to obtain a target electroencephalogram feature; the electroencephalogram feature recognition unit 4014 is used for performing emotion recognition based on the target electroencephalogram feature to obtain the emotion type of the driver.
In one possible design, the electroencephalogram feature recognition unit 4014 is specifically configured to: smoothing the target electroencephalogram characteristics; performing feature dimensionality reduction on the smoothed target electroencephalogram features; and classifying or clustering the target electroencephalogram characteristics subjected to characteristic dimension reduction based on the data statistical characteristics to obtain the emotion type of the driver.
In one possible design, the driving control module 404 is specifically configured to: if the emotion intensity level is a first level, outputting a safe driving prompt to prompt a driver, if the emotion intensity level is a second level, switching the target vehicle to an auxiliary driving mode, and if the emotion intensity level is a third level, switching the target vehicle to an automatic driving mode.
Based on the device that this application embodiment provided, the modularization design lets the hardware of each position of driving control device based on emotion recognition be absorbed in the realization of a certain function, and the maximize has realized the performance of hardware, and the modularized design has also reduced the coupling nature between the module of device simultaneously, and is convenient maintenance more.
Fig. 4 to 5 describe the driving control device based on emotion recognition in the embodiment of the present application in detail from the perspective of the modular functional entity, and the driving control device based on emotion recognition in the embodiment of the present application in detail from the perspective of hardware processing.
Fig. 6 is a schematic structural diagram of a driving control device based on emotion recognition 600 according to an embodiment of the present application, which may have a relatively large difference due to different configurations or performances, and may include one or more processors 610 (e.g., one or more processors) and a memory 620, and one or more storage media 630 (e.g., one or more mass storage devices) storing applications 633 or data 632. Memory 620 and storage medium 630 may be, among other things, transient or persistent storage. The program stored in the storage medium 630 may include one or more modules (not shown), each of which may include a series of instruction operations in the driving control apparatus 600 based on emotion recognition. Still further, the processor 610 may be configured to communicate with the storage medium 630, and execute a series of instruction operations in the storage medium 630 on the driving control apparatus 600 based on emotion recognition.
Mood recognition based steering control device 600 may also include one or more power supplies 640, one or more wired or wireless network interfaces 650, one or more input-output interfaces 660, and/or one or more operating systems 631, such as Windows Server, mac OS X, unix, linux, freeBSD, and so forth. Those skilled in the art will appreciate that the driving control device structure based on emotion recognition shown in fig. 6 does not constitute a limitation of the driving control device based on emotion recognition, and may include more or less components than those shown, or combine some components, or a different arrangement of components.
The application also provides a driving control device based on emotion recognition, the computer device comprises a memory and a processor, the memory stores computer readable instructions, and when the computer readable instructions are executed by the processor, the processor executes the steps of the driving control method based on emotion recognition in the above embodiments.
The present application also provides a computer-readable storage medium, which may be a non-volatile computer-readable storage medium, and which may also be a volatile computer-readable storage medium, having stored therein instructions, which, when run on a computer, cause the computer to execute the steps of the driving control method based on emotion recognition.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The application is operational with numerous general purpose or special purpose computing system environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet-type devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like. The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (10)

1. A driving control method based on emotion recognition is applied to a vehicle-mounted terminal, and is characterized by comprising the following steps:
in response to detecting the starting of a target vehicle, performing emotion recognition on a driver in the target vehicle based on a preset frequency to obtain an emotion type of the driver;
if the emotion type of the driver is the target emotion type, calculating a target emotion intensity level of the driver;
acquiring a safe driving strategy corresponding to the target emotion intensity level;
and executing driving control on the target vehicle according to the safe driving strategy.
2. The driving control method based on emotion recognition according to claim 1, wherein a camera device is included in the target vehicle, and the performing emotion recognition on a driver in the target vehicle based on a preset frequency in response to detection of activation of the target vehicle, obtaining a current emotion type of the driver includes:
calling the camera device to acquire a face image of a driver in a target vehicle based on a preset frequency in response to the detection of the starting of the target vehicle;
performing feature extraction on the collected face image to obtain a target face feature of the driver;
and executing emotion recognition based on the target face characteristics of the driver to obtain the emotion type of the driver.
3. The driving control method based on emotion recognition of claim 1, wherein a recording device is included in the target vehicle, and the performing emotion recognition on a driver in the target vehicle based on a preset frequency in response to detection of start of the target vehicle, the obtaining of the current emotion type of the driver includes:
in response to detecting the start of a target vehicle, calling the sound recording device to perform voice acquisition on a driver in the target vehicle based on a preset frequency;
performing feature extraction on the collected voice to obtain a target voice feature of the driver;
and executing emotion recognition based on the target voice characteristics of the driver to obtain the emotion type of the driver.
4. The driving control method based on emotion recognition of claim 1, wherein the driver wears a portable terminal having a biometric function, the portable terminal periodically collects physiological data of the driver, the portable terminal is electrically connected to the in-vehicle terminal, and in response to detection of start of a target vehicle, emotion recognition is performed on the driver in the target vehicle based on a preset frequency, and obtaining the current emotion type of the driver includes:
in response to the detection of the starting of the target vehicle, sending a physiological data acquisition request to the portable terminal based on a preset frequency, wherein the portable terminal responds to the physiological data acquisition request and sends the current physiological data of the driver to the vehicle-mounted terminal;
receiving physiological data sent by the portable terminal, and performing feature extraction on the physiological data to obtain physiological features of the driver;
and performing emotion recognition based on the physiological characteristics of the driver to obtain the emotion type of the driver.
5. The driving control method based on emotion recognition of claim 1, wherein the target vehicle includes an electroencephalogram acquisition device, the electroencephalogram acquisition device is disposed above a backrest of a driving seat, and in response to detecting the start of the target vehicle, performing emotion recognition on a driver in the target vehicle based on a preset frequency to obtain a current emotion type of the driver includes:
calling the electroencephalogram acquisition device to acquire electroencephalogram signals of a driver in a target vehicle based on a preset frequency in response to the detection of the starting of the target vehicle;
performing signal preprocessing on the acquired electroencephalogram signals;
performing feature extraction on the preprocessed electroencephalogram signals to obtain target electroencephalogram features;
and executing emotion recognition based on the target electroencephalogram characteristics to obtain the emotion type of the driver.
6. The driving control method based on emotion recognition as recited in claim 5, wherein said performing emotion recognition based on the preprocessed electroencephalogram signal, and obtaining the emotion type of the driver includes:
smoothing the target electroencephalogram feature;
performing feature dimension reduction on the smoothed target electroencephalogram features;
and classifying or clustering the target electroencephalogram features subjected to feature dimensionality reduction based on the data statistical characteristics to obtain the emotion types of the drivers.
7. The emotion recognition-based driving control method according to any one of claims 1 to 6, wherein the performing of driving control on the target vehicle in accordance with the safe driving policy includes:
if the emotion intensity level is a first level, outputting a safe driving prompt to prompt the driver, if the emotion intensity level is a second level, switching the target vehicle to an auxiliary driving mode, and if the emotion intensity level is a third level, switching the target vehicle to an automatic driving mode.
8. A driving control apparatus based on emotion recognition, characterized by comprising:
the emotion type recognition module is used for responding to the detection of the starting of a target vehicle and executing emotion recognition on a driver in the target vehicle on the basis of a preset frequency to obtain the emotion type of the driver;
the emotion intensity recognition module is used for calculating a target emotion intensity level of the driver if the emotion type of the driver is a target emotion type;
the driving strategy obtaining module is used for obtaining a safe driving strategy corresponding to the target emotion intensity level;
and the driving control module is used for executing driving control on the target vehicle according to the safe driving strategy.
9. A driving control apparatus based on emotion recognition, characterized in that the driving control apparatus based on emotion recognition includes: a memory and at least one processor, the memory having instructions stored therein;
the at least one processor invokes the instructions in the memory to cause the emotion recognition based driving control apparatus to perform the steps of the emotion recognition based driving control method as recited in any one of claims 1 to 7.
10. A computer readable storage medium having instructions stored thereon, which when executed by a processor implement the steps of the driving control method based on emotion recognition according to any of claims 1-7.
CN202211252790.3A 2022-10-13 2022-10-13 Driving control method, device and equipment based on emotion recognition and storage medium Pending CN115626167A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211252790.3A CN115626167A (en) 2022-10-13 2022-10-13 Driving control method, device and equipment based on emotion recognition and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211252790.3A CN115626167A (en) 2022-10-13 2022-10-13 Driving control method, device and equipment based on emotion recognition and storage medium

Publications (1)

Publication Number Publication Date
CN115626167A true CN115626167A (en) 2023-01-20

Family

ID=84905638

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211252790.3A Pending CN115626167A (en) 2022-10-13 2022-10-13 Driving control method, device and equipment based on emotion recognition and storage medium

Country Status (1)

Country Link
CN (1) CN115626167A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117056799A (en) * 2023-08-03 2023-11-14 广东省机场管理集团有限公司工程建设指挥部 Processing method, device, equipment and medium for vehicle sensor data

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117056799A (en) * 2023-08-03 2023-11-14 广东省机场管理集团有限公司工程建设指挥部 Processing method, device, equipment and medium for vehicle sensor data
CN117056799B (en) * 2023-08-03 2024-03-26 广东省机场管理集团有限公司工程建设指挥部 Processing method, device, equipment and medium for vehicle sensor data

Similar Documents

Publication Publication Date Title
Ali et al. Emotion recognition involving physiological and speech signals: A comprehensive review
US8400313B2 (en) Vehicle driver sleep state classification generating device based on Hidden Markov Model, sleep state classification device and warning device
Pratama et al. A review on driver drowsiness based on image, bio-signal, and driver behavior
CN108693973B (en) Emergency condition detection system fusing electroencephalogram signals and environmental information
US9603556B2 (en) Device and method for continuous biometric recognition based on electrocardiographic signals
EP3825826A1 (en) Method and system for providing a brain computer interface
US20120083668A1 (en) Systems and methods to modify a characteristic of a user device based on a neurological and/or physiological measurement
US20020077534A1 (en) Method and system for initiating activity based on sensed electrophysiological data
US20090062679A1 (en) Categorizing perceptual stimuli by detecting subconcious responses
US20070173733A1 (en) Detection of and Interaction Using Mental States
KR101518575B1 (en) Analysis method of user intention recognition for brain computer interface
Zou et al. Constructing multi-scale entropy based on the empirical mode decomposition (EMD) and its application in recognizing driving fatigue
EP3932303A1 (en) Method and system for detecting attention
CN112488002B (en) Emotion recognition method and system based on N170
CN115626167A (en) Driving control method, device and equipment based on emotion recognition and storage medium
CN109646024A (en) Method for detecting fatigue driving, device and computer readable storage medium
KR20210116309A (en) Techniques for separating driving emotion from media induced emotion in a driver monitoring system
Kumar et al. Neuro-phone: An assistive framework to operate Smartphone using EEG signals
Villa et al. Survey of biometric techniques for automotive applications
Lv et al. Design and implementation of an eye gesture perception system based on electrooculography
CN108491792B (en) Office scene human-computer interaction behavior recognition method based on electro-oculogram signals
Trigka et al. A survey on signal processing methods for EEG-based brain computer interface systems
Singh et al. Emotion recognition using electroencephalography (EEG): a review
Selvathi FPGA based human fatigue and drowsiness detection system using deep neural network for vehicle drivers in road accident avoidance system
WO2023027578A1 (en) Nose-operated head-mounted device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination