CN113516015B - Emotion recognition method, driving assisting device and application - Google Patents

Emotion recognition method, driving assisting device and application Download PDF

Info

Publication number
CN113516015B
CN113516015B CN202110410762.9A CN202110410762A CN113516015B CN 113516015 B CN113516015 B CN 113516015B CN 202110410762 A CN202110410762 A CN 202110410762A CN 113516015 B CN113516015 B CN 113516015B
Authority
CN
China
Prior art keywords
face
driver
area
tension
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110410762.9A
Other languages
Chinese (zh)
Other versions
CN113516015A (en
Inventor
冯桑
邱宏波
王炳成
林启万
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN202110410762.9A priority Critical patent/CN113516015B/en
Publication of CN113516015A publication Critical patent/CN113516015A/en
Application granted granted Critical
Publication of CN113516015B publication Critical patent/CN113516015B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Traffic Control Systems (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Image Analysis (AREA)

Abstract

The application aims to provide an emotion recognition method, an auxiliary driving device and an application, comprising the following steps: collecting a face image of a driver in a normal sitting posture; collecting face area, eye area and mouth characteristics in the face image as parameters for expression recognition; and in the running process of the vehicle, periodically acquiring face images of a driver, and comparing the face area, the eye area and the mouth characteristics in each face image with the face images in a normal sitting posture to identify tension emotion. According to the application, the driver is subjected to emotion calculation and identification through the in-vehicle camera, and direct contact with a human body is not needed, so that the interference to the driving behavior of the driver is avoided; when the face image is detected to carry out emotion recognition in the running process of the automobile, the face area, the eye area and the mouth characteristic multiple parameters are simultaneously calculated, so that misjudgment caused by miscalculation or omission of a single parameter is avoided, and the accuracy and the reliability of emotion recognition are ensured.

Description

Emotion recognition method, driving assisting device and application
Technical Field
The application relates to the field of automobile safety and intelligent auxiliary driving, in particular to a mood recognition method, an auxiliary driving device and application.
Background
The driver is in narrow airtight space for a long time, once meeting more complicated road conditions such as traffic jam, the emotion change is easily caused, and the probability of traffic accidents is increased.
At present, the emotion physiological information of a tested person can be directly obtained through a skin electric reaction test, but the method needs to be in contact with a human body at any time, so that the driving behavior is disturbed, and the method is not suitable for emotion collection of a driver.
Therefore, at the moment of the trend of the intellectualization of automobiles, it is highly demanded to provide an emotion recognition system for recognizing the tension emotion of a driver and taking necessary measures to ensure the comfort of driving and the safety of a road.
Disclosure of Invention
The application aims to provide a emotion recognition method, an auxiliary driving device and application, which can judge the emotion of a driver through video images without touching a human body and have no interference to driving behaviors.
The technical scheme of the application is as follows: a method of emotion recognition, comprising:
collecting a face image of a driver in a normal sitting posture;
collecting face area, eye area and mouth characteristics in the face image as parameters for expression recognition;
and in the running process of the vehicle, periodically acquiring face images of a driver, and comparing the face area, the eye area and the mouth characteristics in each face image with the face images in a normal sitting posture to identify tension emotion.
The step of comparing the face area, the eye area and the mouth characteristics in each face image with the face image under the normal sitting posture for tension emotion recognition includes:
setting: area S of face area to be measured face Area S of eye area to be measured eye And the mouth characteristic lambda to be measured;
face area S' face Area S 'of eye region' eye And a mouth characteristic lambda';
if S face ≥1.2S’ face And/orAnd/or +.>When the driver is in tension, the driver is determined.
After the "driver is determined to be in tension", the method includes:
according to the time that the driver is in the tension state, the vehicle is controlled step by step so as to ensure driving safety.
The "controlling the vehicle according to the time when the driver is in tension" includes:
the control time is from short to long and is respectively as follows: the audio-visual system of the automobile is controlled to play the light music expressing the emotion, or the automobile is limited in speed or the double flash of the automobile is opened while the automobile is limited in speed.
An application of the emotion recognition method as described above in a direction of recognition of a driver's tension emotion.
An auxiliary driving device to which the emotion recognition method as described above is applied, comprising:
a microprocessor module;
the in-vehicle camera is electrically connected with the micro-processing module and is used for collecting images of a driver in the vehicle;
the data transmission module is electrically connected with the micro-processing module and is used for collecting control instructions sent by the micro-processing module when the tension emotion of the driver is detected;
and the execution module is electrically connected with the data transmission module and is used for executing the control instruction transmitted by the data transmission module.
The execution module comprises:
the audio-video system is arranged in the vehicle to be controlled and is used for working according to the control instruction sent by the micro-processing module;
the accelerator controller is arranged in the vehicle to be controlled and is used for working according to the control instruction sent by the micro-processing module;
and the double-flash controller is arranged on the vehicle to be controlled and is used for working according to the control instruction sent by the micro-processing module.
The above-mentioned driving support device further includes:
and the satellite and communication module is in wireless communication with the micro-processing module and is used for sending information.
The above-mentioned driving support device further includes:
the storage module is arranged between the micro-processing module and the in-car camera and used for storing the picture information acquired by the in-car camera.
The above-mentioned driving support device further includes:
and the power supply module is respectively and electrically connected with the micro-processing module, the in-car camera, the data transmission module and the execution module and is used for supplying power.
The beneficial effects of the application at least comprise: according to the application, the driver is subjected to emotion calculation and identification through the in-vehicle camera, and direct contact with a human body is not needed, so that the interference to the driving behavior of the driver is avoided; when the face image is detected to carry out emotion recognition in the running process of the automobile, the face area, the eye area and the mouth characteristic multiple parameters are simultaneously calculated, so that misjudgment caused by single parameter calculation errors or omission is avoided, and the accuracy and the reliability of emotion recognition are ensured; by adopting a control mechanism of three-level regulation, traffic accidents caused by tension emotion of a driver are avoided to a certain extent, and safety of the driver and surrounding roads is ensured.
Drawings
FIG. 1 is a flow chart of the method of the present application.
Fig. 2 is a schematic view of the mouth feature of the method of the present application.
Fig. 3 is another schematic view of the mouth feature of the method of the present application.
Fig. 4 is a further schematic representation of the mouth feature of the method of the present application.
Fig. 5 is a schematic diagram of face recognition in the method of the present application.
Fig. 6 is a block diagram of the driving assistance apparatus according to the present application.
Detailed Description
The application is further described below with reference to the accompanying drawings.
As shown in fig. 1, a method for emotion recognition includes:
collecting a face image of a driver in a normal sitting posture; collecting face area, eye area and mouth characteristics in the face image as parameters for expression recognition; and in the running process of the vehicle, periodically acquiring face images of a driver, and comparing the face area, the eye area and the mouth characteristics in each face image with the face images in a normal sitting posture to identify tension emotion.
As shown in fig. 2 to 5, according to the emotion, the characteristics of the mouth are different,the mouth feature λ can be quantified, namely: defining the coordinates of the two side mouth angles as (a) 1 ,b 1 )、(a 2 ,b 2 ) Connecting the two points to obtain a line segment c, wherein the point isThe highest point coordinate of the mouth is (a 3 ,b 3 ) Then->The mouth characteristic lambda can be reduced when the driver is angry, and the mouth characteristic lambda can be increased when the driver smiles, so that the emotion of the driver is judged;
collecting face images of a driver in normal sitting postures, and then recording the face area S 'of the face area of the images' face Area S 'of eye region' eye And the mouth characteristic lambda' is used as a relevant parameter for expression recognition and is stored by taking the relevant parameter as a reference.
In the running process of the automobile, the camera in the automobile detects the face image of the driver in real time, the acquisition frequency is 4 frames/s, and the characteristics of each frame of image are extracted after preprocessing: area S of face area to be measured face Area S of eye area to be measured eye And the mouth characteristic lambda to be measured; when S is face ≥1.2S’ face At this time, the driver is shown to lean forward, has a tension emotion tendency, is predicted to be 'tension', and is judged as Sf ace ≤0.6S’f ace When the driver turns backward, the emotion is calm, and the driver is predicted to be ' not tension ', when 0.6S ' f ace ≤S face ≤1.2S’ face When the detection is not performed, the pre-judgment is not performed, and the detection is continuously maintained;
for the eye area S to be measured eye Because the body is inclined forward and backward and changes, interference is generated, and the calculation, namely definition, is needed by combining the area of the face regionAt this time, it is indicated that the eyes of the driver tend to be open, and at this time, it is predicted as "tension", when +.>When it is indicated that the driver's eyes tend to close, and the "tension" is predicted at this time;
for the feature lambda of the mouth to be detected, the interference generated by forward tilting and backward tilting of a driver is considered, and the calculation is carried out by combining the area of a face region, namely, the following steps are defined: when (when)At this time, the driver is shown to pucker, and is predicted to be "tension" at this time, when +.>When the driver smiles, the driver is predicted to be 'not stressed'; when->And when the detection is not performed, the detection is continuously maintained.
According to the time when the driver is in a tension state, the vehicle is controlled step by step according to the following emotion calculation rule table to ensure driving safety, and the method comprises the following steps:
(1) if the pre-judgment of the continuous 16 frames (4 s) is 'tension', judging that the continuous 16 frames are first-level tension, and sending an instruction to an audio-visual system of an automobile to play the light music in the situation at the moment by the micro-processing module until the pre-judgment of the 'tension' is released; (2) if the continuous 32 frames (8 s) are predicted to be 'tension' (comprising the previous 16 frames), judging to be second-level tension, at the moment, sending a speed limiting instruction to a central control system of an automobile by a micro-processing module (if the speed is more than or equal to 80km/h, limiting the speed to be 80km/h at the highest, and if the speed is less than 80km/h, limiting the speed to be 60km/h at the highest) until the prediction of 'tension' is released, and sending positioning information to a remote server; (3) if the continuous 48 frames (12 s) are predicted to be 'tension' (including the previous 32 frames), judging that the vehicle is third-level tension, continuously limiting the speed of the vehicle at the moment, and turning on the double-flash lamp until the prediction of the 'tension' is released;
table I: emotion calculation rule table:
as shown in fig. 6, an auxiliary driving device for identifying tension emotion of a driver, a camera, a storage module, a micro-processing module, a positioning and communication module, a power supply module, a data transmission module and an execution module in a vehicle, wherein: the in-car camera is used for collecting face images of a driver and has an infrared light supplementing function, so that the recognition can be carried out at night and in a darker environment; the micro-processing module is used for preprocessing images acquired by the camera in the vehicle, and has the core functions of emotion calculation and control strategy making according to calculation results; the storage module is used for storing the comparison rules of face area, eye area, mouth characteristics and emotion calculation acquired by the camera in the vehicle; the positioning and communication module can receive the positioning signal and send information to a remote server (such as a vehicle transportation management system) according to a control strategy; the power supply module comprises a rechargeable battery; the data transmission module is used for sending the control instruction sent by the micro-processing module to a central control system of the automobile, such as an execution module; the power supply module provides required power for the camera, the storage module, the micro-processing module, the positioning and communication module, the data transmission module and the like; the data transmission module and the power supply module are connected to the central control terminal of the automobile by using a bus to realize the functions of data transmission and charging, and simultaneously ensure the real-time transmission of control instructions.
What needs to be clarified is: the micro-processing module can be a single chip microcomputer module, and can be electrically connected with the storage module, the micro-processing module, the positioning and communication module, the data transmission module and the like by a special PCB.
What needs to be clarified is: the auxiliary driving device provided by the application can be arranged in a shell and uniformly installed. For convenient use, the housing can be connected to the inside rear view mirror of the vehicle by a detachable connecting part; the connecting component may be a clamp.
The working process of the auxiliary driving device provided by the application is as follows: after a driver gets on a vehicle, the sitting posture is adjusted, the vehicle is started, the auxiliary driving device starts to work, at the moment, the face image shot by the camera in the vehicle is preprocessed by the micro-processing module to extract the area of the face area to be detected, the area of the eye area to be detected, the characteristics of the mouth to be detected and the emotion calculation, and then the data are stored in the storage module. In the running process of the automobile, the micro-processing module carries out preprocessing to extract characteristics, carries out emotion parameter calculation and comparison, and sends an instruction to control the automobile according to the comparison result; after the automobile is flameout, the emotion recognition system is automatically turned off.
The emotion recognition method can be applied to the technical direction of tension emotion recognition of automobile drivers.
The foregoing description of the preferred embodiment of the application is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the application.

Claims (9)

1. A method of emotion recognition, comprising:
collecting a face image of a driver in a normal sitting posture;
collecting face area, eye area and mouth characteristics in the face image as parameters for expression recognition;
in the running process of the vehicle, face images of a driver are periodically collected, and the face area, the eye area and the mouth characteristics in each face image are compared with the face images under normal sitting postures to identify tension;
the step of comparing the face area, the eye area and the mouth characteristics in each face image with the face image under the normal sitting posture for tension emotion recognition includes:
setting: area of face region to be measuredS face Area of eye area to be measuredS eye And the mouth characteristics to be measuredλ;
Area of face regionS’ face Eye areaArea ofS’ eye And mouth characteristicsλ’;
If the number of the groups of groups is equal,S face ≥1.2S’ face and≥0.2/>and->≥1.5/>When the driver is determined to be in a tension state;
wherein, the coordinates of the two side mouth angles are defined as%)、(/>) Connecting these two points gives a segment c whose midpoint is (+.>) The highest point coordinate of the mouth is (++>) Thenλ=/>-/>
2. The emotion recognition method according to claim 1, wherein after the determination that the driver is in a state of tension, comprising:
according to the time that the driver is in the tension state, the vehicle is controlled step by step so as to ensure driving safety.
3. A method of emotion recognition according to claim 2, wherein said controlling the vehicle according to the time when the driver is in tension includes:
the control time is from short to long and is respectively as follows: the audio-visual system of the automobile is controlled to play the light music expressing the emotion, or the automobile is limited in speed or the double flash of the automobile is opened while the automobile is limited in speed.
4. Use of the emotion recognition method as defined in any one of claims 1 to 3 in a direction of recognition of a tension emotion of a driver.
5. A driving assisting apparatus applying the emotion recognition method according to any one of claims 1 to 3, characterized by comprising:
a microprocessor module;
the in-vehicle camera is electrically connected with the micro-processing module and is used for collecting images of a driver in the vehicle;
the data transmission module is electrically connected with the micro-processing module and is used for collecting control instructions sent by the micro-processing module when the tension emotion of the driver is detected;
the execution module is electrically connected with the data transmission module and is used for executing the control instruction transmitted by the data transmission module;
the tension emotion recognition is performed by comparing the face area, the eye area and the mouth feature in each face image with the face image under the normal sitting posture, comprising:
setting: area of face region to be measuredS face Area of eye area to be measuredS eye And the mouth characteristics to be measuredλ;
Area of face regionS’ face Area of eye areaS’ eye And mouth characteristicsλ’;
If the number of the groups of groups is equal,S face ≥1.2S’ face and≥0.2/>and->≥1.5/>When the driver is in tension, the driver is determined.
6. The driving assistance apparatus according to claim 5, wherein the execution module includes:
the audio-video system is arranged in the vehicle to be controlled and is used for working according to the control instruction sent by the micro-processing module;
the accelerator controller is arranged in the vehicle to be controlled and is used for working according to the control instruction sent by the micro-processing module;
and the double-flash controller is arranged on the vehicle to be controlled and is used for working according to the control instruction sent by the micro-processing module.
7. The driving assistance apparatus according to claim 5, characterized by further comprising:
and the satellite and communication module is in wireless communication with the micro-processing module and is used for sending information.
8. The driving assistance apparatus according to claim 5, characterized by further comprising:
the storage module is arranged between the micro-processing module and the in-car camera and used for storing the picture information acquired by the in-car camera.
9. The driving assistance apparatus according to claim 5, characterized by further comprising:
and the power supply module is respectively and electrically connected with the micro-processing module, the in-car camera, the data transmission module and the execution module and is used for supplying power.
CN202110410762.9A 2021-04-15 2021-04-15 Emotion recognition method, driving assisting device and application Active CN113516015B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110410762.9A CN113516015B (en) 2021-04-15 2021-04-15 Emotion recognition method, driving assisting device and application

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110410762.9A CN113516015B (en) 2021-04-15 2021-04-15 Emotion recognition method, driving assisting device and application

Publications (2)

Publication Number Publication Date
CN113516015A CN113516015A (en) 2021-10-19
CN113516015B true CN113516015B (en) 2023-12-05

Family

ID=78062353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110410762.9A Active CN113516015B (en) 2021-04-15 2021-04-15 Emotion recognition method, driving assisting device and application

Country Status (1)

Country Link
CN (1) CN113516015B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113780062A (en) * 2021-07-26 2021-12-10 岚图汽车科技有限公司 Vehicle-mounted intelligent interaction method based on emotion recognition, storage medium and chip

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604382A (en) * 2009-06-26 2009-12-16 华中师范大学 A kind of learning fatigue recognition interference method based on human facial expression recognition
CN102785574A (en) * 2012-07-17 2012-11-21 东莞市泰斗微电子科技有限公司 Fatigue or drunk-driving detection and control method and corresponding system
CN106023341A (en) * 2016-05-05 2016-10-12 北京奇虎科技有限公司 Automobile data recorder emergency shooting control method and device
CN110395260A (en) * 2018-04-20 2019-11-01 比亚迪股份有限公司 Vehicle, safe driving method and device
CN110490139A (en) * 2019-08-21 2019-11-22 南京亨视通信息技术有限公司 Night fatigue driving judgment method based on recognition of face
CN111860437A (en) * 2020-07-31 2020-10-30 苏州大学 Method and device for judging fatigue degree based on facial expression

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019074791A (en) * 2017-10-12 2019-05-16 株式会社デンソー Driving support device for vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604382A (en) * 2009-06-26 2009-12-16 华中师范大学 A kind of learning fatigue recognition interference method based on human facial expression recognition
CN102785574A (en) * 2012-07-17 2012-11-21 东莞市泰斗微电子科技有限公司 Fatigue or drunk-driving detection and control method and corresponding system
CN106023341A (en) * 2016-05-05 2016-10-12 北京奇虎科技有限公司 Automobile data recorder emergency shooting control method and device
CN110395260A (en) * 2018-04-20 2019-11-01 比亚迪股份有限公司 Vehicle, safe driving method and device
CN110490139A (en) * 2019-08-21 2019-11-22 南京亨视通信息技术有限公司 Night fatigue driving judgment method based on recognition of face
CN111860437A (en) * 2020-07-31 2020-10-30 苏州大学 Method and device for judging fatigue degree based on facial expression

Also Published As

Publication number Publication date
CN113516015A (en) 2021-10-19

Similar Documents

Publication Publication Date Title
US10908677B2 (en) Vehicle system for providing driver feedback in response to an occupant's emotion
CN112041910B (en) Information processing apparatus, mobile device, method, and program
US9558414B1 (en) Method for calculating a response time
CN110395260B (en) Vehicle, safe driving method and device
CN109664891A (en) Auxiliary driving method, device, equipment and storage medium
CN110027567A (en) The driving condition of driver determines method, apparatus and storage medium
JP2018181269A (en) Presentation control device, automatic operation control device, presentation control method, and automatic operation control method
JP2014096632A (en) Imaging system
US11279373B2 (en) Automated driving system
CN113516015B (en) Emotion recognition method, driving assisting device and application
KR20200020313A (en) Vehicle and control method for the same
CN117842022A (en) Driving safety control method and device for artificial intelligent cabin, vehicle and medium
CN114049677A (en) Vehicle ADAS control method and system based on emotion index of driver
CN114701503A (en) Method, device and equipment for adjusting driving behavior of vehicle driver and storage medium
US20190295400A1 (en) Driving assistance device
JP2010009484A (en) Onboard equipment control device and onboard equipment control method
CN112590735B (en) Emergency braking method and device based on driver habits and vehicle
CN111508232A (en) Road safety snapshot all-in-one
CN114469097A (en) Man-machine common driving takeover state testing method
CN115546920A (en) AI intelligent vehicle system based on Hilens
CN115384539A (en) Method and device for generating intelligent driving strategy of vehicle, vehicle and storage medium
CN114987328A (en) Equipment control method, device and storage medium
CN115019289A (en) Method for detecting the fatigue state of a driver, electronic fatigue detection system and motor vehicle
CN207059776U (en) A kind of motor vehicle driving approval apparatus
CN114644012A (en) Vehicle, control method thereof, storage medium, and electronic apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant