CN112172830A - Driver state monitoring method and system, readable storage medium and vehicle-mounted terminal - Google Patents
Driver state monitoring method and system, readable storage medium and vehicle-mounted terminal Download PDFInfo
- Publication number
- CN112172830A CN112172830A CN201910520987.2A CN201910520987A CN112172830A CN 112172830 A CN112172830 A CN 112172830A CN 201910520987 A CN201910520987 A CN 201910520987A CN 112172830 A CN112172830 A CN 112172830A
- Authority
- CN
- China
- Prior art keywords
- driver
- emotion
- state
- monitoring
- pupil
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0818—Inactivity or incapacity of driver
- B60W2040/0827—Inactivity or incapacity of driver due to sleepiness
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0872—Driver physiology
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Ophthalmology & Optometry (AREA)
- Automation & Control Theory (AREA)
- Mathematical Physics (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention provides a monitoring method and a system for the state of a driver, a readable storage medium and a vehicle-mounted terminal, which are used for improving the emotion of the driver; the monitoring method for the driver state comprises the following steps: receiving basic parameters for analyzing the emotion of the driver, which are acquired by information acquisition equipment; analyzing the base parameters to identify the mood and/or fatigue level of the driver; and selecting the music type suitable for the emotion according to the emotion of the driver, and sending a playing instruction. According to the invention, the facial expression of the driver is detected by using the information acquisition equipment, and the emotion and fatigue condition of the driver are judged in real time by identifying the facial expression, so that music which is beneficial to eliminating fatigue and improving emotion is automatically played, the driver can control the vehicle in a more relaxed and comfortable atmosphere, and the driving experience and the driving safety are improved.
Description
Technical Field
The invention belongs to the technical field of safety monitoring, relates to a monitoring method and a monitoring system, and particularly relates to a monitoring method and a monitoring system for a driver state, a readable storage medium and a vehicle-mounted terminal.
Background
In the driving process, along with the increase of the driving distance and time, the mood is easy to fluctuate and fluctuate. Or the driving mood of a driver can be influenced by the variability and the complexity of traffic conditions such as weather change, poor road quality, congestion by people, unsuccessful overtaking or overtaking by other vehicles, blockage by a large truck and the like during driving. Therefore, the situation of bad mood and even road rage is easy to occur, the driving experience is influenced, and even potential safety hazard is brought to driving.
Therefore, how to provide a method and a system for monitoring a driver state, a readable storage medium and a vehicle-mounted terminal to solve the defects that the prior art cannot monitor the current driving state of the driver, cannot improve the emotion of the driver, causes potential safety hazards to driving, and is poor in driving experience of the driver, and the like, has become a technical problem to be urgently solved by technical personnel in the field.
Disclosure of Invention
In view of the above drawbacks of the prior art, an object of the present invention is to provide a method and a system for monitoring a driver status, a readable storage medium, and a vehicle-mounted terminal, which are used to solve the problems that the prior art cannot monitor the current driving status of the driver, cannot improve the mood of the driver, causes a potential safety hazard to the driver, and is poor in driving experience of the driver.
To achieve the above and other related objects, the present invention provides in one aspect a method for monitoring a driver's state, for improving a driver's mood; the monitoring method for the driver state comprises the following steps: receiving basic parameters for analyzing the emotion of the driver, which are acquired by information acquisition equipment; analyzing the base parameters to identify the mood and/or fatigue level of the driver; and selecting the music type suitable for the emotion according to the emotion of the driver, and sending a playing instruction.
In an embodiment of the present invention, the information collecting device includes a multimedia collecting device installed in a cab of the vehicle and/or a physiological monitoring device worn on a body of the driver.
In an embodiment of the present invention, the basic parameters include facial information, eyeball information, and heartbeat and pulse data of the driver; the multimedia acquisition equipment is used for acquiring facial features including facial information and eyeball information of a driver; the physiological monitoring equipment is used for acquiring heartbeat and pulse data of the driver.
In an embodiment of the present invention, the facial information includes a direction and an amplitude of a change of a mouth angle of the driver, a degree of mouth opening of the driver, whether the face has tears, a nodding frequency and an amplitude, and a yawning frequency; the eyeball information comprises pupil state information, eye action, eye closing time length and blinking frequency.
In an embodiment of the invention, the step of analyzing the basic parameter to identify the emotion of the driver includes: if the pupil state information indicates that the current pupil is smaller than the preset pupil and the facial information indicates that the mouth angle change direction is downward change, identifying that the current emotion of the driver is sadness; if the pupil state information indicates that the current pupil is larger than the preset pupil and the face information indicates that the mouth angle change direction changes upwards, recognizing that the current emotion of the driver is happy; if the eye movement is used as squinting and the face information indicates that the change direction of the mouth angle is upward change, the current emotion of the driver is recognized as distraction; if the eye movement is that eyes are wide open, and the facial information indicates that the mouth of the user is open, recognizing that the current emotion of the driver is surprise; if the eye movement is used as eye closing and the face information indicates that the mouth angle change direction is downward change, identifying that the current emotion of the driver is sadness; if the pupil state information indicates that the current pupil is larger than the preset pupil, the eyes act as wide eyes, and the facial information indicates that the mouth angle change direction changes downwards, the current emotion of the driver is identified as angry; if the face of the driver has the tear water, the current emotion of the driver is identified as sadness, and if the eye closing duration exceeds a preset eye closing duration threshold value, the current fatigue state of the driver is identified; if the blink frequency exceeds a preset blink frequency threshold value, identifying that the driver is in an exhausted state currently; if the frequency and the amplitude of the nodding head exceed a preset nodding head frequency threshold value and a preset nodding head amplitude threshold value, identifying that the driver is in an exhausted state at present; or if the yawning frequency exceeds a preset yawning frequency threshold value, identifying that the driver is in the fatigue state currently.
In an embodiment of the invention, the step of analyzing the basic parameter to identify the emotion of the driver further includes: and comparing the collected heartbeat and pulse data of the driver with prestored emotion characteristic data to identify the emotion of the driver.
In an embodiment of the present invention, the music type of the emotion includes a relaxation, a cheerful, a lively, a dynamic and/or a lyric; the step of selecting a music type adapted to the emotion according to the emotion of the driver includes: if the current emotion of the driver is identified as sadness, selecting the music type as mild; if the current emotion of the driver is recognized as happy, selecting the music type as cheerful; if the current emotion of the driver is angry, selecting the music type as bright and fast; if the current emotion of the driver is identified as surprise, selecting the music type as cheerful or lyric; or if the driver is identified to be in the fatigue state currently, selecting the music type as dynamic.
Another aspect of the present invention provides a monitoring system for a driver's status, for improving the driver's mood; the monitoring system of the driver state includes: the communication module is used for receiving basic parameters which are acquired by the information acquisition equipment and used for analyzing the emotion of the driver; the analysis module is used for analyzing the basic parameters to identify the emotion and/or fatigue degree of the driver; and the selection module is used for selecting the music type suitable for the emotion according to the emotion of the driver and sending a playing instruction through the communication module.
Yet another aspect of the present invention provides a readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of monitoring a driver's state.
A final aspect of the present invention provides a vehicle-mounted terminal, including: a processor and a memory; the memory is used for storing computer programs, and the processor is used for executing the computer programs stored by the memory so as to enable the vehicle-mounted terminal to execute the monitoring method of the driver state.
As described above, the method, the system, the readable storage medium and the vehicle-mounted terminal for monitoring the driver status according to the present invention have the following advantages:
the method, the system, the readable storage medium and the vehicle-mounted terminal for monitoring the state of the driver detect the facial expression of the driver by using the information acquisition equipment, judge the emotion and fatigue condition of the driver in real time by identifying the facial expression, and automatically play music which is beneficial to eliminating fatigue and improving emotion, so that the driver can control the vehicle in a more relaxed and comfortable atmosphere, and the driving experience and the driving safety are improved.
Drawings
Fig. 1 is a schematic diagram of a safe driving monitoring network to which the present invention is applied.
Fig. 2 is a flowchart illustrating a method for monitoring a driver status according to an embodiment of the present invention.
Fig. 3 is a schematic structural diagram of a monitoring system for driver status according to an embodiment of the present invention.
Fig. 4 is a schematic structural diagram of the vehicle-mounted terminal of the present invention.
Description of the element reference numerals
1 safe driving monitoring network
11 information acquisition device
12 vehicle mounted terminal
111 multimedia acquisition equipment
112 physiological monitoring device
3 monitoring system of driver state
31 communication module
32 analysis module
33 selection module
4 vehicle-mounted terminal
41 processor
42 memory
43 transceiver
44 communication interface
45 system bus
S21-S23
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
The technical principles of the method, the system, the readable storage medium and the vehicle-mounted terminal for monitoring the state of the driver are as follows:
a camera is arranged in front of the face of a driver to monitor the facial expression of the driver in the driving process in real time or physiological monitoring equipment is arranged on the driver, and video signals and physiological parameters are input to the main processor of the intelligent vehicle machine for identification processing. And comprehensively judging the emotion of the driver according to the behaviors of the shaking frequency and amplitude of the driver, the change direction and amplitude of the mouth angle, the mouth opening degree and the like. If the situation that the emotion of the driver is poor, the temperament is violent or the driver is angry is monitored, the intelligent vehicle machine plays soft and slow music to improve the mood of the driver, so that the emotion of the driver becomes better, the mental state is improved, the driver can drive the vehicle more intensively in a relatively joyful mood, the driving experience is improved, and the danger potential danger is reduced.
Example one
The embodiment provides a monitoring method of the driver state, which is used for improving the emotion of the driver; the monitoring method for the driver state comprises the following steps:
receiving basic parameters for analyzing the emotion of the driver, which are acquired by information acquisition equipment;
analyzing the base parameters to identify the mood and/or fatigue level of the driver;
and selecting the music type suitable for the emotion according to the emotion of the driver, and sending a playing instruction.
The method for monitoring the driver's state according to the present embodiment will be described in detail with reference to the drawings. The monitoring method of the driver state according to this embodiment is applied to the safe driving monitoring network 1 as shown in fig. 1, where the safe driving monitoring network 1 includes an information acquisition device 11 and a vehicle-mounted terminal 12 wirelessly connected to the information acquisition device 11. In the present embodiment, the information collecting device 11 includes a multimedia collecting device 111 installed in the cab of the vehicle and/or a physiological monitoring device 112 worn on the body of the driver.
Please refer to fig. 2, which is a flowchart illustrating a method for monitoring a driver's status according to an embodiment. As shown in fig. 2, the method for monitoring the driver state specifically includes the following steps:
and S21, receiving basic parameters for analyzing the emotion of the driver, which are collected by the information collection equipment. In the present embodiment, the basic parameters include facial information, eyeball information, and heartbeat and pulse data of the driver. The multimedia acquisition device 111 is used for acquiring facial features including facial information and eyeball information of a driver, and the physiological monitoring device 112 is used for acquiring heartbeat and pulse data of the driver.
Specifically, the face information includes information such as the direction and magnitude of change in the driver's mouth angle, the degree of opening of the driver's mouth, whether the face has tears, the frequency and magnitude of nodding, and the frequency of yawning. The eyeball information comprises information such as pupil state information, eye movement, eye closing time length, blinking frequency and the like.
S22, analyzing the basic parameters to identify the emotion and/or fatigue degree of the driver.
In this embodiment, the S22 specifically includes the following steps:
analyzing the face information and the eyeball information, and if the pupil state information indicates that the current pupil is smaller than the preset pupil and the face information indicates that the mouth angle change direction is downward change, identifying that the current emotion of the driver is sadness; wherein, the preset pupil is the size of the pupil of a human under a normal state.
Analyzing the face information and the eyeball information, and if the pupil state information indicates that the current pupil is larger than the preset pupil and the face information indicates that the mouth angle change direction changes upwards, recognizing the current emotion of the driver as happy;
analyzing the facial information and the eyeball information, and if the eye movement is used as squinting and the facial information indicates that the change direction of the mouth angle is upward change, recognizing that the current emotion of the driver is happy;
analyzing facial information and eyeball information, and if the eye movement is to open the eyes and the facial information indicates that the mouth of the user is open, recognizing that the current emotion of the driver is surprise;
analyzing the face information and eyeball information, and if the eye movement is used as eye closing and the face information indicates that the mouth angle change direction is downward change, identifying that the current emotion of the driver is sadness;
analyzing facial information and eyeball information, and if the pupil state information indicates that the current pupil is larger than a preset pupil, the eyes act as wide eyes, and the facial information indicates that the mouth angle change direction changes downwards, recognizing that the current emotion of the driver is angry;
analyzing eyeball information, and if the face of the driver has tear water, identifying that the current emotion of the driver is sad;
analyzing eyeball information, and if the eye closing duration exceeds a preset eye closing duration threshold, identifying that the driver is in an exhausted state at present;
and analyzing eyeball information, and if the blink frequency exceeds a preset blink frequency threshold value, identifying that the driver is in an exhausted state at present.
Analyzing the facial information, and if the nodding frequency and the nodding amplitude exceed a preset nodding frequency threshold and a preset nodding amplitude threshold, identifying that the driver is in an exhausted state at present; or
And analyzing the facial information, and if the yawning frequency exceeds a preset yawning frequency threshold value, identifying that the driver is in the fatigue state at present.
In this embodiment, the S22 further includes the following steps:
and comparing the collected heartbeat and pulse data of the driver with prestored emotion characteristic data to identify the emotion of the driver.
Specifically, collected heartbeat and pulse data of the driver are drawn into a heartbeat and pulse characteristic graph of the current driver, an emotion change rule is recognized from a heartbeat and pulse oscillogram through an emotion recognition algorithm, the heartbeat and pulse characteristic graph is compared with a pre-stored characteristic graph model, when the matching degree of the heartbeat and pulse characteristic graph and the pre-stored characteristic graph model exceeds a preset value, the level of current emotion fluctuation of the driver is judged, the more the emotion fluctuation is, the higher the emotion fluctuation is, and the higher the level is, for example, the level is sequentially from low level to high level, low level, medium level and high level. In this embodiment, the pre-stored feature pattern model may be one or more predetermined feature pattern models, such as a feature pattern model of happiness, anger, sadness, and happiness.
And S23, selecting the music type suitable for the emotion according to the emotion of the driver, and sending a playing instruction. In the present embodiment, the types of music of the mood include a mood, a cheerful mood, a brightness mood, a motion mood, and/or a lyric.
Specifically, the S23 includes:
if the current emotion of the driver is identified as sadness, selecting the music type as mild;
if the current emotion of the driver is recognized as happy, selecting the music type as cheerful;
if the current emotion of the driver is angry, selecting the music type as bright and fast;
if the current emotion of the driver is identified as surprise, selecting the music type as cheerful or lyric; or
And if the driver is identified to be in the fatigue state at present, selecting the music type as dynamic.
It should be noted that the above is only an example, and it should be understood that other recommendation strategies may be set according to the preference of the user, for example, when the driver is sad, more explicit music is recommended for the driver, and the like. It should be noted that the music type may be recommended according to other division manners besides the division manners such as express, cheerful, lazy, and lyrics, for example, when the driver is in a certain mood, a specific music form may be recommended for the driver, such as a music type of jazz, popular song, or a music of a certain part of singers may also be recommended, and the like, and the music type may be specifically set by the user according to the preference of the user, and will not be described herein again.
The present embodiment also provides a readable storage medium (also referred to as a computer-readable storage medium) on which a computer program is stored, which, when being executed by a processor, implements the above-mentioned method of monitoring the driver's state.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the above method embodiments may be performed by hardware associated with a computer program. The aforementioned computer program may be stored in a computer readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
According to the monitoring method for the driver state, the facial expression of the driver is detected by the information acquisition equipment, the emotion and fatigue condition of the driver are judged in real time by recognizing the facial expression, so that music which can improve the emotion and is beneficial to eliminating fatigue is automatically played, the driver can control a vehicle in a more relaxed and comfortable atmosphere, and the driving experience and the driving safety are improved.
Example two
The embodiment provides a monitoring system for the state of a driver, which is used for improving the emotion of the driver; the monitoring system of the driver state includes:
the communication module is used for receiving basic parameters which are acquired by the information acquisition equipment and used for analyzing the emotion of the driver;
the analysis module is used for analyzing the basic parameters to identify the emotion and/or fatigue degree of the driver;
and the selection module is used for selecting the music type suitable for the emotion according to the emotion of the driver and sending a playing instruction through the communication module.
The monitoring system of the driver's state provided by the present embodiment will be described in detail below with reference to the drawings. Please refer to fig. 3, which is a schematic structural diagram of a monitoring system for driver status in an embodiment. As shown in fig. 3, the driver state monitoring system 3 includes: a communication module 31, an analysis module 32 and a selection module 33.
The communication module 31 is used for receiving basic parameters for analyzing the emotion of the driver, which are collected by the information collecting equipment. In the present embodiment, the basic parameters include facial information, eyeball information, and heartbeat and pulse data of the driver. The multimedia acquisition device 111 is used for acquiring facial features including facial information and eyeball information of a driver, and the physiological monitoring device 112 is used for acquiring heartbeat and pulse data of the driver.
Specifically, the face information includes information such as the direction and magnitude of change in the driver's mouth angle, the degree of opening of the driver's mouth, whether the face has tears, the frequency and magnitude of nodding, and the frequency of yawning. The eyeball information comprises information such as pupil state information, eye movement, eye closing time length, blinking frequency and the like.
An analysis module 32 coupled to the communication module 31 is configured to analyze the base parameters to identify the mood and/or fatigue of the driver.
In this embodiment, the analysis module 32 is specifically configured to analyze facial information and eyeball information, and if the pupil state information indicates that a current pupil is smaller than a preset pupil and the facial information indicates that a mouth angle change direction is a downward change, recognize that a current emotion of a driver is sad; wherein, the preset pupil is the size of the pupil of a human under a normal state; analyzing the face information and the eyeball information, and if the pupil state information indicates that the current pupil is larger than the preset pupil and the face information indicates that the mouth angle change direction changes upwards, recognizing the current emotion of the driver as happy; analyzing the facial information and the eyeball information, and if the eye movement is used as squinting and the facial information indicates that the change direction of the mouth angle is upward change, recognizing that the current emotion of the driver is happy; analyzing facial information and eyeball information, and if the eye movement is to open the eyes and the facial information indicates that the mouth of the user is open, recognizing that the current emotion of the driver is surprise; analyzing the face information and eyeball information, and if the eye movement is used as eye closing and the face information indicates that the mouth angle change direction is downward change, identifying that the current emotion of the driver is sadness; analyzing facial information and eyeball information, and if the pupil state information indicates that the current pupil is larger than a preset pupil, the eyes act as wide eyes, and the facial information indicates that the mouth angle change direction changes downwards, recognizing that the current emotion of the driver is angry; analyzing eyeball information, and if the face of the driver has tear water, identifying that the current emotion of the driver is sad; analyzing eyeball information, and if the eye closing duration exceeds a preset eye closing duration threshold, identifying that the driver is in an exhausted state at present; and analyzing eyeball information, and if the blink frequency exceeds a preset blink frequency threshold value, identifying that the driver is in an exhausted state at present. Analyzing the facial information, and if the nodding frequency and the nodding amplitude exceed a preset nodding frequency threshold and a preset nodding amplitude threshold, identifying that the driver is in an exhausted state at present; or analyzing the facial information, and if the yawning frequency exceeds a preset yawning frequency threshold value, identifying that the driver is in the fatigue state currently.
In this embodiment, the analysis module 32 is further configured to compare the collected heartbeat and pulse data of the driver with pre-stored emotional characteristic data to identify the emotion of the driver.
Specifically, the analysis module 32 draws the acquired heartbeat and pulse data of the driver into a heartbeat and pulse feature graph of the current driver, identifies an emotion change rule from the heartbeat and pulse waveform graph through an emotion identification algorithm, compares the heartbeat and pulse feature graph with a pre-stored feature graph model, and determines the level of current emotion fluctuation of the driver when the matching degree of the heartbeat and pulse feature graph and the pre-stored feature graph model exceeds a preset value, wherein the more the emotion fluctuation is, the larger the emotion fluctuation is, the higher the level is, for example, the levels are sequentially from low to high, namely, a low level, a medium level and a high level. In this embodiment, the pre-stored feature pattern model may be one or more predetermined feature pattern models, such as a feature pattern model of happiness, anger, sadness, and happiness.
The selection module 33 coupled to the communication module 31 and the analysis module 32 is used for selecting a music type adapted to the emotion of the driver according to the emotion of the driver and sending a playing instruction to a speaker. In the present embodiment, the types of music of the mood include a mood, a cheerful mood, a brightness mood, a motion mood, and/or a lyric.
Specifically, the selecting module 33 is configured to select the music type as mild if it is identified that the current emotion of the driver is sad; if the current emotion of the driver is recognized as happy, selecting the music type as cheerful; if the current emotion of the driver is angry, selecting the music type as bright and fast; if the current emotion of the driver is identified as surprise, selecting the music type as cheerful or lyric; or if the driver is identified to be in the fatigue state currently, selecting the music type as dynamic.
It should be noted that the division of the modules of the monitoring system is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And the modules can be realized in a form that all software is called by the processing element, or in a form that all the modules are realized in a form that all the modules are called by the processing element, or in a form that part of the modules are called by the hardware. For example: the x module can be a processing element which is set up separately, and can also be integrated in a certain chip of the monitoring system. In addition, the x module may also be stored in the memory of the monitoring system in the form of program codes, and a certain processing element of the monitoring system calls and executes the functions of the x module. Other modules are implemented similarly. All or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software. These above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), one or more microprocessors (DSPs), one or more Field Programmable Gate Arrays (FPGAs), and the like. When a module is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. These modules may be integrated together and implemented in the form of a System-on-a-chip (SOC).
EXAMPLE III
Fig. 4 is a schematic diagram of a schematic structure of a vehicle-mounted terminal 4. As shown in fig. 4, the in-vehicle terminal 4 includes: a processor 41, a memory 42, a transceiver 43, a communication interface 44, or/and a system bus 45; the memory 42 and the communication interface 44 are connected to the processor 41 and the transceiver 43 through the system bus 45 to complete communication therebetween, the memory 42 is used for storing computer programs, the communication interface 44 is used for communicating with other devices, and the processor 41 and the transceiver 43 are used for running the computer programs, so that the vehicle-mounted terminal 4 performs the steps of the monitoring method for the driver state according to the first embodiment.
The above-mentioned system bus may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The system bus may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus. The communication interface is used for realizing communication between the database access device and other equipment (such as a client, a read-write library and a read-only library). The Memory may include a Random Access Memory (RAM), and may further include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory.
The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components.
The protection scope of the method for monitoring the driver state according to the present invention is not limited to the execution sequence of the steps illustrated in the embodiment, and all the solutions implemented by adding, subtracting, and replacing the steps in the prior art according to the principle of the present invention are included in the protection scope of the present invention.
The present invention also provides a driver state monitoring system, which can implement the driver state monitoring method of the present invention, but the implementation device of the driver state monitoring method of the present invention includes, but is not limited to, the structure of the driver state monitoring system listed in this embodiment, and all structural modifications and substitutions of the prior art made according to the principle of the present invention are included in the protection scope of the present invention.
In summary, the method and the system for monitoring the driver state, the readable storage medium and the vehicle-mounted terminal provided by the invention detect the facial expression of the driver by using the information acquisition equipment, and judge the emotion and fatigue condition of the driver in real time by identifying the facial expression, so that music which is beneficial to eliminating fatigue and improving emotion is automatically played, the driver can control the vehicle in a more comfortable and comfortable atmosphere, and the driving experience and the driving safety are improved. The invention effectively overcomes various defects in the prior art and has high industrial utilization value.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.
Claims (10)
1. A method of monitoring a driver's state, characterized by improving the driver's mood; the monitoring method for the driver state comprises the following steps:
receiving basic parameters for analyzing the emotion of the driver, which are acquired by information acquisition equipment;
analyzing the base parameters to identify the mood and/or fatigue level of the driver;
and selecting the music type suitable for the emotion according to the emotion of the driver, and sending a playing instruction.
2. The method for monitoring the state of the driver as claimed in claim 1, wherein the information collecting device comprises a multimedia collecting device installed in a vehicle cab and/or a physiological monitoring device worn on the body of the driver.
3. The method of monitoring the status of a driver as set forth in claim 2, wherein the basic parameters include facial information, eyeball information, and heartbeat and pulse data of the driver;
the multimedia acquisition equipment is used for acquiring facial features including facial information and eyeball information of a driver; the physiological monitoring equipment is used for acquiring heartbeat and pulse data of the driver.
4. The method for monitoring the state of the driver according to claim 3,
the face information comprises the change direction and amplitude of the mouth angle of the driver, the mouth opening degree of the driver, whether the face has tear water or not, nodding frequency and amplitude and yawning frequency;
the eyeball information comprises pupil state information, eye action, eye closing time length and blinking frequency.
5. The method for monitoring the status of the driver as recited in claim 4, wherein the step of analyzing the basic parameter to identify the emotion of the driver comprises:
if the pupil state information indicates that the current pupil is smaller than the preset pupil and the facial information indicates that the mouth angle change direction is downward change, identifying that the current emotion of the driver is sadness;
if the pupil state information indicates that the current pupil is larger than the preset pupil and the face information indicates that the mouth angle change direction changes upwards, recognizing that the current emotion of the driver is happy;
if the eye movement is used as squinting and the face information indicates that the change direction of the mouth angle is upward change, the current emotion of the driver is recognized as distraction;
if the eye movement is that eyes are wide open, and the facial information indicates that the mouth of the user is open, recognizing that the current emotion of the driver is surprise;
if the eye movement is used as eye closing and the face information indicates that the mouth angle change direction is downward change, identifying that the current emotion of the driver is sadness;
if the pupil state information indicates that the current pupil is larger than the preset pupil, the eyes act as wide eyes, and the facial information indicates that the mouth angle change direction changes downwards, the current emotion of the driver is identified as angry;
if the driver has tear on the face, the current emotion of the driver is identified as sadness
If the eye closing duration exceeds a preset eye closing duration threshold value, identifying that the driver is in an exhausted state currently;
and if the blink frequency exceeds a preset blink frequency threshold value, identifying that the driver is in the fatigue state currently.
If the frequency and the amplitude of the nodding head exceed a preset nodding head frequency threshold value and a preset nodding head amplitude threshold value, identifying that the driver is in an exhausted state at present; or
And if the yawning frequency exceeds a preset yawning frequency threshold value, identifying that the driver is in the fatigue state currently.
6. The method for monitoring the status of a driver as recited in claim 3, wherein the step of analyzing the base parameter to identify the mood of the driver further comprises:
and comparing the collected heartbeat and pulse data of the driver with prestored emotion characteristic data to identify the emotion of the driver.
7. The method for monitoring the status of the driver as recited in claim 5 or 6, wherein the type of music of the mood includes a delightful, cheerful, lively, animated and/or lyric; the step of selecting a music type adapted to the emotion according to the emotion of the driver includes:
if the current emotion of the driver is identified as sadness, selecting the music type as mild;
if the current emotion of the driver is recognized as happy, selecting the music type as cheerful;
if the current emotion of the driver is angry, selecting the music type as bright and fast;
if the current emotion of the driver is identified as surprise, selecting the music type as cheerful or lyric; or
And if the driver is identified to be in the fatigue state at present, selecting the music type as dynamic.
8. A monitoring system for driver status, characterized by being used for improving the mood of a driver; the monitoring system of the driver state includes:
the communication module is used for receiving basic parameters which are acquired by the information acquisition equipment and used for analyzing the emotion of the driver;
the analysis module is used for analyzing the basic parameters to identify the emotion and/or fatigue degree of the driver;
and the selection module is used for selecting the music type suitable for the emotion according to the emotion of the driver and sending a playing instruction through the communication module.
9. A readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a method for monitoring a driver's state according to any one of claims 1 to 7.
10. A vehicle-mounted terminal characterized by comprising: a processor and a memory;
the memory is used for storing a computer program, and the processor is used for executing the computer program stored by the memory so as to enable the vehicle-mounted terminal to execute the monitoring method for the driver state according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910520987.2A CN112172830A (en) | 2019-06-17 | 2019-06-17 | Driver state monitoring method and system, readable storage medium and vehicle-mounted terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910520987.2A CN112172830A (en) | 2019-06-17 | 2019-06-17 | Driver state monitoring method and system, readable storage medium and vehicle-mounted terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112172830A true CN112172830A (en) | 2021-01-05 |
Family
ID=73914235
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910520987.2A Pending CN112172830A (en) | 2019-06-17 | 2019-06-17 | Driver state monitoring method and system, readable storage medium and vehicle-mounted terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112172830A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112829754A (en) * | 2021-01-21 | 2021-05-25 | 浙江合众新能源汽车有限公司 | Vehicle-mounted intelligent robot and running method thereof |
CN113780062A (en) * | 2021-07-26 | 2021-12-10 | 岚图汽车科技有限公司 | Vehicle-mounted intelligent interaction method based on emotion recognition, storage medium and chip |
CN113771859A (en) * | 2021-08-31 | 2021-12-10 | 智新控制系统有限公司 | Intelligent driving intervention method, device and equipment and computer readable storage medium |
CN114132328A (en) * | 2021-12-10 | 2022-03-04 | 智己汽车科技有限公司 | Driving assistance system and method for automatically adjusting driving environment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103873512A (en) * | 2012-12-13 | 2014-06-18 | 深圳市赛格导航科技股份有限公司 | Method for vehicle-mounted wireless music transmission based on face recognition technology |
CN105426404A (en) * | 2015-10-28 | 2016-03-23 | 广东欧珀移动通信有限公司 | Music information recommendation method and apparatus, and terminal |
CN106652378A (en) * | 2015-11-02 | 2017-05-10 | 比亚迪股份有限公司 | Driving reminding method and system for vehicle, server and vehicle |
CN109815817A (en) * | 2018-12-24 | 2019-05-28 | 北京新能源汽车股份有限公司 | A kind of the Emotion identification method and music method for pushing of driver |
CN109849660A (en) * | 2019-01-29 | 2019-06-07 | 合肥革绿信息科技有限公司 | A kind of vehicle safety control system |
-
2019
- 2019-06-17 CN CN201910520987.2A patent/CN112172830A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103873512A (en) * | 2012-12-13 | 2014-06-18 | 深圳市赛格导航科技股份有限公司 | Method for vehicle-mounted wireless music transmission based on face recognition technology |
CN105426404A (en) * | 2015-10-28 | 2016-03-23 | 广东欧珀移动通信有限公司 | Music information recommendation method and apparatus, and terminal |
CN106652378A (en) * | 2015-11-02 | 2017-05-10 | 比亚迪股份有限公司 | Driving reminding method and system for vehicle, server and vehicle |
CN109815817A (en) * | 2018-12-24 | 2019-05-28 | 北京新能源汽车股份有限公司 | A kind of the Emotion identification method and music method for pushing of driver |
CN109849660A (en) * | 2019-01-29 | 2019-06-07 | 合肥革绿信息科技有限公司 | A kind of vehicle safety control system |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112829754A (en) * | 2021-01-21 | 2021-05-25 | 浙江合众新能源汽车有限公司 | Vehicle-mounted intelligent robot and running method thereof |
CN113780062A (en) * | 2021-07-26 | 2021-12-10 | 岚图汽车科技有限公司 | Vehicle-mounted intelligent interaction method based on emotion recognition, storage medium and chip |
CN113771859A (en) * | 2021-08-31 | 2021-12-10 | 智新控制系统有限公司 | Intelligent driving intervention method, device and equipment and computer readable storage medium |
CN113771859B (en) * | 2021-08-31 | 2024-01-26 | 智新控制系统有限公司 | Intelligent driving intervention method, device, equipment and computer readable storage medium |
CN114132328A (en) * | 2021-12-10 | 2022-03-04 | 智己汽车科技有限公司 | Driving assistance system and method for automatically adjusting driving environment and storage medium |
CN114132328B (en) * | 2021-12-10 | 2024-05-14 | 智己汽车科技有限公司 | Auxiliary driving system and method for automatically adjusting driving environment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112172830A (en) | Driver state monitoring method and system, readable storage medium and vehicle-mounted terminal | |
CN106803423B (en) | Man-machine interaction voice control method and device based on user emotion state and vehicle | |
CN106682602B (en) | Driver behavior identification method and terminal | |
TWI603213B (en) | Method for selecting music based on face recognition, music selecting system and electronic apparatus | |
CN105488957B (en) | Method for detecting fatigue driving and device | |
Nguyen et al. | Eye tracking system to detect driver drowsiness | |
CN106361356A (en) | Emotion monitoring and early warning method and system | |
EP3618063B1 (en) | Voice interaction system, voice interaction method and corresponding program | |
CN110395260B (en) | Vehicle, safe driving method and device | |
CN109446375A (en) | Adaptive method for playing music and system applied to vehicle | |
KR20090091335A (en) | Awake state judging model making device, awake state judging device, and warning device | |
JP2007312824A (en) | Instrument for classifying blinking data, arousal evaluation instrument and arousal interpretation instrument | |
CN114648354A (en) | Advertisement evaluation method and system based on eye movement tracking and emotional state | |
CN109646024A (en) | Method for detecting fatigue driving, device and computer readable storage medium | |
CN113780062A (en) | Vehicle-mounted intelligent interaction method based on emotion recognition, storage medium and chip | |
CN110751381A (en) | Road rage vehicle risk assessment and prevention and control method | |
CN109949438A (en) | Abnormal driving monitoring model method for building up, device and storage medium | |
CN109801475A (en) | Method for detecting fatigue driving, device and computer readable storage medium | |
CN109276243A (en) | Brain electricity psychological test method and terminal device | |
CN116386277A (en) | Fatigue driving detection method and device, electronic equipment and medium | |
CN112820072A (en) | Dangerous driving early warning method and device, computer equipment and storage medium | |
CN114821796A (en) | Dangerous driving behavior recognition method, device, equipment and storage medium | |
CN111976732A (en) | Vehicle control method and system based on vehicle owner emotion and vehicle-mounted terminal | |
CN110956780A (en) | Fatigue driving reminding method and device and vehicle | |
Zhu et al. | Heavy truck driver's drowsiness detection method using wearable eeg based on convolution neural network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210105 |
|
RJ01 | Rejection of invention patent application after publication |