WO2022226799A1 - 疲劳驾驶监控方法、装置、交通设备及存储介质 - Google Patents

疲劳驾驶监控方法、装置、交通设备及存储介质 Download PDF

Info

Publication number
WO2022226799A1
WO2022226799A1 PCT/CN2021/090319 CN2021090319W WO2022226799A1 WO 2022226799 A1 WO2022226799 A1 WO 2022226799A1 CN 2021090319 W CN2021090319 W CN 2021090319W WO 2022226799 A1 WO2022226799 A1 WO 2022226799A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
acceleration
score
stress
level
Prior art date
Application number
PCT/CN2021/090319
Other languages
English (en)
French (fr)
Inventor
高杨
陈超
徐吉睿
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2021/090319 priority Critical patent/WO2022226799A1/zh
Priority to CN202180087989.2A priority patent/CN116724339A/zh
Publication of WO2022226799A1 publication Critical patent/WO2022226799A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Definitions

  • the present application relates to the technical field of traffic safety, and in particular, to a fatigue driving monitoring method, device, traffic equipment and storage medium.
  • fatigue driving behavior is generally judged by analyzing the driver's image, performing head pose estimation, eye closure detection, and mouth closure detection. Due to the dependence on the detection results of the human face or human eyes, once the driver's face or human eye information is not visible, such as when the driver wears opaque sunglasses and cannot observe the eyes, the results of fatigue driving detection are not accurate enough.
  • the present application provides a fatigue driving monitoring method, device, traffic equipment and storage medium, so as to improve the accuracy of fatigue driving detection.
  • the present application provides a fatigue driving monitoring method, including:
  • controlling the electronic device to output a stimulation signal that stimulates the driver, and obtain stress information that the driver generates a stress response based on the stimulation signal;
  • the current fatigue level of the driver is determined.
  • the present application further provides a fatigue driving monitoring device, the fatigue driving monitoring device comprising a memory and a processor;
  • the memory is used to store computer programs
  • the processor is configured to execute the computer program and implement the following steps when executing the computer program:
  • controlling the electronic device to output a stimulation signal that stimulates the driver, and obtain stress information that the driver generates a stress response based on the stimulation signal;
  • the current fatigue level of the driver is determined.
  • the present application further provides a traffic device, wherein the remote controller includes the above-mentioned fatigue driving monitoring device.
  • the present application further provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor implements the above-mentioned fatigue driving monitoring method.
  • the fatigue driving monitoring method, device, traffic equipment and storage medium disclosed in the present application acquire the driver's current face information and control the electronic device to output a stimulus signal to stimulate the driver, so as to obtain the driver's stress response based on the stimulus signal. Stress information, the stress information corresponding to the driver under different fatigue levels is different. Combining the face information and stress information to determine the driver's current fatigue level, compared to relying on the face or human eye detection As a result, the accuracy of fatigue driving detection is improved.
  • FIG. 1 is a schematic structural diagram of a traffic device provided by an embodiment of the present application.
  • FIG. 2 is a schematic flowchart of steps of a fatigue driving monitoring method provided by an embodiment of the present application
  • FIG. 3 is a schematic flowchart of steps of another fatigue driving monitoring method provided by an embodiment of the present application.
  • FIG. 4 is a schematic flowchart of steps for determining the current degree of fatigue of the driver according to the face information, the stress information and the body state information provided by an embodiment of the present application;
  • FIG. 5 is a schematic flowchart of steps of another fatigue driving monitoring method provided by an embodiment of the present application.
  • FIG. 6 is a schematic block diagram of a fatigue driving monitoring device provided by an embodiment of the present application.
  • Embodiments of the present application provide a fatigue driving monitoring method, device, traffic equipment, and storage medium, which are used to improve the accuracy of fatigue driving detection.
  • FIG. 1 is a schematic structural diagram of a transportation device provided by an embodiment of the present application.
  • the transportation equipment 1000 may include a power plant 100 and a fatigue driving monitoring device 200 .
  • the power device 100 is used to drive the traction transportation equipment 1000 to move, and the fatigue driving monitoring device 200 is used to detect the fatigue level of the driver of the transportation equipment 1000 .
  • the transportation device 1000 may include automobiles, trams, trucks, and the like. Certainly, the transportation device 1000 may also be other types of mobile transportation vehicles, such as ships, and the embodiment of the present application is not limited thereto.
  • the fatigue driving monitoring device 200 is used to obtain the current face information of the driver, and control electronic devices such as smart bracelets to output stimulation signals that stimulate the driver, obtain the stress information that the driver generates a stress response based on the stimulation signals, and convert the face to the driver.
  • the information and stress information are combined to determine the driver's current fatigue level, thus improving the accuracy of fatigue driving detection.
  • the fatigue driving monitoring method provided by the embodiments of the present application will be described in detail below based on the traffic equipment 1000 and the fatigue driving monitoring apparatus 200 . It should be noted that the traffic equipment 1000 and the fatigue driving monitoring device 200 in FIG. 1 are only used to explain the fatigue driving monitoring method provided by the embodiment of the present application, but do not constitute a limitation on the application scenarios of the fatigue driving monitoring method provided by the embodiment of the present application. .
  • FIG. 2 is a schematic flowchart of a fatigue driving monitoring method provided by an embodiment of the present application.
  • the method can be used in the traffic equipment provided in the above-mentioned embodiment, and can also be used in other equipment including a fatigue driving monitoring device, and the application scenario of the method is not limited in this application. Based on the fatigue driving monitoring method, the accuracy of fatigue driving detection can be improved, thereby improving driving safety.
  • the fatigue driving monitoring method specifically includes steps S101 to S103 .
  • a camera device is installed in front of the driver, and the camera device can collect the driver's face image and/or video, and obtain the driver's face information by performing a face information extraction operation on the face image and/or video.
  • the face information includes, but is not limited to, the driver's eye information, head posture information, mouth information, and the like.
  • S102 Control the electronic device to output a stimulation signal that stimulates the driver, and acquire stress information that the driver generates a stress response based on the stimulation signal.
  • the electronic device is controlled to output a stimulation signal, wherein the electronic device includes but does not Limited to wearable devices such as smart bracelets and smart watches, the driver will generate a stress response based on the stimulus signal, and obtain the stress information corresponding to the driver's stress response.
  • the stress information includes at least one of stress response time and stress response intensity.
  • controlling the electronic device to output the stimulation signal to stimulate the driver may include: sending a control signal to the electronic device, so that the electronic device outputs the stimulation signal when receiving the control signal.
  • a control signal is sent to the electronic device based on the communication channel established with the electronic device, and when the electronic device receives the control signal, it outputs a corresponding stimulation signal.
  • the driver will produce a stress response to the stimulus signal, and the electronic device can collect the stress information corresponding to the driver's stress response, and obtain the stress information collected by the electronic device.
  • the control signal is transmitted to the smart bracelet worn by the driver through WiFi, Bluetooth and other wireless communication methods.
  • the smart bracelet After receiving the control signal, the smart bracelet outputs vibration and other stimulation signals, such as the smart bracelet for high frequency, low peak and short time. shock.
  • the smart bracelet collects the stress response intensity of the driver's stress response to the stimulus signal (the peak value of the stress response after being stimulated), the stress response time (the time t from being stimulated to the response reaching the peak value max), etc. Stress information, feedback the stress information, and receive the stress information fed back by the smart bracelet.
  • sending the control signal to the electronic device may include: periodically sending the control signal to the electronic device according to a preset cycle time.
  • the preset cycle time is 5 minutes.
  • a control signal is sent to the electronic device every 5 minutes.
  • the electronic device receives the control signal, it outputs vibration and other stimulation signals, and collects driving signals.
  • the stress information of the stress response is generated based on the stimulus signal. It can be understood that the cycle time can be flexibly set according to the actual situation, which is not limited here.
  • the timing can be started after the driving is started. , periodically send control signals to electronic devices such as smart bracelets worn by drivers.
  • the preset duration can be flexibly set according to the actual situation, for example, it can be set to 1 hour, or it can be other duration, which is not limited here.
  • the current fatigue level of the driver is determined by combining the face information and the stress information.
  • the key points of the human eye are detected, and the opening degree of the human eye is calculated according to the key points of the human eye, and the calculation method is as follows:
  • left ratio is the opening of the left eye
  • left top is the highest position of the left eye
  • left down is the lowest position of the left eye
  • right ratio is the opening of the right eye
  • right top is the highest position of the right eye.
  • right down is the lowest position where the right eye is opened
  • abs is the absolute value of the difference between the two.
  • the score corresponding to the stress information is determined according to stress information such as stress response intensity and stress response time, wherein different stress response intensity and stress response time correspond to different score values.
  • the obtained perclos score and the score corresponding to the stress information are weighted and summed to obtain a weighted score.
  • the mapping relationship between the preset fatigue degree and the score is queried, and the fatigue degree corresponding to the weighted score is determined. That is, the driver's face information and stress information are integrated to determine the driver's fatigue level.
  • step S104 may be included before step S103 , and step S103 may include sub-step S1031 .
  • the driver's body state information includes at least one of heart rate information, IMU (Inertial Measurement Unit, inertial measurement unit) information, and temperature information. It can be understood that the body state information may also include other information besides heart rate information, IMU information, and temperature information.
  • IMU Inertial Measurement Unit, inertial measurement unit
  • acquiring the body state information of the driver may include: receiving the body state information collected by a sensor, wherein the sensor includes at least one of a pulse sensor, an IMU sensor, and a temperature sensor.
  • each sensor used to collect the driver's body state information such as a pulse sensor, an IMU sensor, and a temperature sensor
  • a wearable device such as a smart bracelet, a smart watch, or the like
  • the driver's body in contact with the driver. cabin part For example, it can be installed inside the steering wheel of the car through an embedded method.
  • the driver holds the steering wheel, he can monitor and collect various body state information such as the driver's heart rate information, IMU information, temperature information, etc.;
  • the position of the backrest of the cabin can directly contact the driver's cervical vertebrae to monitor and collect various types of physical status information of the driver.
  • S1031. Determine the current fatigue level of the driver according to the face information, the stress information and the body state information.
  • the physical state information of people when they are awake and when they are tired will also be different. For example, taking heart rate information as an example, when a person is awake, the general heart rate is between 60-100, while in the sleep state , the heart rate is between 40-60. Therefore, the obtained face information and stress information are combined with the body state information to determine the current fatigue level of the driver, which can further improve the accuracy of fatigue driving detection.
  • the step S1031 may include sub-steps S10311 to S10315.
  • the score obtained by the perclos calculation is referred to as the first score score_1 below.
  • obtaining the second score corresponding to the body state information may include: determining the heart rate corresponding to the heart rate information according to a preset mapping relationship between the heart rate and the quantization level level; according to the preset mapping relationship between the heart rate level and the score, the score corresponding to the heart rate level is determined as the second score.
  • heart rate heart_rate exemplary, the preset mapping relationship between heart rate heart_rate and quantization level grade (heart rate level) is as follows:
  • mapping relationship between the heart rate level grade and the score is as follows:
  • the mapping relationship between the preset heart rate heart_rate and the quantization level grade After obtaining the driver's heart rate information, query the mapping relationship between the preset heart rate heart_rate and the quantization level grade, and determine which of the levels 1 to 6 in the mapping relationship corresponds to the driver's heart rate information. For example, if the obtained heart rate information of the driver is 63, the heart rate grade is determined to be grade 2. After that, it is substituted into the mapping relationship between the above-mentioned heart rate level grade and the score value, and the corresponding score value is determined to be 0.1, and the obtained score value of 0.1 is used as the second score value score_2.
  • performing quantitative and grading processing on the body state information, and obtaining the second score corresponding to the body state information may include: obtaining three scores according to the IMU information the corresponding first acceleration, second acceleration and third acceleration in the direction; calculate the sum of the absolute values of the first acceleration, the second acceleration and the third acceleration, and obtain the acceleration sum value; according to the preset acceleration
  • the IMU level corresponding to the acceleration and the value is determined according to the mapping relationship with the quantization level; according to the preset mapping relationship between the IMU level and the score, the score corresponding to the IMU level is determined as the second score.
  • the method may include: comparing the first acceleration, the second acceleration and the third acceleration Perform correction processing on three accelerations to obtain a first corrected acceleration, a second corrected acceleration and a third corrected acceleration; the calculation of the sum of the absolute values of the first acceleration, the second acceleration and the third acceleration to obtain the acceleration
  • the sum value may include: determining the sum of absolute values of the first correction acceleration, the second correction acceleration and the third correction acceleration as the acceleration sum value.
  • the preset mapping relationship between the acceleration r_all and the quantization level grade (IMU level) is as follows:
  • mapping relationship between the IMU level grade and the score is as follows:
  • the mapping relationship between the preset acceleration r_all and the quantization level grade After calculating the acceleration and the value r_all, query the mapping relationship between the preset acceleration r_all and the quantization level grade, and determine which level in the mapping relationship between the acceleration and the value r_all corresponds to grades 1 to 6. For example, if the acquired acceleration sum value r_all is 0.44, the heart rate level is determined to be level 4. After that, it is substituted into the above-mentioned mapping relationship between the heart rate level grade and the score value, and the corresponding score value is determined to be 0.8, and the obtained score value of 0.8 is used as the second score value score_2.
  • performing quantitative and grading processing on the body state information, and obtaining a second score corresponding to the body state information may include: according to a preset temperature and quantization level A mapping relationship is used to determine the temperature level corresponding to the temperature information; according to a preset mapping relationship between the temperature level and the score, the score corresponding to the temperature level is determined as the second score.
  • mapping relationship between the preset temperature temperature and the quantization level (temperature level) is as follows:
  • mapping relationship between the temperature level grade and the score is as follows:
  • the mapping relationship between the preset temperature and the quantization level After obtaining the temperature of the driver, query the mapping relationship between the preset temperature and the quantization level, and determine which level in the mapping relationship corresponds to the temperature of the driver. For example, if the temperature of the driver is obtained as 36.6, the heart rate grade is determined to be grade 2. After that, it is substituted into the above-mentioned mapping relationship between the temperature level grade and the score value, and the corresponding score value is determined to be 0.4, and the obtained score value of 0.4 is used as the second score value score_2.
  • the corresponding multiple score values are determined according to the multiple types of body state information. Then, a second score score_2 is determined according to the plurality of score values. Exemplarily, a weighted sum calculation is performed on multiple scores to obtain a second score score_2.
  • the body state information includes heart rate information, IMU information, and temperature information
  • the corresponding three scores are determined according to the heart rate information, IMU information, and temperature information, respectively.
  • the three scores score heart , score r , and score tem are weighted and summed, and the second score score_2 is obtained as:
  • the stress information includes stress response time and stress response intensity, and a corresponding third score score_3 is obtained according to the stress response time and stress response intensity.
  • performing quantitative and grading processing on the stress information, and obtaining a third score corresponding to the stress information may include: determining the stress according to a preset mapping relationship between reaction times and quantification levels The stress level corresponding to the reaction time; according to the mapping relationship between the preset response intensity and the quantification coefficient, the stress coefficient corresponding to the stress response intensity is determined; according to the stress level and the stress coefficient, the stress coefficient is determined; third score.
  • the stress response time t of the driver's stress response and the measured values a and b corresponding to the three-axis directions of the IMU sensor at time t are obtained. , c.
  • the preset mapping relationship between the reaction time t and the quantification level (stress level) is as follows:
  • the obtained stress response time t of the driver's stress response query the mapping relationship between the preset response time t and the quantified grade, and determine the driver's stress response time t corresponding to levels 1 to 6 in the mapping relationship which level. For example, if the obtained stress response time t is 0.5, the stress level grade is determined to be from.
  • mapping relationship between the response intensities a, b, and c and the quantization coefficient alpha is preset as follows:
  • the mapping relationship with the quantization coefficient alpha determines the stress coefficient alpha corresponding to the measured values a, b, and c corresponding to the three-axis directions of the driver at time t. For example, if the calculated value of
  • the corresponding third score score_3 is determined.
  • the third score score_3 is obtained by calculation.
  • the obtained grade is level 3, and the alpha is 0.6.
  • Substitute into the formula score_3 (1/grade)*alpha, and calculate the third score score_3 to be 0.2.
  • the stress signal of the driver's stress response is not limited to the IMU signal, and the stress information may also be a measurement value obtained by collecting other physiological signals.
  • the first score score_1, the second score score_2 and the third score score_3 are weighted and calculated to obtain a comprehensive score score is: in, is the corresponding weighting coefficient, and its specific value can be flexibly set according to the actual situation, which is not specifically limited here.
  • ⁇ , ⁇ , ⁇ , and ⁇ are the corresponding weighting coefficients. For example, if ⁇ , ⁇ , ⁇ , ⁇ , and ⁇ are set to be 0.4, 0.2, 0.2, 0.1, and 0.1, respectively, the weighted sum calculation obtains the score as:
  • the mapping relationship between the fatigue level and the score includes, but is not limited to, a mapping table.
  • the mapping relationship between the fatigue degree and the score is preset as shown in Table 1.
  • the calculated score is 0.3
  • step S105 may be included after step S103 .
  • the alarm information includes at least one of vibration prompt information and voice prompt information. That is, according to the fatigue level of the driver, different vibration prompts and/or voice prompts are performed. For example, the audio equipment in the car center console is used for voice prompts.
  • outputting corresponding alarm information may include: sending an alarm instruction to a wearable device, so that the wearable device device outputs the alarm information according to the alarm instruction.
  • corresponding alarm instructions are sent to wearable devices such as smart bracelets and smart watches, and the wearable devices give corresponding vibration prompts and/or voice prompts.
  • different degrees of fatigue correspond to different prompt parameters of the vibration prompt information and/or voice prompt information, wherein the prompt parameters include but are not limited to vibration frequency, vibration amplitude, vibration duration, voice volume, and the like.
  • the smart bracelet (watch) is controlled to vibrate in a short-term and large-scale vibration mode.
  • the above embodiment obtains the driver's current face information, and controls the electronic device to output a stimulus signal that stimulates the driver, and obtains the driver's stress information based on the stimulus signal to generate a stress response.
  • the excitation information is different.
  • FIG. 6 is a schematic block diagram of a fatigue driving monitoring device provided by an embodiment of the present application.
  • the fatigue driving monitoring device 200 may include a processor 211 and a memory 212, and the processor 211 and the memory 212 are connected through a bus, such as an I2C (Inter-integrated Circuit) bus.
  • I2C Inter-integrated Circuit
  • the processor 211 may be a micro-controller unit (Micro-controller Unit, MCU), a central processing unit (Central Processing Unit, CPU), or a digital signal processor (Digital Signal Processor, DSP) or the like.
  • MCU Micro-controller Unit
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • the memory 212 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) magnetic disk, an optical disk, a U disk, a mobile hard disk, and the like.
  • Various computer programs to be executed by the processor 211 are stored in the memory 212 .
  • the processor is used for running the computer program stored in the memory, and implements the following steps when executing the computer program:
  • controlling the electronic device to output a stimulation signal that stimulates the driver, and obtain stress information that the driver generates a stress response based on the stimulation signal;
  • the current fatigue level of the driver is determined.
  • the stress information includes at least one of stress response time and stress response intensity.
  • the processor when the processor implements the control electronic device to output a stimulation signal that stimulates the driver, the processor specifically implements:
  • the processor when the processor implements the sending of the control signal to the electronic device, the processor specifically implements:
  • the control signal is periodically sent to the electronic device.
  • the processor before implementing the determining the current fatigue level of the driver according to the face information and the stress information, the processor further implements:
  • the processor determines the current fatigue level of the driver according to the face information and the stress information
  • the processor specifically implements:
  • the current fatigue level of the driver is determined according to the face information, the stress information and the body state information.
  • the body state information includes at least one of heart rate information, IMU information, and temperature information.
  • the processor when the processor implements the acquiring of the driver's body state information, the processor specifically implements:
  • the body state information collected by a sensor is received, wherein the sensor includes at least one of a pulse sensor, an IMU sensor, and a temperature sensor.
  • the senor is provided in the wearable device, or the sensor is provided in the cockpit part of the driver's body contact.
  • the processor when the processor determines the current fatigue level of the driver according to the face information, the stress information and the body state information, the processor specifically implements:
  • the fatigue degree corresponding to the comprehensive score is determined according to the preset mapping relationship between the fatigue degree and the score.
  • the body state information includes heart rate information
  • the processor when the processor implements the quantifying and grading processing of the body state information to obtain the second score corresponding to the body state information, the processor specifically implements :
  • the score corresponding to the heart rate level is determined as the second score.
  • the body state information includes IMU information
  • the processor when the processor implements the quantifying and grading processing of the body state information to obtain the second score corresponding to the body state information, the processor specifically implements :
  • the score corresponding to the IMU level is determined as the second score.
  • the processor after obtaining the first acceleration, the second acceleration and the third acceleration corresponding to the three directions according to the IMU information, the processor further implements:
  • the processor When the processor implements the calculation of the sum of the absolute values of the first acceleration, the second acceleration and the third acceleration, and obtains the acceleration sum value, the processor specifically implements:
  • the sum of absolute values of the first correction acceleration, the second correction acceleration and the third correction acceleration is determined as the acceleration sum value.
  • the body state information includes temperature information
  • the processor when the processor implements the quantification and grading process on the body state information to obtain the second score corresponding to the body state information, the processor specifically implements :
  • the score corresponding to the temperature level is determined as the second score.
  • the stress information includes stress response time and stress response intensity
  • the processor obtains the first step corresponding to the stress information when the processor performs the quantitative and grading processing on the stress information.
  • the third score is determined according to the stress level and the stress coefficient.
  • the processor after implementing the determining the current fatigue level of the driver, the processor further implements:
  • the processor when implementing the alarm information corresponding to the output, the processor specifically implements:
  • the alarm information includes at least one of vibration prompt information and voice prompt information.
  • different degrees of fatigue correspond to different prompt parameters of the vibration prompt information and/or the voice prompt information
  • the prompt parameters include vibration amplitude, vibration duration, and voice volume.
  • the embodiments of the present application further provide a computer-readable storage medium, where the computer-readable storage medium stores a computer program, the computer program includes program instructions, and the processor executes the program instructions to implement the present application The steps of the fatigue driving monitoring method provided by the embodiment.
  • the computer-readable storage medium may be the internal storage unit of the traffic equipment or the fatigue driving monitoring device described in the foregoing embodiments, such as a hard disk or memory of the traffic device or the fatigue driving monitoring device.
  • the computer-readable storage medium may also be an external storage device of the traffic equipment or the fatigue driving monitoring device, such as a plug-in hard disk, a smart memory card (Smart Media Card, SMC), Secure Digital (SD) card, Flash Card (Flash Card), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

一种疲劳驾驶监控方法、装置、交通设备及存储介质,该方法包括:获取驾驶员当前的人脸信息(S101);控制电子设备输出刺激所述驾驶员的刺激信号,并获取所述驾驶员基于所述刺激信号产生应激反应的应激信息(S102);根据所述人脸信息和所述应激信息,确定所述驾驶员当前的疲劳程度(S103)。

Description

疲劳驾驶监控方法、装置、交通设备及存储介质 技术领域
本申请涉及交通安全技术领域,尤其涉及一种疲劳驾驶监控方法、装置、交通设备及存储介质。
背景技术
现今,车辆已经成为人们出行中必不可少的交通工具,安全驾驶尤其重要,当驾驶员处于疲劳状态时,对周围环境的感知能力、形势判断能力和对车辆的操控能力都有不同程度的下降,容易发生交通事故。因此,及时发现驾驶员疲劳驾驶有着非同一般的重要意义。
目前,一般是通过对驾驶员的图像进行分析,进行头部姿态估计,人眼闭合检测和嘴巴闭合检测等,进行疲劳驾驶行为的判断。由于依赖于人脸或人眼的检测结果,一旦驾驶员的人脸或人眼信息不可观时,比如驾驶员佩戴不透光墨镜,观测不到眼睛时,疲劳驾驶检测的结果就不够准确。
因此,如何提高疲劳驾驶检测的准确性成为亟待解决的问题。
发明内容
基于此,本申请提供了一种疲劳驾驶监控方法、装置、交通设备及存储介质,以实现提高疲劳驾驶检测的准确性。
第一方面,本申请提供了一种疲劳驾驶监控方法,包括:
获取驾驶员当前的人脸信息;
控制电子设备输出刺激所述驾驶员的刺激信号,并获取所述驾驶员基于所述刺激信号产生应激反应的应激信息;
根据所述人脸信息和所述应激信息,确定所述驾驶员当前的疲劳程度。
第二方面,本申请还提供了一种疲劳驾驶监控装置,所述疲劳驾驶监控装置包括存储器和处理器;
所述存储器用于存储计算机程序;
所述处理器,用于执行所述计算机程序并在执行所述计算机程序时,实现如下步骤:
获取驾驶员当前的人脸信息;
控制电子设备输出刺激所述驾驶员的刺激信号,并获取所述驾驶员基于所述刺激信号产生应激反应的应激信息;
根据所述人脸信息和所述应激信息,确定所述驾驶员当前的疲劳程度。
第三方面,本申请还提供了一种交通设备,所述遥控器包括如上述的疲劳驾驶监控装置。
第四方面,本申请还提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如上述的疲劳驾驶监控方法。
本申请公开的疲劳驾驶监控方法、装置、交通设备及存储介质,通过获取驾驶员当前的人脸信息,并控制电子设备输出刺激驾驶员的刺激信号,获取驾驶员基于刺激信号产生应激反应的应激信息,驾驶员在不同疲劳程度下对应的应激信息是不同的,将人脸信息和应激信息结合起来确定驾驶员当前的疲劳程度,相比于依赖于人脸或人眼的检测结果,提高了疲劳驾驶检测的准确性。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本申请。
附图说明
为了更清楚地说明本申请实施例技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的一种交通设备的结构示意图;
图2是本申请实施例提供的一种疲劳驾驶监控方法的步骤示意流程图;
图3是本申请实施例提供的另一种疲劳驾驶监控方法的步骤示意流程图;
图4是本申请实施例提供的一种根据所述人脸信息、所述应激信息和所述身体状态信息,确定所述驾驶员当前的疲劳程度的步骤示意流程图;
图5是本申请实施例提供的另一种疲劳驾驶监控方法的步骤示意流程图;
图6是本申请实施例提供的一种疲劳驾驶监控装置的示意性框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
附图中所示的流程图仅是示例说明,不是必须包括所有的内容和操作/步骤,也不是必须按所描述的顺序执行。例如,有的操作/步骤还可以分解、组合或部分合并,因此实际执行的顺序有可能根据实际情况改变。
应当理解,在此本申请说明书中所使用的术语仅仅是出于描述特定实施例的目的而并不意在限制本申请。如在本申请说明书和所附权利要求书中所使用的那样,除非上下文清楚地指明其它情况,否则单数形式的“一”、“一个”及“该”意在包括复数形式。
还应当进理解,在本申请说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。
下面结合附图,对本申请的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
本申请的实施例提供了一种疲劳驾驶监控方法、装置、交通设备及存储介质,用于实现提高疲劳驾驶检测的准确性。
请参阅图1,图1为本申请实施例提供的一种交通设备的结构示意图。如图1所示,交通设备1000可以包括动力装置100和疲劳驾驶监控装置200。其中,动力装置100用于驱动牵引交通设备1000移动,疲劳驾驶监控装置200用于检测交通设备1000的驾驶员的疲劳程度。
该交通设备1000可以包括汽车、电车、货车等。当然,交通设备1000还可以是其他类型的如船只等移动交通工具,本申请实施例不限于此。
疲劳驾驶监控装置200用于获取驾驶员当前的人脸信息,并控制智能手环 等电子设备输出刺激驾驶员的刺激信号,获取驾驶员基于刺激信号产生应激反应的应激信息,将人脸信息和应激信息结合起来确定驾驶员当前的疲劳程度,因此,提高了疲劳驾驶检测的准确性。
可以理解的是,上述对于交通设备1000各部件的命名仅仅出于标识的目的,并不因此对本申请实施例进行限制。
以下将基于交通设备1000、以及疲劳驾驶监控装置200对本申请的实施例提供的疲劳驾驶监控方法进行详细介绍。需知,图1中的交通设备1000、疲劳驾驶监控装置200仅用于解释本申请实施例提供的疲劳驾驶监控方法,但并不构成对本申请实施例提供的疲劳驾驶监控方法的应用场景的限定。
请参阅图2,图2是本申请的实施例提供的一种疲劳驾驶监控方法的示意流程图。该方法可以用于上述实施例提供的交通设备中,也可以用于其他包含有疲劳驾驶监控装置的设备中,本申请中对该方法的应用场景不做限定。基于该疲劳驾驶监控方法以实现提高疲劳驾驶检测的准确性,进而提高驾驶安全性。
如图2所示,该疲劳驾驶监控方法具体包括步骤S101至步骤S103。
S101、获取驾驶员当前的人脸信息。
示例性的,在驾驶员前方安装摄像装置,摄像装置可以采集驾驶员的人脸图像和/或视频,通过对人脸图像和/或视频进行人脸信息提取操作,获取驾驶员的人脸信息。示例性的,人脸信息中包括但不限于驾驶员的眼部信息、头部姿态信息、嘴部信息等。
S102、控制电子设备输出刺激所述驾驶员的刺激信号,并获取所述驾驶员基于所述刺激信号产生应激反应的应激信息。
一般,驾驶员在不同疲劳程度下对刺激所产生的应激反应程度是不同的,基于这点,除了获取驾驶员的人脸信息以外,控制电子设备输出刺激信号,其中,电子设备包括但不限于智能手环、智能手表等可穿戴设备,驾驶员基于刺激信号会产生应激反应,获取驾驶员应激反应对应的应激信息。其中,应激信息包括应激反应时间、应激反应强度中至少一种。
示例性的,控制电子设备输出刺激所述驾驶员的刺激信号可以包括:发送控制信号至所述电子设备,以供所述电子设备在接收到所述控制信号时,输出所述刺激信号。
通过与电子设备建立通信连接,如WiFi、蓝牙等无线通信连接。在驾驶员驾驶过程中,基于与电子设备建立的通信信道发送控制信号至电子设备,电子设备接收到该控制信号时,输出相应的刺激信号。驾驶员针对刺激信号会产生应激反应,电子设备可以采集驾驶员应激反应对应的应激信息,获取电子设备采集到的应激信息。
例如,通过WiFi、蓝牙等无线通讯方式传输控制信号至驾驶员佩戴的智能手环,智能手环在接收到该控制信号后,输出震动等刺激信号,如智能手环进行高频低峰值短时间震动。并且,智能手环采集驾驶员针对刺激信号产生应激反应的应激反应强度(受到刺激后应激反应的峰值max)、应激反应时间(从受到刺激到反应达到峰值max的时间t)等应激信息,并将应激信息进行反馈,接收获取智能手环反馈的应激信息。
在一些实施例中,发送控制信号至所述电子设备可以包括:按照预设的周期时间,周期性发送所述控制信号至所述电子设备。
例如,预先设置周期时间为5分钟,在驾驶过程中,根据该周期时间,每间隔5分钟向电子设备发送控制信号,每当电子设备接收到控制信号后,输出震动等刺激信号,并采集驾驶员基于刺激信号产生应激反应的应激信息。可以理解的是,该周期时间可根据实际情况进行灵活设置,在此不做限制。
进一步地,由于驾驶员驾驶过程中,驾驶员一般都是驾驶一段时间之后才会开始疲劳,因此,可以在启动驾驶后进行计时,当计时达到预设时长后,开启疲劳驾驶监控,根据周期时间,周期性发送控制信号至驾驶员佩戴的智能手环等电子设备。其中,预设时长可根据实际情况进行灵活设置,例如,设置为1小时,也可以为其他时长,在此不做限制。
S103、根据所述人脸信息和所述应激信息,确定所述驾驶员当前的疲劳程度。
获得驾驶员的人脸信息和基于刺激信号产生应激反应的应激信息后,结合人脸信息和应激信息,确定驾驶员当前的疲劳程度。
示例性的,根据获取的人脸信息,对人眼关键点进行检测,根据人眼关键点计算人眼的开度,计算方法如下所示:
left ratio=abs|left top-left down|
right ratio=abs|right top-right down|
其中,left ratio是左眼开度,left top是左眼睁开的最高位置,left down是左眼睁开的最低位置,right ratio是右眼开度,right top是右眼睁开的最高位置,right down是右眼睁开的最低位置,abs是两者之差的绝对值。
根据左右两只眼睛的开度left ratio、right ratio,计算perclos分值。在计算perclos时可以采用P70,P80,EM三种指标进行计算。
需要说明的是,上述是列举的其中一种计算perclos的方法,还可以采用其他的方法计算perclos,比如,采用CNN预测眼部状态进而计算perclos,或是,基于传统图像处理的方法进行眼部状态预测进而计算perclos,本实施例中对计算perclos的方法不作具体限定。
示例性的,根据应激反应强度、应激反应时间等应激信息,确定应激信息对应的分值,其中,不同应激反应强度、应激反应时间,对应不同分值。
之后,将获得的perclos分值和应激信息对应的分值进行加权求和,得到一个加权分值。根据该加权分值,查询预设的疲劳程度与分值的映射关系,确定该加权分值对应的疲劳程度。也即,实现综合驾驶员的人脸信息和应激信息,确定驾驶员的疲劳程度。
在一些实施例中,如图3所示,所述步骤S103之前可以包括步骤S104,步骤S103可以包括子步骤S1031。
S104、获取所述驾驶员的身体状态信息。
其中,驾驶员的身体状态信息包括心率信息、IMU(Inertial Measurement Unit,惯性测量单元)信息、温度信息中至少一种。可以理解的是,身体状态信息还可以包括除心率信息、IMU信息、温度信息以外的其他信息。
在一些实施例中,获取所述驾驶员的身体状态信息可以包括:接收传感器采集的所述身体状态信息,其中,所述传感器包括脉搏传感器、IMU传感器、温度传感器中至少一种。
示例性的,脉搏传感器、IMU传感器、温度传感器等用于采集驾驶员的身体状态信息的各传感器可以设置于智能手环、智能手表等可穿戴设备内,或设置于驾驶员的身体接触的驾驶舱部位。比如,通过内嵌的方式加装在汽车方向盘内部,当驾驶员手握方向盘时,就可以监控采集驾驶员的心率信息、IMU信 息、温度信息等各种身体状态信息;或是加装在驾驶舱靠背位置,可直接与驾驶员的颈椎部分接触,监控采集驾驶员的各类身体状态信息。
S1031、根据所述人脸信息、所述应激信息和所述身体状态信息,确定所述驾驶员当前的疲劳程度。
人在清醒情况下和疲劳情况下的身体状态信息也会有所不同,例如,以心率信息为例,人在处于清醒的正常情况下,一般心率在60-100之间,而处于睡眠状态下,心率在40-60之间。因此,将获得的人脸信息、应激信息,再结合上身体状态信息,来确定驾驶员当前的疲劳程度,可以进一步提高疲劳驾驶检测的准确性。
在一些实施例中,如图4所示,所述步骤S1031可以包括子步骤S10311至子步骤S10315。
S10311、根据所述人脸信息进行perclos计算,获得第一分值。
根据人脸信息进行perclos计算可参考上面实施例中所述,在此不再赘述。为了便于区分描述,下文将perclos计算获得的分值称为第一分值score_1。
S10312、对所述身体状态信息进行量化分级处理,获得所述身体状态信息对应的第二分值。
在一些实施例中,当身体状态信息包括心率信息时,获得所述身体状态信息对应的第二分值可以包括:根据预设的心率与量化级别的映射关系,确定所述心率信息对应的心率级别;根据预设的心率级别与分值的映射关系,将所述心率级别对应的分值确定为所述第二分值。
示例性的,预先设置心率heart_rate与量化级别grade(心率级别)的映射关系如下:
Figure PCTCN2021090319-appb-000001
心率级别grade与分值score的映射关系如下:
Figure PCTCN2021090319-appb-000002
在获得驾驶员的心率信息后,查询预设的心率heart_rate与量化级别grade的映射关系,确定驾驶员的心率信息对应映射关系中1至6级别里哪个级别。比如若获取驾驶员的心率信息为63,则确定心率级别grade为2级。之后,代入上述心率级别grade与分值score的映射关系中,确定对应的分值为0.1,将获得的分值0.1作为第二分值score_2。
在一些实施例中,当身体状态信息包括IMU信息时,对所述身体状态信息进行量化分级处理,获得所述身体状态信息对应的第二分值可以包括:根据所述IMU信息,获得三个方向上对应的第一加速度、第二加速度和第三加速度;计算所述第一加速度、所述第二加速度和所述第三加速度的绝对值之和,获得加速度和值;根据预设的加速度与量化级别的映射关系,确定所述加速度和值对应的IMU级别;根据预设的IMU级别与分值的映射关系,将所述IMU级别对应的分值确定为所述第二分值。
若三个方向上对应的第一加速度为r_x、第二加速度为r_y、第三加速度为r_z,计算第一加速度r_x、第二加速度r_y和第三加速度r_z的绝对值之和,获得加速度和值r_all为:r_all=abs|r_x|+abs|r_y|+abs|r_z|。
在一些实施例中,根据所述IMU信息,获得三个方向上对应的第一加速度、第二加速度和第三加速度之后可以包括:对所述第一加速度、所述第二加速度和所述第三加速度进行校正处理,获得第一校正加速度、第二校正加速度和第三校正加速度;所述计算所述第一加速度、所述第二加速度和所述第三加速度的绝对值之和,获得加速度和值可以包括:将所述第一校正加速度、所述第二校正加速度和所述第三校正加速度的绝对值之和,确定为所述加速度和值。
为了更进一步提高疲劳驾驶检测的准确性,对第一加速度r_x、第二加速度r_y和第三加速度r_z进行校正处理,除去重力加速度的影响,获得第一校正加速度r_x'、第二校正加速度r_y'和第三校正加速度r_z'。然后,计算第一校正加速度r_x'、第二校正加速度r_y'和第三校正加速度r_z'的绝对值之和,获得加速度和值r_all为:r_all=abs|r_x'|+abs|r_y'|+abs|r_z'|。
示例性的,预先设置加速度r_all与量化级别grade(IMU级别)的映射关系如下:
Figure PCTCN2021090319-appb-000003
IMU级别grade与分值score的映射关系如下:
Figure PCTCN2021090319-appb-000004
在计算获得加速度和值r_all后,查询预设的加速度r_all与量化级别grade的映射关系,确定加速度和值r_all对应映射关系中1至6级别里哪个级别。比如若获取加速度和值r_all为0.44,则确定心率级别grade为4级。之后,代入上述心率级别grade与分值score的映射关系中,确定对应的分值为0.8,将获得的分值0.8作为第二分值score_2。
在一些实施例中,当身体状态信息包括温度信息时,对所述身体状态信息进行量化分级处理,获得所述身体状态信息对应的第二分值可以包括:根据预设的温度与量化级别的映射关系,确定所述温度信息对应的温度级别;根据预设的温度级别与分值的映射关系,将所述温度级别对应的分值确定为所述第二分值。
示例性的,预先设置温度temperature与量化级别grade(温度级别)的映射关系如下:
Figure PCTCN2021090319-appb-000005
温度级别grade与分值score的映射关系如下:
Figure PCTCN2021090319-appb-000006
在获得驾驶员的温度temperature后,查询预设的温度temperature与量化级别grade的映射关系,确定驾驶员的温度temperature对应映射关系中1至4级别里哪个级别。比如若获取驾驶员的温度temperature为36.6,则确定心率级别grade为2级。之后,代入上述温度级别grade与分值score的映射关系中,确定对应的分值为0.4,将获得的分值0.4作为第二分值score_2。
在一些实施例中,当身体状态信息包括多种时,根据多种身体状态信息确定对应的多个分值。然后根据多个分值确定第二分值score_2。示例性的,将多个分值进行加权求和计算,获得第二分值score_2。
例如,当身体状态信息包括心率信息、IMU信息、温度信息时,则分别根据心率信息、IMU信息、温度信息确定对应的三个分值,若分别为score heart、score r、score tem,将这三个分值score heart、score r、score tem进行加权求和计算,获得第二分值score_2为:
Figure PCTCN2021090319-appb-000007
其中,
Figure PCTCN2021090319-appb-000008
为对应的加权系数,其具体数值可根据实际情况进行灵活设置,在此不作具体限制。
S10313、对所述应激信息进行量化分级处理,获得所述应激信息对应的第三分值。
示例性的,应激信息包括应激反应时间和应激反应强度,根据应激反应时间和应激反应强度,获得对应的第三分值score_3。
在一些实施例中,对所述应激信息进行量化分级处理,获得所述应激信息对应的第三分值可以包括:根据预设的反应时间与量化级别的映射关系,确定所述应激反应时间对应的应激级别;根据预设的反应强度与量化系数的映射关系,确定所述应激反应强度对应的应激系数;根据所述应激级别和所述应激系数,确定所述第三分值。
示例性的,以IMU信号为驾驶员应激反应的应激信号为例,获取驾驶员应激反应的应激反应时间t、以及IMU传感器在t时刻三轴方向上对应的测量值a、b、c。
示例性的,预先设置反应时间t与量化级别grade(应激级别)的映射关系如下:
Figure PCTCN2021090319-appb-000009
根据获取到的驾驶员应激反应的应激反应时间t,查询预设的反应时间t与量化级别grade映射关系,确定驾驶员应激反应的应激反应时间t对应映射关系中1至6级别里哪个级别。比如若获取应激反应时间t为0.5,则确定应激级别grade为从。
示例性的,预先设置反应强度a、b、c与量化系数alpha(应激系数)的映射关系如下:
Figure PCTCN2021090319-appb-000010
根据获取到的驾驶员在t时刻三轴方向上对应的测量值a、b、c,计算|a|+|b|+|c|的值,并查询预设的反应强度a、b、c与量化系数alpha的映射关系,确定驾驶员在t时刻三轴方向上对应的测量值a、b、c对应的应激系数alpha。比如若计算获得|a|+|b|+|c|的值为0.6,则确定对应的应激系数alpha为0.6。
之后,根据获得的应激级别grade和应激系数alpha,确定对应的第三分值score_3。示例性的,代入公式score_3=(1/grade)*alpha,计算获得第三分值score_3。例如,上述举例中获得grade为3级,alpha为0.6,代入公式score_3=(1/grade)*alpha中,计算获得第三分值score_3为0.2。
需要说明的是,驾驶员应激反应的应激信号不限于IMU信号,应激信息还可以为采集其它生理信号获得的测量值。
S10314、对所述第一分值、所述第二分值和所述第三分值进行加权求和计算,获得综合分值。
通过上述获得第一分值score_1、第二分值score_2和第三分值score_3后,将第 一分值score_1、第二分值score_2和第三分值score_3进行加权求和计算,获得综合分值score为:
Figure PCTCN2021090319-appb-000011
其中,
Figure PCTCN2021090319-appb-000012
为对应的加权系数,其具体数值可根据实际情况进行灵活设置,在此不作具体限制。
示例性的,若第二分值score_2是由score heart、score r、score tem多个分值计算得到的,那根据score_1、score heart、score r、score tem、score_3进行加权求和计算,获得score为:score=α·score_1+β·score heart+γ·score r+δ·score tem+θ·score_3。
其中,α、β、γ、δ、θ为对应的加权系数,例如,设置α、β、γ、δ、θ分别为0.4、0.2、0.2、0.1、0.1,则加权求和计算获得score为:
score=0.4score_1+0.2score heart+0.2score r+0.1score tem+0.1score_3
需要说明的是,α、β、γ、δ、θ的具体数值可根据实际情况进行灵活设置,在此不作具体限制。
S10315、根据预设的疲劳程度与分值的映射关系,确定所述综合分值对应的疲劳程度。
示例性的,疲劳程度与分值的映射关系包括但不限于映射表。例如,预先设置疲劳程度与分值的映射关系如表1所示。
表1
分值(Score) 疲劳程度
>0.7 已睡着
>0.5&&<=0.7 严重疲劳
>0.25&&<=0.5 中度疲劳
>0.15&&<=0.25 轻度疲劳
<=0.15 清醒
例如,若计算获得score为0.3,通过查询表1确定score为0.3对应的疲劳程度是中度疲劳,也即确定驾驶员当前的疲劳程度是中度疲劳。
在一些实施例中,如图5所示,所述步骤S103之后可以包括步骤S105。
S105、根据所述疲劳程度,输出对应的报警信息,其中,不同的疲劳程度对应不同的报警信息。
其中,报警信息包括震动提示信息、语音提示信息中至少一种。也即根据驾驶员的疲劳程度,进行不同的震动提示和/或语音提示。例如,采用汽车中控 台的音响设备进行语音提示。
在一些实施例中,输出对应的报警信息可以包括:发送报警指令至可穿戴设备,以供所述可穿戴设备设备根据所述报警指令输出所述报警信息。
例如,根据驾驶员的疲劳程度,发送相应的报警指令至智能手环、智能手表等可穿戴设备设备,可穿戴设备设备进行相应的震动提示和/或语音提示。
示例性的,不同的疲劳程度对应震动提示信息和/或语音提示信息的不同提示参数,其中,提示参数包括但不限于震动频率、震动幅度、震动时长、语音音量等。
示例性的,预先设置疲劳程度与分值、报警信息的映射关系如表2所示。
表2
Figure PCTCN2021090319-appb-000013
例如,若确定驾驶员当前的疲劳程度是中度疲劳,通过查询表2确定相应的报警信息,控制智能手环(手表)以短时大幅度的震动方式进行震动提示。
上述实施例通过获取驾驶员当前的人脸信息,并控制电子设备输出刺激驾驶员的刺激信号,获取驾驶员基于刺激信号产生应激反应的应激信息,驾驶员在不同疲劳程度下对应的应激信息是不同的,将人脸信息和应激信息结合起来确定驾驶员当前的疲劳程度,相比于依赖于人脸或人眼的检测结果,提高了疲劳驾驶检测的准确性,进而提高驾驶员驾驶的安全性。
请参阅图6,图6是本申请一实施例提供的疲劳驾驶监控装置的示意性框图。
如图6所示,该疲劳驾驶监控装置200可以包括包括处理器211和存储器212,处理器211和存储器212通过总线连接,该总线比如为I2C(Inter-integrated Circuit)总线。
具体地,处理器211可以是微控制单元(Micro-controller Unit,MCU)、中央处理单元(Central Processing Unit,CPU)或数字信号处理器(Digital Signal Processor,DSP)等。
具体地,存储器212可以是Flash芯片、只读存储器(ROM,Read-Only Memory)磁盘、光盘、U盘或移动硬盘等。存储器212中存储有供处理器211执行的各种计算机程序。
其中,所述处理器用于运行存储在存储器中的计算机程序,并在执行所述计算机程序时实现如下步骤:
获取驾驶员当前的人脸信息;
控制电子设备输出刺激所述驾驶员的刺激信号,并获取所述驾驶员基于所述刺激信号产生应激反应的应激信息;
根据所述人脸信息和所述应激信息,确定所述驾驶员当前的疲劳程度。
在一些实施例中,所述应激信息包括应激反应时间、应激反应强度中至少一种。
在一些实施例中,所述处理器在实现所述控制电子设备输出刺激所述驾驶员的刺激信号时,具体实现:
发送控制信号至所述电子设备,以供所述电子设备在接收到所述控制信号时,输出所述刺激信号。
在一些实施例中,所述处理器在实现所述发送控制信号至所述电子设备时,具体实现:
按照预设的周期时间,周期性发送所述控制信号至所述电子设备。
在一些实施例中,所述处理器在实现所述根据所述人脸信息和所述应激信息,确定所述驾驶员当前的疲劳程度之前,还实现:
获取所述驾驶员的身体状态信息;
所述处理器在实现所述根据所述人脸信息和所述应激信息,确定所述驾驶员当前的疲劳程度时,具体实现:
根据所述人脸信息、所述应激信息和所述身体状态信息,确定所述驾驶员当前的疲劳程度。
在一些实施例中,所述身体状态信息包括心率信息、IMU信息、温度信息中至少一种。
在一些实施例中,所述处理器在实现所述获取所述驾驶员的身体状态信息时,具体实现:
接收传感器采集的所述身体状态信息,其中,所述传感器包括脉搏传感器、IMU传感器、温度传感器中至少一种。
在一些实施例中,所述传感器设置于可穿戴设备内,或所述传感器设置于所述驾驶员的身体接触的驾驶舱部位。
在一些实施例中,所述处理器在实现所述根据所述人脸信息、所述应激信息和所述身体状态信息,确定所述驾驶员当前的疲劳程度时,具体实现:
根据所述人脸信息进行perclos计算,获得第一分值;
对所述身体状态信息进行量化分级处理,获得所述身体状态信息对应的第二分值;
对所述应激信息进行量化分级处理,获得所述应激信息对应的第三分值;
对所述第一分值、所述第二分值和所述第三分值进行加权求和计算,获得综合分值;
根据预设的疲劳程度与分值的映射关系,确定所述综合分值对应的疲劳程度。
在一些实施例中,所述身体状态信息包括心率信息,所述处理器在实现所述对所述身体状态信息进行量化分级处理,获得所述身体状态信息对应的第二分值时,具体实现:
根据预设的心率与量化级别的映射关系,确定所述心率信息对应的心率级别;
根据预设的心率级别与分值的映射关系,将所述心率级别对应的分值确定为所述第二分值。
在一些实施例中,所述身体状态信息包括IMU信息,所述处理器在实现所述对所述身体状态信息进行量化分级处理,获得所述身体状态信息对应的第二分值时,具体实现:
根据所述IMU信息,获得三个方向上对应的第一加速度、第二加速度和第三加速度;
计算所述第一加速度、所述第二加速度和所述第三加速度的绝对值之和,获得加速度和值;
根据预设的加速度与量化级别的映射关系,确定所述加速度和值对应的IMU级别;
根据预设的IMU级别与分值的映射关系,将所述IMU级别对应的分值确定为所述第二分值。
在一些实施例中,所述处理器在实现所述根据所述IMU信息,获得三个方向上对应的第一加速度、第二加速度和第三加速度之后,还实现:
对所述第一加速度、所述第二加速度和所述第三加速度进行校正处理,获得第一校正加速度、第二校正加速度和第三校正加速度;
所述处理器在实现所述计算所述第一加速度、所述第二加速度和所述第三加速度的绝对值之和,获得加速度和值时,具体实现:
将所述第一校正加速度、所述第二校正加速度和所述第三校正加速度的绝对值之和,确定为所述加速度和值。
在一些实施例中,所述身体状态信息包括温度信息,所述处理器在实现所述对所述身体状态信息进行量化分级处理,获得所述身体状态信息对应的第二分值时,具体实现:
根据预设的温度与量化级别的映射关系,确定所述温度信息对应的温度级别;
根据预设的温度级别与分值的映射关系,将所述温度级别对应的分值确定为所述第二分值。
在一些实施例中,所述应激信息包括应激反应时间和应激反应强度,所述处理器在实现所述对所述应激信息进行量化分级处理,获得所述应激信息对应的第三分值时,具体实现:
根据预设的反应时间与量化级别的映射关系,确定所述应激反应时间对应的应激级别;
根据预设的反应强度与量化系数的映射关系,确定所述应激反应强度对应的应激系数;
根据所述应激级别和所述应激系数,确定所述第三分值。
在一些实施例中,所述处理器在实现所述确定所述驾驶员当前的疲劳程度之后,还实现:
根据所述疲劳程度,输出对应的报警信息,其中,不同的疲劳程度对应不同的报警信息。
在一些实施例中,所述处理器在实现所述输出对应的报警信息时,具体实现:
发送报警指令至可穿戴设备,以供所述可穿戴设备设备根据所述报警指令输出所述报警信息。
在一些实施例中,所述报警信息包括震动提示信息、语音提示信息中至少一种。
在一些实施例中,不同的疲劳程度对应所述震动提示信息和/或所述语音提示信息的不同提示参数,所述提示参数包括震动幅度、震动时长、语音音量。
本申请的实施例中还提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序中包括程序指令,所述处理器执行所述程序指令,实现本申请实施例提供的疲劳驾驶监控方法的步骤。
其中,所述计算机可读存储介质可以是前述实施例所述的交通设备或疲劳驾驶监控装置的内部存储单元,例如所述交通设备或疲劳驾驶监控装置的硬盘或内存。所述计算机可读存储介质也可以是所述交通设备或疲劳驾驶监控装置的外部存储设备,例如所述交通设备或疲劳驾驶监控装置上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本申请的保护范围之内。 因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (38)

  1. 一种疲劳驾驶监控方法,其特征在于,包括:
    获取驾驶员当前的人脸信息;
    控制电子设备输出刺激所述驾驶员的刺激信号,并获取所述驾驶员基于所述刺激信号产生应激反应的应激信息;
    根据所述人脸信息和所述应激信息,确定所述驾驶员当前的疲劳程度。
  2. 根据权利要求1所述的方法,其特征在于,所述应激信息包括应激反应时间、应激反应强度中至少一种。
  3. 根据权利要求1所述的方法,其特征在于,所述控制电子设备输出刺激所述驾驶员的刺激信号,包括:
    发送控制信号至所述电子设备,以供所述电子设备在接收到所述控制信号时,输出所述刺激信号。
  4. 根据权利要求3所述的方法,其特征在于,所述发送控制信号至所述电子设备,包括:
    按照预设的周期时间,周期性发送所述控制信号至所述电子设备。
  5. 根据权利要求1所述的方法,其特征在于,所述根据所述人脸信息和所述应激信息,确定所述驾驶员当前的疲劳程度之前,包括:
    获取所述驾驶员的身体状态信息;
    所述根据所述人脸信息和所述应激信息,确定所述驾驶员当前的疲劳程度,包括:
    根据所述人脸信息、所述应激信息和所述身体状态信息,确定所述驾驶员当前的疲劳程度。
  6. 根据权利要求5所述的方法,其特征在于,所述身体状态信息包括心率信息、IMU信息、温度信息中至少一种。
  7. 根据权利要求5所述的方法,其特征在于,所述获取所述驾驶员的身体状态信息,包括:
    接收传感器采集的所述身体状态信息,其中,所述传感器包括脉搏传感器、 IMU传感器、温度传感器中至少一种。
  8. 根据权利要求7所述的方法,其特征在于,所述传感器设置于可穿戴设备内,或所述传感器设置于所述驾驶员的身体接触的驾驶舱部位。
  9. 根据权利要求5所述的方法,其特征在于,所述根据所述人脸信息、所述应激信息和所述身体状态信息,确定所述驾驶员当前的疲劳程度,包括:
    根据所述人脸信息进行perclos计算,获得第一分值;
    对所述身体状态信息进行量化分级处理,获得所述身体状态信息对应的第二分值;
    对所述应激信息进行量化分级处理,获得所述应激信息对应的第三分值;
    对所述第一分值、所述第二分值和所述第三分值进行加权求和计算,获得综合分值;
    根据预设的疲劳程度与分值的映射关系,确定所述综合分值对应的疲劳程度。
  10. 根据权利要求9所述的方法,其特征在于,所述身体状态信息包括心率信息,所述对所述身体状态信息进行量化分级处理,获得所述身体状态信息对应的第二分值,包括:
    根据预设的心率与量化级别的映射关系,确定所述心率信息对应的心率级别;
    根据预设的心率级别与分值的映射关系,将所述心率级别对应的分值确定为所述第二分值。
  11. 根据权利要求9所述的方法,其特征在于,所述身体状态信息包括IMU信息,所述对所述身体状态信息进行量化分级处理,获得所述身体状态信息对应的第二分值,包括:
    根据所述IMU信息,获得三个方向上对应的第一加速度、第二加速度和第三加速度;
    计算所述第一加速度、所述第二加速度和所述第三加速度的绝对值之和,获得加速度和值;
    根据预设的加速度与量化级别的映射关系,确定所述加速度和值对应的IMU级别;
    根据预设的IMU级别与分值的映射关系,将所述IMU级别对应的分值确定为所述第二分值。
  12. 根据权利要求11所述的方法,其特征在于,所述根据所述IMU信息,获得三个方向上对应的第一加速度、第二加速度和第三加速度之后,包括:
    对所述第一加速度、所述第二加速度和所述第三加速度进行校正处理,获得第一校正加速度、第二校正加速度和第三校正加速度;
    所述计算所述第一加速度、所述第二加速度和所述第三加速度的绝对值之和,获得加速度和值,包括:
    将所述第一校正加速度、所述第二校正加速度和所述第三校正加速度的绝对值之和,确定为所述加速度和值。
  13. 根据权利要求9所述的方法,其特征在于,所述身体状态信息包括温度信息,所述对所述身体状态信息进行量化分级处理,获得所述身体状态信息对应的第二分值,包括:
    根据预设的温度与量化级别的映射关系,确定所述温度信息对应的温度级别;
    根据预设的温度级别与分值的映射关系,将所述温度级别对应的分值确定为所述第二分值。
  14. 根据权利要求9所述的方法,其特征在于,所述应激信息包括应激反应时间和应激反应强度,所述对所述应激信息进行量化分级处理,获得所述应激信息对应的第三分值,包括:
    根据预设的反应时间与量化级别的映射关系,确定所述应激反应时间对应的应激级别;
    根据预设的反应强度与量化系数的映射关系,确定所述应激反应强度对应的应激系数;
    根据所述应激级别和所述应激系数,确定所述第三分值。
  15. 根据权利要求1至14任一项所述的方法,其特征在于,所述确定所述驾驶员当前的疲劳程度之后,包括:
    根据所述疲劳程度,输出对应的报警信息,其中,不同的疲劳程度对应不同的报警信息。
  16. 根据权利要求15所述的方法,其特征在于,所述输出对应的报警信息,包括:
    发送报警指令至可穿戴设备,以供所述可穿戴设备设备根据所述报警指令输出所述报警信息。
  17. 根据权利要求15所述的方法,其特征在于,所述报警信息包括震动提示信息、语音提示信息中至少一种。
  18. 根据权利要求17所述的方法,其特征在于,不同的疲劳程度对应所述震动提示信息和/或所述语音提示信息的不同提示参数,所述提示参数包括震动幅度、震动时长、语音音量。
  19. 一种疲劳驾驶监控装置,其特征在于,所述疲劳驾驶监控装置包括存储器和处理器;
    所述存储器用于存储计算机程序;
    所述处理器,用于执行所述计算机程序并在执行所述计算机程序时,实现如下步骤:
    获取驾驶员当前的人脸信息;
    控制电子设备输出刺激所述驾驶员的刺激信号,并获取所述驾驶员基于所述刺激信号产生应激反应的应激信息;
    根据所述人脸信息和所述应激信息,确定所述驾驶员当前的疲劳程度。
  20. 根据权利要求19所述的装置,其特征在于,所述应激信息包括应激反应时间、应激反应强度中至少一种。
  21. 根据权利要求19所述的装置,其特征在于,所述处理器在实现所述控制电子设备输出刺激所述驾驶员的刺激信号时,具体实现:
    发送控制信号至所述电子设备,以供所述电子设备在接收到所述控制信号时,输出所述刺激信号。
  22. 根据权利要求21所述的装置,其特征在于,所述处理器在实现所述发送控制信号至所述电子设备时,具体实现:
    按照预设的周期时间,周期性发送所述控制信号至所述电子设备。
  23. 根据权利要求19所述的装置,其特征在于,所述处理器在实现所述根据所述人脸信息和所述应激信息,确定所述驾驶员当前的疲劳程度之前,还实 现:
    获取所述驾驶员的身体状态信息;
    所述处理器在实现所述根据所述人脸信息和所述应激信息,确定所述驾驶员当前的疲劳程度时,具体实现:
    根据所述人脸信息、所述应激信息和所述身体状态信息,确定所述驾驶员当前的疲劳程度。
  24. 根据权利要求23所述的装置,其特征在于,所述身体状态信息包括心率信息、IMU信息、温度信息中至少一种。
  25. 根据权利要求23所述的装置,其特征在于,所述处理器在实现所述获取所述驾驶员的身体状态信息时,具体实现:
    接收传感器采集的所述身体状态信息,其中,所述传感器包括脉搏传感器、IMU传感器、温度传感器中至少一种。
  26. 根据权利要求25所述的装置,其特征在于,所述传感器设置于可穿戴设备内,或所述传感器设置于所述驾驶员的身体接触的驾驶舱部位。
  27. 根据权利要求23所述的装置,其特征在于,所述处理器在实现所述根据所述人脸信息、所述应激信息和所述身体状态信息,确定所述驾驶员当前的疲劳程度时,具体实现:
    根据所述人脸信息进行perclos计算,获得第一分值;
    对所述身体状态信息进行量化分级处理,获得所述身体状态信息对应的第二分值;
    对所述应激信息进行量化分级处理,获得所述应激信息对应的第三分值;
    对所述第一分值、所述第二分值和所述第三分值进行加权求和计算,获得综合分值;
    根据预设的疲劳程度与分值的映射关系,确定所述综合分值对应的疲劳程度。
  28. 根据权利要求27所述的装置,其特征在于,所述身体状态信息包括心率信息,所述处理器在实现所述对所述身体状态信息进行量化分级处理,获得所述身体状态信息对应的第二分值时,具体实现:
    根据预设的心率与量化级别的映射关系,确定所述心率信息对应的心率级 别;
    根据预设的心率级别与分值的映射关系,将所述心率级别对应的分值确定为所述第二分值。
  29. 根据权利要求27所述的装置,其特征在于,所述身体状态信息包括IMU信息,所述处理器在实现所述对所述身体状态信息进行量化分级处理,获得所述身体状态信息对应的第二分值时,具体实现:
    根据所述IMU信息,获得三个方向上对应的第一加速度、第二加速度和第三加速度;
    计算所述第一加速度、所述第二加速度和所述第三加速度的绝对值之和,获得加速度和值;
    根据预设的加速度与量化级别的映射关系,确定所述加速度和值对应的IMU级别;
    根据预设的IMU级别与分值的映射关系,将所述IMU级别对应的分值确定为所述第二分值。
  30. 根据权利要求29所述的装置,其特征在于,所述处理器在实现所述根据所述IMU信息,获得三个方向上对应的第一加速度、第二加速度和第三加速度之后,还实现:
    对所述第一加速度、所述第二加速度和所述第三加速度进行校正处理,获得第一校正加速度、第二校正加速度和第三校正加速度;
    所述处理器在实现所述计算所述第一加速度、所述第二加速度和所述第三加速度的绝对值之和,获得加速度和值时,具体实现:
    将所述第一校正加速度、所述第二校正加速度和所述第三校正加速度的绝对值之和,确定为所述加速度和值。
  31. 根据权利要求27所述的装置,其特征在于,所述身体状态信息包括温度信息,所述处理器在实现所述对所述身体状态信息进行量化分级处理,获得所述身体状态信息对应的第二分值时,具体实现:
    根据预设的温度与量化级别的映射关系,确定所述温度信息对应的温度级别;
    根据预设的温度级别与分值的映射关系,将所述温度级别对应的分值确定 为所述第二分值。
  32. 根据权利要求27所述的装置,其特征在于,所述应激信息包括应激反应时间和应激反应强度,所述处理器在实现所述对所述应激信息进行量化分级处理,获得所述应激信息对应的第三分值时,具体实现:
    根据预设的反应时间与量化级别的映射关系,确定所述应激反应时间对应的应激级别;
    根据预设的反应强度与量化系数的映射关系,确定所述应激反应强度对应的应激系数;
    根据所述应激级别和所述应激系数,确定所述第三分值。
  33. 根据权利要求19至32任一项所述的装置,其特征在于,所述处理器在实现所述确定所述驾驶员当前的疲劳程度之后,还实现:
    根据所述疲劳程度,输出对应的报警信息,其中,不同的疲劳程度对应不同的报警信息。
  34. 根据权利要求33所述的装置,其特征在于,所述处理器在实现所述输出对应的报警信息时,具体实现:
    发送报警指令至可穿戴设备,以供所述可穿戴设备设备根据所述报警指令输出所述报警信息。
  35. 根据权利要求33所述的装置,其特征在于,所述报警信息包括震动提示信息、语音提示信息中至少一种。
  36. 根据权利要求35所述的装置,其特征在于,不同的疲劳程度对应所述震动提示信息和/或所述语音提示信息的不同提示参数,所述提示参数包括震动幅度、震动时长、语音音量。
  37. 一种交通设备,其特征在于,所述交通设备包括如权利要求19至36中任一项所述的疲劳驾驶监控装置。
  38. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如权利要求1至18中任一项所述的疲劳驾驶监控方法。
PCT/CN2021/090319 2021-04-27 2021-04-27 疲劳驾驶监控方法、装置、交通设备及存储介质 WO2022226799A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/090319 WO2022226799A1 (zh) 2021-04-27 2021-04-27 疲劳驾驶监控方法、装置、交通设备及存储介质
CN202180087989.2A CN116724339A (zh) 2021-04-27 2021-04-27 疲劳驾驶监控方法、装置、交通设备及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/090319 WO2022226799A1 (zh) 2021-04-27 2021-04-27 疲劳驾驶监控方法、装置、交通设备及存储介质

Publications (1)

Publication Number Publication Date
WO2022226799A1 true WO2022226799A1 (zh) 2022-11-03

Family

ID=83847691

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/090319 WO2022226799A1 (zh) 2021-04-27 2021-04-27 疲劳驾驶监控方法、装置、交通设备及存储介质

Country Status (2)

Country Link
CN (1) CN116724339A (zh)
WO (1) WO2022226799A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190126820A1 (en) * 2017-10-26 2019-05-02 Lattice Energy Technology Corporation Fatigue alarm apparatus and method
CN109835331A (zh) * 2017-11-28 2019-06-04 现代自动车株式会社 用于调整车辆的驾驶控制权限的方法及系统
CN111231969A (zh) * 2020-02-14 2020-06-05 开沃新能源汽车集团有限公司 一种汽车行驶状态检测方法
WO2020152678A1 (en) * 2019-01-22 2020-07-30 Adam Cogtech Ltd. Detection of cognitive state of a driver
CN112258790A (zh) * 2020-10-26 2021-01-22 上海工程技术大学 轨道交通列车驾驶员岗前疲劳等级及身体状态检测系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190126820A1 (en) * 2017-10-26 2019-05-02 Lattice Energy Technology Corporation Fatigue alarm apparatus and method
CN109835331A (zh) * 2017-11-28 2019-06-04 现代自动车株式会社 用于调整车辆的驾驶控制权限的方法及系统
WO2020152678A1 (en) * 2019-01-22 2020-07-30 Adam Cogtech Ltd. Detection of cognitive state of a driver
CN111231969A (zh) * 2020-02-14 2020-06-05 开沃新能源汽车集团有限公司 一种汽车行驶状态检测方法
CN112258790A (zh) * 2020-10-26 2021-01-22 上海工程技术大学 轨道交通列车驾驶员岗前疲劳等级及身体状态检测系统

Also Published As

Publication number Publication date
CN116724339A (zh) 2023-09-08

Similar Documents

Publication Publication Date Title
EP3809227B1 (en) Driving assistance apparatus and driving assistance method
US20190161091A1 (en) Vehicle and method for supporting driving safety thereof
JP4867215B2 (ja) 生理・心理状態判定装置、生理・心理状態判定方法、リファレンスデータ生成装置、及びリファレンスデータ生成方法。
EP3889740A1 (en) Affective-cognitive load based digital assistant
WO2019208450A1 (ja) 運転支援装置、運転支援方法及びプログラム
JP2015033457A (ja) 運転状態推定装置及び運転状態推定方法
CN113439049A (zh) 交通工具晕动症推测装置、交通工具晕动症抑制装置以及交通工具晕动症推测方法
KR101988426B1 (ko) 졸음방지용 손목형 전기충격기
WO2022226799A1 (zh) 疲劳驾驶监控方法、装置、交通设备及存储介质
JP2020103462A (ja) 感情推定装置、環境提供システム、車両、感情推定方法、および情報処理プログラム
JP6379656B2 (ja) 眠気検知装置、眠気検知方法および眠気検知プログラム
CN108242131A (zh) 一种脊椎智能保护方法及装置
JP5340660B2 (ja) 車両用乗員覚醒装置
CN206421551U (zh) 一种预防疲劳驾驶的安全辅助系统
JP2021053240A (ja) 眠気又は覚醒レベル評価装置及び評価プログラム
US11383640B2 (en) Techniques for automatically reducing annoyance levels of drivers when using driver monitoring systems
US9820687B2 (en) Method for determining drowsiness
CN114435373A (zh) 疲劳驾驶检测方法、装置、计算机设备和存储介质
JP5751162B2 (ja) 呼吸検出装置
JP2018192128A (ja) 眠気判定装置及びプログラム
JP2017134688A (ja) 振動発生装置、振動パターン設定方法、および振動パターン設定プログラム
JP2020170471A (ja) 運転支援装置
JP2015161542A (ja) 報知装置及び制御方法
Kono et al. Suppression of Vestibulo-Ocular Reflex with Increased Mental Workload While Driving
JPH08196636A (ja) 居眠り警告装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21938286

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180087989.2

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21938286

Country of ref document: EP

Kind code of ref document: A1