WO2019095733A1 - 用于发出告警信号的方法及装置 - Google Patents

用于发出告警信号的方法及装置 Download PDF

Info

Publication number
WO2019095733A1
WO2019095733A1 PCT/CN2018/099158 CN2018099158W WO2019095733A1 WO 2019095733 A1 WO2019095733 A1 WO 2019095733A1 CN 2018099158 W CN2018099158 W CN 2018099158W WO 2019095733 A1 WO2019095733 A1 WO 2019095733A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
fatigue
value
attention
type
Prior art date
Application number
PCT/CN2018/099158
Other languages
English (en)
French (fr)
Inventor
李宏言
拓冰
商兴奇
蔡浚宇
贾巍
李时聪
李鹏
贺诚
Original Assignee
百度在线网络技术(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 百度在线网络技术(北京)有限公司 filed Critical 百度在线网络技术(北京)有限公司
Publication of WO2019095733A1 publication Critical patent/WO2019095733A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level

Definitions

  • the embodiments of the present invention relate to the field of computer technologies, and particularly to the technical field of vehicle control, and in particular, to a method and apparatus for issuing an alarm signal.
  • Cars have expanded the range of people's travel, brought convenience to people's travel, and improved people's quality of life.
  • unmanned vehicles controlled by intelligent systems can obtain more driving information than manned vehicles, and have higher safety, which has become an important trend in the future development of automobiles.
  • the driver is prone to traffic accidents during fatigue driving.
  • the prior art determines whether the driver is fatigued or not by monitoring the driver's eye movements and the like.
  • the purpose of the embodiments of the present application is to provide a method and apparatus for issuing an alarm signal.
  • an embodiment of the present application provides a method for issuing an alarm signal, the method comprising: acquiring driving state data in real time, where the driving state data includes driver state data, driving environment data, and vehicle state data, and the driving The player state data includes hand motion data, mouth motion data, head motion data, and eye motion data, the driving environment data includes lane line data and vehicle distance data, and the vehicle state data includes speed data and direction data;
  • the driving state data is imported into the pre-trained fatigue recognition model to obtain the fatigue type and the fatigue value of the corresponding fatigue type.
  • the fatigue recognition model is used to determine the fatigue type and the fatigue value by the driving state data, and the fatigue value is used to characterize the degree of the corresponding fatigue type.
  • the above fatigue type and fatigue value are introduced into the pre-trained attention value calculation model to obtain the attention value, and the above attention value calculation model is used to calculate the attention value by the fatigue type and the fatigue value, and the above attention value is used to characterize the driving Member
  • the degree of fatigue determining the attention threshold range in which the attention value is located, and issuing an alarm signal according to the above attention threshold range, wherein the attention threshold range is determined by a preset attention threshold.
  • the above method includes the step of constructing a fatigue recognition model, the step of constructing the fatigue recognition model comprising: extracting temporally synchronized drivers from the driver state data set, the driving environment data set, and the vehicle state data set, respectively State data, driving environment data, and vehicle state data; feature extraction of driver state data, driving environment data, and vehicle state data, respectively, to obtain driver state feature data, driving environment feature data, and vehicle state feature data;
  • the state feature data determines a fatigue type, and the fatigue type includes at least one of the following: a closed eye type, a yawn type, a line of sight offset type, and a phone call type; and is calculated by driver state feature data, driving environment feature data, and vehicle state feature data.
  • Fatigue value using the machine learning method, the driver state characteristic data, the driving environment characteristic data and the vehicle state characteristic data are taken as inputs, and the fatigue type and the fatigue type of the corresponding fatigue type are taken as outputs, and the fatigue recognition mode is trained. .
  • calculating the fatigue value by the driver state feature data, the driving environment feature data, and the vehicle state feature data includes: acquiring a time threshold range, a distance threshold range, and a data change amount threshold range, wherein the time threshold range includes a plurality of time threshold sub-ranges formed by a preset time threshold, the distance threshold range includes a plurality of distance threshold sub-ranges formed by a preset distance threshold, and the data change amount threshold range includes a plurality of data change thresholds set in advance Each data threshold threshold sub-range, each time threshold sub-range corresponds to a time weight value, each distance threshold sub-range corresponds to a distance weight value, and each data change threshold sub-range corresponds to a data change weight value; respectively, driving is measured
  • the fatigue type duration of each fatigue type corresponding to the member state characteristic data, the distance change value corresponding to the driving environment feature data, and the data change value corresponding to the vehicle state feature data, wherein the fatigue type duration corresponds to the driver state feature data Means The duration of the action corresponding to
  • the distance change value is represented by the amount of change of the driving environment characteristic data in unit time
  • the data change value is represented by the amount of change of the vehicle state characteristic data in unit time
  • determining fatigue The time weight sub-range of the time threshold range corresponding to the type duration, the distance change value, and the data change value respectively, the distance weight value of the distance threshold sub-range of the distance threshold range, and the data change threshold of the data change threshold range
  • Sub-range data change weight value weighted sum of time weight value, distance weight value and data change weight value to obtain fatigue value corresponding to fatigue type.
  • the above method includes the step of constructing a attention value calculation model, and the step of constructing the attention value calculation model includes: dividing the fatigue value in the fatigue value set into a fatigue value subset corresponding to the fatigue type according to the fatigue type Setting the type weight value for each fatigue type separately, and the product of the type of weight value and each fatigue value in the fatigue value subset corresponding to the type weight value as the attention of the corresponding fatigue type when taking the fatigue value Component; using the machine learning method, the fatigue type and the fatigue value are taken as inputs, the attention component is taken as the output, and the attention value calculation model is trained.
  • the foregoing attention threshold range in which the attention value is determined, and the issuing the alarm signal according to the attention threshold range includes: acquiring an attention threshold range, where the attention threshold range includes a preset attention threshold a plurality of attention threshold sub-ranges, each attention threshold sub-range corresponding to an alarm signal, wherein the alarm signal comprises: an audible alarm signal and an image alarm signal; and the attention threshold sub-range corresponding to the attention value is issued, and the corresponding attention is issued The alarm signal for the sub-range of the force threshold.
  • the determining the attention threshold range in which the attention value is located, and issuing the alarm signal according to the attention threshold range further includes: starting the automatic driving mode in response to the attention value not being within the attention threshold range.
  • the embodiment of the present application provides a device for issuing an alarm signal, where the device includes: a driving state data acquiring unit, configured to acquire driving state data in real time, where the driving state data includes driver state data, driving The environmental data and the vehicle state data, the driver state data includes hand motion data, mouth motion data, head motion data, and eye motion data, and the driving environment data includes lane line data, vehicle distance data, and the vehicle state data.
  • the invention includes a speed data and a direction data, and a fatigue value acquiring unit, configured to introduce the driving state data into a pre-trained fatigue recognition model to obtain a fatigue type and a fatigue type of a corresponding fatigue type, wherein the fatigue recognition model is used to determine fatigue through driving state data.
  • Type and fatigue value the above fatigue value is used to characterize the degree of the corresponding fatigue type;
  • the attention value acquisition unit is used to introduce the above fatigue type and fatigue value into the pre-trained attention value calculation model to obtain the attention value, the above attention Value calculation
  • the model is used to calculate the attention value by fatigue type and fatigue value, the above attention value is used to characterize the driver's fatigue degree;
  • the alarm unit is used to determine the attention threshold range where the attention value is located, and is issued according to the above attention threshold range
  • the alarm signal, the above attention threshold range is determined by a preset attention threshold.
  • the above apparatus includes a fatigue recognition model construction unit for constructing a fatigue recognition model
  • the fatigue recognition model construction unit includes: a data extraction subunit for respectively obtaining a driver state data set, a driving environment data set, and Extracting time-synchronized driver state data, driving environment data, and vehicle state data in the vehicle state data set; the body sign extraction subunit is configured to perform feature extraction on driver state data, driving environment data, and vehicle state data, respectively, to obtain driving a member state characteristic data, driving environment characteristic data and vehicle state characteristic data; a fatigue type determining subunit, configured to determine a fatigue type according to the driver state characteristic data, wherein the fatigue type comprises at least one of the following: a closed eye type, a yawn type , line of sight offset type, phone type; fatigue value calculation sub-unit for calculating fatigue value by driver state feature data, driving environment feature data and vehicle state feature data; fatigue recognition model building sub-unit for utilizing machine learning square , Wherein the driver condition data, driver data and vehicle environment state characteristic features as input data,
  • the fatigue value calculation subunit includes: a threshold range acquisition module, configured to acquire a time threshold range, a distance threshold range, and a data change threshold range, wherein the time threshold range includes a preset time threshold a plurality of time threshold sub-ranges, the distance threshold range includes a plurality of distance threshold sub-ranges formed by a preset distance threshold, and the data change threshold range includes a plurality of data change threshold sub-ranges composed of preset data change thresholds Each time threshold sub-range corresponds to a time weight value, each distance threshold sub-range corresponds to a distance weight value, and each data change threshold sub-range corresponds to a data change weight value; the data measurement module is configured to separately measure the driver The fatigue type duration of each fatigue type corresponding to the state feature data, the distance change value corresponding to the driving environment feature data, and the data change value corresponding to the vehicle state feature data, wherein the fatigue type duration is specified by the driver state feature data Type of fatigue Characterized by the duration of the action, the distance change value is characterized
  • the above apparatus includes an attention value calculation model construction unit for constructing a attention value calculation model, wherein the attention value calculation model construction unit includes: a data division subunit for collecting the fatigue value according to the fatigue type The fatigue value is divided into a fatigue value sub-set corresponding to the fatigue type; the attention component calculation sub-unit is used to respectively set a type weight value for each fatigue type, and a fatigue value subset corresponding to the type weight value and the type weight value.
  • the product of each fatigue value is used as the attention component of the corresponding fatigue type when taking the fatigue value; the attention value calculation model is constructed by using a machine learning method to input fatigue type and fatigue value as input.
  • the attention component is used as an output, and the attention value calculation model is trained.
  • the alarm unit includes: an attention threshold range acquisition subunit, configured to acquire an attention threshold range, where the attention threshold range includes a plurality of attention threshold sub-ranges formed by a preset attention threshold, Each attention threshold sub-range corresponds to an alarm signal, and the alarm signal includes: an audible alarm signal and an image alarm signal; and an alarm sub-unit, configured to respond to the attention threshold sub-range corresponding to the attention value, and issue a corresponding attention threshold Sub-range alarm signal.
  • an attention threshold range acquisition subunit configured to acquire an attention threshold range, where the attention threshold range includes a plurality of attention threshold sub-ranges formed by a preset attention threshold, Each attention threshold sub-range corresponds to an alarm signal, and the alarm signal includes: an audible alarm signal and an image alarm signal; and an alarm sub-unit, configured to respond to the attention threshold sub-range corresponding to the attention value, and issue a corresponding attention threshold Sub-range alarm signal.
  • the alerting unit further comprises: in response to the attentional value being within the range of attention thresholds, initiating the automatic driving mode.
  • an embodiment of the present application provides a server, including: one or more processors; and a memory, configured to store one or more programs, when the one or more programs are executed by the one or more processors. And causing the one or more processors to perform the method for issuing an alert signal of the first aspect described above.
  • an embodiment of the present application provides a computer readable storage medium having stored thereon a computer program, wherein the program is implemented by a processor to implement the method for issuing an alarm signal according to the first aspect.
  • the method and device for issuing an alarm signal acquires driving state data in real time, and the driving state data includes driver state data, driving environment data, and vehicle state data, and implements a driver, a driving environment, and a vehicle.
  • the data is comprehensively considered, and then the driving state data is imported into the fatigue recognition model to obtain the fatigue type and the fatigue type of the corresponding fatigue type, and the fatigue type and the fatigue value are introduced into the pre-trained attention value calculation model to obtain the attention value, and finally Corresponding alarm signals are issued according to the attention threshold range in which the attention value is located, which improves the accuracy of the driver's fatigue degree recognition.
  • FIG. 1 is an exemplary system architecture diagram to which the present application can be applied;
  • FIG. 2 is a flow diagram of one embodiment of a method for issuing an alert signal in accordance with the present application
  • FIG. 3 is a schematic diagram of an application scenario of a method for issuing an alert signal according to the present application
  • FIG. 4 is a schematic structural diagram of an embodiment of an apparatus for issuing an alarm signal according to the present application.
  • FIG. 5 is a schematic structural diagram of a computer system suitable for implementing a server of an embodiment of the present application.
  • FIG. 1 illustrates an exemplary system architecture 100 of an embodiment of a method for issuing an alert signal or for signaling an alert signal to which the present application may be applied.
  • the system architecture 100 can include a first camera 101, a second camera 102, an in-vehicle terminal 103, a network 104, and a server 105.
  • the network 104 is used to provide a medium for communication links between the in-vehicle terminal 103 and the server 105.
  • Network 104 may include various types of connections, such as wired, wireless communication links, fiber optic cables, and the like.
  • the first camera 101 may be disposed outside the vehicle for collecting an image during running of the current vehicle, and then calculating a distance between the current vehicle and other vehicles and pedestrians on the road according to the collected image, and calculating the current vehicle. a distance from the lane line; the second camera 102 may be disposed inside the vehicle for monitoring various actions of the driver, including hand movements, mouth movements, head movements, and eye movements, and these actions Converted into corresponding data; the vehicle-mounted terminal 103 can receive the first camera 101, the second camera 102, and current state data of the vehicle (for example, may be speed data, direction data, etc.), and process the received data, thereby driving the driver A corresponding alarm signal is issued during fatigue driving.
  • current state data of the vehicle for example, may be speed data, direction data, etc.
  • the server 105 may be a server that provides various services, such as a server that processes data sent from the in-vehicle terminal 103.
  • the server 105 may perform data processing on data sent from the in-vehicle terminal 103, and transmit data such as the attention value obtained.
  • the vehicle terminal 103 is provided.
  • the method for issuing an alarm signal provided by the embodiment of the present application is generally performed by the vehicle-mounted terminal 103. Accordingly, the device for issuing an alarm signal is generally disposed in the vehicle-mounted terminal 103.
  • first camera 101 the second camera 102, the in-vehicle terminal 103, the network 104, and the server 105 in FIG. 1 is merely illustrative. Any number of first camera 101, second camera 102, in-vehicle terminal 103, network 104, and server 105 may be provided as needed for implementation.
  • the method for issuing an alert signal includes the following steps:
  • step 201 the driving state data is acquired in real time.
  • the electronic device (for example, the in-vehicle terminal 103 shown in FIG. 1) on which the method for issuing an alarm signal is executed can receive the driving state data through a wired connection or a wireless connection.
  • the driving state data may include driver state data, driving environment data, and vehicle state data
  • the driver state data may include hand motion data, mouth motion data, head motion data, and eye motion data
  • the environmental data may include lane line data and vehicle distance data
  • the vehicle status data may include speed data and direction data.
  • wireless connection manner may include but is not limited to 3G/4G connection, WiFi connection, Bluetooth connection, WiMAX connection, Zigbee connection, UWB (ultra wideband) connection, and other wireless connection methods that are now known or developed in the future. .
  • the driver status data may be collected by the second camera 102 disposed in the vehicle; the driving environment data may be collected by the first camera 101 disposed outside the vehicle; the vehicle status data may be directly acquired from the vehicle.
  • Step 202 Import the driving state data into a pre-trained fatigue recognition model to obtain a fatigue type and a fatigue value of the corresponding fatigue type.
  • the driving state data can be imported into the fatigue recognition model to obtain the fatigue type (which can be expressed in the form of information) and the fatigue value of the corresponding fatigue type.
  • the fatigue recognition model is used to determine the fatigue type and the fatigue value by the driving state data, and the fatigue value is used to characterize the degree of the corresponding fatigue type.
  • Fatigue values can be expressed in a variety of forms. For example, the fatigue value can range from 0 to 1. When the fatigue value is 0, the driver is considered to be not fatigued; when the fatigue value is 1, the driver is considered to be in severe fatigue. The fatigue value can also be expressed in other forms such as 0 to 100, and will not be repeated here.
  • the method of the embodiment may include the step of constructing a fatigue recognition model, and the step of constructing the fatigue recognition model may include the following steps:
  • temporally synchronized driver state data, driving environment data, and vehicle state data are extracted from the driver state data set, the driving environment data set, and the vehicle state data set, respectively.
  • driver state data When the driver is in fatigue driving, not only the driver himself may exhibit some fatigue characteristics, but also the vehicle may be driven abnormally due to the driver's fatigue driving. Therefore, when it is determined whether the driver is fatigued, time-synchronized driver state data, driving environment data, and vehicle state data may be extracted from the driver state data set, the driving environment data set, and the vehicle state data set, respectively. It should be noted that the driver state data, the driving environment data, and the vehicle state data must be synchronized (in time) in time in order to make an accurate judgment on whether the driver is fatigued or not by combining various data.
  • feature extraction is performed on the driver state data, the driving environment data, and the vehicle state data, respectively, to obtain driver state feature data, driving environment feature data, and vehicle state feature data.
  • Driver status data, driving environment data, and vehicle status data are typically collected directly. In order to judge whether the driver is fatigued or not, it is necessary to perform feature extraction on the driver state data, the driving environment data, and the vehicle state data, respectively, to obtain driver state feature data, driving environment feature data, and vehicle state feature data. That is, the driver state feature data, the driving environment feature data, and the vehicle state feature data are data related to driver fatigue driving.
  • the driver state feature data may be a time value of the eye closure; the driving environment feature data may be a change value of the vehicle distance; the vehicle state feature data may be a change value of the speed.
  • the driver state feature data, the driving environment feature data, and the vehicle state feature data may also be other types of data, depending on actual needs.
  • the fatigue type is determined based on the driver state characteristic data described above.
  • the driver status data may include data such as driver's hand motion data, mouth motion data, head motion data, and eye motion data.
  • the corresponding motion data can be expressed by the driver status characteristic data. If a certain type or types of data appear, the driver shows the corresponding type of fatigue.
  • the above fatigue types include at least one of the following: a closed eye type, a yawn type, a line of sight offset type, and a call type.
  • the fatigue value is calculated from the driver state feature data, the driving environment feature data, and the vehicle state feature data.
  • the fatigue value corresponding to the fatigue type can be obtained by a certain data processing method.
  • the driver state characteristic data, the driving environment characteristic data, and the vehicle state characteristic data may be respectively calculated corresponding to a certain fatigue type value, and then weighted to obtain the fatigue type fatigue value.
  • the machine learning method is used to input the driver state characteristic data, the driving environment characteristic data and the vehicle state characteristic data, and the fatigue type and the fatigue type of the corresponding fatigue type are output as the fatigue recognition model.
  • the fatigue recognition model may be pre-defined by the technician based on statistics of a large amount of driver state characteristic data, driving environment characteristic data, vehicle state characteristic data, fatigue type, and fatigue value, and stored with driver state characteristic data and driving environment characteristics. Correspondence table of correspondence between data, vehicle state characteristic data, fatigue type and fatigue value; or may be preset by the technician based on statistics of a large amount of data and stored in the electronic device, and the driver state characteristic data,
  • the driving environment characteristic data and the vehicle state characteristic data are numerically calculated to obtain a calculation formula for characterizing the calculation result of the fatigue type fatigue value.
  • the calculation formula may be a formula for weighting the driver state characteristic data, the driving environment characteristic data, and the vehicle state characteristic data, and the obtained result may be used to characterize the fatigue value of the corresponding fatigue type.
  • the calculating the fatigue value by using the driver state feature data, the driving environment feature data, and the vehicle state feature data may include the following steps:
  • the first step is to obtain a time threshold range, a distance threshold range, and a data change threshold range.
  • Each time threshold sub-range corresponds to a time weight value; each distance threshold sub-range corresponds to a distance weight value; each data change threshold sub-range corresponds to a data change weight value.
  • the fatigue type duration of each fatigue type corresponding to the driver state characteristic data, the distance change value corresponding to the driving environment characteristic data, and the data change value corresponding to the vehicle state characteristic data are separately measured.
  • the fatigue type duration of each fatigue type corresponding to the driver state characteristic data, the distance change value corresponding to the driving environment characteristic data, and the vehicle state characteristic data may be separately measured.
  • the duration of the fatigue type may be characterized by the duration of the action corresponding to the specified fatigue type corresponding to the driver state characteristic data, and the distance change value may be represented by the amount of change of the driving environment characteristic data in a unit time, and the data change value may be It is characterized by the amount of change in vehicle state characteristic data per unit time.
  • the time weight value of the time threshold sub-range of the time threshold range corresponding to the fatigue type duration, the distance change value, and the data change value, and the distance weight value and the data change threshold of the distance threshold sub-range of the distance threshold range are determined.
  • the data change weight value of the range of data change threshold sub-range is determined.
  • the time weight value of the fatigue type duration corresponding to the time threshold sub-range corresponding to the time threshold range and the distance change value corresponding to the distance threshold range may be respectively obtained by data comparison.
  • the distance change weight value of the distance threshold sub-range and the data change weight value of the data change amount threshold sub-range corresponding to the data change amount threshold range may be respectively obtained by data comparison.
  • the time value weight value, the distance weight value and the data change weight value are weighted and summed to obtain the fatigue value of the corresponding fatigue type.
  • the temporal weight value, the distance weight value, and the data change weight value may be different from the different fatigue types.
  • different weights can be set for the time weight value, the distance weight value, and the data change weight value according to the specific fatigue type, and after obtaining the product of the time weight value, the distance weight value, and the data change weight value and the respective weight values, Summing and obtaining the fatigue value corresponding to the fatigue type.
  • step 203 the fatigue type and the fatigue value are introduced into the pre-trained attention value calculation model to obtain the attention value.
  • the fatigue type and fatigue value can be imported into the attention value calculation model to obtain the attention value.
  • the above attention value calculation model is used to calculate the attention value by the fatigue type and the fatigue value, and the above attention value is used to characterize the driver's fatigue level. Similar to the fatigue value, the attention value can also range from 0 to 1. When the attention value is 0, the driver can be considered to be severely fatigued; when the attention value is 1, the driver can be considered not to be fatigued.
  • the attention value can also be expressed in other forms, and will not be repeated here.
  • the method of the embodiment may include the step of constructing a attention value calculation model, and the step of constructing the attention value calculation model may include the following steps:
  • the fatigue value in the fatigue value set is divided into the fatigue value sub-set corresponding to the fatigue type according to the fatigue type.
  • the fatigue value in the fatigue value set can be divided into the fatigue value sub-set corresponding to the fatigue type according to the fatigue type.
  • a type weight value is set for each fatigue type, and the product of the type weight value and each fatigue value in the fatigue value subset corresponding to the type weight value is used as the corresponding fatigue type when the fatigue value is taken. The amount of attention.
  • a type weight value can be set for each type of fatigue in order to make a judgment on whether the driver is fatigued or not according to the characteristics of the driver.
  • the product of the type weight value of the fatigue type and each fatigue value in the fatigue value subset corresponding to the type weight value may be taken as the corresponding fatigue type.
  • the attention component of the fatigue value That is, how many fatigue values have the corresponding attention component.
  • the attention component corresponding to each fatigue type can be summed to obtain the driver's attention value.
  • the driver has fatigue types A, B, and C, and the corresponding fatigue value subsets have fatigue values of 0.1, 0.2, and 0.3, respectively, and the corresponding type weight values may be 0.2, 0.2, and 0.6, respectively.
  • the fatigue type and the fatigue value are taken as inputs, the attention component is taken as the output, and the attention value calculation model is trained.
  • the attention value calculation model is a correspondence table in which a technician pre-determines a relationship between a fatigue type, a fatigue value, and a attention component based on statistics on a large number of fatigue types and fatigue values; or a technician based on The statistics of the large amount of data are previously set and stored in the above-described electronic device, and the fatigue value is numerically calculated to obtain a calculation formula for characterizing the calculation result of the attention component of the fatigue type.
  • the calculation formula may be a formula that weights the fatigue values and the results obtained may be used to characterize the attention component of the corresponding fatigue type. After that, the attention components of the respective fatigue types are summed to obtain the attention value.
  • the fatigue recognition model and the attention value calculation model described above may be trained in advance by the server 105.
  • Step 204 Determine an attention threshold range in which the attention value is located, and issue an alarm signal according to the attention threshold range.
  • a plurality of attention thresholds may be preset in the vehicle-mounted terminal 103, and the attention threshold range is formed by the plurality of attention thresholds, that is, the attention threshold range is determined by the preset attention threshold. Different alarm signals can then be set for each attention threshold range, and different alarm signals can reflect the driver's current attention. When the attention value belongs to a certain attention threshold range, an alarm signal corresponding to the attention threshold range is issued.
  • the foregoing attention threshold range in which the attention value is determined, and the sending the alarm signal according to the attention threshold range may include the following steps:
  • the first step is to obtain a range of attention thresholds.
  • the attention threshold range includes a plurality of attention threshold sub-ranges formed by a preset attention threshold, and each attention threshold sub-range corresponds to an alarm signal, and the alarm signal may include: an audible alarm signal and an image alarm signal. Wait.
  • an alarm signal corresponding to the attention threshold sub-range is issued.
  • the attention threshold range can be constructed by attention thresholds 0, 0.5, and 1.
  • the attention threshold value of the attention value range is between 0.5 and 1, the driver's attention can be considered to be concentrated.
  • the driver is not in the fatigue driving state, and the alarm signal can prompt the driver to pay attention to the driving safety;
  • the attention threshold of the attention value ranges from 0 to 0.5 the driver's attention can be considered to be inconsistent.
  • the alarm signal can be a high volume voice and image signal. The driver is prompted to pay attention to driving safety.
  • the determining the attention threshold range in which the attention value is located, and sending the alarm signal according to the attention threshold range may further include: when the attention value is not within the attention threshold range When the auto driving mode is activated.
  • the vehicle-mounted terminal 103 can also automatically activate the automatic driving mode in response to the vehicle having an automatic driving function and the attention value is not within the above-described attention threshold range. For example, when the attention threshold value of the attention value is 0, the driver can be considered to be distracted. At this time, the vehicle-mounted terminal 103 can directly activate the automatic driving mode of the vehicle to improve the safety of driving.
  • FIG. 3 is a schematic diagram of an application scenario of a method for issuing an alert signal according to the present embodiment.
  • the vehicle-mounted terminal 103 acquires driving state data such as vehicle state data of the vehicle, driving environment data collected by the first camera 101, and driver state data collected by the second camera 102
  • the driving state data is imported.
  • the fatigue recognition model obtains the current fatigue type of the driver and the fatigue value of the corresponding fatigue type.
  • the fatigue type and the fatigue value are imported into the attention value calculation model to obtain the attention value.
  • the attention threshold of the attention value is determined. Range, according to the above attention threshold range, the alarm signal "Please pay attention to driving safety".
  • the method provided by the above embodiment of the present application can acquire driving state data in real time, and the driving state data includes driver state data, driving environment data and vehicle state data, and comprehensively considers data pairs of the driver, the driving environment and the vehicle data. Then, the driving state data is imported into the fatigue recognition model to obtain the fatigue type and the fatigue type of the corresponding fatigue type, and the fatigue type and the fatigue value are introduced into the pre-trained attention value calculation model to obtain the attention value, and finally according to the attention value.
  • the attention threshold range emits a corresponding alarm signal, which improves the accuracy of the driver's fatigue level recognition.
  • the present application provides an embodiment of an apparatus for issuing an alarm signal, the apparatus embodiment corresponding to the method embodiment shown in FIG.
  • the device can be specifically applied to various electronic devices.
  • the apparatus 400 for issuing an alarm signal of the present embodiment may include a driving state data acquiring unit 401, a fatigue value acquiring unit 402, an attention value obtaining unit 403, and an alarm unit 404.
  • the driving state data acquiring unit 401 is configured to acquire driving state data in real time, the driving state data includes driver state data, driving environment data, and vehicle state data, and the driver state data includes hand motion data, mouth motion data, The head motion data and the eye motion data, the driving environment data includes lane line data and vehicle distance data, the vehicle state data includes speed data and direction data, and the fatigue value acquiring unit 402 is configured to introduce the driving state data into the pre-trained
  • the fatigue recognition model obtains the fatigue type and the fatigue value of the corresponding fatigue type, and the fatigue recognition model is used to determine the fatigue type and the fatigue value by the driving state data, the fatigue value is used to characterize the degree of the corresponding fatigue type; the attention value obtaining unit 403 It is used to introduce the above fatigue type and fatigue value into the pre-trained attention value calculation model
  • the apparatus 400 for issuing an alarm signal of the embodiment may include a fatigue recognition model construction unit (not shown) for constructing a fatigue recognition model, and the above fatigue recognition
  • the model construction unit may include: a data extraction subunit (not shown in the drawing), a feature extraction subunit (not shown), a fatigue type determination subunit (not shown), and a fatigue numerical calculation subunit (Fig. And the fatigue recognition model construction subunit (not shown).
  • the data extraction subunit is configured to extract temporally synchronized driver state data, driving environment data, and vehicle state data from the driver state data set, the driving environment data set, and the vehicle state data set, respectively; the feature extraction subunit is configured to: Feature extraction is performed on driver state data, driving environment data and vehicle state data, respectively, to obtain driver state feature data, driving environment feature data and vehicle state feature data; and a fatigue type determining subunit is configured to determine according to the driver state feature data.
  • Fatigue type the above fatigue type includes at least one of the following: a closed eye type, a yawn type, a line of sight offset type, a phone call type; a fatigue value calculation subunit for passing driver state characteristic data, driving environment characteristic data, and vehicle state The characteristic data is used to calculate the fatigue value; the fatigue recognition model construction sub-unit is used to input the driver state characteristic data, the driving environment characteristic data and the vehicle state characteristic data as input by using the machine learning method, and the fatigue type and the fatigue type of the corresponding fatigue type are used. It is output, the trained fatigue recognition model.
  • the fatigue value calculation subunit may include: a threshold range acquisition module (not shown), a data measurement module (not shown), and a weight value determination module ( The figure is not shown in the figure) and the fatigue value calculation module (not shown).
  • the threshold range obtaining module is configured to obtain a time threshold range, a distance threshold range, and a data change threshold range, where the time threshold range includes a plurality of time threshold sub-ranges formed by a preset time threshold, where the distance threshold range is included by a plurality of distance threshold sub-ranges formed by the set distance threshold, the data change amount threshold range includes a plurality of data change amount threshold sub-ranges formed by the preset data change amount thresholds, and each time threshold sub-range corresponds to a time weight value, Each distance threshold sub-range corresponds to a distance weight value, and each data change amount threshold sub-range corresponds to one data change weight value; the data measurement module is configured to separately measure the fatigue type duration of each fatigue type corresponding to the driver state characteristic data.
  • the fatigue type duration is represented by a duration of the action corresponding to the specified fatigue type corresponding to the driver state feature data
  • the distance change value Driving
  • the environmental characteristic data is characterized by the amount of change per unit time
  • the data change value is represented by the amount of change of the vehicle state characteristic data in unit time; the weight value determining module is used to determine the fatigue type duration, the distance change value, and the data change value.
  • the module is used to weight the time weight value, the distance weight value and the data change weight value to obtain the fatigue value of the corresponding fatigue type.
  • the apparatus 400 for issuing an alarm signal of the present embodiment may include a attention value calculation model construction unit (not shown) for constructing a attention value calculation model.
  • the attention value calculation model construction unit may include: a data division subunit (not shown), a attention component calculation subunit (not shown), and a attention value calculation model construction subunit (not shown in the figure) show).
  • the data dividing subunit is configured to divide the fatigue value in the fatigue value set into a fatigue value subset corresponding to the fatigue type according to the fatigue type;
  • the attention component calculation subunit is used to respectively set the type weight value for each fatigue type, The product of the type of weight value and each fatigue value in the subset of fatigue values corresponding to the weight value of the type is used as the attention component of the corresponding fatigue type when the fatigue value is taken;
  • the attention value calculation model is constructed by using the subunit for utilization
  • the machine learning method takes the fatigue type and the fatigue value as inputs, the attention component as the output, and the training to obtain the attention value calculation model.
  • the alarm unit 404 may include: an attention threshold range acquisition subunit (not shown) and an alarm subunit (not shown).
  • the attention threshold range obtaining sub-unit is configured to obtain an attention threshold range, where the attention threshold range includes a plurality of attention threshold sub-ranges formed by a preset attention threshold, and each attention threshold sub-range corresponds to one alarm.
  • the alarm signal includes: an audible alarm signal and an image alarm signal; and the alarm subunit is configured to generate an alarm signal corresponding to the attention threshold sub-range in response to the attention threshold sub-range corresponding to the attention value.
  • the alarm unit 404 may further include: when the attention value is not within the attention threshold range, the automatic driving mode is started.
  • the embodiment further provides a server comprising: one or more processors; a memory for storing one or more programs, when the one or more programs are executed by the one or more processors, One or more processors perform the method described above for issuing an alert signal.
  • the embodiment further provides a computer readable storage medium having stored thereon a computer program that, when executed by the processor, implements the above-described method for issuing an alert signal.
  • FIG. 5 there is shown a block diagram of a computer system 500 suitable for use in implementing the server of the embodiments of the present application.
  • the server shown in FIG. 5 is merely an example, and should not impose any limitation on the function and scope of use of the embodiments of the present application.
  • computer system 500 includes a central processing unit (CPU) 501 that can be loaded into a program in random access memory (RAM) 503 according to a program stored in read only memory (ROM) 502 or from storage portion 508. And perform various appropriate actions and processes.
  • RAM random access memory
  • ROM read only memory
  • RAM 503 various programs and data required for the operation of the system 500 are also stored.
  • the CPU 501, the ROM 502, and the RAM 503 are connected to each other through a bus 504.
  • An input/output (I/O) interface 505 is also coupled to bus 504.
  • the following components are connected to the I/O interface 505: an input portion 506 including a keyboard, a mouse, etc.; an output portion 507 including, for example, a cathode ray tube (CRT), a liquid crystal display (LCD), and the like, and a storage portion 508 including a hard disk or the like. And a communication portion 509 including a network interface card such as a LAN card, a modem, or the like. The communication section 509 performs communication processing via a network such as the Internet.
  • Driver 510 is also coupled to I/O interface 505 as needed.
  • a removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory or the like is mounted on the drive 510 as needed so that a computer program read therefrom is installed into the storage portion 508 as needed.
  • an embodiment of the present disclosure includes a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for executing the method illustrated in the flowchart.
  • the computer program can be downloaded and installed from the network via the communication portion 509, and/or installed from the removable medium 511.
  • CPU central processing unit
  • the computer readable medium described above may be a computer readable signal medium or a computer readable storage medium or any combination of the two.
  • the computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the above. More specific examples of computer readable storage media may include, but are not limited to, electrical connections having one or more wires, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable Programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain or store a program, which can be used by or in connection with an instruction execution system, apparatus or device.
  • a computer readable signal medium may include a data signal that is propagated in the baseband or as part of a carrier, carrying computer readable program code. Such propagated data signals can take a variety of forms including, but not limited to, electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer readable signal medium can also be any computer readable medium other than a computer readable storage medium, which can transmit, propagate, or transport a program for use by or in connection with the instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium can be transmitted by any suitable medium, including but not limited to wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
  • each block of the flowchart or block diagram can represent a module, a program segment, or a portion of code that includes one or more of the logic functions for implementing the specified.
  • Executable instructions can also occur in a different order than that illustrated in the drawings. For example, two successively represented blocks may in fact be executed substantially in parallel, and they may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowcharts, and combinations of blocks in the block diagrams and/or flowcharts can be implemented in a dedicated hardware-based system that performs the specified function or operation. Or it can be implemented by a combination of dedicated hardware and computer instructions.
  • the units involved in the embodiments of the present application may be implemented by software or by hardware.
  • the described unit may also be provided in the processor, for example, as a processor including a travel state data acquisition unit, a fatigue value acquisition unit, an attention value acquisition unit, and an alarm unit.
  • the names of these units do not constitute a limitation on the unit itself under certain circumstances.
  • the alarm unit may also be described as "a unit for issuing an alarm signal".
  • the present application also provides a computer readable medium, which may be included in the apparatus described in the above embodiments, or may be separately present and not incorporated into the apparatus.
  • the computer readable medium carries one or more programs, when the one or more programs are executed by the device, causing the device to: acquire driving state data in real time, the driving state data including driver state data, driving environment data, and Vehicle state data, the driver state data includes hand motion data, mouth motion data, head motion data, and eye motion data, the driving environment data includes lane line data and vehicle distance data, and the vehicle state data includes speed data.
  • the direction data introducing the driving state data into the pre-trained fatigue recognition model to obtain the fatigue type and the fatigue type of the corresponding fatigue type, wherein the fatigue recognition model is used to determine the fatigue type and the fatigue value by using the driving state data, and the fatigue value is used for Characterizing the degree of the corresponding fatigue type; introducing the above fatigue type and fatigue value into the pre-trained attention value calculation model to obtain the attention value, and the above attention value calculation model is used to calculate the attention value by the fatigue type and the fatigue value, the above note Force value for characterizing the degree of fatigue of the driver; attention determining threshold range where the focus value, an alarm signal according to the focus threshold range, the amount of attention is determined by a preset threshold range attention threshold.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Surgery (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Traffic Control Systems (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

本申请实施例公开了用于发出告警信号的方法及装置。该方法的一具体实施方式包括:实时获取行驶状态数据;将上述行驶状态数据导入预先训练的疲劳识别模型,得到疲劳类型和对应疲劳类型的疲劳数值;将上述疲劳类型和疲劳数值导入预先训练的注意力值计算模型,得到注意力数值;判断注意力数值所在的注意力阈值范围,根据上述注意力阈值范围发出告警信号。该实施方式提高了对驾驶员的疲劳程度识别的准确率。

Description

用于发出告警信号的方法及装置
相关申请的交叉引用
本专利申请要求于2017年11月16日提交的、申请号为201711140773.X、申请人为百度在线网络技术(北京)有限公司、发明名称为“用于发出告警信号的方法及装置”的中国专利申请的优先权,该申请的全文以引用的方式并入本申请中。
技术领域
本申请实施例涉及计算机技术领域,具体涉及车辆控制技术领域,尤其涉及用于发出告警信号的方法及装置。
背景技术
汽车拓展了人们出行的范围,给人们的出行带来了便利,提高了人们的生活质量。随着科技的发展和进步,通过智能系统控制的无人驾驶车辆能够获取比有人驾驶的汽车更多的行驶信息,具备更高的安全性,成为未来汽车发展的一个重要趋势。
车辆行驶过程中,驾驶员在疲劳驾驶时容易出现交通事故。为了降低交通事故的发生,现有技术通过对驾驶员的眼睛动作进行监测等方法来判断驾驶员的是否疲劳驾驶。
发明内容
本申请实施例的目的在于提出了用于发出告警信号的方法及装置。
第一方面,本申请实施例提供了一种用于发出告警信号的方法,该方法包括:实时获取行驶状态数据,上述行驶状态数据包括驾驶员状态数据、驾驶环境数据和车辆状态数据,上述驾驶员状态数据包括手部动作数据、嘴部动作数据、头部动作数据和眼部动作数据,上述驾驶环境数据包括车道线数据、车辆距离数据,上述车辆状态数据包 括速度数据、方向数据;将上述行驶状态数据导入预先训练的疲劳识别模型,得到疲劳类型和对应疲劳类型的疲劳数值,上述疲劳识别模型用于通过行驶状态数据确定疲劳类型和疲劳数值,上述疲劳数值用于表征对应疲劳类型的程度;将上述疲劳类型和疲劳数值导入预先训练的注意力值计算模型,得到注意力数值,上述注意力值计算模型用于通过疲劳类型和疲劳数值计算注意力数值,上述注意力数值用于表征驾驶员的疲劳程度;判断注意力数值所在的注意力阈值范围,根据上述注意力阈值范围发出告警信号,上述注意力阈值范围通过预设的注意力阈值确定。
在一些实施例中,上述方法包括构建疲劳识别模型的步骤,上述构建疲劳识别模型的步骤包括:分别从驾驶员状态数据集合、驾驶环境数据集合和车辆状态数据集合中提取时间上同步的驾驶员状态数据、驾驶环境数据和车辆状态数据;分别对驾驶员状态数据、驾驶环境数据和车辆状态数据进行特征提取,得到驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据;根据上述驾驶员状态特征数据确定疲劳类型,上述疲劳类型包括以下至少一项:闭眼类型、打哈欠类型、视线偏移类型、打电话类型;通过驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据计算疲劳数值;利用机器学习方法,将驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据作为输入,将疲劳类型和对应的疲劳类型的疲劳数值作为输出,训练得到疲劳识别模型。
在一些实施例中,上述通过驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据计算疲劳数值包括:获取时间阈值范围、距离阈值范围和数据变化量阈值范围,其中,时间阈值范围包括由预先设置的时间阈值构成的多个时间阈值子范围,距离阈值范围包括由预先设置的距离阈值构成的多个距离阈值子范围,数据变化量阈值范围包括由预先设置的数据变化量阈值构成的多个数据变化量阈值子范围,每个时间阈值子范围对应一个时间权重值,每个距离阈值子范围对应一个距离权重值,每个数据变化量阈值子范围对应一个数据变化权重值;分别测量驾驶员状态特征数据对应的每种疲劳类型的疲劳类型持 续时间、驾驶环境特征数据对应的距离变化值和车辆状态特征数据对应的数据变化值,其中,疲劳类型持续时间通过驾驶员状态特征数据对应的指定疲劳类型对应的动作的持续时间来表征,距离变化值通过驾驶环境特征数据在单位时间内的变化量来表征,数据变化值通过车辆状态特征数据在单位时间内的变化量来表征;确定疲劳类型持续时间、距离变化值和数据变化值分别对应的时间阈值范围的时间阈值子范围的时间权重值、距离阈值范围的距离阈值子范围的距离权重值和数据变化量阈值范围的数据变化量阈值子范围的数据变化权重值;对时间权重值、距离权重值和数据变化权重值加权求和得到对应疲劳类型的疲劳数值。
在一些实施例中,上述方法包括构建注意力值计算模型的步骤,上述构建注意力值计算模型的步骤包括:按照疲劳类型将疲劳数值集合中的疲劳数值划分为对应疲劳类型的疲劳数值子集合;分别为每种疲劳类型设置类型权重值,将类型权重值与该类型权重值对应的疲劳数值子集合中的每个疲劳数值的乘积作为对应疲劳类型在取值为该疲劳数值时的注意力分量;利用机器学习方法,将疲劳类型和疲劳数值作为输入,将注意力分量作为输出,训练得到注意力值计算模型。
在一些实施例中,上述判断注意力数值所在的注意力阈值范围,根据上述注意力阈值范围发出告警信号包括:获取注意力阈值范围,上述注意力阈值范围包括由预先设置的注意力阈值构成的多个注意力阈值子范围,每个注意力阈值子范围对应一个告警信号,上述告警信号包括:声音告警信号、图像告警信号;响应于注意力数值对应的注意力阈值子范围,发出对应该注意力阈值子范围的告警信号。
在一些实施例中,上述判断上述注意力数值所在的注意力阈值范围,根据上述注意力阈值范围发出告警信号还包括:响应于注意力数值不在上述注意力阈值范围内时,启动自动驾驶模式。
第二方面,本申请实施例提供了一种用于发出告警信号的装置,上述该装置包括:行驶状态数据获取单元,用于实时获取行驶状态数据,上述行驶状态数据包括驾驶员状态数据、驾驶环境数据和车辆状态数据,上述驾驶员状态数据包括手部动作数据、嘴部动作数据、头 部动作数据和眼部动作数据,上述驾驶环境数据包括车道线数据、车辆距离数据,上述车辆状态数据包括速度数据、方向数据;疲劳数值获取单元,用于将上述行驶状态数据导入预先训练的疲劳识别模型,得到疲劳类型和对应疲劳类型的疲劳数值,上述疲劳识别模型用于通过行驶状态数据确定疲劳类型和疲劳数值,上述疲劳数值用于表征对应疲劳类型的程度;注意力数值获取单元,用于将上述疲劳类型和疲劳数值导入预先训练的注意力值计算模型,得到注意力数值,上述注意力值计算模型用于通过疲劳类型和疲劳数值计算注意力数值,上述注意力数值用于表征驾驶员的疲劳程度;告警单元,用于判断注意力数值所在的注意力阈值范围,根据上述注意力阈值范围发出告警信号,上述注意力阈值范围通过预设的注意力阈值确定。
在一些实施例中,上述装置包括疲劳识别模型构建单元,用于构建疲劳识别模型,上述疲劳识别模型构建单元包括:数据提取子单元,用于分别从驾驶员状态数据集合、驾驶环境数据集合和车辆状态数据集合中提取时间上同步的驾驶员状态数据、驾驶环境数据和车辆状态数据;体征提取子单元,用于分别对驾驶员状态数据、驾驶环境数据和车辆状态数据进行特征提取,得到驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据;疲劳类型确定子单元,用于根据上述驾驶员状态特征数据确定疲劳类型,上述疲劳类型包括以下至少一项:闭眼类型、打哈欠类型、视线偏移类型、打电话类型;疲劳数值计算子单元,用于通过驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据计算疲劳数值;疲劳识别模型构建子单元,用于利用机器学习方法,将驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据作为输入,将疲劳类型和对应的疲劳类型的疲劳数值作为输出,训练得到疲劳识别模型。
在一些实施例中,上述疲劳数值计算子单元包括:阈值范围获取模块,用于获取时间阈值范围、距离阈值范围和数据变化量阈值范围,其中,时间阈值范围包括由预先设置的时间阈值构成的多个时间阈值子范围,距离阈值范围包括由预先设置的距离阈值构成的多个距离阈值子范围,数据变化量阈值范围包括由预先设置的数据变化量阈值构 成的多个数据变化量阈值子范围,每个时间阈值子范围对应一个时间权重值,每个距离阈值子范围对应一个距离权重值,每个数据变化量阈值子范围对应一个数据变化权重值;数据测量模块,用于分别测量驾驶员状态特征数据对应的每种疲劳类型的疲劳类型持续时间、驾驶环境特征数据对应的距离变化值和车辆状态特征数据对应的数据变化值,其中,疲劳类型持续时间通过驾驶员状态特征数据对应的指定疲劳类型对应的动作的持续时间来表征,距离变化值通过驾驶环境特征数据在单位时间内的变化量来表征,数据变化值通过车辆状态特征数据在单位时间内的变化量来表征;权重值确定模块,用于确定疲劳类型持续时间、距离变化值和数据变化值分别对应的时间阈值范围的时间阈值子范围的时间权重值、距离阈值范围的距离阈值子范围的距离权重值和数据变化量阈值范围的数据变化量阈值子范围的数据变化权重值;疲劳数值计算模块,用于对时间权重值、距离权重值和数据变化权重值加权求和得到对应疲劳类型的疲劳数值。
在一些实施例中,上述装置包括注意力值计算模型构建单元,用于构建注意力值计算模型,上述注意力值计算模型构建单元包括:数据划分子单元,用于按照疲劳类型将疲劳数值集合中的疲劳数值划分为对应疲劳类型的疲劳数值子集合;注意力分量计算子单元,用于分别为每种疲劳类型设置类型权重值,将类型权重值与该类型权重值对应的疲劳数值子集合中的每个疲劳数值的乘积作为对应疲劳类型在取值为该疲劳数值时的注意力分量;注意力值计算模型构建子单元,用于利用机器学习方法,将疲劳类型和疲劳数值作为输入,将注意力分量作为输出,训练得到注意力值计算模型。
在一些实施例中,上述告警单元包括:注意力阈值范围获取子单元,用于获取注意力阈值范围,上述注意力阈值范围包括由预先设置的注意力阈值构成的多个注意力阈值子范围,每个注意力阈值子范围对应一个告警信号,上述告警信号包括:声音告警信号、图像告警信号;告警子单元,用于响应于注意力数值对应的注意力阈值子范围,发出对应该注意力阈值子范围的告警信号。
在一些实施例中,上述告警单元还包括:响应于注意力数值不在 上述注意力阈值范围内时,启动自动驾驶模式。
第三方面,本申请实施例提供了一种服务器,包括:一个或多个处理器;存储器,用于存储一个或多个程序,当上述一个或多个程序被上述一个或多个处理器执行时,使得上述一个或多个处理器执行上述第一方面的用于发出告警信号的方法。
第四方面,本申请实施例提供了一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现上述第一方面的用于发出告警信号的方法。
本申请实施例提供的用于发出告警信号的方法及装置,实时获取行驶状态数据,上述行驶状态数据包括驾驶员状态数据、驾驶环境数据和车辆状态数据,实现了对驾驶员、行驶环境和车辆数据的数据对综合考虑,然后将行驶状态数据导入疲劳识别模型得到疲劳类型和对应疲劳类型的疲劳数值,并将疲劳类型和疲劳数值导入预先训练的注意力值计算模型,得到注意力数值,最后根据注意力数值所在的注意力阈值范围发出对应的告警信号,提高了对驾驶员的疲劳程度识别的准确率。
附图说明
通过阅读参照以下附图所作的对非限制性实施例所作的详细描述,本申请的其它特征、目的和优点将会变得更明显:
图1是本申请可以应用于其中的示例性系统架构图;
图2是根据本申请的用于发出告警信号的方法的一个实施例的流程图;
图3是根据本申请的用于发出告警信号的方法的一个应用场景的示意图;
图4是根据本申请的用于发出告警信号的装置的一个实施例的结构示意图;
图5是适于用来实现本申请实施例的服务器的计算机系统的结构示意图。
具体实施方式
下面结合附图和实施例对本申请作进一步的详细说明。可以理解的是,此处所描述的具体实施例仅仅用于解释相关发明,而非对该发明的限定。另外还需要说明的是,为了便于描述,附图中仅示出了与有关发明相关的部分。
需要说明的是,在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互组合。下面将参考附图并结合实施例来详细说明本申请。
图1示出了可以应用本申请的用于发出告警信号的方法或用于发出告警信号的装置的实施例的示例性系统架构100。
如图1所示,系统架构100可以包括第一摄像头101、第二摄像头102、车载终端103、网络104和服务器105。网络104用以在车载终端103和服务器105之间提供通信链路的介质。网络104可以包括各种连接类型,例如有线、无线通信链路或者光纤电缆等等。
第一摄像头101可以设置在车辆外部,用于采集当前车辆行驶过程中的图像,然后根据采集的图像计算出当前车辆与行驶道路上的其他车辆、行人之间的距离,还可以计算出当前车辆与车道线之间的距离;第二摄像头102可以设置在车辆内部,用于监测驾驶员的各种动作,包括手部动作、嘴部动作、头部动作和眼部动作等,并将这些动作转换为对应的数据;车载终端103可以接收第一摄像头101、第二摄像头102以及当前车辆的状态数据(例如可以是速度数据、方向数据等),并对接收的数据进行处理,进而在驾驶员疲劳驾驶时发出对应的告警信号。
服务器105可以是提供各种服务的服务器,例如对车载终端103发来的数据进行处理的服务器,服务器105可以对车载终端103发来的数据进行数据处理,并将得到的注意力数值等数据发送给车载终端103。
需要说明的是,本申请实施例所提供的用于发出告警信号的方法一般由车载终端103执行,相应地,用于发出告警信号的装置一般设置于车载终端103中。
应该理解,图1中的第一摄像头101、第二摄像头102、车载终端103、网络104和服务器105的数目仅仅是示意性的。根据实现需要,可以具有任意数目的第一摄像头101、第二摄像头102、车载终端103、网络104和服务器105。
继续参考图2,示出了根据本申请的用于发出告警信号的方法的一个实施例的流程200。该用于发出告警信号的方法包括以下步骤:
步骤201,实时获取行驶状态数据。
在本实施例中,用于发出告警信号的方法运行于其上的电子设备(例如图1所示的车载终端103)可以通过有线连接方式或者无线连接方式接收行驶状态数据。其中,上述行驶状态数据可以包括驾驶员状态数据、驾驶环境数据和车辆状态数据,上述驾驶员状态数据可以包括手部动作数据、嘴部动作数据、头部动作数据和眼部动作数据,上述驾驶环境数据可以包括车道线数据、车辆距离数据,上述车辆状态数据可以包括速度数据、方向数据。需要指出的是,上述无线连接方式可以包括但不限于3G/4G连接、WiFi连接、蓝牙连接、WiMAX连接、Zigbee连接、UWB(ultra wideband)连接、以及其他现在已知或将来开发的无线连接方式。
当驾驶员驾驶车辆行驶时,可以实时获取驾驶员状态数据、驾驶环境数据和车辆状态数据。其中,驾驶员状态数据可以由设置在车内的第二摄像头102采集;驾驶环境数据可以由设置在车外的第一摄像头101采集;车辆状态数据可以直接从车辆获取。
步骤202,将上述行驶状态数据导入预先训练的疲劳识别模型,得到疲劳类型和对应疲劳类型的疲劳数值。
获取到行驶状态数据后,可以将行驶状态数据导入疲劳识别模型,得到疲劳类型(可以通过信息的形式表示)和对应疲劳类型的疲劳数值。其中,上述疲劳识别模型用于通过行驶状态数据确定疲劳类型和疲劳数值,上述疲劳数值用于表征对应疲劳类型的程度。疲劳数值可以通过多种形式表示。例如,疲劳数值的取值范围可以是0到1。当疲劳数值为0时,可以认为驾驶员不疲劳;当疲劳数值为1时,可以认为驾驶员处于严重疲劳。疲劳数值还可以通过0到100等其他形式 表示,此处不再一一赘述。
在本实施例的一些可选的实现方式中,本实施例方法可以包括构建疲劳识别模型的步骤,上述构建疲劳识别模型的步骤可以包括以下步骤:
第一步,分别从驾驶员状态数据集合、驾驶环境数据集合和车辆状态数据集合中提取时间上同步的驾驶员状态数据、驾驶环境数据和车辆状态数据。
当驾驶员处于疲劳驾驶时,不仅驾驶员自身会表现出一些疲劳特征,同时,车辆也会因驾驶员的疲劳驾驶而出现行驶异常。因此,在判断驾驶员是否疲劳时,可以分别从驾驶员状态数据集合、驾驶环境数据集合和车辆状态数据集合中提取时间上同步的驾驶员状态数据、驾驶环境数据和车辆状态数据。需要说明的是,驾驶员状态数据、驾驶环境数据和车辆状态数据在时间上必须的同步(时间上相同)的,以便通过多种数据综合起来对驾驶员是否疲劳驾驶做出准确判断。
第二步,分别对驾驶员状态数据、驾驶环境数据和车辆状态数据进行特征提取,得到驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据。
驾驶员状态数据、驾驶环境数据和车辆状态数据通常是直接采集的。为了对驾驶员是否疲劳驾驶进行判断,需要分别对驾驶员状态数据、驾驶环境数据和车辆状态数据进行特征提取,得到驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据。即,驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据为与驾驶员疲劳驾驶相关的数据。例如,驾驶员状态特征数据可以是眼部闭合的时间值;驾驶环境特征数据可以是车辆距离的变化值;车辆状态特征数据可以是速度的变化值。驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据还可以是其他类型的数据,具体视实际需要而定。
第三步,根据上述驾驶员状态特征数据确定疲劳类型。
驾驶员状态数据可以包括驾驶员的手部动作数据、嘴部动作数据、头部动作数据和眼部动作数据等数据。当驾驶员出现了其中的一类数据或几类数据时,对应的动作数据可以通过驾驶员状态特征数据来表 现出来。如果出现了某一类或几类数据,则说明驾驶员表现出了对应的疲劳类型。上述疲劳类型包括以下至少一项:闭眼类型、打哈欠类型、视线偏移类型、打电话类型。
第四步,通过驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据计算疲劳数值。
获取到驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据后,通过一定的数据处理方式可以得到对应疲劳类型的疲劳数值。例如,可以分别计算驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据对应某一疲劳类型的数值,然后加权得到该疲劳类型的疲劳数值。
第五步,利用机器学习方法,将驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据作为输入,将疲劳类型和对应的疲劳类型的疲劳数值作为输出,训练得到疲劳识别模型。
疲劳识别模型可以是技术人员基于对大量的驾驶员状态特征数据、驾驶环境特征数据、车辆状态特征数据、疲劳类型和疲劳数值进行统计而预先制定的、存储有驾驶员状态特征数据、驾驶环境特征数据、车辆状态特征数据、疲劳类型和疲劳数值的对应关系的对应关系表;也可以是技术人员基于对大量数据的统计而预先设置并存储在上述电子设备中的,对驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据进行数值计算以得到用于表征疲劳类型的疲劳数值的计算结果的计算公式。例如,该计算公式可以是对驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据进行加权求和的公式,得到的结果可以用于表征对应疲劳类型的疲劳数值。
在本实施例的一些可选的实现方式中,上述通过驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据计算疲劳数值可以包括以下步骤:
第一步,获取时间阈值范围、距离阈值范围和数据变化量阈值范围。
通常,时间阈值范围、距离阈值范围和数据变化量阈值范围可以是预先设置好的。其中,时间阈值范围可以包括由预先设置的时间阈 值构成的多个时间阈值子范围;距离阈值范围可以包括由预先设置的距离阈值构成的多个距离阈值子范围;数据变化量阈值范围可以包括由预先设置的数据变化量阈值构成的多个数据变化量阈值子范围。每个时间阈值子范围对应一个时间权重值;每个距离阈值子范围对应一个距离权重值;每个数据变化量阈值子范围对应一个数据变化权重值。
第二步,分别测量驾驶员状态特征数据对应的每种疲劳类型的疲劳类型持续时间、驾驶环境特征数据对应的距离变化值和车辆状态特征数据对应的数据变化值。
为了通过量化的方式确定各种数据与疲劳驾驶的相关性,可以分别测量驾驶员状态特征数据对应的每种疲劳类型的疲劳类型持续时间、驾驶环境特征数据对应的距离变化值和车辆状态特征数据对应的数据变化值。其中,疲劳类型持续时间可以通过驾驶员状态特征数据对应的指定疲劳类型对应的动作的持续时间来表征,距离变化值可以通过驾驶环境特征数据在单位时间内的变化量来表征,数据变化值可以通过车辆状态特征数据在单位时间内的变化量来表征。
第三步,确定疲劳类型持续时间、距离变化值和数据变化值分别对应的时间阈值范围的时间阈值子范围的时间权重值、距离阈值范围的距离阈值子范围的距离权重值和数据变化量阈值范围的数据变化量阈值子范围的数据变化权重值。
得到疲劳类型持续时间、距离变化值和数据变化值后,通过数据对比,可以分别得到疲劳类型持续时间在时间阈值范围对应的时间阈值子范围的时间权重值、距离变化值在距离阈值范围对应的距离阈值子范围的距离权重值和数据变化值在数据变化量阈值范围对应的数据变化量阈值子范围的数据变化权重值。
第四步,对时间权重值、距离权重值和数据变化权重值加权求和得到对应疲劳类型的疲劳数值。
时间权重值、距离权重值和数据变化权重值与不同的疲劳类型的相关性可以不同。为此,可以根据具体的疲劳类型为时间权重值、距离权重值和数据变化权重值设置不同的权值,在得到时间权重值、距离权重值和数据变化权重值与各自权值的乘积后再求和,得到对应疲 劳类型的疲劳数值。例如,时间权重值、距离权重值和数据变化权重值可以分别是0.1、0.3和0.5,对应某一疲劳类型的权值可以分别是0.2、0.1和0.4,则对应该疲劳类型的疲劳数值可以是0.1×0.2+0.3×0.1+0.5×0.4=0.25。
步骤203,将上述疲劳类型和疲劳数值导入预先训练的注意力值计算模型,得到注意力数值。
得到疲劳类型和疲劳数值后,可以将疲劳类型和疲劳数值导入注意力值计算模型得到注意力数值。其中,上述注意力值计算模型用于通过疲劳类型和疲劳数值计算注意力数值,上述注意力数值用于表征驾驶员的疲劳程度。与疲劳数值类似,注意力数值的取值范围也可以是0到1。当注意力数值为0时,可以认为驾驶员严重疲劳;当注意力数值为1时,可以认为驾驶员不疲劳。注意力数值还可以通过其他形式表示,此处不再一一赘述。
在本实施例的一些可选的实现方式中,本实施例方法可以包括构建注意力值计算模型的步骤,上述构建注意力值计算模型的步骤可以包括以下步骤:
第一步,按照疲劳类型将疲劳数值集合中的疲劳数值划分为对应疲劳类型的疲劳数值子集合。
由上述描述可知,不同的疲劳类型对应各自的疲劳数值。因此,可以按照疲劳类型将疲劳数值集合中的疲劳数值划分为对应疲劳类型的疲劳数值子集合。
第二步,分别为每种疲劳类型设置类型权重值,将类型权重值与该类型权重值对应的疲劳数值子集合中的每个疲劳数值的乘积作为对应疲劳类型在取值为该疲劳数值时的注意力分量。
根据不同的驾驶员的驾驶习惯和驾驶行为,可以为每种疲劳类型设置类型权重值,以便根据驾驶员的特征针对性地对驾驶员是否疲劳驾驶做出判断。当需要获取某一疲劳类型的注意类分量时,可以将该疲劳类型的类型权重值与该类型权重值对应的疲劳数值子集合中的每个疲劳数值的乘积作为对应疲劳类型在取值为该疲劳数值时的注意力分量。即,有多少个疲劳数值就有多少个对应的注意力分量。当驾驶 员出现了多种疲劳类型时,可以将对应各个疲劳类型的注意力分量求和得到驾驶员的注意力数值。例如,驾驶员出现了疲劳类型A、B、C,对应的疲劳数值子集合中分别具有疲劳数值0.1、0.2和0.3,对应的类型权重值可以分别为0.2、0.2和0.6,则该驾驶员的注意力数值在疲劳数值分别为0.1、0.2和0.3时,可以是0.1×0.2+0.2×0.2+0.3×0.6=0.24。
第三步,利用机器学习方法,将疲劳类型和疲劳数值作为输入,将注意力分量作为输出,训练得到注意力值计算模型。
注意力值计算模型以是技术人员基于对大量的疲劳类型和疲劳数值进行统计而预先制定的、存储有疲劳类型、疲劳数值和注意力分量的对应关系的对应关系表;也可以是技术人员基于对大量数据的统计而预先设置并存储在上述电子设备中的,对疲劳数值进行数值计算以得到用于表征疲劳类型的注意力分量的计算结果的计算公式。例如,该计算公式可以是对疲劳数值进行加权求和的公式,得到的结果可以用于表征对应疲劳类型的注意力分量。之后,将各个疲劳类型的注意力分量求和得到注意力数值。
需要说明的是,上述的疲劳识别模型和注意力值计算模型可以预先通过服务器105训练得到。
步骤204,判断注意力数值所在的注意力阈值范围,根据上述注意力阈值范围发出告警信号。
车载终端103上可以预先设置有多个注意力阈值,通过多个注意力阈值构成注意力阈值范围,即,上述注意力阈值范围通过预设的注意力阈值确定。然后,可以为每个注意力阈值范围设置不同的告警信号,不同的告警信号可以体现驾驶员当前的注意力。当注意力数值属于某个注意力阈值范围时,发出该注意力阈值范围对应的告警信号。
在本实施例的一些可选的实现方式中,上述判断注意力数值所在的注意力阈值范围,根据上述注意力阈值范围发出告警信号可以包括以下步骤:
第一步,获取注意力阈值范围。
在发出告警信号前,需要首先获取到注意力阈值范围。其中,上述注意力阈值范围包括由预先设置的注意力阈值构成的多个注意力阈 值子范围,每个注意力阈值子范围对应一个告警信号,上述告警信号可以包括:声音告警信号、图像告警信号等。
第二步,响应于注意力数值对应的注意力阈值子范围,发出对应该注意力阈值子范围的告警信号。
例如,注意力阈值范围可以通过注意力阈值0、0.5和1构成。当注意力数值所在的注意力阈值范围为0.5到1之间时,可以认为驾驶员的注意力集中,对应的,驾驶员没有处于疲劳驾驶状态,告警信号可以为语音提示驾驶员注意行驶安全;当注意力数值所在的注意力阈值范围为0到0.5之间时,可以认为驾驶员的注意力不集中,对应的,驾驶员没有处于轻微疲劳驾驶状态,告警信号可以为高音量语音、图像信号提示驾驶员注意行驶安全。
在本实施例的一些可选的实现方式中,上述判断上述注意力数值所在的注意力阈值范围,根据上述注意力阈值范围发出告警信号还可以包括:当注意力数值不在上述注意力阈值范围内时,启动自动驾驶模式。
响应于车辆具有自动驾驶功能,且注意力数值不在上述注意力阈值范围内时,车载终端103还可以自动启动自动驾驶模式。例如,当注意力数值所在的注意力阈值范围为0时,可以认为驾驶员散失注意力,此时,车载终端103就可以直接启动车辆的自动驾驶模式,以提高行驶的安全性。
继续参见图3,图3是根据本实施例的用于发出告警信号的方法的应用场景的一个示意图。在图3的应用场景中,车载终端103获取到车辆的车辆状态数据、第一摄像头101采集的驾驶环境数据和第二摄像头102采集的驾驶员状态数据等行驶状态数据后,将行驶状态数据导入疲劳识别模型,得到驾驶员当前的疲劳类型和对应疲劳类型的疲劳数值;然后,将疲劳类型和疲劳数值导入注意力值计算模型,得到注意力数值;最后,判断注意力数值所在的注意力阈值范围,根据上述注意力阈值范围发出告警信号“请注意行驶安全”。
本申请的上述实施例提供的方法能够实时获取行驶状态数据,上述行驶状态数据包括驾驶员状态数据、驾驶环境数据和车辆状态数据, 实现了对驾驶员、行驶环境和车辆数据的数据对综合考虑,然后将行驶状态数据导入疲劳识别模型得到疲劳类型和对应疲劳类型的疲劳数值,并将疲劳类型和疲劳数值导入预先训练的注意力值计算模型,得到注意力数值,最后根据注意力数值所在的注意力阈值范围发出对应的告警信号,提高了对驾驶员的疲劳程度识别的准确率。
进一步参考图4,作为对上述各图所示方法的实现,本申请提供了一种用于发出告警信号的装置的一个实施例,该装置实施例与图4所示的方法实施例相对应,该装置具体可以应用于各种电子设备中。
如图4所示,本实施例的用于发出告警信号的装置400可以包括:行驶状态数据获取单元401、疲劳数值获取单元402、注意力数值获取单元403和告警单元404。其中,行驶状态数据获取单元401用于实时获取行驶状态数据,上述行驶状态数据包括驾驶员状态数据、驾驶环境数据和车辆状态数据,上述驾驶员状态数据包括手部动作数据、嘴部动作数据、头部动作数据和眼部动作数据,上述驾驶环境数据包括车道线数据、车辆距离数据,上述车辆状态数据包括速度数据、方向数据;疲劳数值获取单元402用于将上述行驶状态数据导入预先训练的疲劳识别模型,得到疲劳类型和对应疲劳类型的疲劳数值,上述疲劳识别模型用于通过行驶状态数据确定疲劳类型和疲劳数值,上述疲劳数值用于表征对应疲劳类型的程度;注意力数值获取单元403用于将上述疲劳类型和疲劳数值导入预先训练的注意力值计算模型,得到注意力数值,上述注意力值计算模型用于通过疲劳类型和疲劳数值计算注意力数值,上述注意力数值用于表征驾驶员的疲劳程度;告警单元404用于判断注意力数值所在的注意力阈值范围,根据上述注意力阈值范围发出告警信号,上述注意力阈值范围通过预设的注意力阈值确定。
在本实施例的一些可选的实现方式中,本实施例的用于发出告警信号的装置400可以包括疲劳识别模型构建单元(图中未示出),用于构建疲劳识别模型,上述疲劳识别模型构建单元可以包括:数据提取子单元(图中未示出)、特征提取子单元(图中未示出)、疲劳类型确定子单元(图中未示出)、疲劳数值计算子单元(图中未示出)和疲劳 识别模型构建子单元(图中未示出)。其中,数据提取子单元用于分别从驾驶员状态数据集合、驾驶环境数据集合和车辆状态数据集合中提取时间上同步的驾驶员状态数据、驾驶环境数据和车辆状态数据;特征提取子单元用于分别对驾驶员状态数据、驾驶环境数据和车辆状态数据进行特征提取,得到驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据;疲劳类型确定子单元用于根据上述驾驶员状态特征数据确定疲劳类型,上述疲劳类型包括以下至少一项:闭眼类型、打哈欠类型、视线偏移类型、打电话类型;疲劳数值计算子单元用于通过驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据计算疲劳数值;疲劳识别模型构建子单元用于利用机器学习方法,将驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据作为输入,将疲劳类型和对应的疲劳类型的疲劳数值作为输出,训练得到疲劳识别模型。
在本实施例的一些可选的实现方式中,上述疲劳数值计算子单元可以包括:阈值范围获取模块(图中未示出)、数据测量模块(图中未示出)、权重值确定模块(图中未示出)和疲劳数值计算模块(图中未示出)。其中,阈值范围获取模块用于获取时间阈值范围、距离阈值范围和数据变化量阈值范围,其中,时间阈值范围包括由预先设置的时间阈值构成的多个时间阈值子范围,距离阈值范围包括由预先设置的距离阈值构成的多个距离阈值子范围,数据变化量阈值范围包括由预先设置的数据变化量阈值构成的多个数据变化量阈值子范围,每个时间阈值子范围对应一个时间权重值,每个距离阈值子范围对应一个距离权重值,每个数据变化量阈值子范围对应一个数据变化权重值;数据测量模块用于分别测量驾驶员状态特征数据对应的每种疲劳类型的疲劳类型持续时间、驾驶环境特征数据对应的距离变化值和车辆状态特征数据对应的数据变化值,其中,疲劳类型持续时间通过驾驶员状态特征数据对应的指定疲劳类型对应的动作的持续时间来表征,距离变化值通过驾驶环境特征数据在单位时间内的变化量来表征,数据变化值通过车辆状态特征数据在单位时间内的变化量来表征;权重值确定模块用于确定疲劳类型持续时间、距离变化值和数据变化值分别对 应的时间阈值范围的时间阈值子范围的时间权重值、距离阈值范围的距离阈值子范围的距离权重值和数据变化量阈值范围的数据变化量阈值子范围的数据变化权重值;疲劳数值计算模块用于对时间权重值、距离权重值和数据变化权重值加权求和得到对应疲劳类型的疲劳数值。
在本实施例的一些可选的实现方式中,本实施例的用于发出告警信号的装置400可以包括注意力值计算模型构建单元(图中未示出),用于构建注意力值计算模型,上述注意力值计算模型构建单元可以包括:数据划分子单元(图中未示出)、注意力分量计算子单元(图中未示出)和注意力值计算模型构建子单元(图中未示出)。其中,数据划分子单元用于按照疲劳类型将疲劳数值集合中的疲劳数值划分为对应疲劳类型的疲劳数值子集合;注意力分量计算子单元用于分别为每种疲劳类型设置类型权重值,将类型权重值与该类型权重值对应的疲劳数值子集合中的每个疲劳数值的乘积作为对应疲劳类型在取值为该疲劳数值时的注意力分量;注意力值计算模型构建子单元用于利用机器学习方法,将疲劳类型和疲劳数值作为输入,将注意力分量作为输出,训练得到注意力值计算模型。
在本实施例的一些可选的实现方式中,上述告警单元404可以包括:注意力阈值范围获取子单元(图中未示出)和告警子单元(图中未示出)。其中,注意力阈值范围获取子单元用于获取注意力阈值范围,上述注意力阈值范围包括由预先设置的注意力阈值构成的多个注意力阈值子范围,每个注意力阈值子范围对应一个告警信号,上述告警信号包括:声音告警信号、图像告警信号;告警子单元用于响应于注意力数值对应的注意力阈值子范围,发出对应该注意力阈值子范围的告警信号。
在本实施例的一些可选的实现方式中,上述告警单元404还可以包括:响应于注意力数值不在上述注意力阈值范围内时,启动自动驾驶模式。
本实施例还提供了一种服务器,包括:一个或多个处理器;存储器,用于存储一个或多个程序,当上述一个或多个程序被上述一个或多个处理器执行时,使得上述一个或多个处理器执行上述的用于发出 告警信号的方法。
本实施例还提供了一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现上述的用于发出告警信号的方法。
下面参考图5,其示出了适于用来实现本申请实施例的服务器的计算机系统500的结构示意图。图5示出的服务器仅仅是一个示例,不应对本申请实施例的功能和使用范围带来任何限制。
如图5所示,计算机系统500包括中央处理单元(CPU)501,其可以根据存储在只读存储器(ROM)502中的程序或者从存储部分508加载到随机访问存储器(RAM)503中的程序而执行各种适当的动作和处理。在RAM 503中,还存储有系统500操作所需的各种程序和数据。CPU 501、ROM 502以及RAM 503通过总线504彼此相连。输入/输出(I/O)接口505也连接至总线504。
以下部件连接至I/O接口505:包括键盘、鼠标等的输入部分506;包括诸如阴极射线管(CRT)、液晶显示器(LCD)等以及扬声器等的输出部分507;包括硬盘等的存储部分508;以及包括诸如LAN卡、调制解调器等的网络接口卡的通信部分509。通信部分509经由诸如因特网的网络执行通信处理。驱动器510也根据需要连接至I/O接口505。可拆卸介质511,诸如磁盘、光盘、磁光盘、半导体存储器等等,根据需要安装在驱动器510上,以便于从其上读出的计算机程序根据需要被安装入存储部分508。
特别地,根据本公开的实施例,上文参考流程图描述的过程可以被实现为计算机软件程序。例如,本公开的实施例包括一种计算机程序产品,其包括承载在计算机可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信部分509从网络上被下载和安装,和/或从可拆卸介质511被安装。在该计算机程序被中央处理单元(CPU)501执行时,执行本申请的方法中限定的上述功能。
需要说明的是,本申请上述的计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红 外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本申请中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本申请中,计算机可读的信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读的信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:无线、电线、光缆、RF等等,或者上述的任意合适的组合。
附图中的流程图和框图,图示了按照本申请各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,该模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本申请实施例中所涉及到的单元可以通过软件的方式实现,也可以通过硬件的方式来实现。所描述的单元也可以设置在处理器中,例如,可以描述为:一种处理器包括行驶状态数据获取单元、疲劳数 值获取单元、注意力数值获取单元和告警单元。其中,这些单元的名称在某种情况下并不构成对该单元本身的限定,例如,告警单元还可以被描述为“用于发出告警信号的单元”。
作为另一方面,本申请还提供了一种计算机可读介质,该计算机可读介质可以是上述实施例中描述的装置中所包含的;也可以是单独存在,而未装配入该装置中。上述计算机可读介质承载有一个或者多个程序,当上述一个或者多个程序被该装置执行时,使得该装置:实时获取行驶状态数据,上述行驶状态数据包括驾驶员状态数据、驾驶环境数据和车辆状态数据,上述驾驶员状态数据包括手部动作数据、嘴部动作数据、头部动作数据和眼部动作数据,上述驾驶环境数据包括车道线数据、车辆距离数据,上述车辆状态数据包括速度数据、方向数据;将上述行驶状态数据导入预先训练的疲劳识别模型,得到疲劳类型和对应疲劳类型的疲劳数值,上述疲劳识别模型用于通过行驶状态数据确定疲劳类型和疲劳数值,上述疲劳数值用于表征对应疲劳类型的程度;将上述疲劳类型和疲劳数值导入预先训练的注意力值计算模型,得到注意力数值,上述注意力值计算模型用于通过疲劳类型和疲劳数值计算注意力数值,上述注意力数值用于表征驾驶员的疲劳程度;判断注意力数值所在的注意力阈值范围,根据上述注意力阈值范围发出告警信号,上述注意力阈值范围通过预设的注意力阈值确定。
以上描述仅为本申请的较佳实施例以及对所运用技术原理的说明。本领域技术人员应当理解,本申请中所涉及的发明范围,并不限于上述技术特征的特定组合而成的技术方案,同时也应涵盖在不脱离上述发明构思的情况下,由上述技术特征或其等同特征进行任意组合而形成的其它技术方案。例如上述特征与本申请中公开的(但不限于)具有类似功能的技术特征进行互相替换而形成的技术方案。

Claims (14)

  1. 一种用于发出告警信号的方法,其特征在于,所述方法包括:
    实时获取行驶状态数据,所述行驶状态数据包括驾驶员状态数据、驾驶环境数据和车辆状态数据,所述驾驶员状态数据包括手部动作数据、嘴部动作数据、头部动作数据和眼部动作数据,所述驾驶环境数据包括车道线数据、车辆距离数据,所述车辆状态数据包括速度数据、方向数据;
    将所述行驶状态数据导入预先训练的疲劳识别模型,得到疲劳类型和对应疲劳类型的疲劳数值,所述疲劳识别模型用于通过行驶状态数据确定疲劳类型和疲劳数值,所述疲劳数值用于表征对应疲劳类型的程度;
    将所述疲劳类型和疲劳数值导入预先训练的注意力值计算模型,得到注意力数值,所述注意力值计算模型用于通过疲劳类型和疲劳数值计算注意力数值,所述注意力数值用于表征驾驶员的疲劳程度;
    判断注意力数值所在的注意力阈值范围,根据所述注意力阈值范围发出告警信号,所述注意力阈值范围通过预设的注意力阈值确定。
  2. 根据权利要求1所述的方法,其特征在于,所述方法包括构建疲劳识别模型的步骤,所述构建疲劳识别模型的步骤包括:
    分别从驾驶员状态数据集合、驾驶环境数据集合和车辆状态数据集合中提取时间上同步的驾驶员状态数据、驾驶环境数据和车辆状态数据;
    分别对驾驶员状态数据、驾驶环境数据和车辆状态数据进行特征提取,得到驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据;
    根据所述驾驶员状态特征数据确定疲劳类型,所述疲劳类型包括以下至少一项:闭眼类型、打哈欠类型、视线偏移类型、打电话类型;
    通过驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据计算疲劳数值;
    利用机器学习方法,将驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据作为输入,将疲劳类型和对应的疲劳类型的疲劳数值作为输出,训练得到疲劳识别模型。
  3. 根据权利要求2所述的方法,其特征在于,所述通过驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据计算疲劳数值包括:
    获取时间阈值范围、距离阈值范围和数据变化量阈值范围,其中,时间阈值范围包括由预先设置的时间阈值构成的多个时间阈值子范围,距离阈值范围包括由预先设置的距离阈值构成的多个距离阈值子范围,数据变化量阈值范围包括由预先设置的数据变化量阈值构成的多个数据变化量阈值子范围,每个时间阈值子范围对应一个时间权重值,每个距离阈值子范围对应一个距离权重值,每个数据变化量阈值子范围对应一个数据变化权重值;
    分别测量驾驶员状态特征数据对应的每种疲劳类型的疲劳类型持续时间、驾驶环境特征数据对应的距离变化值和车辆状态特征数据对应的数据变化值,其中,疲劳类型持续时间通过驾驶员状态特征数据对应的指定疲劳类型对应的动作的持续时间来表征,距离变化值通过驾驶环境特征数据在单位时间内的变化量来表征,数据变化值通过车辆状态特征数据在单位时间内的变化量来表征;
    确定疲劳类型持续时间、距离变化值和数据变化值分别对应的时间阈值范围的时间阈值子范围的时间权重值、距离阈值范围的距离阈值子范围的距离权重值和数据变化量阈值范围的数据变化量阈值子范围的数据变化权重值;
    对时间权重值、距离权重值和数据变化权重值加权求和得到对应疲劳类型的疲劳数值。
  4. 根据权利要求1所述的方法,其特征在于,所述方法包括构建注意力值计算模型的步骤,所述构建注意力值计算模型的步骤包括:
    按照疲劳类型将疲劳数值集合中的疲劳数值划分为对应疲劳类型 的疲劳数值子集合;
    分别为每种疲劳类型设置类型权重值,将类型权重值与该类型权重值对应的疲劳数值子集合中的每个疲劳数值的乘积作为对应疲劳类型在取值为该疲劳数值时的注意力分量;
    利用机器学习方法,将疲劳类型和疲劳数值作为输入,将注意力分量作为输出,训练得到注意力值计算模型。
  5. 根据权利要求1所述的方法,其特征在于,所述判断注意力数值所在的注意力阈值范围,根据所述注意力阈值范围发出告警信号包括:
    获取注意力阈值范围,所述注意力阈值范围包括由预先设置的注意力阈值构成的多个注意力阈值子范围,每个注意力阈值子范围对应一个告警信号,所述告警信号包括:声音告警信号、图像告警信号;
    响应于注意力数值对应的注意力阈值子范围,发出对应该注意力阈值子范围的告警信号。
  6. 根据权利要求5所述的方法,其特征在于,所述判断所述注意力数值所在的注意力阈值范围,根据所述注意力阈值范围发出告警信号还包括:
    响应于注意力数值不在所述注意力阈值范围内时,启动自动驾驶模式。
  7. 一种用于发出告警信号的装置,其特征在于,所述装置包括:
    行驶状态数据获取单元,用于实时获取行驶状态数据,所述行驶状态数据包括驾驶员状态数据、驾驶环境数据和车辆状态数据,所述驾驶员状态数据包括手部动作数据、嘴部动作数据、头部动作数据和眼部动作数据,所述驾驶环境数据包括车道线数据、车辆距离数据,所述车辆状态数据包括速度数据、方向数据;
    疲劳数值获取单元,用于将所述行驶状态数据导入预先训练的疲劳识别模型,得到疲劳类型和对应疲劳类型的疲劳数值,所述疲劳识 别模型用于通过行驶状态数据确定疲劳类型和疲劳数值,所述疲劳数值用于表征对应疲劳类型的程度;
    注意力数值获取单元,用于将所述疲劳类型和疲劳数值导入预先训练的注意力值计算模型,得到注意力数值,所述注意力值计算模型用于通过疲劳类型和疲劳数值计算注意力数值,所述注意力数值用于表征驾驶员的疲劳程度;
    告警单元,用于判断注意力数值所在的注意力阈值范围,根据所述注意力阈值范围发出告警信号,所述注意力阈值范围通过预设的注意力阈值确定。
  8. 根据权利要求7所述的装置,其特征在于,所述装置包括疲劳识别模型构建单元,用于构建疲劳识别模型,所述疲劳识别模型构建单元包括:
    数据提取子单元,用于分别从驾驶员状态数据集合、驾驶环境数据集合和车辆状态数据集合中提取时间上同步的驾驶员状态数据、驾驶环境数据和车辆状态数据;
    特征提取子单元,用于分别对驾驶员状态数据、驾驶环境数据和车辆状态数据进行特征提取,得到驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据;
    疲劳类型确定子单元,用于根据所述驾驶员状态特征数据确定疲劳类型,所述疲劳类型包括以下至少一项:闭眼类型、打哈欠类型、视线偏移类型、打电话类型;
    疲劳数值计算子单元,用于通过驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据计算疲劳数值;
    疲劳识别模型构建子单元,用于利用机器学习方法,将驾驶员状态特征数据、驾驶环境特征数据和车辆状态特征数据作为输入,将疲劳类型和对应的疲劳类型的疲劳数值作为输出,训练得到疲劳识别模型。
  9. 根据权利要求8所述的装置,其特征在于,所述疲劳数值计算 子单元包括:
    阈值范围获取模块,用于获取时间阈值范围、距离阈值范围和数据变化量阈值范围,其中,时间阈值范围包括由预先设置的时间阈值构成的多个时间阈值子范围,距离阈值范围包括由预先设置的距离阈值构成的多个距离阈值子范围,数据变化量阈值范围包括由预先设置的数据变化量阈值构成的多个数据变化量阈值子范围,每个时间阈值子范围对应一个时间权重值,每个距离阈值子范围对应一个距离权重值,每个数据变化量阈值子范围对应一个数据变化权重值;
    数据测量模块,用于分别测量驾驶员状态特征数据对应的每种疲劳类型的疲劳类型持续时间、驾驶环境特征数据对应的距离变化值和车辆状态特征数据对应的数据变化值,其中,疲劳类型持续时间通过驾驶员状态特征数据对应的指定疲劳类型对应的动作的持续时间来表征,距离变化值通过驾驶环境特征数据在单位时间内的变化量来表征,数据变化值通过车辆状态特征数据在单位时间内的变化量来表征;
    权重值确定模块,用于确定疲劳类型持续时间、距离变化值和数据变化值分别对应的时间阈值范围的时间阈值子范围的时间权重值、距离阈值范围的距离阈值子范围的距离权重值和数据变化量阈值范围的数据变化量阈值子范围的数据变化权重值;
    疲劳数值计算模块,用于对时间权重值、距离权重值和数据变化权重值加权求和得到对应疲劳类型的疲劳数值。
  10. 根据权利要求7所述的装置,其特征在于,所述装置包括注意力值计算模型构建单元,用于构建注意力值计算模型,所述注意力值计算模型构建单元包括:
    数据划分子单元,用于按照疲劳类型将疲劳数值集合中的疲劳数值划分为对应疲劳类型的疲劳数值子集合;
    注意力分量计算子单元,用于分别为每种疲劳类型设置类型权重值,将类型权重值与该类型权重值对应的疲劳数值子集合中的每个疲劳数值的乘积作为对应疲劳类型在取值为该疲劳数值时的注意力分量;
    注意力值计算模型构建子单元,用于利用机器学习方法,将疲劳 类型和疲劳数值作为输入,将注意力分量作为输出,训练得到注意力值计算模型。
  11. 根据权利要求7所述的装置,其特征在于,所述告警单元包括:
    注意力阈值范围获取子单元,用于获取注意力阈值范围,所述注意力阈值范围包括由预先设置的注意力阈值构成的多个注意力阈值子范围,每个注意力阈值子范围对应一个告警信号,所述告警信号包括:声音告警信号、图像告警信号;
    告警子单元,用于响应于注意力数值对应的注意力阈值子范围,发出对应该注意力阈值子范围的告警信号。
  12. 根据权利要求11所述的装置,其特征在于,所述告警单元还包括:
    响应于注意力数值不在所述注意力阈值范围内时,启动自动驾驶模式。
  13. 一种服务器,包括:
    一个或多个处理器;
    存储器,用于存储一个或多个程序,
    当所述一个或多个程序被所述一个或多个处理器执行时,使得所述一个或多个处理器执行权利要求1至6中任一所述的方法。
  14. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现如权利要求1至6中任一所述的方法。
PCT/CN2018/099158 2017-11-16 2018-08-07 用于发出告警信号的方法及装置 WO2019095733A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711140773.X 2017-11-16
CN201711140773.XA CN107742399B (zh) 2017-11-16 2017-11-16 用于发出告警信号的方法及装置

Publications (1)

Publication Number Publication Date
WO2019095733A1 true WO2019095733A1 (zh) 2019-05-23

Family

ID=61234884

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/099158 WO2019095733A1 (zh) 2017-11-16 2018-08-07 用于发出告警信号的方法及装置

Country Status (2)

Country Link
CN (1) CN107742399B (zh)
WO (1) WO2019095733A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110705502A (zh) * 2019-10-14 2020-01-17 首约科技(北京)有限公司 一种驾驶员监控设备优化方法
CN113034895A (zh) * 2021-02-04 2021-06-25 招商局公路网络科技控股股份有限公司 一种etc门架系统、高速公路疲劳驾驶预警方法及装置
CN114312800A (zh) * 2022-02-14 2022-04-12 深圳市发掘科技有限公司 车辆安全驾驶方法、装置、计算机设备及存储介质

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107742399B (zh) * 2017-11-16 2022-02-22 百度在线网络技术(北京)有限公司 用于发出告警信号的方法及装置
CN109582529A (zh) * 2018-09-29 2019-04-05 阿里巴巴集团控股有限公司 一种报警阈值的设置方法及装置
CN109191789A (zh) * 2018-10-18 2019-01-11 斑马网络技术有限公司 疲劳驾驶检测方法、装置、终端和存储介质
CN110191011A (zh) * 2019-04-15 2019-08-30 厦门科灿信息技术有限公司 基于数据中心监控系统的智能设备监测方法、装置及设备
CN110458191B (zh) * 2019-07-05 2024-04-12 中国平安财产保险股份有限公司 疲劳状态判断方法、装置、计算机设备及存储介质
CN111062300A (zh) * 2019-12-11 2020-04-24 深圳市赛梅斯凯科技有限公司 驾驶状态检测方法、装置、设备及计算机可读存储介质
CN112365680A (zh) * 2020-10-29 2021-02-12 福信富通科技股份有限公司 一种基于ai识别的主动安全预警方法及系统
CN113744499B (zh) * 2021-08-12 2023-05-30 科大讯飞股份有限公司 一种疲劳预警方法、眼镜、系统和计算机可读存储介质
CN113460060B (zh) * 2021-08-20 2022-12-09 武汉霖汐科技有限公司 驾驶员疲劳程度评估系统、控制方法及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6927694B1 (en) * 2001-08-20 2005-08-09 Research Foundation Of The University Of Central Florida Algorithm for monitoring head/eye motion for driver alertness with one camera
CN103606245A (zh) * 2013-11-08 2014-02-26 北京工业大学 基于蓝牙脑电耳机和安卓手机的疲劳驾驶检测预警系统
CN104112334A (zh) * 2013-04-16 2014-10-22 百度在线网络技术(北京)有限公司 疲劳驾驶预警方法及系统
CN104794855A (zh) * 2014-01-22 2015-07-22 径卫视觉科技(上海)有限公司 驾驶员注意力综合评估装置
CN107742399A (zh) * 2017-11-16 2018-02-27 百度在线网络技术(北京)有限公司 用于发出告警信号的方法及装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102509418B (zh) * 2011-10-11 2013-11-13 东华大学 一种多传感信息融合的疲劳驾驶评估预警方法及装置
CN104408879B (zh) * 2014-11-19 2017-02-01 湖南工学院 疲劳驾驶预警处理方法、装置及系统
JP6154411B2 (ja) * 2015-02-27 2017-06-28 本田技研工業株式会社 車両の注意喚起装置
US10354539B2 (en) * 2015-06-08 2019-07-16 Biofli Technologies, Inc. Situational awareness analysis and fatigue management system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6927694B1 (en) * 2001-08-20 2005-08-09 Research Foundation Of The University Of Central Florida Algorithm for monitoring head/eye motion for driver alertness with one camera
CN104112334A (zh) * 2013-04-16 2014-10-22 百度在线网络技术(北京)有限公司 疲劳驾驶预警方法及系统
CN103606245A (zh) * 2013-11-08 2014-02-26 北京工业大学 基于蓝牙脑电耳机和安卓手机的疲劳驾驶检测预警系统
CN104794855A (zh) * 2014-01-22 2015-07-22 径卫视觉科技(上海)有限公司 驾驶员注意力综合评估装置
CN107742399A (zh) * 2017-11-16 2018-02-27 百度在线网络技术(北京)有限公司 用于发出告警信号的方法及装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110705502A (zh) * 2019-10-14 2020-01-17 首约科技(北京)有限公司 一种驾驶员监控设备优化方法
CN113034895A (zh) * 2021-02-04 2021-06-25 招商局公路网络科技控股股份有限公司 一种etc门架系统、高速公路疲劳驾驶预警方法及装置
CN114312800A (zh) * 2022-02-14 2022-04-12 深圳市发掘科技有限公司 车辆安全驾驶方法、装置、计算机设备及存储介质

Also Published As

Publication number Publication date
CN107742399B (zh) 2022-02-22
CN107742399A (zh) 2018-02-27

Similar Documents

Publication Publication Date Title
WO2019095733A1 (zh) 用于发出告警信号的方法及装置
JP6953464B2 (ja) 情報プッシュ方法及び装置
WO2020107974A1 (zh) 用于无人驾驶车的避障方法和装置
US10391406B2 (en) Apparatus and method for safe drive inducing game
CN109455180B (zh) 用于控制无人车的方法和装置
US20210132614A1 (en) Control method and apparatus for autonomous vehicle
EP3675121A2 (en) Two-way in-vehicle virtual personal assistant
US11003926B2 (en) Method and apparatus for recognizing boundary of traffic sign
CN110119725B (zh) 用于检测信号灯的方法及装置
US11017270B2 (en) Method and apparatus for image processing for vehicle
US11151880B1 (en) Systems and methods for providing guidance to vehicle drivers regarding predicted lane-change behavior of other vehicle drivers
CN112630799B (zh) 用于输出信息的方法和装置
CN116453679A (zh) 基于混合测试识别痴呆症的技术
CN110254442B (zh) 用于控制车辆显示的方法和装置
CN110853364B (zh) 数据监控方法和装置
CN110097600B (zh) 用于识别交通标志牌的方法及装置
CN110660217B (zh) 用于检测信息安全的方法及装置
CN117261930B (zh) 一种疲劳驾驶的预警方法和装置
CN110838027A (zh) 车辆使用满意度的确定方法及装置、存储介质、计算设备
EP4102481A1 (en) Method and apparatus for controlling vehicle, device, medium, and program product
CN111866056B (zh) 信息推送方法、装置、电子设备和存储介质
CN115817459B (zh) 车辆控制方法、装置、电子设备和计算机可读介质
EP3965017A1 (en) Knowledge distillation for autonomous vehicles
EP4009251B1 (en) Information output device and information output method
EP3892958A1 (en) Method and apparatus for generating position information, device, medium and computer program product

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18877754

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 07.10.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18877754

Country of ref document: EP

Kind code of ref document: A1