CN110638474A - Method, system and equipment for detecting driving state and readable storage medium - Google Patents

Method, system and equipment for detecting driving state and readable storage medium Download PDF

Info

Publication number
CN110638474A
CN110638474A CN201910913808.1A CN201910913808A CN110638474A CN 110638474 A CN110638474 A CN 110638474A CN 201910913808 A CN201910913808 A CN 201910913808A CN 110638474 A CN110638474 A CN 110638474A
Authority
CN
China
Prior art keywords
driver
driving state
information
gaze
driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910913808.1A
Other languages
Chinese (zh)
Inventor
仲崇亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZKTeco Co Ltd
Original Assignee
ZKTeco Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZKTeco Co Ltd filed Critical ZKTeco Co Ltd
Priority to CN201910913808.1A priority Critical patent/CN110638474A/en
Publication of CN110638474A publication Critical patent/CN110638474A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1103Detecting eye twinkling
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Developmental Disabilities (AREA)
  • Physiology (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • General Physics & Mathematics (AREA)
  • Child & Adolescent Psychology (AREA)
  • Dentistry (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a method for detecting a driving state, which comprises the following steps: acquiring three-dimensional face image information in a designated area, and establishing a 3D face model by using the three-dimensional face image information; determining a gaze location of a driver's gaze using a 3D face model; judging whether the gaze position is within a preset range; if not, judging whether the staring time exceeds a first threshold value; wherein the gaze time is the time when the driver's sight line stays at the gaze position; if the gaze time exceeds a first threshold, it is determined that the driver is in a fatigue driving state. According to the method and the device, the three-dimensional facial image information is acquired by using the 3D-based image acquisition technology, and whether the driver is in the fatigue driving state is detected by using the established 3D facial model, so that the driving state detection precision is improved, and the occurrence of the fatigue driving misjudgment condition is reduced. The application also provides a system and equipment for detecting the driving state and a computer readable storage medium, and the system and the equipment have the beneficial effects.

Description

Method, system and equipment for detecting driving state and readable storage medium
Technical Field
The present disclosure relates to the field of driving state detection, and more particularly, to a method, a system, a device, and a computer-readable storage medium for driving state detection.
Background
The development of society brings the rapid increase of vehicles, and the driving safety becomes a thing which needs to be considered when people go out. For the driver, safe driving is not only responsible for himself but also respected for other lives. However, driving a vehicle is not a simple task, and driving fatigue often occurs, especially for long distance vehicles.
Driving fatigue affects the driver's attention, feeling, perception, thinking, judgment, consciousness, decision and movement. When the vehicle is driven continuously after fatigue, the vehicle can feel sleepy, weak limbs, unconsciousness, reduced judgment capability, even absentmindedness or instant memory loss, delayed or early action, improper operation pause or correction time and other unsafe factors, and road traffic accidents are easy to happen.
In the prior art, whether a driver is in a fatigue driving state is judged by detecting facial expressions and limb actions, for example, by detecting the eye closing time of the driver, the frequency of yawning, the frequency of head lowering, whether the body leans forward or not and the like, however, the traditional face recognition technology is based on a 2D image acquisition technology, so that the recognition precision is low, and the condition of serious misjudgment exists.
Therefore, how to improve the accuracy of the driving state detection is a technical problem that needs to be solved by those skilled in the art at present.
Disclosure of Invention
An object of the present application is to provide a method, system, device and computer-readable storage medium for driving state detection, which are used to improve the accuracy of driving state detection.
In order to solve the above technical problem, the present application provides a method for detecting a driving state, including:
acquiring three-dimensional face image information in a designated area, and establishing a 3D face model by using the three-dimensional face image information;
determining a gaze location of a driver's gaze using the 3D face model;
judging whether the gaze position is within a preset range;
if not, judging whether the staring time exceeds a first threshold value; wherein the gaze time is a time the driver's gaze stays at the gaze location;
determining that the driver is in a fatigue driving state if the gaze time exceeds the first threshold.
Optionally, after acquiring the three-dimensional face image information in the designated area, the method further includes;
determining position information of each preset feature point in the three-dimensional face image information;
determining facial motion data from the location information;
detecting whether the driver is in the fatigue driving state according to the facial motion data.
Optionally, when the gaze position is within the preset range, the method further includes:
acquiring running information of a current vehicle and vehicle distance information between the current vehicle and surrounding vehicles;
detecting whether the driver is in a fatigue driving state according to the driving information;
if not, judging whether the vehicle distance information is smaller than a second threshold value;
if the vehicle distance information is smaller than the second threshold value, determining that the driver is in the fatigue driving state;
the driving information comprises a time interval of lane departure of the current vehicle, a speed change value of the current vehicle in a preset time interval and a vehicle speed when the current vehicle turns.
Optionally, the method further includes:
recording driving behavior data of the driver;
establishing an exclusive driving state detection model of the driver according to the driving behavior data, the driving information and the vehicle distance information;
and detecting whether the driver is in the fatigue driving state or not by using the exclusive driving state detection model.
Optionally, the method further includes:
uploading the driving behavior data, the driving information and the vehicle distance information to a cloud platform so that the cloud platform records the driving behavior data, the driving information and the vehicle distance information of the driver.
Optionally, after determining that the driver is in a fatigue driving state, the method further includes:
and sending out an alarm signal.
Optionally, the sending the alarm signal includes:
acquiring running information and vehicle distance information, and calculating the danger level of the current vehicle according to the running information and the vehicle distance information;
and sending out an alarm signal corresponding to the danger level.
The present application further provides a system for driving state detection, the system comprising:
the system comprises a first model establishing module, a second model establishing module and a third model establishing module, wherein the first model establishing module is used for acquiring three-dimensional face image information in a designated area and establishing a 3D face model by using the three-dimensional face image information;
the sight line simulation module is used for simulating the gaze position of the sight line of the driver by utilizing the 3D face model;
the first judgment module is used for judging whether the gaze position is within a preset range;
the second judging module is used for judging whether the gaze time exceeds a first threshold value or not when the gaze position is judged not to be in the preset range;
a first determination module to determine that the driver is in a fatigue driving state if the gaze time exceeds the first threshold.
The present application also provides a driving state detection device, which includes:
a memory for storing a computer program;
a processor for implementing the steps of the method of driving state detection as claimed in any one of the above when said computer program is executed.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method of driving status detection as defined in any one of the preceding claims.
The application provides a method for detecting a driving state, which comprises the following steps: acquiring three-dimensional face image information in a designated area, and establishing a 3D face model by using the three-dimensional face image information; determining a gaze location of a driver's gaze using a 3D face model; judging whether the gaze position is within a preset range; if not, judging whether the staring time exceeds a first threshold value; wherein the gaze time is the time when the driver's sight line stays at the gaze position; if the gaze time exceeds a first threshold, it is determined that the driver is in a fatigue driving state.
According to the technical scheme, a 3D face model is established by utilizing three-dimensional face image information acquired in a designated area, then the gaze position of the sight line of a driver is determined by utilizing the 3D face model, whether the gaze position of the sight line of the driver at the moment is within a preset range determined according to normal driving conditions is judged, and if the gaze position of the sight line of the driver is not within the preset range determined according to the normal driving conditions and the gaze time is too long, the driver is determined to be in a fatigue driving state; according to the method and the device, the three-dimensional facial image information is acquired by using the 3D-based image acquisition technology, and whether the driver is in the fatigue driving state is detected by using the established 3D facial model, so that the driving state detection precision is improved, and the occurrence of the fatigue driving misjudgment condition is reduced. The application also provides a system, equipment and computer readable storage medium for detecting the driving state, which have the beneficial effects and are not repeated herein.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a flowchart of a method for detecting a driving state according to an embodiment of the present disclosure;
FIG. 2 is a flow chart of another driving state detection method provided by an embodiment of the present application;
fig. 3 is a structural diagram of a driving state detection system according to an embodiment of the present application;
FIG. 4 is a block diagram of another driving status detection system provided in an embodiment of the present application;
fig. 5 is a structural diagram of a driving state detection device according to an embodiment of the present application.
Detailed Description
The core of the application is to provide a method, a system, equipment and a computer readable storage medium for detecting the driving state, which are used for improving the precision of the driving state detection.
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Factors causing fatigue driving are various, and fatigue of a driver is mainly fatigue of nerves and sense organs, and fatigue of limbs caused by poor blood circulation due to a long-term fixed posture. When a driver sits on a fixed seat for a long time, the action is limited to a certain degree, the attention is highly concentrated, the driver is busy judging stimulation information outside the vehicle, and the mental state is highly tense, so that the driving fatigue phenomena of blurred eyes, waist soreness and backache, slow response, inflexible driving and the like occur. When a driver is in slight fatigue, untimely and inaccurate gear shifting can occur; when the driver is in moderate fatigue, the operation action is dull, and sometimes even the driver forgets the operation; when a driver is severely tired, the driver is often conscious of operation or sleeps for a short time, and the control capability of the vehicle is lost in severe cases. When a driver is tired, the phenomena of blurred vision, soreness and pain of the waist and back, stiff movements, fullness in hands and feet, or lack of concentration of energy, slow reaction, poor thinking, distraction, anxiety, impatience and the like can occur. If the vehicle is still being driven barely, a traffic accident may occur.
In the prior art, whether a driver is in a fatigue driving state is judged by detecting facial expressions and limb actions, and because the traditional face recognition technology is based on a 2D image acquisition technology, the recognition precision is low, and the condition of serious misjudgment exists, the driving state detection method provided by the application is used for solving the technical problems;
referring to fig. 1, fig. 1 is a flowchart illustrating a driving state detection method according to an embodiment of the present disclosure.
The method specifically comprises the following steps:
s101: acquiring three-dimensional face image information in a designated area, and establishing a 3D face model by using the three-dimensional face image information;
because the face recognition is easily influenced by factors such as environment, posture, expression and the like, and the three-dimensional face image information can provide more complete and abundant recognition information compared with the two-dimensional face image information, the three-dimensional face recognition has stronger robustness to factors such as illumination, posture, expression and the like, the detection of the driving state of the driver is finished by using the acquired three-dimensional face image information in the application, so that the purpose of improving the detection precision is achieved;
in the step, the face information acquisition device can be specifically a depth camera, the depth camera can detect the depth distance of three-dimensional face image information, and then the system can establish a 3D face model by using the three-dimensional face image information;
optionally, the three-dimensional facial image information in the designated area may be acquired through a binocular vision face recognition algorithm;
the binocular vision face recognition algorithm is based on a binocular stereoscopic vision system with a simple structure, and completes two-dimensional image acquisition of the face by adjusting the relative positions of the left camera and the right camera and the face, so that the image acquisition can be economically and efficiently completed; in the identification process, two-dimensional feature points of a two-dimensional image are automatically positioned by using an Active Shape Model (ASM) technology, and three-dimensional coordinates of the feature points are obtained by combining internal and external parameters of a camera, so that complicated human face three-dimensional reconstruction is avoided; and finally, a Back Propagation (BP) neural network is used for identification, so that the face identification rate is higher.
Preferably, after the three-dimensional face image information within the specified area is acquired, the following steps may also be performed;
determining position information of each preset feature point in the three-dimensional face image information;
determining facial motion data from the location information;
detecting whether the driver is in a fatigue driving state according to the facial motion data;
according to the embodiment of the application, after the three-dimensional face image information is acquired, the face motion data of the driver is determined directly by using the three-dimensional face image information, whether the driver is in the fatigue driving state or not is detected according to the face motion data, for example, the blinking frequency of the driver can be determined according to the face image information, and when the blinking frequency is no longer within the preset threshold range, the driver can be determined to be in the fatigue driving state.
S102: determining a gaze location of a driver's gaze using a 3D face model;
optionally, the pupil positions of the two eyes of the driver can be determined through the 3D face model, the sight line directions of the two eyes can be determined according to the pupil positions, and then the intersection point of the two sight lines is used as the gaze position of the sight line of the driver.
S103: judging whether the gaze position is within a preset range;
if not, the step S104 is executed;
in the step, after the gaze position of the sight line of the driver is determined through the 3D face model, whether the gaze position is within a preset range is judged, the preset range is the range of the position of the sight line when the driver normally drives the vehicle, generally is a circular area on a windshield right in front of the driver, if the driver is in a fatigue driving state, the sight line possibly stays at an abnormal position for a long time, so that the gaze time is judged whether to exceed a first threshold value when the gaze position is detected not to be within the preset range, and if the gaze time exceeds the first threshold value, the driver is considered to be in the fatigue driving state;
in a specific embodiment, a corresponding threshold range can be set according to different personal information of the driver, so that the detection precision of the driving state is further improved;
for example, the preset range may be appropriately shifted upward for a tall king, and downward for a short lie;
furthermore, respective preset ranges can be set for each driver according to different driving habits of the driver, so that the driving state detection precision is further improved;
optionally, when the gaze position is within the preset range, it is proved that the sight line of the driver is normal at this time, and at this time, it may be determined whether the driver is in a fatigue driving state according to the driving information of the vehicle.
S104: determining whether the gaze time exceeds a first threshold;
if yes, go to step S105;
the gaze fixation time is the time for the driver to stay at the gaze fixation position, when the gaze fixation time exceeds a first threshold value, the time for the driver to deviate from the normal driving position is proved to be longer, and at the moment, the driver is determined to be in a fatigue driving state;
the first threshold mentioned here is a maximum value allowed by the time when the line of sight deviates from the normal position in the normal driving state, and the first threshold may be set by the manufacturer or determined by a related technician according to a specific calculation method, and the determination method of the first threshold is not specifically limited in the present application;
optionally, when the gaze fixation time does not exceed the first threshold, it is proved that the time for the driver to deviate the sight line from the normal driving position is short, the driver may only watch the surrounding environment through a rearview mirror or a side-view mirror, and the driver cannot be determined to be in a fatigue driving state, and at this time, prompt information for the deviation of the sight line can be sent out, so that the driver can timely withdraw the sight line and keep the normal driving state.
S105: it is determined that the driver is in a fatigue driving state.
Preferably, after the driver is determined to be in the fatigue driving state, an alarm signal can be sent out, so that the driver can be awake in time, and accidents are avoided;
further, the sending of the alarm signal mentioned herein may specifically be:
acquiring the driving information and the vehicle distance information, and calculating the danger level of the current vehicle according to the driving information and the vehicle distance information;
and sending out an alarm signal corresponding to the danger level.
In this step, this application with the dangerous detection level of vehicle and alarm signal do the coupling, send corresponding alarm signal according to the dangerous level of vehicle to make the driver can in time notice the dangerous condition of vehicle, in time adjust and avoid taking place the accident.
Based on the technical scheme, the method for detecting the driving state comprises the steps of establishing a 3D face model by utilizing three-dimensional face image information acquired in a designated area, then determining the gaze position of the sight line of a driver by utilizing the 3D face model to judge whether the gaze position of the sight line of the driver at the moment is within a preset range determined according to normal driving conditions, and if the gaze position of the sight line of the driver is not within the preset range determined according to the normal driving conditions and the gaze time is too long, determining that the driver is in a fatigue driving state; according to the method and the device, the three-dimensional facial image information is acquired by using the 3D-based image acquisition technology, and whether the driver is in the fatigue driving state is detected by using the established 3D facial model, so that the driving state detection precision is improved, and the occurrence of the fatigue driving misjudgment condition is reduced.
With respect to step S103 of the previous embodiment, when the gaze position of the driver' S sight line is not within the preset range, the steps shown in fig. 2 may be further performed, which will be described below with reference to fig. 2.
Referring to fig. 2, fig. 2 is a flowchart illustrating another driving state detection method according to an embodiment of the present disclosure.
The method specifically comprises the following steps:
s201: acquiring running information of a current vehicle and vehicle distance information between the current vehicle and surrounding vehicles;
the travel information mentioned here includes a time interval at which the current vehicle makes a lane departure, a speed change value of the current vehicle within a preset time interval, and a vehicle speed at the time of turning of the current vehicle.
S202: detecting whether the driver is in a fatigue driving state according to the driving information;
if not, go to step S203;
in the step, whether the driver is in a fatigue driving state is detected according to the driving information, and whether the driver is fatigued or not can be indirectly judged according to the control condition of the driver on the vehicle by acquiring some illegal driving behaviors of the driver, such as lane departure, rapid acceleration, high-speed turning and the like;
optionally, when the driver is detected to be in a fatigue driving state, a warning signal is sent out.
S203: judging whether the vehicle distance information is smaller than a second threshold value;
if yes, the process proceeds to step S204.
The second threshold mentioned here is a minimum value allowed by a distance between the current vehicle and a surrounding vehicle in a normal driving state, and the second threshold may be set by a manufacturer or determined by a related technician according to a specific calculation method, and the determination method of the second threshold is not specifically limited in the present application;
in this step, when the vehicle distance information is smaller than the second threshold, it is proved that the distance between the current vehicle and the surrounding vehicle is too close, and the driver does not notice, and it is determined that the driver is in a fatigue driving state at this time.
S204: determining that a driver is in a fatigue driving state;
as a preferred embodiment, the method can also record driving behavior data of the driver, establish an exclusive driving state detection model of the driver according to the driving behavior data, the driving information and the vehicle distance information, and then detect whether the driver is in a fatigue driving state by using the exclusive driving state detection model;
furthermore, the driving behavior data, the driving information and the vehicle distance information can be uploaded to the cloud platform, so that the cloud platform can record the driving behavior data, the driving information and the vehicle distance information of the driver;
based on the fact that different drivers may have different driving habits, the method and the device establish a special driving state detection model of the driver according to the driving behavior data, the driving information and the vehicle distance information by recording the driving behavior data of the driver, and then detect whether the driver is in a fatigue driving state or not by using the special driving state detection model, so that the driving state detection precision is further improved.
Referring to fig. 3, fig. 3 is a structural diagram of a driving state detection system according to an embodiment of the present disclosure.
The system may include:
a first model building module 100, configured to obtain three-dimensional face image information in a specified region, and build a 3D face model using the three-dimensional face image information;
a sight line simulation module 200 for simulating a gaze position of a driver's sight line using the 3D face model;
the first judging module 300 is configured to judge whether the gaze position is within a preset range;
a second judging module 400, configured to, when the gaze fixation position is not within the preset range, judge whether the gaze fixation time exceeds a first threshold;
a first determination module 500 for determining that the driver is in a fatigue driving state if the gaze time exceeds a first threshold.
Referring to fig. 4, fig. 4 is a structural diagram of another driving state detection system according to an embodiment of the present disclosure.
The system may also include;
the second determining module is used for determining the position information of each preset feature point in the three-dimensional face image information;
a third determining module for determining face movement data according to the position information;
the first detection module is used for detecting whether the driver is in a fatigue driving state or not according to the facial motion data.
The system may also include;
the information acquisition module is used for acquiring the running information of the current vehicle and the distance information between the current vehicle and the surrounding vehicles;
the second detection module is used for detecting whether the driver is in a fatigue driving state or not according to the driving information;
the third judging module is used for judging whether the vehicle distance information is smaller than a second threshold value when the driver is not in a fatigue driving state;
the fourth determining module is used for determining that the driver is in a fatigue driving state when the distance information is smaller than the second threshold value;
the driving information comprises a time interval of lane departure of the current vehicle, a speed change value of the current vehicle in a preset time interval and a vehicle speed when the current vehicle turns.
The system may also include;
the recording module is used for recording driving behavior data of a driver;
the second model establishing module is used for establishing an exclusive driving state detection model of the driver according to the driving behavior data, the driving information and the vehicle distance information;
and the third detection module is used for detecting whether the driver is in a fatigue driving state by utilizing the exclusive driving state detection model.
The system may also include;
and the uploading module is used for uploading the driving behavior data, the driving information and the vehicle distance information to the cloud platform so that the cloud platform records the driving behavior data, the driving information and the vehicle distance information of the driver.
The system may also include;
and the alarm module is used for sending out an alarm signal.
Further, the alarm module may include:
the calculation submodule is used for acquiring the driving information and the distance information and calculating the danger level of the current vehicle according to the driving information and the distance information;
and the alarm submodule is used for sending out an alarm signal corresponding to the danger level.
Since the embodiment of the system part corresponds to the embodiment of the method part, the embodiment of the system part is described with reference to the embodiment of the method part, and is not repeated here.
Referring to fig. 5, fig. 5 is a structural diagram of a driving state detection device according to an embodiment of the present application.
The driving state detection apparatus 600 may have relatively large differences due to different configurations or performances, and may include one or more processors (CPUs) 622 (e.g., one or more processors) and a memory 632, one or more storage media 630 (e.g., one or more mass storage devices) storing applications 642 or data 644. Memory 632 and storage medium 630 may be, among other things, transient or persistent storage. The program stored in the storage medium 630 may include one or more modules (not shown), each of which may include a sequence of instructions operating on the device. Further, the central processor 622 may be provided in communication with the storage medium 630, and execute a series of instruction operations in the storage medium 630 on the driving state detection apparatus 600.
The driving state detection apparatus 600 may also include one or more power supplies 626, one or more wired or wireless network interfaces 650, one or more input-output interfaces 658, and/or one or more operating systems 641, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
The steps in the driving state detection method described above with reference to fig. 1 to 2 are implemented by the driving state detection apparatus based on the structure shown in fig. 5.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system, the apparatus and the module described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus, device and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of modules is merely a division of logical functions, and an actual implementation may have another division, for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
Modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
The integrated module, if implemented in the form of a software functional module and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a function calling device, or a network device) to execute all or part of the steps of the method of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
A method, system, device and computer readable storage medium for driving state detection provided by the present application are described in detail above. The principles and embodiments of the present application are explained herein using specific examples, which are provided only to help understand the method and the core idea of the present application. It should be noted that, for those skilled in the art, it is possible to make several improvements and modifications to the present application without departing from the principle of the present application, and such improvements and modifications also fall within the scope of the claims of the present application.
It is further noted that, in the present specification, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.

Claims (10)

1. A method of driving state detection, comprising:
acquiring three-dimensional face image information in a designated area, and establishing a 3D face model by using the three-dimensional face image information;
determining a gaze location of a driver's gaze using the 3D face model;
judging whether the gaze position is within a preset range;
if not, judging whether the staring time exceeds a first threshold value; wherein the gaze time is a time the driver's gaze stays at the gaze location;
determining that the driver is in a fatigue driving state if the gaze time exceeds the first threshold.
2. The method according to claim 1, further comprising, after acquiring three-dimensional face image information within a specified area;
determining position information of each preset feature point in the three-dimensional face image information;
determining facial motion data from the location information;
detecting whether the driver is in the fatigue driving state according to the facial motion data.
3. The method of claim 1, further comprising, when the gaze location is within the preset range:
acquiring running information of a current vehicle and vehicle distance information between the current vehicle and surrounding vehicles;
detecting whether the driver is in a fatigue driving state according to the driving information;
if not, judging whether the vehicle distance information is smaller than a second threshold value;
if the vehicle distance information is smaller than the second threshold value, determining that the driver is in the fatigue driving state;
the driving information comprises a time interval of lane departure of the current vehicle, a speed change value of the current vehicle in a preset time interval and a vehicle speed when the current vehicle turns.
4. The method of claim 3, further comprising:
recording driving behavior data of the driver;
establishing an exclusive driving state detection model of the driver according to the driving behavior data, the driving information and the vehicle distance information;
and detecting whether the driver is in the fatigue driving state or not by using the exclusive driving state detection model.
5. The method of claim 4, further comprising:
uploading the driving behavior data, the driving information and the vehicle distance information to a cloud platform so that the cloud platform records the driving behavior data, the driving information and the vehicle distance information of the driver.
6. The method of claim 1, after determining that the driver is in a fatigue driving state, further comprising:
and sending out an alarm signal.
7. The method of claim 6, wherein said issuing an alarm signal comprises:
acquiring running information and vehicle distance information, and calculating the danger level of the current vehicle according to the running information and the vehicle distance information;
and sending out an alarm signal corresponding to the danger level.
8. A system for driving condition detection, comprising:
the system comprises a first model establishing module, a second model establishing module and a third model establishing module, wherein the first model establishing module is used for acquiring three-dimensional face image information in a designated area and establishing a 3D face model by using the three-dimensional face image information;
the sight line simulation module is used for simulating the gaze position of the sight line of the driver by utilizing the 3D face model;
the first judgment module is used for judging whether the gaze position is within a preset range;
the second judging module is used for judging whether the gaze time exceeds a first threshold value or not when the gaze position is judged not to be in the preset range;
a first determination module to determine that the driver is in a fatigue driving state if the gaze time exceeds the first threshold.
9. A driving state detection apparatus characterized by comprising:
a memory for storing a computer program;
a processor for implementing the steps of the method of driving state detection according to any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, carries out the steps of the method of driving state detection according to any one of claims 1 to 7.
CN201910913808.1A 2019-09-25 2019-09-25 Method, system and equipment for detecting driving state and readable storage medium Pending CN110638474A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910913808.1A CN110638474A (en) 2019-09-25 2019-09-25 Method, system and equipment for detecting driving state and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910913808.1A CN110638474A (en) 2019-09-25 2019-09-25 Method, system and equipment for detecting driving state and readable storage medium

Publications (1)

Publication Number Publication Date
CN110638474A true CN110638474A (en) 2020-01-03

Family

ID=69011375

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910913808.1A Pending CN110638474A (en) 2019-09-25 2019-09-25 Method, system and equipment for detecting driving state and readable storage medium

Country Status (1)

Country Link
CN (1) CN110638474A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111267865A (en) * 2020-02-11 2020-06-12 矩阵元技术(深圳)有限公司 Vision-based safe driving early warning method and system and storage medium
CN111325172A (en) * 2020-02-28 2020-06-23 英华达(上海)科技有限公司 Device and method for achieving driving habit formation by sight line tracking
CN111738126A (en) * 2020-06-16 2020-10-02 湖南警察学院 Driver fatigue detection method and device based on Bayesian network and HMM
CN112654547A (en) * 2020-09-25 2021-04-13 华为技术有限公司 Driving reminding method, device and system
CN112754498A (en) * 2021-01-11 2021-05-07 一汽解放汽车有限公司 Driver fatigue detection method, device, equipment and storage medium
CN112883834A (en) * 2021-01-29 2021-06-01 重庆长安汽车股份有限公司 DMS system distraction detection method, DMS system distraction detection system, DMS vehicle, and storage medium
CN113119983A (en) * 2021-05-07 2021-07-16 恒大新能源汽车投资控股集团有限公司 Vehicle safety control method and device and vehicle
WO2021159269A1 (en) * 2020-02-11 2021-08-19 云图技术有限公司 Vision-based safe driving early warning method and system, and storage medium
CN113460060A (en) * 2021-08-20 2021-10-01 武汉霖汐科技有限公司 Driver fatigue degree evaluation system, control method, and storage medium
CN113753057A (en) * 2020-06-01 2021-12-07 丰田自动车株式会社 State determination device and state determination method
CN114241719A (en) * 2021-12-03 2022-03-25 广州宏途教育网络科技有限公司 Visual fatigue state monitoring method and device in student learning and storage medium
WO2022206727A1 (en) * 2021-04-01 2022-10-06 华为技术有限公司 Driving reminder method and device
CN117197786A (en) * 2023-11-02 2023-12-08 安徽蔚来智驾科技有限公司 Driving behavior detection method, control device and storage medium
CN117237926A (en) * 2023-11-14 2023-12-15 广州斯沃德科技有限公司 Vehicle driving state determining method, system, equipment and readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6927694B1 (en) * 2001-08-20 2005-08-09 Research Foundation Of The University Of Central Florida Algorithm for monitoring head/eye motion for driver alertness with one camera
CN101032405A (en) * 2007-03-21 2007-09-12 汤一平 Safe driving auxiliary device based on omnidirectional computer vision
CN101565036A (en) * 2008-04-21 2009-10-28 上海汽车集团股份有限公司 Device and method for preventing fatigue driving
CN104224204A (en) * 2013-12-24 2014-12-24 烟台通用照明有限公司 Driver fatigue detection system on basis of infrared detection technology
CN105354987A (en) * 2015-11-26 2016-02-24 南京工程学院 Vehicle fatigue driving detection and identity authentication apparatus, and detection method thereof
CN108154101A (en) * 2017-12-21 2018-06-12 江苏东洲物联科技有限公司 The fatigue driving detecting system and method for a kind of multi-parameter fusion
CN109215293A (en) * 2018-11-23 2019-01-15 深圳市元征科技股份有限公司 A kind of method for detecting fatigue driving, device and vehicle-mounted terminal equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6927694B1 (en) * 2001-08-20 2005-08-09 Research Foundation Of The University Of Central Florida Algorithm for monitoring head/eye motion for driver alertness with one camera
CN101032405A (en) * 2007-03-21 2007-09-12 汤一平 Safe driving auxiliary device based on omnidirectional computer vision
CN101565036A (en) * 2008-04-21 2009-10-28 上海汽车集团股份有限公司 Device and method for preventing fatigue driving
CN104224204A (en) * 2013-12-24 2014-12-24 烟台通用照明有限公司 Driver fatigue detection system on basis of infrared detection technology
CN105354987A (en) * 2015-11-26 2016-02-24 南京工程学院 Vehicle fatigue driving detection and identity authentication apparatus, and detection method thereof
CN108154101A (en) * 2017-12-21 2018-06-12 江苏东洲物联科技有限公司 The fatigue driving detecting system and method for a kind of multi-parameter fusion
CN109215293A (en) * 2018-11-23 2019-01-15 深圳市元征科技股份有限公司 A kind of method for detecting fatigue driving, device and vehicle-mounted terminal equipment

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111267865A (en) * 2020-02-11 2020-06-12 矩阵元技术(深圳)有限公司 Vision-based safe driving early warning method and system and storage medium
CN111267865B (en) * 2020-02-11 2021-07-16 矩阵元技术(深圳)有限公司 Vision-based safe driving early warning method and system and storage medium
WO2021159269A1 (en) * 2020-02-11 2021-08-19 云图技术有限公司 Vision-based safe driving early warning method and system, and storage medium
CN111325172A (en) * 2020-02-28 2020-06-23 英华达(上海)科技有限公司 Device and method for achieving driving habit formation by sight line tracking
CN113753057A (en) * 2020-06-01 2021-12-07 丰田自动车株式会社 State determination device and state determination method
CN111738126A (en) * 2020-06-16 2020-10-02 湖南警察学院 Driver fatigue detection method and device based on Bayesian network and HMM
CN111738126B (en) * 2020-06-16 2023-04-07 湖南警察学院 Driver fatigue detection method and device based on Bayesian network and HMM
CN112654547A (en) * 2020-09-25 2021-04-13 华为技术有限公司 Driving reminding method, device and system
CN112754498A (en) * 2021-01-11 2021-05-07 一汽解放汽车有限公司 Driver fatigue detection method, device, equipment and storage medium
CN112754498B (en) * 2021-01-11 2023-05-26 一汽解放汽车有限公司 Driver fatigue detection method, device, equipment and storage medium
CN112883834A (en) * 2021-01-29 2021-06-01 重庆长安汽车股份有限公司 DMS system distraction detection method, DMS system distraction detection system, DMS vehicle, and storage medium
WO2022206727A1 (en) * 2021-04-01 2022-10-06 华为技术有限公司 Driving reminder method and device
CN113119983A (en) * 2021-05-07 2021-07-16 恒大新能源汽车投资控股集团有限公司 Vehicle safety control method and device and vehicle
CN113460060A (en) * 2021-08-20 2021-10-01 武汉霖汐科技有限公司 Driver fatigue degree evaluation system, control method, and storage medium
CN114241719A (en) * 2021-12-03 2022-03-25 广州宏途教育网络科技有限公司 Visual fatigue state monitoring method and device in student learning and storage medium
CN114241719B (en) * 2021-12-03 2023-10-31 广州宏途数字科技有限公司 Visual fatigue state monitoring method, device and storage medium in student learning
CN117197786A (en) * 2023-11-02 2023-12-08 安徽蔚来智驾科技有限公司 Driving behavior detection method, control device and storage medium
CN117197786B (en) * 2023-11-02 2024-02-02 安徽蔚来智驾科技有限公司 Driving behavior detection method, control device and storage medium
CN117237926A (en) * 2023-11-14 2023-12-15 广州斯沃德科技有限公司 Vehicle driving state determining method, system, equipment and readable storage medium
CN117237926B (en) * 2023-11-14 2024-02-06 广州斯沃德科技有限公司 Vehicle driving state determining method, system, equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN110638474A (en) Method, system and equipment for detecting driving state and readable storage medium
JP6998564B2 (en) Arousal level estimation device and arousal level estimation method
JP5326521B2 (en) Arousal state determination device and arousal state determination method
JP6699831B2 (en) Driving awareness estimation device
JP6113440B2 (en) Visual input of vehicle operator
KR20200030049A (en) Vehicle control device and vehicle control method
Varma et al. Accident prevention using eye blinking and head movement
JP7357006B2 (en) Information processing device, mobile device, method, and program
CN110155072B (en) Carsickness prevention method and carsickness prevention device
US20150173665A1 (en) State estimation device and state estimation program
AU2019261701B2 (en) Method, apparatus and system for determining line of sight, and wearable eye movement device
JP2019195377A (en) Data processing device, monitoring system, awakening system, data processing method, and data processing program
CN113491519A (en) Digital assistant based on emotion-cognitive load
CN113525389B (en) Driver alertness detection method, apparatus and system
WO2019176492A1 (en) Calculation system, information processing device, driving assistance system, index calculation method, computer program, and storage medium
JP2018008575A (en) Vehicle control device
KR102433668B1 (en) Driver Monitoring System and method thereof
Bergasa et al. Visual monitoring of driver inattention
JP2017068761A (en) Information presentation device, driving assistance system, and information presentation method
US11383640B2 (en) Techniques for automatically reducing annoyance levels of drivers when using driver monitoring systems
JP7342636B2 (en) Vehicle control device and driver condition determination method
JP7342637B2 (en) Vehicle control device and driver condition determination method
US20240051585A1 (en) Information processing apparatus, information processing method, and information processing program
JP6658095B2 (en) Awakening maintenance device and program
CN111267865B (en) Vision-based safe driving early warning method and system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 523710, 26, 188 Industrial Road, Pingshan Town, Guangdong, Dongguan, Tangxia

Applicant after: Entropy Technology Co.,Ltd.

Address before: 523710, 26, 188 Industrial Road, Pingshan Town, Guangdong, Dongguan, Tangxia

Applicant before: ZKTECO Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200103