CN111476122A - Driving state monitoring method and device and storage medium - Google Patents
Driving state monitoring method and device and storage medium Download PDFInfo
- Publication number
- CN111476122A CN111476122A CN202010224077.2A CN202010224077A CN111476122A CN 111476122 A CN111476122 A CN 111476122A CN 202010224077 A CN202010224077 A CN 202010224077A CN 111476122 A CN111476122 A CN 111476122A
- Authority
- CN
- China
- Prior art keywords
- driving state
- driver
- size
- eyes
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 65
- 238000012544 monitoring process Methods 0.000 title claims abstract description 36
- 210000001508 eye Anatomy 0.000 claims abstract description 129
- 230000004424 eye movement Effects 0.000 claims abstract description 59
- 210000005252 bulbus oculi Anatomy 0.000 claims abstract description 45
- 230000033001 locomotion Effects 0.000 claims abstract description 28
- 230000002159 abnormal effect Effects 0.000 claims description 41
- 238000004590 computer program Methods 0.000 claims description 9
- 230000004399 eye closure Effects 0.000 claims description 7
- 238000007781 pre-processing Methods 0.000 claims description 6
- 230000009471 action Effects 0.000 abstract description 10
- 230000008569 process Effects 0.000 abstract description 5
- 230000006399 behavior Effects 0.000 description 27
- 206010041349 Somnolence Diseases 0.000 description 16
- 238000004422 calculation algorithm Methods 0.000 description 15
- 230000010365 information processing Effects 0.000 description 14
- 238000012806 monitoring device Methods 0.000 description 11
- 238000012549 training Methods 0.000 description 10
- 206010039203 Road traffic accident Diseases 0.000 description 7
- 230000004397 blinking Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 210000000744 eyelid Anatomy 0.000 description 6
- 230000001788 irregular Effects 0.000 description 6
- 210000003128 head Anatomy 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 208000003028 Stuttering Diseases 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 4
- 238000000605 extraction Methods 0.000 description 4
- 101001073212 Arabidopsis thaliana Peroxidase 33 Proteins 0.000 description 3
- 101001123325 Homo sapiens Peroxisome proliferator-activated receptor gamma coactivator 1-beta Proteins 0.000 description 3
- 102100028961 Peroxisome proliferator-activated receptor gamma coactivator 1-beta Human genes 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000000193 eyeblink Effects 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 241001282135 Poromitra oscitans Species 0.000 description 1
- 206010048232 Yawning Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000003467 diminishing effect Effects 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000004898 kneading Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Ophthalmology & Optometry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Traffic Control Systems (AREA)
Abstract
The embodiment of the invention provides a driving state monitoring method, equipment and a storage medium, wherein the method comprises the following steps: tracking the position of the eyes in real time, and updating the eye movement information in real time according to the position of the eyes; the eye movement information comprises eyeball movement information and blink information; judging a first driving state for reflecting whether the attention of the driver is concentrated or not according to the eyeball motion information, and judging a second driving state for reflecting whether the driver is drowsy or not according to the blink information; and judging a third driving state for reflecting whether the driving state of the driver is safe or not in real time according to the first driving state and the second driving state. The driving state monitoring method provided by the embodiment of the invention is accurate and easy to realize, avoids the possibility of misjudgment in the process of identifying the behavior action of the driver, realizes the high-efficiency and high-accuracy monitoring of the state of the driver, and can timely monitor the unsafe driving state.
Description
Technical Field
The invention relates to the technical field of vehicles, in particular to a driving state monitoring method, driving state monitoring equipment and a storage medium.
Background
With the rapid development of the traffic industry in China, the popularization rate of automobiles increases year by year, and the frequency of traffic accidents also increases year by year while the wide use of automobiles brings convenience to the life of people. According to the statistics of the world health organization, about 3000 people die of traffic accidents every day at present, and the global road traffic safety situation is severe. According to the research and analysis of the national authorities on accident reasons, the proportion of traffic accidents caused by factors such as fatigue driving of drivers, inattention and the like is higher in various traffic accidents in China. According to incomplete statistics, 20% of drivers have situations of dozing off, sleeping, inattention or looking down at the mobile phone at least once in a year, the traffic accidents caused by the reasons account for 33% of the total traffic accidents, and the death rate is more than 80%. Therefore, installing a device in a vehicle that can timely alert the driver of fatigue driving or inattention helps to reduce the occurrence of such traffic accidents.
At present, most methods are used for training artificial intelligence through a large amount of data, accurately identifying actions of yawning, eye kneading, head lowering and the like of a driver, and identifying whether the driving behavior of the driver is standard, but the existing methods are used for identifying the behavior action of the driver and have certain possibility of misjudgment.
Therefore, how to provide an accurate and easily-realized driving state monitoring method becomes a problem which needs to be solved urgently.
Disclosure of Invention
In order to solve the existing problems, embodiments of the present invention provide a driving state monitoring method, a device, and a storage medium.
In a first aspect, an embodiment of the present invention provides a driving state monitoring method, including:
tracking the position of the eyes in real time, and updating the eye movement information in real time according to the position of the eyes; the eye movement information comprises eyeball movement information and blink information;
judging a first driving state for reflecting whether the attention of the driver is concentrated or not according to the eyeball motion information, and judging a second driving state for reflecting whether the driver is drowsy or not according to the blink information;
and judging a third driving state for reflecting whether the driving state of the driver is safe or not in real time according to the first driving state and the second driving state.
Optionally, the determining, in real time, a third driving state for reflecting whether the driving state of the driver is safe according to the first driving state and the second driving state specifically includes:
if the first driving state is abnormal and/or the second driving state is abnormal, determining that the third driving state is abnormal;
and if the first driving state is normal and the second driving state is normal, determining that the third driving state is normal.
Optionally, the determining, according to the eye movement information, a first driving state used for reflecting whether the attention of the driver is focused includes:
if the eye movement information is not obtained within a first time threshold, determining that the attention of the driver is not concentrated, namely the first driving state is abnormal;
if the eyeball of the driver faces the front of the driving route according to the eyeball motion information, determining that the attention of the driver is concentrated, namely the first driving state is normal, otherwise, determining that the attention of the driver is not concentrated, namely the first driving state is not normal;
and if the eyeball of the driver is moved within a second time threshold value according to the eyeball motion information, determining that the attention of the driver is concentrated, namely the first driving state is normal, otherwise, determining that the attention of the driver is not concentrated, namely the first driving state is not normal.
Optionally, the determining, according to the blink information, a second driving state for reflecting whether the driver is drowsy includes:
determining the time length of the eyes of the driver from the maximum size to the first size, the time length of the eyes of the driver from the first size to the second size, the time length of the eyes of the driver from the second size to the second size after the eyes of the driver are closed, and the time length of the eyes of the driver from the second size to the first size according to the blink information to obtain the percentage of the eye closing time length in the unit time length; wherein the first dimension is greater than the second dimension;
and judging the second driving state according to the percentage and the blink information.
Optionally, the determining, according to the blink information, a duration taken by the eyes of the driver to close from the maximum size to the first size, a duration taken by the eyes of the driver to close from the first size to the second size, a duration taken by the eyes of the driver to open from the second size to the second size after closing, and a duration taken by the eyes of the driver to open from the second size to the first size may obtain a percentage of the duration of the eyes closed in the unit time, and specifically includes:
the percentage of the eye closure time length in the unit time length is obtained by the following formula:
wherein f is the percentage of the eye closing time length in the unit time length, t1For the duration of time it takes for the eye to close from the maximum size to the first size, t2Length of time for closing from first size to second size, t3Length of time taken to open again from the second size after closure, t4For the length of time it takes to open from the second size to the first size.
Optionally, the determining the second driving state according to the percentage and the blink information specifically includes:
and if the percentage f is greater than a preset threshold value and the eye closing duration is greater than a third time threshold value, determining that the second driving state is abnormal, otherwise, determining that the second driving state is normal, wherein the eye closing duration is obtained according to the blink information.
Optionally, the method further comprises any one or combination of the following:
before the eyes are tracked in real time, acquiring a face image in real time, preprocessing the face image, and identifying the positions of the eyes from the preprocessed face image;
and if the third driving state is abnormal, giving a warning.
In a second aspect, an embodiment of the present invention provides a driving state monitoring apparatus, including:
the information acquisition module is used for tracking the eye position in real time and updating the eye movement information in real time according to the eye position; the eye movement information comprises eyeball movement information and blink information;
the first judging module is used for judging a first driving state for reflecting whether the attention of the driver is concentrated or not according to the eyeball motion information and judging a second driving state for reflecting whether the driver is drowsy or not according to the blink information;
and the second judgment module is used for judging a third driving state for reflecting whether the driving state of the driver is safe or not in real time according to the first driving state and the second driving state.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the steps of the driving state monitoring method according to the first aspect are implemented.
In a fourth aspect, an embodiment of the present invention provides a non-transitory computer readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the driving state monitoring method as provided in the first aspect.
According to the driving state monitoring method, the driving state monitoring equipment and the driving state monitoring storage medium, the eye position is tracked in real time, the eye movement information is obtained in real time, the first driving state and the second driving state are judged according to the eye movement information in real time, accuracy and easiness in implementation are achieved, the mode that whether the driving behavior of a driver is standard or not is recognized through massive data training and a complex algorithm in the prior art is changed, the possibility that misjudgment exists in recognition of behavior actions of the driver is avoided, the driving state can be monitored efficiently and accurately, and the unsafe driving state can be monitored timely.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
Fig. 1 is a flowchart of a driving state monitoring method according to an embodiment of the present invention;
FIG. 2 is a flow chart of a driving state monitoring method according to another embodiment of the present invention;
fig. 3 is a schematic structural diagram of a driving state monitoring device according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a driving state monitoring device according to another embodiment of the present invention;
fig. 5 is a schematic structural diagram of a driving state monitoring device according to another embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a flowchart of a driving state monitoring method according to an embodiment of the present invention, and as shown in fig. 1, the method includes:
specifically, to implement real-time eye tracking, an image acquisition device is first used to acquire a face image, that is, the face image is acquired for a driver, and eyes are identified from the face image and tracked. The positions of the eyes of the driver are obtained by tracking the positions of the eyes of the driver in real time, and the eye movement information is updated in real time according to the obtained positions of the eyes. The eye movement information includes eyeball movement information and blink information.
Specifically, when the eye position is tracked, the eyeball of the driver is further tracked, and then the position of the eyeball is updated in real time to obtain the eyeball motion information.
It can be understood that the eye movement information may reflect the moving condition of the eyeball and the direction of the eyeball, i.e. the direction in which the eyeball specifically looks.
Meanwhile, blink information of the eyes of the driver is acquired in real time while tracking the eye position.
It is understood that the blink information may be embodied as the percentage of the eye covered by the eyelid when the eye blinks, i.e., closes or opens, i.e., the size of the pupil of the eye when the eye is closed or blinks.
specifically, after eyeball motion information and blink information of a driver are acquired in real time, whether a first driving state and a second driving state are normal or not can be judged, wherein the first driving state is used for reflecting whether the attention of the driver is concentrated or not, if the first driving state is normal, the driver is determined to be concentrated in the driving process, and if the first driving state is abnormal, the driver is determined to be not concentrated in the driving process, for example, the driver has an irregular driving condition of being not concentrated in the attention, such as a slow-moving state and the like; and the second driving state is used for reflecting whether the driver is sleepy or not, if the second driving state is normal, the driver is determined not to be sleepy during driving, and if the second driving state is abnormal, the driver is determined to be sleepy and tired during driving.
And 102, judging a third driving state for reflecting whether the driving state of the driver is safe or not in real time according to the first driving state and the second driving state.
Specifically, the third driving state is used to reflect whether the driving state is safe or not. If the first driving state is abnormal, the driver is determined to be not attentive when driving; if the second driving state is normal, determining that the driver is not sleepy when driving, and if the second driving state is abnormal, determining that the driver is sleepy and tired when driving; therefore, when the driver is not attentive while driving, or drowsiness occurs while driving, or the driver is neither concentrated nor drowsy, it is determined that the driving state is unsafe at this time.
According to the driving state monitoring method provided by the embodiment of the invention, the eye position is tracked in real time, the eye movement information is obtained in real time, and the first driving state and the second driving state are judged in real time according to the eye movement information, so that the method is accurate and easy to realize, the mode that whether the driving behavior of the driver is standard or not is identified by adopting a complex algorithm through mass data training in the prior art is changed, the possibility of misjudgment in identifying the behavior action of the driver is avoided, the driver state is monitored efficiently and accurately, and the unsafe driving state can be monitored in time.
Optionally, on the basis of the above embodiment, the determining, in real time according to the first driving state and the second driving state, a third driving state used for reflecting whether the driving state of the driver is safe includes:
if the first driving state is abnormal and/or the second driving state is abnormal, determining that the third driving state is abnormal;
specifically, because the first driving state is normal, it is determined that the driver is attentive when driving, and if the first driving state is abnormal, it is determined that the driver is not attentive when driving; if the second driving state is normal, determining that the driver is not sleepy when driving, and if the second driving state is abnormal, determining that the driver is sleepy and tired when driving; therefore, when the driver is not attentive while driving, or sleepy while driving, or when the driver is not attentive and sleepy, it can be determined that the driving state is unsafe, and thus if the first driving state is abnormal and/or the second driving state is abnormal, it is determined that the third driving state is abnormal.
And if the first driving state is normal and the second driving state is normal, determining that the third driving state is normal.
Specifically, because the first driving state is normal, it is determined that the driver is attentive when driving, and if the first driving state is abnormal, it is determined that the driver is not attentive when driving; if the second driving state is normal, determining that the driver is not sleepy when driving, and if the second driving state is abnormal, determining that the driver is sleepy and tired when driving; since it is determined that the current driving state is safe only when the driver is attentive and drowsiness does not occur while driving, it is determined that the third driving state is normal only if the first driving state is normal and the second driving state is normal.
Optionally, on the basis of the foregoing embodiment, the determining, according to the eye movement information, the first driving state for reflecting whether the attention of the driver is focused includes:
if the eye movement information is not obtained within a first time threshold, determining that the attention of the driver is not concentrated, namely the first driving state is abnormal;
specifically, when the first driving state is determined according to the eye movement information, if the eyes of the driver cannot be detected within the first time threshold, it is indicated that the driver has a head twisting or head lowering behavior at the time, that is, it is determined that the driver is not attentive when driving, and it is determined that the first driving state is abnormal.
For example, when the first driving state is determined according to the eye movement information, if the eyes of the driver cannot be detected within 3 seconds, it is indicated that the driver has a head twisting or head lowering behavior at the time, that is, it is determined that the driver is not attentive when driving, and it is determined that the first driving state is abnormal.
If the eyeball of the driver faces the front of the driving route according to the eyeball motion information, determining that the attention of the driver is concentrated, namely the first driving state is normal, otherwise, determining that the attention of the driver is not concentrated, namely the first driving state is not normal;
specifically, if the driver is attentively driving, the eyeball should be oriented to the front of the driving route; therefore, after the eyeball motion information is obtained, whether the eyeball of the driver looks ahead of the driving route or not is judged according to the eyeball motion information, if yes, the attention of the driver is concentrated, namely the first driving state is normal, and if the eyeball of the driver does not look ahead of the driving route, the attention of the driver is not concentrated, namely the first driving state is abnormal.
And if the eyeball of the driver is moved within a second time threshold value according to the eyeball motion information, determining that the attention of the driver is concentrated, namely the first driving state is normal, otherwise, determining that the attention of the driver is not concentrated, namely the first driving state is not normal.
Specifically, if the eyeball of the driver does not move within the second time threshold, which indicates that the driver has an irregular driving condition with inattentive attention such as stuttering, the first driving state is determined to be abnormal; if the eyeball of the driver continuously moves slightly without interruption, which indicates that the driver does not have a stuttering condition but focuses on driving, the first driving state is determined to be normal.
For example, if the eyeball of the driver does not move within 3 seconds, which indicates that the driver has an irregular driving condition with inattention such as stuttering, the first driving state is determined to be abnormal; if the eyeball of the driver continuously moves slightly without interruption, which indicates that the driver does not have a stuttering condition but focuses on driving, the first driving state is determined to be normal.
Optionally, on the basis of the foregoing embodiment, the determining, according to the blink information, the second driving state for reflecting whether the driver is drowsy specifically includes:
determining the time length of the eyes of the driver from the maximum size to the first size, the time length of the eyes of the driver from the first size to the second size, the time length of the eyes of the driver from the second size to the second size after the eyes of the driver are closed, and the time length of the eyes of the driver from the second size to the first size according to the blink information to obtain the percentage of the eye closing time length in the unit time length; wherein the first dimension is greater than the second dimension;
specifically, while eyes are tracked in real time, blink information is obtained, it is determined from the blink information that, when the driver blinks, the length of time it takes for the eyes to close from the maximum size to the first size, that is, the length of time it takes for the eyes to close from the first size to the second size, that it takes for the eyes to open from the second size to the second size after closing from the second size, and that it takes for the eyes to open from the second size to the first size, and then calculation is performed based on the obtained four lengths of time to obtain the percentage of the length of time the eyes closed in unit.
It will be appreciated that the first dimension is greater than the second dimension, and that the length of time it takes to close from the first dimension to the second dimension, i.e. the length of time the eye closes from the first dimension to the second dimension, i.e. the process of the pupil slowly diminishing.
And judging the second driving state according to the percentage and the blink information.
In particular, when fatigue driving occurs in the driver, the blinking behavior of the driver may change significantly, and the blinking frequency of the driver may increase, and the percentage of eye lids covering the eyes may also increase. Therefore, after the percentage of the eye closing time length in the unit time length is calculated, the second driving state can be judged according to the percentage and the blink information, and whether the driver is drowsy or not can be judged.
Optionally, on the basis of the above embodiment, the determining, according to the blink information, a length of time taken for the eyes of the driver to close from the maximum size to the first size, a length of time taken for the eyes of the driver to close from the first size to the second size, a length of time taken for the eyes of the driver to open from the second size to the second size after closing, and a length of time taken for the eyes of the driver to open from the second size to the first size may specifically include:
the percentage of the eye closure time length in the unit time length is obtained by the following formula:
wherein f is the percentage of the eye closing time length in the unit time length, t1For the duration of time it takes for the eye to close from the maximum size to the first size, t2Length of time for closing from first size to second size, t3Length of time taken to open again from the second size after closure, t4For the length of time it takes to open from the second size to the first size.
Specifically, after acquiring blink information based on eye movement information, it may be determined whether the driver is in fatigue driving by calculating an eye-closing time by the PERC L OS method, wherein the length of time t it takes for the eyes to close from the maximum size to the first size1Duration t for closing from first size to second size2For a period t from being closed in the second size and then opened to the second size3For a period t from the second size to the first size4Can be obtained by blinking information.
For example, calculating the eye-closing time by the PERC L OS method to determine whether the driver is in fatigue driving, acquiring the blink information in real time while tracking the eyes in real time, and acquiring the time period t taken for the eyes of the driver to close from the maximum size to 80% of the first size in real time based on the blink information1A time period t from 80% closure of the first dimension to 20% closure of the second dimension2From the second size, 20% closed and then opened to the second ruler20% of the length of time t3A length t of time for opening from the second size of 20% to the first size of 80%4The percentage of the eye closure time length in the unit time length is obtained by the following formula:
for example, calculating the eye-closing time by the PERC L OS method to determine whether the driver is in fatigue driving, acquiring the blink information in real time while tracking the eyes in real time, and acquiring the time period t taken for the eyes of the driver to close from the maximum size to 75% of the first size in real time based on the blink information1For a time t from 75% closure of the first dimension to 25% closure of the second dimension2For a period t of 25% closure from the second size and then open to 25% of the second size3For a period t of 25% opening from the second dimension to 75% opening from the first dimension4The percentage of the eye closure time length in the unit time length is obtained by the following formula:
optionally, on the basis of the above embodiment, the determining the second driving state according to the percentage and the blink information specifically includes:
and if the percentage f is greater than a preset threshold value and the eye closing duration is greater than a third time threshold value, determining that the second driving state is abnormal, otherwise, determining that the second driving state is normal, wherein the eye closing duration is obtained according to the blink information.
Specifically, when fatigue driving occurs to the driver, the blinking behavior of the driver changes significantly, at this time, the blinking frequency of the driver increases, the percentage of the eye covered by the eyelid also increases, if the percentage of the eye covered by the eyelid during blinking increases to a preset threshold value, and the duration of the eye closure is greater than a third time threshold value, it is determined that the driver is in a drowsy state, and when dozing occurs, it is determined that the second driving state is abnormal. Therein, it is understood that the eye-closure duration may be obtained from blink information.
For example, taking the percentage of the unit time period occupied by the eye-closed time period as 35%, and the third time threshold as 2 seconds as an example, if the percentage of the eyelid covering the eyes increases to 35% when the eye blinks, and the eye-closed duration is longer than 2 seconds, it is determined that the driver is in the drowsy state, and it is determined that the second driving state is not normal.
For example, taking the percentage of the unit time length occupied by the eye-closing time length as 40% and the third time threshold as 3 seconds as an example, if the percentage of the eyelid covering the eyes is increased to 40% when the eye blinks and the eye-closing duration is longer than 3 seconds, it is determined that the driver is in a drowsy state, and even a dozing state occurs, it is determined that the second driving state is abnormal.
Optionally, on the basis of the above embodiment, the method further includes any one or a combination of the following:
before the eyes are tracked in real time, acquiring a face image in real time, preprocessing the face image, and identifying the positions of the eyes from the preprocessed face image;
specifically, when the human face image is obtained in real time, the image acquisition device can be installed under the sun shield in front of the driver, the face of the driver can be shot, and normal driving of the driver is not affected. The image acquisition module is connected with the information processing module, acquires image information of a driver in real time, and transmits the acquired image to the information processing module.
It can be understood that the image acquisition device is composed of a camera and is responsible for acquiring the face image of the driver. To eliminate interference from ambient light sources around the driver, infrared light source imaging may be employed.
After the image data is collected, the image can be preprocessed, the image preprocessing mainly enhances the contrast of the image, enhances the face contour, inspects the noise of the image and removes the noise which affects the detection of the face and eyes.
It can be understood that, when image data is preprocessed, an OTSU method or a KITT L E method can be used to perform binary processing on the image, so as to convert the face image into binary image data.
It is to be understood that all manners similar to the OTSU method or the KITT L E method, which can convert the face image into binary image data, can be used as the manner of implementing the binary processing in the present embodiment.
After image data are preprocessed, face positioning can be carried out on the preprocessed images, eyes of a driver can be further positioned on the basis of the face positioning, and the positions of the eyes of the driver are tracked by comprehensively using Kalman filtering and Mean Shift algorithms.
It can be understood that when the human face is located, the AdaBoost method can be used to locate the human face of the preprocessed image;
it is understood that when the driver's eyes are further positioned on the basis of the face positioning, the image data can be subjected to edge extraction or the driver's eyes can be positioned by machine vision, specifically, the driver's eyes can be subjected to edge extraction or machine vision positioning by using canny operator, and the positions of the driver's eyes can be tracked by using the kalman filtering and the Mean Shift algorithm together.
When the meanShift algorithm is used for tracking the image video target, the color histogram of the target can be used as a search feature, and the meanShift vector is iterated continuously to enable the algorithm to be converged at the real position of the target, so that the tracking purpose is achieved.
The MeanShift algorithm belongs to a nuclear density estimation method, does not need any historical data, does not need training of samples to obtain a model, and completely depends on calculation of sample points acquired in real time to realize the tracking purpose, has small calculation amount, and can completely realize real-time tracking under the condition that a target area is known; the kernel function histogram model is insensitive to edge occlusion, target rotation, deformation and background motion, so that the real-time uninterrupted tracking of eyes cannot be influenced even if the background of a cab is complex or the occlusion exists between a human face and an image acquisition device due to vehicle bump.
It can be understood that the manner similar to the meanShift algorithm, in which the target tracking can be realized without sample training, can be used as the manner in which the real-time tracking in the embodiment is realized on the basis of a smaller amount of calculation in the embodiment.
And if the third driving state is abnormal, giving a warning.
Specifically, if the driver is determined to have irregular driving behaviors, namely the driving state is unsafe, and the third driving state is determined to be abnormal, a warning is sent to warn the driver.
It can be understood that the warning can be issued by issuing a voice warning, or by flashing a warning light, or by combining the voice warning with screen display of warning information, or by combining flashing a warning light with screen display of warning information, so that the driver can concentrate on the attention and improve the drowsiness state for safe driving.
According to the driving state monitoring method provided by the embodiment of the invention, the eye position is tracked in real time, the eye movement information is obtained in real time, and the first driving state and the second driving state are judged in real time according to the eye movement information, so that the method is accurate and easy to realize, the mode that whether the driving behavior of the driver is standard or not is identified by adopting a complex algorithm through mass data training in the prior art is changed, the possibility of misjudgment of identifying the behavior action of the driver is avoided, the driving state can be monitored efficiently and accurately, and the unsafe driving state can be monitored in time and warned.
Fig. 2 is a flowchart of a driving state monitoring method according to another embodiment of the present invention, and as shown in fig. 2, the method includes:
specifically, a camera or other similar image acquisition device is used for acquiring a face image of a driver; in order to eliminate the interference of ambient light sources around the driver, the embodiment of the invention adopts an infrared light source for imaging.
specifically, the face image acquired by the image acquisition device is preprocessed, the contrast of the image is enhanced, the face contour is enhanced, the image is subjected to binary processing by using an OTSU method or a KITT L E method, the face image is converted into binary image data, the noise of the image is checked, and the noise influencing the detection of the face and eyes is removed.
specifically, the position of the face is identified from the face image preprocessed in step 201, in this embodiment, the face of the preprocessed image is located by using an AdaBoost method.
specifically, the positions of the human eyes are identified, and in the embodiment, edge extraction is performed by using a canny operator on the basis of face positioning or the driver eyes are positioned by machine vision.
specifically, the position of the human eye is tracked in real time, and in this embodiment, the kalman filter and the Mean Shift algorithm are used together to realize the real-time tracking of the eyes of the driver, so as to track the eyeball, and obtain the eye movement information.
specifically, the eye movement information comprises eye movement information and blink information, namely the eye movement information of the driver is obtained according to the eye movement information, and whether the first driving state of the driver is normal or not is judged according to the eye movement information of the driver. If the first driving state is normal, determining that the driver is attentive when driving, and if the first driving state is abnormal, determining that the driver is not attentive when driving;
specifically, the eye movement information includes eye movement information and blink information, that is, blink information of the driver is acquired according to the eye movement information, whether the second driving state of the driver is normal or not is judged according to the blink information of the driver, if the second driving state is normal, it is determined that the driver is not sleepy during driving, and if the second driving state is not normal, it is determined that the driver is sleepy and tired during driving.
specifically, a third driving state is judged according to the first driving state and the second driving state; only when the driver is attentive and does not get drowsy while driving, the current driving state can be determined to be safe, namely only when the first driving state is normal and the second driving state is normal, the third driving state is determined to be normal, and the driving behavior specification of the driver is determined;
when the driver is not focused when driving, or the driver is drowsy when driving, or the driver is not focused and feels drowsy, the driving state can be determined to be unsafe, namely, if the first driving state is abnormal and/or the second driving state is abnormal, the third driving state is determined to be abnormal, and the driving behavior of the driver is determined to be irregular.
Specifically, if it is determined in step 207 that the driving behavior of the driver is not normal, that is, the third driving state is not normal, a warning is issued to the driver; in this embodiment, if it is determined in step 207 that the driving behavior of the driver is not normal, that is, the third driving state is not normal, a voice warning is issued to the driver. According to the driving state monitoring method provided by the embodiment of the invention, the eye position is tracked in real time, the eye movement information is obtained in real time, and the first driving state and the second driving state are judged in real time according to the eye movement information, so that the method is accurate and easy to realize, the mode that whether the driving behavior of the driver is standard or not is identified by adopting a complex algorithm through mass data training in the prior art is changed, the possibility of misjudgment of identifying the behavior action of the driver is avoided, the driving state can be monitored efficiently and accurately, and the unsafe driving state can be monitored in time and warned.
According to the driving state monitoring method provided by the embodiment of the invention, the eye position is tracked in real time, the eye movement information is obtained in real time, and the first driving state and the second driving state are judged in real time according to the eye movement information, so that the method is accurate and easy to realize, the mode that whether the driving behavior of the driver is standard or not is identified by adopting a complex algorithm through mass data training in the prior art is changed, the possibility of misjudgment of identifying the behavior action of the driver is avoided, the driving state can be monitored efficiently and accurately, and the unsafe driving state can be monitored in time and warned.
Fig. 3 is a schematic structural diagram of a driving state monitoring device according to an embodiment of the present invention, and as shown in fig. 3, the driving state monitoring device includes: the information acquisition module 301 comprises a first judgment module 302 and a second judgment module 303;
the information acquisition module 301 is configured to track the eye position in real time, and update the eye movement information in real time according to the eye position; the eye movement information comprises eyeball movement information and blink information;
a first determining module 302, configured to determine, according to the eye movement information, a first driving state that is used to reflect whether the attention of the driver is focused, and determine, according to the blink information, a second driving state that is used to reflect whether the driver is drowsy;
the second judging module 303 is configured to judge, in real time, a third driving state for reflecting whether the driving state of the driver is safe according to the first driving state and the second driving state.
Specifically, after a face image is acquired in real time, human eyes are identified, the positions of the eyes are tracked in real time by the information acquisition module 301, and the eye movement information is updated in real time according to the positions of the eyes; then, the first judging module 302 judges a first driving state for reflecting whether the attention of the driver is focused or not according to the eye movement information in the eye movement information, and judges a second driving state for reflecting whether the driver is drowsy or not according to the blink information in the eye movement information, after the first judging module 302 judges whether the first driving state is normal or not and whether the second driving state is normal or not, the second judging module 303 judges a third driving state for reflecting whether the driving state of the driver is safe or not in real time according to the first driving state and the second driving state, and judges whether the third driving module is normal or not.
The driving state monitoring device provided by the embodiment of the invention tracks the positions of the eyes in real time, obtains the eye movement information in real time, judges the first driving state and the second driving state according to the eye movement information in real time, is accurate and easy to realize, changes the mode that whether the driving behavior of the driver is standard or not is identified by adopting a complex algorithm through mass data training in the prior art, avoids the possibility of misjudgment of behavior and action of the driver, realizes the purpose of efficiently and accurately monitoring the state of the driver, and can monitor the unsafe driving state in time.
Fig. 4 is a schematic structural diagram of a driving state monitoring device according to another embodiment of the present invention, and as shown in fig. 4, the driving state monitoring device includes an image acquisition module 401, an information processing module 402, a warning module 403, a voltage conversion module 404, and a power supply module 405;
the image acquisition module 401 mainly includes a camera and other similar image acquisition devices for acquiring a face image in real time;
specifically, the image capturing module 401 is installed under a sun visor right in front of the driver, and ensures that the face of the driver can be photographed without affecting the normal driving of the driver.
The information processing module 402 mainly includes an information obtaining module 301, a first determining module 302, and a second determining module 303; which is responsible for processing the images collected by the image collecting module 401 and then judging whether the driving behavior of the driver is normal.
Specifically, the image acquisition module 401 is connected with the information processing module 402, the image acquisition module 401 acquires image information of a driver in real time, and transmits the acquired image to the information processing module 402. after the information processing module 401 receives image data acquired by the image acquisition module 401, the image is preprocessed, the image preprocessing mainly enhances the contrast of the image, enhances the face contour, performs binary processing on the image by using an OTSU method or a KITT L E method, converts the face image into binary image data, checks the noise of the image, and removes the noise affecting the detection of the face and eyes.
In this embodiment, after the information processing module 402 preprocesses the image, the adaboost method is used to perform face positioning on the preprocessed image, edge extraction or machine vision positioning is performed on the driver's eyes by using a canny operator on the basis of the face positioning, real-time tracking of the driver's eyes is realized by comprehensively using kalman filtering and Mean Shift algorithm, the eyes of the driver are further tracked by tracking the driver's eyes, and the movement information of the eyes of the driver is obtained, and the eye movement information is obtained; meanwhile, the information processing module 402 further obtains blink information through tracking of the eyes of the driver.
In this embodiment, after the information processing module 402 obtains the eye movement information and the blink information, it is determined whether the first driving state and the second driving state are normal according to the eye movement information and the blink information. And whether the third driving state is normal is obtained, namely whether the driving state of the driver is safe is obtained.
The warning module 403 is used for warning the driver to prompt the driver to regulate driving.
Specifically, if the information processing module 402 determines that the driving state of the driver is unsafe, i.e., irregular driving behavior occurs during driving, a warning is issued to the driver through the warning module 403 to prompt the driver to drive regularly.
The voltage conversion module 404 is used for providing working voltages for the image acquisition module 401, the information processing module 402 and the warning module 403.
Specifically, since the image capturing module 401, the information processing module 402, and the voice prompt module 403 require different operating voltages, the voltage conversion module 404 converts the input voltage of the power supply into a voltage at which the image capturing module 401, the information processing module 402, and the warning module 403 can operate normally.
The power module 405 is used to provide an input voltage for the voltage conversion module 404.
The driving state monitoring device provided by the embodiment of the invention tracks the eye position in real time, obtains the eye movement information in real time, judges the first driving state and the second driving state according to the eye movement information in real time, is accurate and easy to realize, changes the mode that whether the driving behavior of the driver is standard or not is identified by adopting a complex algorithm through mass data training in the prior art, avoids the possibility of misjudgment of behavior and action of the driver, realizes high-efficiency and high-accuracy monitoring of the driving state, and can monitor the unsafe driving state in time and warn.
Fig. 5 is a schematic structural diagram of a driving state monitoring device according to another embodiment of the present invention, as shown in fig. 5, the driving state monitoring device includes a memory (memory)501, a processor (processor)502, and a program stored in the memory 501 and capable of being executed on the processor 502, where the memory 501 and the processor 502 complete communication with each other through a communication bus 503, and the processor 502 executes the program to implement the following steps:
tracking the position of the eyes in real time, and updating the eye movement information in real time according to the position of the eyes; the eye movement information comprises eyeball movement information and blink information;
judging a first driving state for reflecting whether the attention of the driver is concentrated or not according to the eyeball motion information, and judging a second driving state for reflecting whether the driver is drowsy or not according to the blink information;
and judging a third driving state for reflecting whether the driving state of the driver is safe or not in real time according to the first driving state and the second driving state.
Further, the computer program in the memory 501 may be implemented in the form of a software functional unit and may be stored in a computer readable storage medium when sold or used as a separate product. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The method flow related to the execution of the program by the processor 502 may specifically refer to the above method embodiment, and is not described herein again.
Embodiments of the present invention further provide a non-transitory computer-readable storage medium, on which a computer program is stored, where the computer program is implemented to execute the data transmission processing method provided in the foregoing embodiments when executed by a processor, and specific functions and processes of the computer program may be detailed in the foregoing method embodiments, and are not described herein again.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
Claims (10)
1. A driving state monitoring method, characterized by comprising:
tracking the position of the eyes in real time, and updating the eye movement information in real time according to the position of the eyes; the eye movement information comprises eyeball movement information and blink information;
judging a first driving state for reflecting whether the attention of the driver is concentrated or not according to the eyeball motion information, and judging a second driving state for reflecting whether the driver is drowsy or not according to the blink information;
and judging a third driving state for reflecting whether the driving state of the driver is safe or not in real time according to the first driving state and the second driving state.
2. The driving state monitoring method according to claim 1, wherein the determining, in real time, a third driving state that is used for reflecting whether the driving state of the driver is safe or not according to the first driving state and the second driving state specifically includes:
if the first driving state is abnormal and/or the second driving state is abnormal, determining that the third driving state is abnormal;
and if the first driving state is normal and the second driving state is normal, determining that the third driving state is normal.
3. The driving state monitoring method according to claim 1 or 2, wherein the determining, according to the eye movement information, the first driving state that reflects whether the attention of the driver is focused specifically includes:
if the eye movement information is not obtained within a first time threshold, determining that the attention of the driver is not concentrated, namely the first driving state is abnormal;
if the eyeball of the driver faces the front of the driving route according to the eyeball motion information, determining that the attention of the driver is concentrated, namely the first driving state is normal, otherwise, determining that the attention of the driver is not concentrated, namely the first driving state is not normal;
and if the eyeball of the driver is moved within a second time threshold value according to the eyeball motion information, determining that the attention of the driver is concentrated, namely the first driving state is normal, otherwise, determining that the attention of the driver is not concentrated, namely the first driving state is not normal.
4. The driving state monitoring method according to claim 1, wherein the determining a second driving state reflecting whether the driver is drowsy or not according to the blink information specifically comprises:
determining the time length of the eyes of the driver from the maximum size to the first size, the time length of the eyes of the driver from the first size to the second size, the time length of the eyes of the driver from the second size to the second size after the eyes of the driver are closed, and the time length of the eyes of the driver from the second size to the first size according to the blink information to obtain the percentage of the eye closing time length in the unit time length; wherein the first dimension is greater than the second dimension;
and judging the second driving state according to the percentage and the blink information.
5. The driving state monitoring method of claim 4, wherein determining, based on the blink information, a length of time for the driver's eyes to close from a maximum size to a first size, a length of time for the driver's eyes to close from the first size to a second size, a length of time for the driver's eyes to open from the second size to the second size after closing, and a length of time for the driver's eyes to open from the second size to the first size, comprises:
the percentage of the eye closure time length in the unit time length is obtained by the following formula:
wherein f is the percentage of the eye closing time length in the unit time length, t1For the duration of time it takes for the eye to close from the maximum size to the first size, t2Length of time for closing from first size to second size, t3Length of time taken to open again from the second size after closure, t4For the length of time it takes to open from the second size to the first size.
6. The driving state monitoring method according to claim 5, wherein the determining the second driving state according to the percentage and the blink information specifically comprises:
and if the percentage f is greater than a preset threshold value and the eye closing duration is greater than a third time threshold value, determining that the second driving state is abnormal, otherwise, determining that the second driving state is normal, wherein the eye closing duration is obtained according to the blink information.
7. The driving state monitoring method according to claim 1, characterized in that the method further comprises any one or a combination of the following:
before the eyes are tracked in real time, acquiring a face image in real time, preprocessing the face image, and identifying the positions of the eyes from the preprocessed face image;
and if the third driving state is abnormal, giving a warning.
8. A driving state monitoring apparatus, characterized by comprising:
the information acquisition module is used for tracking the eye position in real time and updating the eye movement information in real time according to the eye position; the eye movement information comprises eyeball movement information and blink information;
the first judging module is used for judging a first driving state for reflecting whether the attention of the driver is concentrated or not according to the eyeball motion information and judging a second driving state for reflecting whether the driver is drowsy or not according to the blink information;
and the second judgment module is used for judging a third driving state for reflecting whether the driving state of the driver is safe or not in real time according to the first driving state and the second driving state.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the driving status monitoring method according to any one of claims 1 to 7 are implemented when the program is executed by the processor.
10. A non-transitory computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the driving state monitoring method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010224077.2A CN111476122A (en) | 2020-03-26 | 2020-03-26 | Driving state monitoring method and device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010224077.2A CN111476122A (en) | 2020-03-26 | 2020-03-26 | Driving state monitoring method and device and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111476122A true CN111476122A (en) | 2020-07-31 |
Family
ID=71748439
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010224077.2A Pending CN111476122A (en) | 2020-03-26 | 2020-03-26 | Driving state monitoring method and device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111476122A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112668548A (en) * | 2021-01-15 | 2021-04-16 | 重庆大学 | Method and system for detecting driver's fool |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050073136A1 (en) * | 2002-10-15 | 2005-04-07 | Volvo Technology Corporation | Method and arrangement for interpreting a subjects head and eye activity |
CN1830389A (en) * | 2006-04-21 | 2006-09-13 | 太原理工大学 | Device for monitoring fatigue driving state and its method |
CN102396009A (en) * | 2009-11-09 | 2012-03-28 | 松下电器产业株式会社 | Alertness assessment device, method, and program |
US20150109429A1 (en) * | 2012-07-06 | 2015-04-23 | Yazaki Corporation | Driver condition detecting device and driver condition informing device |
CN105574487A (en) * | 2015-11-26 | 2016-05-11 | 中国第一汽车股份有限公司 | Facial feature based driver attention state detection method |
CN108545080A (en) * | 2018-03-20 | 2018-09-18 | 北京理工大学 | Driver Fatigue Detection and system |
CN110287916A (en) * | 2019-06-28 | 2019-09-27 | 浙江吉利控股集团有限公司 | It is a kind of for judging the method and system of driver attention |
CN110879973A (en) * | 2019-10-31 | 2020-03-13 | 安徽普华灵动机器人科技有限公司 | Driver fatigue state facial feature recognition and detection method |
-
2020
- 2020-03-26 CN CN202010224077.2A patent/CN111476122A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050073136A1 (en) * | 2002-10-15 | 2005-04-07 | Volvo Technology Corporation | Method and arrangement for interpreting a subjects head and eye activity |
CN1830389A (en) * | 2006-04-21 | 2006-09-13 | 太原理工大学 | Device for monitoring fatigue driving state and its method |
CN102396009A (en) * | 2009-11-09 | 2012-03-28 | 松下电器产业株式会社 | Alertness assessment device, method, and program |
US20150109429A1 (en) * | 2012-07-06 | 2015-04-23 | Yazaki Corporation | Driver condition detecting device and driver condition informing device |
CN105574487A (en) * | 2015-11-26 | 2016-05-11 | 中国第一汽车股份有限公司 | Facial feature based driver attention state detection method |
CN108545080A (en) * | 2018-03-20 | 2018-09-18 | 北京理工大学 | Driver Fatigue Detection and system |
CN110287916A (en) * | 2019-06-28 | 2019-09-27 | 浙江吉利控股集团有限公司 | It is a kind of for judging the method and system of driver attention |
CN110879973A (en) * | 2019-10-31 | 2020-03-13 | 安徽普华灵动机器人科技有限公司 | Driver fatigue state facial feature recognition and detection method |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112668548A (en) * | 2021-01-15 | 2021-04-16 | 重庆大学 | Method and system for detecting driver's fool |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Hossain et al. | IOT based real-time drowsy driving detection system for the prevention of road accidents | |
Junaedi et al. | Driver drowsiness detection based on face feature and PERCLOS | |
EP1589485B1 (en) | Object tracking and eye state identification method | |
Ahmed et al. | Robust driver fatigue recognition using image processing | |
CN104637246A (en) | Driver multi-behavior early warning system and danger evaluation method | |
KR20130016606A (en) | Advanced driver assistance system for safety driving using driver adaptive irregular behavior detection | |
CN109118714B (en) | Alarming method, device, equipment and storage medium based on eye movement information | |
Lashkov et al. | Driver dangerous state detection based on OpenCV & dlib libraries using mobile video processing | |
Guria et al. | Iot-enabled driver drowsiness detection using machine learning | |
CN117227740A (en) | Multi-mode sensing system and method for intelligent driving vehicle | |
Rani et al. | Development of an Automated Tool for Driver Drowsiness Detection | |
Yogesh et al. | Driver drowsiness detection and alert system using YOLO | |
CN111476122A (en) | Driving state monitoring method and device and storage medium | |
Mašanović et al. | Driver monitoring using the in-vehicle camera | |
CN115880675A (en) | Fatigue driving behavior real-time monitoring analysis method and system and electronic equipment | |
CN111152653A (en) | Fatigue driving detection method based on multi-information fusion | |
Kurian et al. | AI-Based Driver Drowsiness and Distraction Detection in Real-Time | |
Verma et al. | DRIVER DROWSINESS DETECTION | |
Theivadas et al. | VigilEye: Machine learning-powered driver fatigue recognition for safer roads | |
KR102476829B1 (en) | Method for detecting drowsiness using deep learning and system for preventing drowsy driving using thereof | |
Swetha et al. | Vehicle Accident Prevention System Using Artificial Intelligence | |
Doppala et al. | A Machine Intelligence Model to Detect Drowsiness for Preventing Road Accidents | |
Nair et al. | Smart System for Drowsiness and Accident Detection | |
Malathi et al. | A Comprehensive Evaluation of Driver Drowsiness Identification System using Camera based Improved Deep Learning Methodology | |
Gafur et al. | Real-Time Drowsiness Detection Using Fusion of Facial Features |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200731 |
|
RJ01 | Rejection of invention patent application after publication |