CN106971177A - A kind of driver tired driving detection method - Google Patents

A kind of driver tired driving detection method Download PDF

Info

Publication number
CN106971177A
CN106971177A CN201710328421.0A CN201710328421A CN106971177A CN 106971177 A CN106971177 A CN 106971177A CN 201710328421 A CN201710328421 A CN 201710328421A CN 106971177 A CN106971177 A CN 106971177A
Authority
CN
China
Prior art keywords
driver
eyes
sample
tired driving
detection method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201710328421.0A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanning Lehongpo Technology Co Ltd
Original Assignee
Nanning Lehongpo Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanning Lehongpo Technology Co Ltd filed Critical Nanning Lehongpo Technology Co Ltd
Priority to CN201710328421.0A priority Critical patent/CN106971177A/en
Publication of CN106971177A publication Critical patent/CN106971177A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/285Selection of pattern recognition techniques, e.g. of classifiers in a multi-classifier system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/469Contour-based spatial representations, e.g. vector-coding
    • G06V10/473Contour-based spatial representations, e.g. vector-coding using gradient analysis

Abstract

The invention discloses a kind of driver tired driving detection method, comprise the following steps:S1:Driving image is acquired using camera;S2:Driver face is detected using AdaBoost algorithms;S3:Positioning and feature extraction to driver's eyes;S4:The calculating of driver's eyes state;S5:The judgement of driver tired driving.The present invention is detected using AdaBoost algorithms to driver face, obtain the gradient matrix of driver's face area figure vertical direction, and floor projection is carried out to gradient matrix, the relative position of eyes in the picture is obtained by the architectural feature of driver face, eyes opening and closing is determined according to distance.Then the parameter of each state of driver's eyes is obtained according to PERCLOS measuring principles, judges whether driver is in fatigue driving state finally by each index and the relation of given threshold.This method has higher accuracy of detection.

Description

A kind of driver tired driving detection method
Technical field
Present invention relates particularly to a kind of driver tired driving detection method.
Background technology
With the gradually development of Chinese transportation transport service, the generation of traffic accident is more and more frequent, and driver fatigue is driven Sail and had changed into the one of the main reasons for causing traffic accident, can be mentioned in the same breath with driving when intoxicated.Therefore, developing one has Driver's fatigue drives monitoring method to ensureing that the safety of people's trip possesses critical significance, becomes relevant scholar and inquires into Key subject, obtain more and more widely noting.
During being acquired to driver's driving image, easily there is the driving image with certain gradient, Traditional recognized based on electroencephalogram is caused to combine the driver tired driving detection method for manipulating feature, due to need to be to angular standard Difference is accurately measured with zero-speed percentage, causes effectively realize the problem of detecting driver fatigue.
The content of the invention
The technical problem to be solved in the present invention is to provide a kind of driver tired driving detection method.
A kind of driver tired driving detection method, it is characterised in that comprise the following steps:
S1:Driving image is acquired using camera;
S2:Driver face is detected using AdaBoost algorithms;
S3:Positioning and feature extraction to driver's eyes;
S4:The calculating of driver's eyes state;
S5:The judgement of driver tired driving.
Further, AdaBoost algorithm steps are as follows:
1)Assuming that training sample is, wherein,For describing to treat training sample,For describing to treat training sample set;For describing generic,If,, then it is negative Sample, i.e., it is not driver face;If, then it is positive sample, i.e., it is driver face, n is used for needed for describing The sample total of training;
2)The initialization of weight vector:, wherein,,For describing to be trained The probability distribution situation of sample;
3)Iterative cycles:Weights are normalized by following formula:
By weak learning algorithm, the training sample after being normalized in sequence to weights is trained, and obtains Weak Classifier:
In above-mentioned weights, error rates of weak classifiers is calculated in sequence:
The Weak Classifier for selecting error rate minimum, add it in strong classifier;
The weights of each sample are updated in sequence by optimum classifier:
In above formula, if i-th sample can by Accurate classification,;If i-th of sample is accurate Classification, then, simultaneously
4)Assuming that the number of times of whole process circulation is T, then the strong classifier finally obtained can be described as follows:
In formula,
Further, the localization method of driver's eyes is as follows:
1)Obtain the gradient matrix of driver's face area figure vertical direction:
2)Floor projection is carried out to gradient matrix:
Further, the computational methods of driver's eyes state are as follows:
1)Blink during the duration once blinks, eyes are closed again to the time needed for the process opened from reaching, and its value can Obtained by following formula:
2)PERCLOS is the percentage shared by the closing time of eyes in the unit interval, and the unit interval takes 6s, then had:
In formula, N is used to describe the useful frame number of collected image in 6s;P (t) is used for representing the letter that eye opening level is changed over time Count, then the closing time ratio once blinked is:
Further, the decision method of driver tired driving is as follows:
1)If the duration D (t) of eye closing situation is higher than thresholding Th1, Th1=2.5, then it is assumed that driver fatigue;
2)Frequency of wink is higher than thresholding Th2, Th2=0.6, then it is assumed that driver fatigue;
3)If PERCLOS value F is higher than thresholding Th3, Th3=4.5, then it is assumed that driver fatigue.
The beneficial effects of the invention are as follows:
The present invention is detected using AdaBoost algorithms to driver face, obtains driver's face area figure vertical direction Gradient matrix, and carry out floor projection to gradient matrix, eyes are obtained in the picture by the architectural feature of driver face Relative position, is determined according to distance to eyes opening and closing.Then according to PERCLOS measuring principles obtain driver's eyes each The parameter of state, judges whether driver is in fatigue driving state finally by each index and the relation of given threshold.Should Method has higher accuracy of detection.
Embodiment
The present invention is further elaborated for specific examples below, but not as a limitation of the invention.
A kind of driver tired driving detection method, comprises the following steps:
S1:Driving image is acquired using camera;
S2:Driver face is detected using AdaBoost algorithms;
S3:Positioning and feature extraction to driver's eyes;
S4:The calculating of driver's eyes state;
S5:The judgement of driver tired driving.
AdaBoost algorithm steps are as follows:
1)Assuming that training sample is, wherein,For describing to treat training sample,For describing to treat training sample set;For describing generic,If,, then it is negative Sample, i.e., it is not driver face;If, then it is positive sample, i.e., it is driver face, n is used for needed for describing The sample total of training;
2)The initialization of weight vector:, wherein,,For describing to be trained The probability distribution situation of sample;
3)Iterative cycles:Weights are normalized by following formula:
By weak learning algorithm, the training sample after being normalized in sequence to weights is trained, and obtains Weak Classifier:
In above-mentioned weights, error rates of weak classifiers is calculated in sequence:
The Weak Classifier for selecting error rate minimum, add it in strong classifier;
The weights of each sample are updated in sequence by optimum classifier:
In above formula, if i-th sample can by Accurate classification,;If i-th of sample is accurate Classification, then, simultaneously
4)Assuming that the number of times of whole process circulation is T, then the strong classifier finally obtained can be described as follows:
In formula,
The localization method of driver's eyes is as follows:
1)Obtain the gradient matrix of driver's face area figure vertical direction:
2)Floor projection is carried out to gradient matrix:
The computational methods of driver's eyes state are as follows:
1)Blink during the duration once blinks, eyes are closed again to the time needed for the process opened from reaching, and its value can Obtained by following formula:
2)PERCLOS is the percentage shared by the closing time of eyes in the unit interval, and the unit interval takes 6s, then had:
In formula, N is used to describe the useful frame number of collected image in 6s;P (t) is used for representing the letter that eye opening level is changed over time Count, then the closing time ratio once blinked is:
The decision method of driver tired driving is as follows:
1)If the duration D (t) of eye closing situation is higher than thresholding Th1, Th1=2.5, then it is assumed that driver fatigue;
2)Frequency of wink is higher than thresholding Th2, Th2=0.6, then it is assumed that driver fatigue;
3)If PERCLOS value F is higher than thresholding Th3, Th3=4.5, then it is assumed that driver fatigue.

Claims (5)

1. a kind of driver tired driving detection method, it is characterised in that comprise the following steps:
S1:Driving image is acquired using camera;
S2:Driver face is detected using AdaBoost algorithms;
S3:Positioning and feature extraction to driver's eyes;
S4:The calculating of driver's eyes state;
S5:The judgement of driver tired driving.
2. driver tired driving detection method according to claim 1, it is characterised in that AdaBoost algorithm steps are such as Under:
1)Assuming that training sample is, wherein,For describing to treat training sample, For describing to treat training sample set;For describing generic,If,, then it is negative sample, i.e., It is not driver face;If, then it is positive sample, i.e., it is driver face, n is used for the sample of training needed for describing This total amount;
2)The initialization of weight vector:, wherein,,For describing to train sample This probability distribution situation;
3)Iterative cycles:Weights are normalized by following formula:
By weak learning algorithm, the training sample after being normalized in sequence to weights is trained, and obtains Weak Classifier:
In above-mentioned weights, error rates of weak classifiers is calculated in sequence:
The Weak Classifier for selecting error rate minimum, add it in strong classifier;
The weights of each sample are updated in sequence by optimum classifier:
In above formula, if i-th sample can by Accurate classification,;If i-th of sample is accurate Classification, then, simultaneously
4)Assuming that the number of times of whole process circulation is T, then the strong classifier finally obtained can be described as follows:
In formula,
3. driver tired driving detection method according to claim 1, it is characterised in that the positioning side of driver's eyes Method is as follows:
1)Obtain the gradient matrix of driver's face area figure vertical direction:
2)Floor projection is carried out to gradient matrix:
4. driver tired driving detection method according to claim 1, it is characterised in that the meter of driver's eyes state Calculation method is as follows:
1)Blink during the duration once blinks, eyes are closed again to the time needed for the process opened from reaching, and its value can Obtained by following formula:
2)PERCLOS is the percentage shared by the closing time of eyes in the unit interval, and the unit interval takes 6s, then had:
In formula, N is used to describe the useful frame number of collected image in 6s;P (t) is used for representing the letter that eye opening level is changed over time Count, then the closing time ratio once blinked is:
5. driver tired driving detection method according to claim 1, it is characterised in that driver tired driving is sentenced Determine method as follows:
1)If the duration D (t) of eye closing situation is higher than thresholding Th1, Th1=2.5, then it is assumed that driver fatigue;
2)Frequency of wink is higher than thresholding Th2, Th2=0.6, then it is assumed that driver fatigue;
3)If PERCLOS value F is higher than thresholding Th3, Th3=4.5, then it is assumed that driver fatigue.
CN201710328421.0A 2017-05-11 2017-05-11 A kind of driver tired driving detection method Withdrawn CN106971177A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710328421.0A CN106971177A (en) 2017-05-11 2017-05-11 A kind of driver tired driving detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710328421.0A CN106971177A (en) 2017-05-11 2017-05-11 A kind of driver tired driving detection method

Publications (1)

Publication Number Publication Date
CN106971177A true CN106971177A (en) 2017-07-21

Family

ID=59331498

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710328421.0A Withdrawn CN106971177A (en) 2017-05-11 2017-05-11 A kind of driver tired driving detection method

Country Status (1)

Country Link
CN (1) CN106971177A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101216887A (en) * 2008-01-04 2008-07-09 浙江大学 An automatic computer authentication method for photographic faces and living faces
CN101950355A (en) * 2010-09-08 2011-01-19 中国人民解放军国防科学技术大学 Method for detecting fatigue state of driver based on digital video
CN102054163A (en) * 2009-10-27 2011-05-11 南京理工大学 Method for testing driver fatigue based on monocular vision

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101216887A (en) * 2008-01-04 2008-07-09 浙江大学 An automatic computer authentication method for photographic faces and living faces
CN102054163A (en) * 2009-10-27 2011-05-11 南京理工大学 Method for testing driver fatigue based on monocular vision
CN101950355A (en) * 2010-09-08 2011-01-19 中国人民解放军国防科学技术大学 Method for detecting fatigue state of driver based on digital video

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
宫法明: ""交通驾驶员脸疲劳驾驶行为优化图像识"", 《计算机仿真》 *

Similar Documents

Publication Publication Date Title
CN100592322C (en) An automatic computer authentication method for photographic faces and living faces
CN105426870B (en) A kind of face key independent positioning method and device
CN109359536B (en) Passenger behavior monitoring method based on machine vision
CN107491769A (en) Method for detecting fatigue driving and system based on AdaBoost algorithms
CN105260705B (en) A kind of driver's making and receiving calls behavioral value method suitable under multi-pose
CN109840565A (en) A kind of blink detection method based on eye contour feature point aspect ratio
CN102819733B (en) Rapid detection fuzzy method of face in street view image
CN106295522A (en) A kind of two-stage anti-fraud detection method based on multi-orientation Face and environmental information
CN103488993B (en) A kind of crowd's abnormal behaviour recognition methods based on FAST
CN101908140A (en) Biopsy method for use in human face identification
CN106156688A (en) A kind of dynamic human face recognition methods and system
CN102085099B (en) Method and device for detecting fatigue driving
CN109376608A (en) A kind of human face in-vivo detection method
CN107133568B (en) A kind of speed limit prompt and hypervelocity alarm method based on vehicle-mounted forward sight camera
CN104298963B (en) A kind of multi-pose fatigue monitoring method based on face shape regression model of robust
CN103903004A (en) Method and device for fusing multiple feature weights for face recognition
CN104182729B (en) Pedestrian detection method based on ARM embedded platforms
CN103020596A (en) Method for identifying abnormal human behaviors in power production based on block model
CN106503645A (en) Monocular distance-finding method and system based on Android
CN110226913A (en) A kind of self-service examination machine eyesight detection intelligent processing method and device
CN109353907A (en) The security prompt method and system of elevator operation
CN107169437A (en) The method for detecting fatigue driving of view-based access control model
CN111104817A (en) Fatigue detection method based on deep learning
CN110222608A (en) A kind of self-service examination machine eyesight detection intelligent processing method
CN110176128A (en) A kind of driver tired driving alarming processing system

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20170721