CN104021370B - The driver status monitoring method and system of a kind of view-based access control model information fusion - Google Patents

The driver status monitoring method and system of a kind of view-based access control model information fusion Download PDF

Info

Publication number
CN104021370B
CN104021370B CN201410208173.2A CN201410208173A CN104021370B CN 104021370 B CN104021370 B CN 104021370B CN 201410208173 A CN201410208173 A CN 201410208173A CN 104021370 B CN104021370 B CN 104021370B
Authority
CN
China
Prior art keywords
human eye
driver
image
vehicle
road
Prior art date
Application number
CN201410208173.2A
Other languages
Chinese (zh)
Other versions
CN104021370A (en
Inventor
张根源
虞勤
杨思思
周泓
Original Assignee
浙江传媒学院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 浙江传媒学院 filed Critical 浙江传媒学院
Priority to CN201410208173.2A priority Critical patent/CN104021370B/en
Publication of CN104021370A publication Critical patent/CN104021370A/en
Application granted granted Critical
Publication of CN104021370B publication Critical patent/CN104021370B/en

Links

Abstract

The invention discloses a kind of driver status monitoring method of view-based access control model information fusion and system, including collection driver's face image and vehicle front road image and pre-processed;Face detection is carried out to driver's face image by pretreatment and obtains human face region;Human eye pupil barycenter and pupil region are positioned in human face region, human eye state parameter is obtained and human eye is tracked;Lane detection is carried out from the vehicle front road image by pretreatment and is identified;According to the track line computation vehicle traveling direction that detects and leading line angle, judge vehicle whether run-off-road;Driver's driving condition is differentiated according to the eye feature of driver and vehicle departure degree.The present invention proposes the driver status monitoring method of two kinds of computer vision modes of fusion, improves the accuracy rate of detection, realize driver fatigue, after drinking, the real-time monitoring and alarm of the improper driving condition such as dispersion attention, so as to ensure vehicle security drive.

Description

The driver status monitoring method and system of a kind of view-based access control model information fusion

Technical field

The present invention relates to traffic engineering, more particularly to a kind of driver status monitoring method of view-based access control model information fusion and System.

Background technology

Driver, which is in improper driving condition, easily causes road traffic accident, therefore safe and reliable vehicle carried driving person Real-time monitoring system for state has profound significance.

At present, existing driver status monitoring technology typically has following several ways:One, based on physiological driver's phenomenon Method;Two, the method that behavior is manipulated based on driver;Three, the method for the behavior based on automobile;Wherein:

Based on the method for physiological driver's phenomenon, mainly physiological signal such as heart rate, pulse, the brain by measuring driver The Parameters variations such as electric wave, electromyographic signal or body action include parameters such as facial expression, frequency of wink etc. to realize to driving The monitoring of member's state.This method is more reliable, and accuracy is high, but is used as a kind of contact type measurement, and application scenarios are limited, and Operation inconvenience, thus practicality is general.

The method that behavior is manipulated based on driver, mainly by manipulation of the driver to steering wheel, accelerator etc. come anti- Answer the state of driver.Although such method and a kind of contact measurement method, the method based on physiological phenomenon is compared, Operability is stronger, it is contemplated that actual scene is complicated and changeable, such method accuracy is not high and autgmentability, anti-interference It is all very poor, therefore market prospects allow of no optimist.

Method based on motor vehicle behavior, it is a kind of indirect monitoring method, usually using view-based access control model sensor, velocity pick-up The information such as the monitoring such as device travel route of vehicle, speed, acceleration judge the state of driver to analyze.Such method is operable Property is stronger, and cost is than relatively low, but scalability is poor.

The content of the invention

In view of the above-mentioned deficiencies in the prior art, it is an object of the present invention to provide a kind of driver's shape of view-based access control model information fusion State monitoring method and system, to solve in existing monitoring method, not only accuracy rate is not high, and easily normal driving is caused The problem of interference.

The present invention is achieved through the following technical solutions:

A kind of driver status monitoring method of view-based access control model information fusion, comprises the following steps:

S10, collection driver's face image and vehicle front road image are simultaneously pre-processed;

S20, Face detection is carried out to driver's face image by pretreatment, obtains human face region;

S30, human eye pupil barycenter and pupil region are positioned in human face region, obtain human eye state parameter and human eye is entered Line trace;

S40, lane detection is carried out to the vehicle front road image by pretreatment and is identified;

S50, according to the track line computation vehicle traveling direction detected and leading line angle, judge whether vehicle deviates car Road;

S60, driver's driving condition is differentiated according to the eye feature of driver and vehicle departure degree, if being in Improper driving condition, then start alarm;

Wherein, step S30 also includes:

S31, the bianry image of human face region is obtained using the method for adaptive threshold binaryzation;

S32, the upper half area of the bianry image to obtaining carry out the horizontal integral projection of Gray Projection value, obtain human eye Horizontal level;

S33, gray value maximum point is found in the human eye area oriented, and region is carried out to surrounding centered on the point Grow and obtain pupil barycenter after the growth of zoning, human eye is tracked using Kalman filter method;

S34, the closed state of opening that pupil area differentiates eyes is calculated, obtains human eye parameter.

Further, the step S40 includes:

S41, canny rim detections are carried out to the road image by pretreatment and realize binaryzation;

S42, Hough transform straight-line detection is carried out to the bianry image of the road image;

S43, using a plurality of straight line that detects of horizontal line detecting method analysis Hough transform, the vanishing point of straight line is found, Intersect at vanishing point (bar straight line i.e. be respectively lane line right boundary.

Further, the step S50 includes:

Road boundary and vanishing point according to being obtained in step S40 determine the angle of road boundary, so as to obtain track The half of center line, i.e. angle and the straight line for crossing vanishing point.

If the angle between vehicle traveling direction and track center line in preset threshold range, is judged as that automobile belongs to Normal driving state, otherwise it is judged as that vehicle travels run-off-road.

Another object of the present invention, which also resides in, provides a kind of driver status monitoring system of view-based access control model information fusion, bag Include:

Image obtains and pretreatment unit, for gathering driver's face image and vehicle front road image and carrying out pre- Processing;

Human eye positions and kinematic parameter extraction unit, determines for carrying out face to driver's face image by pretreatment Position positions human eye pupil barycenter and pupil region to obtain human face region in human face region, obtains human eye state parameter simultaneously Human eye is tracked;

Lane detection recognition unit:For carrying out lane detection simultaneously from the vehicle front road image by pretreatment Identification;

Deviate judging unit:For according to the track line computation vehicle traveling direction detected and leading line angle, judging Vehicle whether run-off-road;

Driver status analysis unit:Driver is driven for the eye feature according to driver and vehicle departure degree State is differentiated, if being in improper driving condition, starts alarm.

Further, the human eye positioning and kinematic parameter extraction unit are additionally operable to:

The bianry image of human face region is obtained using the method for adaptive threshold binaryzation;

The upper half area of bianry image to obtaining carries out the horizontal integral projection of Gray Projection value, obtains the level of human eye Position;

Gray value maximum point is found in the human eye area oriented, and region growing is carried out to surrounding centered on the point And pupil barycenter is obtained after the growth of zoning, human eye is tracked using Kalman filter method;Pupil area is calculated to sentence Other eyes open closed state, obtain human eye parameter.

Further, the lane detection recognition unit is specifically used for:

Canny rim detections are carried out to the road image by pretreatment and realize binaryzation;

Hough transform straight-line detection is carried out to the bianry image of the road image;

The a plurality of straight line detected using horizontal line detecting method analysis Hough transform, the vanishing point of straight line is found, is intersected In (the right boundary that it is respectively lane line that bar straight line, which is, of vanishing point.

Further, the deviation judging unit is specifically used for:

Road boundary and vanishing point according to being obtained in lane detection recognition unit determine the angle of road boundary, so as to Obtain track center line, the i.e. half of angle and the excessively straight line of vanishing point;If the folder between vehicle traveling direction and track center line Angle is then judged as that automobile belongs to normal driving state in preset threshold range, is otherwise judged as that vehicle travels run-off-road.

Compared to prior art, the beneficial effects of the invention are as follows:The present invention has merged in-car vision, and (kind is believed with the outer vision of car Breath, by monitoring human eye movement's feature and monitoring of the deviation degree realization to driver status, in-car detection in real time simultaneously Human eye movement's feature, car monitor track departure degree in real time outside;Detection accuracy is improved, so as to improve) motor-car road driving Security;This method calculating speed is fast simultaneously, and committed memory is small, realizes vehicle imbedding type system and monitors in real time.

Brief description of the drawings

Fig. 1 is the driver status monitoring method flow chart of the view-based access control model information fusion of the embodiment of the present invention.

Fig. 2 is the human eye parameter extraction flow chart of the embodiment of the present invention.

Fig. 3 is the deviation overhaul flow chart of the embodiment of the present invention.

Fig. 4 is the deviation detection model of the embodiment of the present invention.

Fig. 5 is the bayesian network structure that the embodiment of the present invention is merged to eye feature and vehicle runout information.

Embodiment

As shown in figure 1, the driver status monitoring method of view-based access control model information fusion comprises the following steps:

S10, collection driver's face image and vehicle front road image are simultaneously pre-processed.

Realtime graphic is gathered, by setting in-car infrared photography) driver's face's realtime graphic is gathered with being imaged outside car) real When gather vehicle front road image, be transferred to image processing module and pre-processed, such as the needs according to data format, will The image got is converted into single channel gray level image, and is filtered the operation such as noise reduction.

S20, Face detection is carried out to driver's face image by pretreatment, obtains human face region.

As shown in Figure 2, human face region is obtained, eliminates influence of the part such as hair to human eye detection, specific method is as follows:

Square M00, M10, M01, M02, M20 and the barycenter (X, Y) of image are calculated so as to obtain the width of human face region and height Degree.Each rank square of wherein image is defined as:

M00=∑ x=0m ∑ y=0nf (x, y)]]>

M10=∑ x=0m ∑ y=0nxf (x, y)]]>

M01=∑ x=0m ∑ y=0nyf (x, y)]]>

M02=∑ x=0m ∑ y=0ny2f (x, y);]]>

M20=∑ x=0m ∑ y=0nx2f (x, y)]]>

Wherein, m, n represent the wide and high of image respectively, and f (x, y) represents the pixel value in image coordinate point (x, y).

The calculation formula of barycenter is as follows:

X=M10M00]]>

Y=M01M00;]]>

The width (witdth) of human face region and high (height) calculation formula are as follows:

Witdth=3XX]]>

Height=3YY;]]>

Wherein, XX, YY can be calculated according to barycenter data, and formula is as follows:

XX=M20M00-X × X]]>

YY=M02M00-Y × Y;]]>

S30, human eye pupil barycenter and pupil region are positioned in human face region, obtain human eye state parameter and human eye is entered Line trace.

In the human face region obtained in step S20, adaptive threshold and horizontal integral projection method coarse positioning people are utilized The position of eye.

The bianry image of human face region is obtained using the method for adaptive threshold binaryzation, it is contemplated that human eye is located at face Upper half area, therefore the upper half area to obtaining bianry image carries out the horizontal integral projection of Gray Projection value, obtains human eye Horizontal level, formula are as follows:

H (x)=1y2-y1 ∑ y=y1y2I (x, y)]]>

Wherein, y1, y2 represent the lower and upper limit of the abscissa of human face region in image respectively, and x represents the vertical seat of image Mark, I (x, y) represent pixel value of the image in coordinate points (x, y), and H (x) represents being averaged after the horizontal integration of xth row pixel Value.

Positioning pupil barycenter in the human eye area oriented, and human eye is tracked using Kalman filter method.

In the human eye area oriented, the maximum point of gray value is found, and region is carried out to surrounding centered on the point Growth, then obtain the barycenter of the barycenter in region, as pupil after the growth of zoning.The computational methods and Face detection of barycenter When computational methods it is identical.In order to improve efficiency, after the position of human eye information of a two field picture is obtained, tracked using Kalman filter Position of human eye.

Using the pupil region for positioning to obtain in above-mentioned steps, the state of eyes is differentiated by calculating pupil area, is obtained Human eye parameter.Differentiate that eyes open closed state and use P80 indexs, i.e., differentiate that eyes are closed form when eyelid covers pupil 80% State.Pupil area can be obtained by counting the non-zero pixel of pupil region.

S40, carry out lane detection from the vehicle front road image by pretreatment and identify, specifically include:

S41, as shown in Figure 3, the road image of pretreatment will be passed through in step S10, carry out canny rim detection realizations Binaryzation.

S42, to the binary map obtained in step S41, carry out Hough transform lane detection and identify.

S43, because Hough transform can detect a plurality of straight line, not only including lane line, in addition to some are spuious straight Line, therefore utilize horizontal line detecting method, finds the vanishing point of straight line, intersect at vanishing point (bar straight line is respectively then track The right boundary of line.

S50, according to the track line computation vehicle traveling direction detected and leading line angle, judge whether vehicle deviates car Road:

First, the center line of lane line is calculated:According to the road boundary and vanishing point obtained in step S40, it may be determined that road The angle on roadside circle, so as to obtain the half of the center line in track, i.e. angle, and cross the straight line of vanishing point.

As shown in Figure 4, automobile normal running, then the angle α between travel direction and track center line should be in default threshold In value α T ranges, angle exceedes the predetermined threshold value, it is believed that vehicle travels run-off-road.Wherein predetermined threshold value α T with shooting) ginseng Number and road conditions etc. are relevant, are configured according to corresponding situation, and by many experiments, effect is relatively good when value is 10.

Wherein, in accompanying drawing 4, Vp represents vanishing point, and v represents the speed of automobile, and vl represents car speed in center line direction Component.

S60, driver's driving condition is differentiated according to the eye feature of driver and vehicle departure degree, if being in Improper driving condition, then start alarm.

The method for using dynamic bayesian network in the present embodiment, driver's eyes parameter information and vehicle are deviateed and believed Breath fusion, Bayesian network parameters are obtained using the method for study, and according to obtained Bayesian network, be to driver finally No to provide differentiation in improper driving condition, its mathematic(al) representation is as follows:Bayesian network structure can use opening up in accompanying drawing 5 Structure is flutterred to describe causality, a) represents initial network, b, which closes the eyes, represents transfer network.

P (F | E, C)=P (F, E, C) P (E, C)=P (F, E, C) ∑ i=01P (Fi, E, C)]]>

Wherein, father node table F represents driver status, and child node E represents eye parameters state, and child node C represents track Deviation state, all exist for F, E and C and (plant state 0 and 1,0 to represent to occur, 1 represents not occur.Between father node and child node Line represent the conditional probability of each state respectively.T and t+1 represents the time, between t node and t+1 moment nodes Line represents state transition probability.

Bayesian network parameters are obtained by sample learning method, respectively obtain the prior probability of driving condition with it is each The conditional probability and transition probability of feature.Finally to currently available state, using result is calculated to driver status Judge.

Corresponding to a kind of above-mentioned driver status monitoring method of view-based access control model information fusion, present invention also offers one kind The driver status monitoring system of view-based access control model information fusion, its part and function and above-mentioned view-based access control model information fusion Each step of driver status monitoring method corresponds, and just no longer discusses herein.

The all or part of above-mentioned technical proposal provided in an embodiment of the present invention can pass through the related hardware of programmed instruction To complete, described program can be stored in the storage medium that can be read, and the storage medium includes:ROM, RAM, magnetic disc or light Disk etc. is various can be with the medium of store program codes.

The foregoing is only presently preferred embodiments of the present invention, be not intended to limit the invention, it is all the present invention spirit and Within principle, any modification, equivalent substitution and improvements made etc., it should be included in the scope of the protection.

Claims (6)

1. a kind of driver status monitoring method of view-based access control model information fusion, it is characterised in that comprise the following steps:
S10, collection driver's face image and vehicle front road image are simultaneously pre-processed;
S20, Face detection is carried out to driver's face image by pretreatment, obtains human face region;
S30, human eye pupil barycenter and pupil region are positioned in human face region, obtain human eye state parameter and human eye is carried out with Track;
S40, lane detection is carried out to the vehicle front road image by pretreatment and is identified;
S50, according to the track line computation vehicle traveling direction that detects and leading line angle, judge vehicle whether run-off-road;
S60, driver's human eye parameter information is merged with vehicle runout information, Bayesian network ginseng is obtained using the method for study Number, respectively obtains the prior probability of driving condition and the conditional probability and transition probability of each feature, and according to obtained shellfish This network of leaf, driver status is differentiated using result of calculation, if being in improper driving condition, start alarm;
Wherein, step S30 also includes:
S31, the bianry image of human face region is obtained using the method for adaptive threshold binaryzation;
S32, the upper half area of the bianry image to obtaining carry out the horizontal integral projection of Gray Projection value, obtain the level of human eye Position;
S33, gray value maximum point is found in the human eye area oriented, and region growing is carried out to surrounding centered on the point And pupil barycenter is obtained after the growth of zoning, human eye is tracked using Kalman filter method;
S34, the closed state of opening that pupil area differentiates eyes is calculated, obtains human eye parameter.
2. the driver status monitoring method of view-based access control model information fusion according to claim 1, it is characterised in that described Step S40 includes:
S41, canny rim detections are carried out to the road image by pretreatment and realize binaryzation;
S42, Hough transform straight-line detection is carried out to the bianry image of the road image;
S43, using a plurality of straight line that detects of horizontal line detecting method analysis Hough transform, find the vanishing point of straight line, phase Two straight lines for meeting at vanishing point are respectively the right boundary of lane line.
3. the driver status monitoring method of view-based access control model information fusion according to claim 2, it is characterised in that described Step S50 includes:
Road boundary and vanishing point according to being obtained in step S40 determine the angle of road boundary, so as to obtain the center line in track, That is the half of angle and the excessively straight line of vanishing point,
If the angle between vehicle traveling direction and track center line in preset threshold range, is judged as that automobile belongs to normal Driving condition, otherwise it is judged as that vehicle travels run-off-road.
A kind of 4. driver status monitoring system of view-based access control model information fusion, it is characterised in that including:
Image obtains and pretreatment unit, for gathering driver's face image and vehicle front road image and being located in advance Reason;
Human eye position and kinematic parameter extraction unit, for by pretreatment driver's face image carry out Face detection with Human face region is obtained, and human eye pupil barycenter and pupil region are positioned in human face region, obtains human eye state parameter and to people Eye is tracked;
Lane detection recognition unit:For carrying out lane detection to the vehicle front road image by pretreatment and knowing Not;
Deviate judging unit:For according to the track line computation vehicle traveling direction detected and leading line angle, judging vehicle Whether run-off-road;
Driver status analysis unit:For driver's human eye parameter information to be merged with vehicle runout information, using study Method obtains Bayesian network parameters, respectively obtains the prior probability of driving condition and the conditional probability of each feature and transfer Probability, and according to obtained Bayesian network, driver status is differentiated using result of calculation, if being in improper driving State, then start alarm;
Wherein described human eye positioning and kinematic parameter extraction unit are additionally operable to:
The bianry image of human face region is obtained using the method for adaptive threshold binaryzation;
The upper half area of bianry image to obtaining carries out the horizontal integral projection of Gray Projection value, obtains the horizontal position of human eye Put;
Gray value maximum point is found in the human eye area oriented, and region growing is carried out simultaneously to surrounding centered on the point Pupil barycenter is obtained after the growth of zoning, human eye is tracked using Kalman filter method;
The closed state of opening that pupil area differentiates eyes is calculated, obtains human eye parameter.
5. driver status monitoring system according to claim 4, it is characterised in that the lane detection recognition unit It is specifically used for:
Canny rim detections are carried out to the road image by pretreatment and realize binaryzation;
Hough transform straight-line detection is carried out to the bianry image of the road image;
The a plurality of straight line detected using horizontal line detecting method analysis Hough transform, is found the vanishing point of straight line, intersects at and disappear The right boundary that it is respectively lane line that two straight lines of hidden point, which are,.
6. driver status monitoring system according to claim 5, it is characterised in that the deviation judging unit is specifically used In:
Road boundary and vanishing point according to being obtained in lane detection recognition unit determine the angle of road boundary, so as to obtain The half of track center line, i.e. angle and the straight line for crossing vanishing point;If the angle between vehicle traveling direction and track center line exists In preset threshold range, then it is judged as that automobile belongs to normal driving state, is otherwise judged as that vehicle travels run-off-road.
CN201410208173.2A 2014-05-16 2014-05-16 The driver status monitoring method and system of a kind of view-based access control model information fusion CN104021370B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410208173.2A CN104021370B (en) 2014-05-16 2014-05-16 The driver status monitoring method and system of a kind of view-based access control model information fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410208173.2A CN104021370B (en) 2014-05-16 2014-05-16 The driver status monitoring method and system of a kind of view-based access control model information fusion

Publications (2)

Publication Number Publication Date
CN104021370A CN104021370A (en) 2014-09-03
CN104021370B true CN104021370B (en) 2018-01-23

Family

ID=51438115

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410208173.2A CN104021370B (en) 2014-05-16 2014-05-16 The driver status monitoring method and system of a kind of view-based access control model information fusion

Country Status (1)

Country Link
CN (1) CN104021370B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104332055B (en) * 2014-11-18 2016-04-06 吉林大学 A kind of traffic conflict method for quickly identifying based on driver's pupil diameter
CN105912984A (en) * 2016-03-31 2016-08-31 大连楼兰科技股份有限公司 Auxiliary driving method capable of realizing multi-state information fusion
CN106886217A (en) * 2017-02-24 2017-06-23 安科智慧城市技术(中国)有限公司 Automatic navigation control method and apparatus
CN107959756A (en) * 2017-11-30 2018-04-24 西安科锐盛创新科技有限公司 The system and method for electronic equipment are automatically closed in sleep

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102509418A (en) * 2011-10-11 2012-06-20 东华大学 Fatigue driving estimation and early-warning method and device of multi-sensor information fusion
CN103310202A (en) * 2013-06-27 2013-09-18 西安电子科技大学 System and method for guaranteeing driving safety

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6502035B2 (en) * 2000-08-02 2002-12-31 Alfred B. Levine Automotive safety enhansing system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102509418A (en) * 2011-10-11 2012-06-20 东华大学 Fatigue driving estimation and early-warning method and device of multi-sensor information fusion
CN103310202A (en) * 2013-06-27 2013-09-18 西安电子科技大学 System and method for guaranteeing driving safety

Also Published As

Publication number Publication date
CN104021370A (en) 2014-09-03

Similar Documents

Publication Publication Date Title
Doshi et al. On the roles of eye gaze and head dynamics in predicting driver's intent to change lanes
US7444311B2 (en) System and method for real-time recognition of driving patterns
Ueno et al. Development of drowsiness detection system
US8902070B2 (en) Eye closure detection using structured illumination
Sun et al. A real-time precrash vehicle detection system
US9405982B2 (en) Driver gaze detection system
US8170725B2 (en) Vehicle stability enhancement control adaptation to driving skill based on highway on/off ramp maneuver
US9665802B2 (en) Object-centric fine-grained image classification
US7463157B2 (en) Method and system for recognizing driver impairment
CN101540090B (en) Driver fatigue monitoring method based on multivariate information fusion
US8260515B2 (en) Adaptive vehicle control system with driving style recognition
TWI447039B (en) Driving behavior analysis and warning system and method thereof
US8280601B2 (en) Adaptive vehicle control system with integrated maneuver-based driving style recognition
US6927694B1 (en) Algorithm for monitoring head/eye motion for driver alertness with one camera
CN102509418B (en) Fatigue driving estimation and early-warning method and device of multi-sensor information fusion
Cheng et al. Turn-intent analysis using body pose for intelligent driver assistance
Doshi et al. On-road prediction of driver's intent with multimodal sensory cues
CN100462047C (en) Safe driving auxiliary device based on omnidirectional computer vision
US8976040B2 (en) Intelligent driver assist system based on multimodal sensor fusion
US20100209884A1 (en) Driving skill recognition based on vehicle left and right turns
Smith et al. Determining driver visual attention with one camera
JP2007272834A (en) Driver model creation device, drive support device, and drive behavior determination device
CN102785661B (en) Lane departure control system and lane departure control method
Mbouna et al. Visual analysis of eye state and head pose for driver alertness monitoring
US8280560B2 (en) Adaptive vehicle control system with driving style recognition based on headway distance

Legal Events

Date Code Title Description
PB01 Publication
C06 Publication
SE01 Entry into force of request for substantive examination
C10 Entry into substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180123

Termination date: 20180516

CF01 Termination of patent right due to non-payment of annual fee