CN101877051A - Driver attention state monitoring method and device - Google Patents

Driver attention state monitoring method and device Download PDF

Info

Publication number
CN101877051A
CN101877051A CN2009102334491A CN200910233449A CN101877051A CN 101877051 A CN101877051 A CN 101877051A CN 2009102334491 A CN2009102334491 A CN 2009102334491A CN 200910233449 A CN200910233449 A CN 200910233449A CN 101877051 A CN101877051 A CN 101877051A
Authority
CN
China
Prior art keywords
driver
face
eyes
distance
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2009102334491A
Other languages
Chinese (zh)
Inventor
刘志强
汪澎
秦洪懋
宋世亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu University
Original Assignee
Jiangsu University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu University filed Critical Jiangsu University
Priority to CN2009102334491A priority Critical patent/CN101877051A/en
Publication of CN101877051A publication Critical patent/CN101877051A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

The invention discloses driver attention state monitoring method and device. An output port of a camera is connected with an input port of a video decoder, and a digital signal processor is externally connected with the video decoder, a storage, a power supply and an alarm buzzer respectively. The method comprises the following steps of: acquiring a face video detection image of a driver by the camera, using the transverse width between two eyes of the driver and the longitudinal distance from the mouth to the middle point of a connecting line of the two eyes, which are detected under the normal state, as standard distances for locating the face, and resetting the standard distances whenever the face is located; comparing the distances actually detected in the driving process with the standard distances; and judging whether the driver is distracted according to the change obtained by comparing the actually detected distances and the standard distances. The invention can monitor the whole forming process of tiredness and distraction of the driver, judge whether the drive is tired and distracted, capture the distraction state of the driver in time, warn dangerous driving behavior such as distraction, and guarantee the safe driving of the driver.

Description

Driver attention state monitoring method and device
Technical field
The present invention relates to the Communication and Transportation Engineering technical field, refer in particular to a kind of method and apparatus that driver's notice is detected.
Background technology
The driver attracts in fatigue driving, by surrounding and absent-minded, answering cell phone or when looking after minor on the car, often lose control to direction of traffic, cause the discontinuous and randomness of driving behavior, distinguishing feature is that vehicles such as deviation, the rear-end impact accident phenomenon that travels takes place.According to these characteristics, Wang Rongben, Guo Lie etc. roll up at Shandong Jiaotong University's journal the 14th and have introduced the eye that utilizes machine vision technique monitoring driver in " based on the traffic safety comprehensive coverage systematic study of machine vision " delivered in July, 2006 the 2nd phase, mouth, the motion feature of organs such as head, set up the tired and distraction state comprehensive quantification evaluation model of the driver who detects based on visual information, realize real-time and effective dynamic monitoring and early warning, and systematic study the assessment criteria of front vehicles detection method and safe distance between vehicles when driving, realize the preventing collision early warning, respectively vehicle is departed from early warning technology, vehicle safety spacing early warning technology, driver's fatigue and distraction Condition Monitoring Technology, the pedestrian monitoring technology is analyzed.Mainly be from frequency of wink aspect driver's fatigue detecting, the mouth feature detection judges whether the driver yawns owing to tired.
Only limit to conceptual phase about driver's fatigue detecting at present, do not propose specific embodiment, therefore, significant limitation is arranged in actual applications.Driving behavior is the process that needs keep hig diligence power to concentrate constantly, a slight carelessness all might cause bigger disastrous accident, and the driver is by driving absent-minded accurate waking state to lassitude, drowsy state, fatigue formation to the state of dozing off also is a process again, yawn because of fatigue when detecting the driver, even when dozing off, may be late, accident has been difficult to avoid, and is difficult to ensure effectively driver's the security of the lives and property.
Summary of the invention
The objective of the invention is for overcoming the deficiencies in the prior art, for the traffic safety that ensures the driver provides a kind of safe, objective, real-time monitoring method and device, realize that vehicle-mounted, contactless, real-time, round-the-clock driving demand power detects, whether real-time judge driver dispersion attention, and scatterbrained dangerous driving behavior is warned in real time.
Method of the present invention adopts following steps successively:
1) obtains driver face Video Detection image by camera, transverse width between eyes and face under detected driver's normal condition to the gauged distance of the fore-and-aft distance between the eyes line mid point as location people face, and are located the replacement that people's face all will be realized gauged distance at every turn;
2) by digital signal processor with actual detected in the driving procedure to distance compare with described gauged distance;
3) according to actual detected to the distance variation of comparing with gauged distance, judge whether dispersion attention of driver.
The technical scheme that device of the present invention adopts is: camera output connects the Video Decoder input, and digital signal processor is external Video Decoder, storer, power supply and alarm buzzer respectively.
The invention has the beneficial effects as follows: the present invention proposes notice detection method and device on the basis of analyzing driving demand power mechanism of production, analyze the driver's vision cognitive process, detect driver face T-shape curve in real time, estimate the distribution of visual attention, drive safety is carried out assay.By detection to notice, monitor the absent-minded whole forming process of driver's driving fatigue, analyze the security of driver's driving behavior, driving condition, judge whether driving fatigue, absent-minded of this driver, in time catch driver's distraction state, scatterbrained dangerous driving behavior is warned, submitted to the driver to note, to ensure driver's traffic safety.
Description of drawings
The present invention is described in more detail below in conjunction with the drawings and specific embodiments.
Fig. 1 is the monitoring device structural representation.
Fig. 2 is the alarm module structured flowchart.
Fig. 3 is the monitoring method process flow diagram.
Fig. 4 is driver attention characteristic parameter figure.
Embodiment:
As Fig. 1, the driver attention state monitoring device is made up of camera 1, Video Decoder 2, digital signal processor 3, storer 4, power supply 5 and alarm buzzer 6, camera 1 output connects Video Decoder 2 inputs, and digital signal processor 3 is external Video Decoder 2, storer 4, power supply 5 and alarm buzzer 6 respectively.Camera 1 is the CCD camera, and Video Decoder 2 is the TVP5150 Video Decoder, and digital signal processor 3 is the TMS320DM642 dsp chip, and storer 4 is the flash program storage.
Camera 1 is installed in the pilothouse near driver's frontal dish position or in driver's head front upper place, adopts camera 1 to carry out video image acquisition, obtain driver face status information in real time.The vision signal input video demoder of gathering 2 converts NTSC, PAL vision signal to digital difference signal, in the video image acquisition process, select pal mode for use, per second is gathered 30 two field pictures, and every two field picture size is 720 * 576 pixels, and output format is ITU-R BT.656.
Vision signal to output is stored, the storage administration of inputting video data in digital signal processor 3 earlier, by the initialization of ICETEKDM642PCIBoardInit () function realization, comprise the side-play amount setting and the buffer zone initialization of variable of buffer zone to DM642 hardware.For the driver that loads video interface provides the basis.FVID create (), FVID_control (), FVID_alloc () and FVID_exchange (), this FVID series of functions has realized using the conversion between buffer zone and the driving buffer zone.When the device driver of configuration input video interface screen buffer is set, these buffer zones of device driver management are used for the collection of real time video data.Extract frame of video, allocation buffer, initialization video drive, configuration codes device, video capture and Memory Allocation several steps, it realizes that function is as follows:
EVMDM642_vCapParamsChan.segId=EXTERNALHEAP;
EVMDM642_vCapParamsTVP5150A.hI2C=EVMDM642_I2C_hI2C;
capChan=FVID_create(″/VP0CAPTURE/A/0″,IOM_INPUT,&status,(Ptr)&EVMDM642_vCapParamsChan,NULL);
FVID_control(capChan,VPORT_CMD_EDC_BASE+EDC_CONFIG,(Ptr)&EVMDM642_vCapParamsTVP5150A);
FVID_control(disChan,VPORT_CMD_START,NULL);
FVID_control(capChan,VPORT_CMD_START,NULL);
FVID_alloc(disChan,&disFrameBuf);
FVID_alloc(capChan,&capFrameBuf);
Video is handled, 3 pairs of various complex calculations such as video of digital signal processor carry out high speed processing again.Program/data space of DM642 is by external storage interface accessing external storage (Flash4:M*8 position).
In case find the driver dispersion attention state taking place, enables alarm buzzer 6 immediately and report to the police.As Fig. 2, the communication contact between ARM module and the DM642 uses the outside expansion bus of LPC2214 chip and the HPI interface of DM642 to realize, uses 8 IO pins of DSP control ARM to realize the control with the registered permanent residence to RS232, and function realizes that function is as follows:
ARMInit () initialize routine is used for DSP and is communicated with ARM, calls before any use ARM resource;
SndByte (int val) send data by serial ports, and send through the UART0 of ARM;
RcvByte (void) accepts data by serial ports, and the data that the UART0 of process ARM receives are delivered to DSP from arm;
IOCfg (int val) IO mouth configurator, 1 is output, 2 are input;
IOWR (int val) writes 1 and is high level to IO mouth write data, and 0 is low level;
IORD (void) is from IO mouth reading of data.
As shown in Figure 3, the present invention obtains the driver attention status information by transverse width between real-time monitoring driving people's eyes (interpupillary distance) and face to the fore-and-aft distance between the eyes line mid point, by video camera 1 input driver face image, when initial people's face location, face's T-shape linear distance is predisposed to gauged distance under the detection driver normal condition, described face T-shape linear distance promptly be transverse width (interpupillary distance) between driver's eyes and face to the fore-and-aft distance between the eyes line mid point, locate the replacement that people's face all will be realized gauged distance at every turn.Actual detected T-shape linear distance in the driving procedure and described gauged distance are compared, according to the variation that real-time detection T-shape curve distance is compared with gauged distance, judge whether dispersion attention of driver, its concrete grammar is as follows:
Obtain the Video Detection image with video camera 1, use smooth (), highlow () and regionfind () function successively and realize the The disposal of gentle filter of video sequence, be converted into the processing of bianry image after the Region Segmentation based on threshold value, and the function of carrying out eyes, face regional search, carry out noise reduction process through low-pass filter again and obtain driver's face detection image.Use the Adaboost algorithm driver's face image is expanded selected characteristic, the realization facial contour detects, obtain and drive number and the positional information that everybody face detects, adopt optical flow method to make full use of related information between the video successive frame simultaneously, continuity to video image interframe is estimated, obtain and drive number and the positional information that everybody face detects, the result that deduction Adaboost algorithm detects in the optical flow method testing result, final location obtains driving everybody the face band of position.On the basis that human face region is determined, adopting the OTSU algorithm to carry out binaryzation cuts apart, obtain and comprise the characteristic circle that hair, eyebrow, face etc. and the colour of skin differ greatly, then the binaryzation result is carried out ellipse fitting, obtain the oval and mouth region ellipse of ocular, geometric properties according to ocular and mouth region is differentiated screening, Primary Location ocular and mouth region.Geometric properties relation based on normalization moment of inertia NMI (normalized moment of inertia) characterization method and Gradient distribution is further verified, obtain the final positioning result of ocular and mouth region, and then extraction characterizes the driver attention eigenwert, record driver detected characteristics parameter is obtained people's face T-shape characteristic curve value in real time.
Actual detected T-shape linear distance in the driving procedure and described gauged distance compared judge whether dispersion attention of driver, driver attention is the notion of a neuro-physiology, its expression people ability that mentality is concentrated when examining an object.The present invention uses the non-contact detection mode, under the situation that the driver is not had interference, detect, by analysis and modeling to driver face static nature, behavioral characteristics, detect and contain the important information that visual attention distributes, it is the T-shape curve, produce people's face T-shape curve conspicuousness variation characteristic figure, judge the driver attention degree of scatter.
The driver attention evaluation model that the present invention adopts is realized domination, is depended on four kinds of task and watch behavior attentively according to the driver: 1. bottom-up vision attention; 2. top-down vision attention; 3. vision attention from left to right; 4. the vision attention of right-to-left.Dui Ying the watch behavior of driver that be respectively therewith to left and right sides rearview mirror, room mirror and panel board.In addition the behavior of watching attentively of long period all is judged to be dangerous in driver's driving procedure and watches behavior attentively.
One embodiment of the present of invention below are provided:
Embodiment
As Fig. 4, form people's face T-shape notice by interpupillary distance Si, wheelbase Di and characterize curve.In any a period of time T, the continuous picture of processing adds up to: n=f*T (f: the frame rate that video obtains; T: detection time).Detect on the basis at picture, based on f (x, y) extracted region driver's notice characteristic parameter: lefteye (x, y) (left eye coordinates), righteye (x, y) (right eye coordinate) and mouse (x, y) (mouth coordinate) calculate the eigenwert that eye, mouth and driver's head transverse and longitudinal change in the driver attention state monitoring process:
1. the interpupillary distance Si of eyes is:
S i = ( leftey e i . x - righteye i . x ) 2 + ( lefteye i . y - righteye i . y ) 2 - - - ( 6 )
2. the wheelbase Di of eye and mouth is:
D i = ( mouse i . x - ( lefteye i . x + righteye i . x ) / 2 ) 2 + ( mouse i . y - ( lefteye i . y + righteye i . y ) / 2 ) 2 - - - ( 7 )
By camera 1 input driver face image, obtain driver's interpupillary distance Si in real time by detecting, the wheelbase Di of eye and mouth utilizes the interpupillary distance rule to extract the feature of shaking the head about the driver as evaluating, and it realizes that power function is shakehead ().Utilize face to extract driver's feature of nodding to the fore-and-aft distance rule between the eyes line mid point, it realizes that power function is nod ().Extracted later the combining with dynamic template renewal and Kalman filtering of feature feature is followed the tracks of, its function realizes that function is feature_track ().After eyes, the face location, around the two, defined the dynamic rectangular frame, method with template matches in the rectangle of area-of-interest appointment is directly extracted the two state, and it realizes that function is respectively rectangle (), mouse_state () and eye_state ().Location eyes, face carry out distance later on and extract, and carry out the detection evaluation of driver attention by the T-shape curve.When initial people's face location, face's T-shape linear distance under the driver's normal condition that detects is predisposed to gauged distance, and all to realize the replacement of gauged distance after each people's face location, because of gauged distance will be decided with driver's variation, when beginning, be provided with simultaneously at every turn, varying with each individual, is not to be the value of fixedly installing.Actual detected T-shape linear distance and gauged distance in the driving procedure are compared,, judge whether dispersion attention of this driver according to detecting in real time the result that the T-shape curve distance is compared with gauged distance.
When driver's head double swerve, in real time the driver's interpupillary distance Si that detects changes, and when detecting the interpupillary distance Si that continuous appearance changes more than 0.5 second, judges the driver attention dispersion, the driving behavior of assert this moment is a hazardous act, and system gives the alarm.When driver's head teetertotters, the driver's mouth that detects changes apart from Di to the vertical Z-axis between the eyes line in real time, when detecting the wheelbase Di that continuous appearance changes more than 0.5 second, judge that driver attention disperses, the driving behavior of assert this moment is a hazardous act, and system gives the alarm.When head obliquely or oblique following time, the composition of T-shape line: interpupillary distance and mouth all change to the vertical vertical range between the eyes line, when detecting the distance that continuous appearance changes more than 0.5 second, judge that driver attention disperses, the driving behavior of assert this moment is a hazardous act, and system gives the alarm.
According to the characteristic of driver's environmental activity in pilothouse, in carrying out the notice testing process, make following supposition:
(1) position agreement: on the basis that test of many times detects,, make up driver's normal operation region, i.e. surveyed area by the statistical study means.During driver's safe driving, its zone of action is assert when T in surveyed area continues in the time to detect less than the driver and is driven unusually in surveyed area, enters the abnormity early warning mechanism of driving.
(2) disturb agreement: in certain time interval T, detecting the incident that surpasses a width of cloth driver head portrait in surveyed area is small probability event, surveyed area belongs to the zone of driver's operate as normal, surpass a copilot human head picture as in the zone, detecting simultaneously, and continue for some time T, then judge to disturb and drive, need enter the abnormity early warning mechanism of driving; Only select driver's head portrait when notice detects simultaneously, do not monitor for non-driver's head portrait.
(3) speed agreement: the rate travel of driver's head can be not too high under normal condition.
On the basis of above three supposition, detect driver attention characteristic feature value.

Claims (5)

1. driver attention state monitoring method is characterized in that adopting successively following steps:
1) obtains driver face Video Detection image by camera (1), transverse width between eyes and face under detected driver's normal condition to the gauged distance of the fore-and-aft distance between the eyes line mid point as location people face, and are located the replacement that people's face all will be realized gauged distance at every turn;
2) by digital signal processor (3) with actual detected in the driving procedure to distance compare with described gauged distance;
3) according to actual detected to the distance variation of comparing with gauged distance, judge whether dispersion attention of driver.
2. driver attention state monitoring method according to claim 1, it is characterized in that: step 1) is specially: described Video Detection image is carried out The disposal of gentle filter, be converted into bianry image after the Region Segmentation based on threshold value, and carry out eyes, the face regional search, carry out noise reduction process through low-pass filter again, use the Adaboost algorithm driver's face image is expanded selected characteristic, adopt optical flow method that the continuity of video image interframe is estimated simultaneously, the result that deduction Adaboost algorithm detects in the optical flow method testing result, adopting the OTSU algorithm to carry out binaryzation cuts apart, obtain characteristic circle, then the binaryzation result is carried out ellipse fitting, obtain the oval and mouth region ellipse of ocular, geometric properties according to ocular and mouth region is differentiated screening, Primary Location ocular and mouth region, adopt at last based on the geometric properties relation of normalization moment of inertia characterization method and Gradient distribution and verify, obtain the final location of ocular and mouth region.
3. driver attention state monitoring method according to claim 1, it is characterized in that: the determination methods of step 3) is: when detecting transverse width between eyes and continuously occur changing more than 0.5 second, when maybe changing more than 0.5 second all appears in the fore-and-aft distance between eyes line mid point when detecting the fore-and-aft distance of driver's face to eyes line mid point between appearance changed more than 0.5 second continuously or when transverse width between eyes and face, device gives the alarm.
4. driver attention state monitoring device, comprise camera (1) and digital signal processor (3), it is characterized in that: camera (1) output connects Video Decoder (2) input, and digital signal processor (3) is external Video Decoder (2), storer (4), power supply (5) and alarm buzzer (6) respectively.
5. driver attention state monitoring device according to claim 4, it is characterized in that: camera (1) is the CCD camera, Video Decoder (2) is the TVP5150 Video Decoder, digital signal processor (3) is the TMS320DM642DSP chip, and storer (4) is the flash program storage.
CN2009102334491A 2009-10-30 2009-10-30 Driver attention state monitoring method and device Pending CN101877051A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009102334491A CN101877051A (en) 2009-10-30 2009-10-30 Driver attention state monitoring method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009102334491A CN101877051A (en) 2009-10-30 2009-10-30 Driver attention state monitoring method and device

Publications (1)

Publication Number Publication Date
CN101877051A true CN101877051A (en) 2010-11-03

Family

ID=43019604

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009102334491A Pending CN101877051A (en) 2009-10-30 2009-10-30 Driver attention state monitoring method and device

Country Status (1)

Country Link
CN (1) CN101877051A (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102310771A (en) * 2011-05-26 2012-01-11 臧安迪 Motor vehicle safety control method and system based on driver face identification
CN102982316A (en) * 2012-11-05 2013-03-20 安维思电子科技(广州)有限公司 Driver abnormal driving behavior recognition device and method thereof
CN103413467A (en) * 2013-08-01 2013-11-27 袁苗达 Controllable compelling guide type self-reliance study system
CN104504378A (en) * 2014-12-29 2015-04-08 北京奇艺世纪科技有限公司 Method and device for detecting image information
CN104688251A (en) * 2015-03-02 2015-06-10 西安邦威电子科技有限公司 Method for detecting fatigue driving and driving in abnormal posture under multiple postures
CN104751663A (en) * 2015-02-28 2015-07-01 北京壹卡行科技有限公司 Safe driving auxiliary system and safe driving auxiliary method for driver
CN105632104A (en) * 2016-03-18 2016-06-01 内蒙古大学 Fatigue driving detection system and method
CN105809152A (en) * 2016-04-06 2016-07-27 清华大学 Monitoring method for cognitive distraction of driver on basis of multi-source information fusion
CN105869231A (en) * 2015-12-15 2016-08-17 乐视致新电子科技(天津)有限公司 Driving safety reminding method, device and vehicle
CN106295600A (en) * 2016-08-18 2017-01-04 宁波傲视智绘光电科技有限公司 Driver status real-time detection method and device
CN106650670A (en) * 2016-12-27 2017-05-10 北京邮电大学 Method and device for detection of living body face video
CN108549838A (en) * 2018-03-13 2018-09-18 苏州奥科德瑞智能科技有限公司 A kind of back-up surveillance method of view-based access control model system
CN108563992A (en) * 2018-03-13 2018-09-21 苏州奥科德瑞智能科技有限公司 A kind of vision survey system based on recognition of face
CN108995650A (en) * 2018-07-04 2018-12-14 惠州市德赛西威汽车电子股份有限公司 Method for controlling driving speed, device and the computer readable storage medium of automobile
CN109074748A (en) * 2016-05-11 2018-12-21 索尼公司 Image processing equipment, image processing method and movable body
CN109359539A (en) * 2018-09-17 2019-02-19 中国科学院深圳先进技术研究院 Attention appraisal procedure, device, terminal device and computer readable storage medium
CN109803583A (en) * 2017-08-10 2019-05-24 北京市商汤科技开发有限公司 Driver monitoring method, apparatus and electronic equipment
CN110765807A (en) * 2018-07-25 2020-02-07 阿里巴巴集团控股有限公司 Driving behavior analysis method, driving behavior processing method, driving behavior analysis device, driving behavior processing device and storage medium
CN112150860A (en) * 2019-06-27 2020-12-29 歌乐株式会社 In-vehicle device and control method for in-vehicle device
CN112215115A (en) * 2020-09-30 2021-01-12 易显智能科技有限责任公司 Method and related device for concentration capability evaluation in driving training
CN113705373A (en) * 2021-08-10 2021-11-26 苏州莱布尼茨智能科技有限公司 Adjustable self-adaptive strong driver facial expression recognition system
CN113744498A (en) * 2020-05-29 2021-12-03 杭州海康汽车软件有限公司 System and method for driver attention monitoring
CN117576668A (en) * 2024-01-17 2024-02-20 江西科技学院 Multi-feature perception driving fatigue state detection method and system based on video frame

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102310771A (en) * 2011-05-26 2012-01-11 臧安迪 Motor vehicle safety control method and system based on driver face identification
CN102310771B (en) * 2011-05-26 2013-05-29 臧安迪 Motor vehicle safety control method and system based on driver face identification
CN102982316A (en) * 2012-11-05 2013-03-20 安维思电子科技(广州)有限公司 Driver abnormal driving behavior recognition device and method thereof
CN103413467A (en) * 2013-08-01 2013-11-27 袁苗达 Controllable compelling guide type self-reliance study system
CN104504378A (en) * 2014-12-29 2015-04-08 北京奇艺世纪科技有限公司 Method and device for detecting image information
CN104751663A (en) * 2015-02-28 2015-07-01 北京壹卡行科技有限公司 Safe driving auxiliary system and safe driving auxiliary method for driver
CN104688251A (en) * 2015-03-02 2015-06-10 西安邦威电子科技有限公司 Method for detecting fatigue driving and driving in abnormal posture under multiple postures
CN105869231A (en) * 2015-12-15 2016-08-17 乐视致新电子科技(天津)有限公司 Driving safety reminding method, device and vehicle
CN105632104A (en) * 2016-03-18 2016-06-01 内蒙古大学 Fatigue driving detection system and method
CN105632104B (en) * 2016-03-18 2019-03-01 内蒙古大学 A kind of fatigue driving detecting system and method
CN105809152A (en) * 2016-04-06 2016-07-27 清华大学 Monitoring method for cognitive distraction of driver on basis of multi-source information fusion
CN105809152B (en) * 2016-04-06 2019-05-21 清华大学 A kind of driver's cognition based on Multi-source Information Fusion is divert one's attention monitoring method
CN109074748A (en) * 2016-05-11 2018-12-21 索尼公司 Image processing equipment, image processing method and movable body
CN106295600A (en) * 2016-08-18 2017-01-04 宁波傲视智绘光电科技有限公司 Driver status real-time detection method and device
CN106650670A (en) * 2016-12-27 2017-05-10 北京邮电大学 Method and device for detection of living body face video
CN109803583A (en) * 2017-08-10 2019-05-24 北京市商汤科技开发有限公司 Driver monitoring method, apparatus and electronic equipment
CN108549838A (en) * 2018-03-13 2018-09-18 苏州奥科德瑞智能科技有限公司 A kind of back-up surveillance method of view-based access control model system
CN108563992A (en) * 2018-03-13 2018-09-21 苏州奥科德瑞智能科技有限公司 A kind of vision survey system based on recognition of face
CN108995650A (en) * 2018-07-04 2018-12-14 惠州市德赛西威汽车电子股份有限公司 Method for controlling driving speed, device and the computer readable storage medium of automobile
CN110765807A (en) * 2018-07-25 2020-02-07 阿里巴巴集团控股有限公司 Driving behavior analysis method, driving behavior processing method, driving behavior analysis device, driving behavior processing device and storage medium
CN110765807B (en) * 2018-07-25 2024-04-05 斑马智行网络(香港)有限公司 Driving behavior analysis and processing method, device, equipment and storage medium
CN109359539A (en) * 2018-09-17 2019-02-19 中国科学院深圳先进技术研究院 Attention appraisal procedure, device, terminal device and computer readable storage medium
CN112150860A (en) * 2019-06-27 2020-12-29 歌乐株式会社 In-vehicle device and control method for in-vehicle device
CN113744498A (en) * 2020-05-29 2021-12-03 杭州海康汽车软件有限公司 System and method for driver attention monitoring
CN113744498B (en) * 2020-05-29 2023-10-27 杭州海康汽车软件有限公司 System and method for driver attention monitoring
CN112215115A (en) * 2020-09-30 2021-01-12 易显智能科技有限责任公司 Method and related device for concentration capability evaluation in driving training
CN113705373A (en) * 2021-08-10 2021-11-26 苏州莱布尼茨智能科技有限公司 Adjustable self-adaptive strong driver facial expression recognition system
CN113705373B (en) * 2021-08-10 2023-12-26 江苏钮玮动力科技有限公司 Driver facial expression recognition system with adjustable self-adaption
CN117576668A (en) * 2024-01-17 2024-02-20 江西科技学院 Multi-feature perception driving fatigue state detection method and system based on video frame
CN117576668B (en) * 2024-01-17 2024-04-05 江西科技学院 Multi-feature perception driving fatigue state detection method and system based on video frame

Similar Documents

Publication Publication Date Title
CN101877051A (en) Driver attention state monitoring method and device
CN101593425B (en) Machine vision based fatigue driving monitoring method and system
Li et al. Modeling of driver behavior in real world scenarios using multiple noninvasive sensors
He et al. Fatigue detection using smartphones
Hossain et al. IOT based real-time drowsy driving detection system for the prevention of road accidents
CN102263937B (en) Driver's driving behavior monitoring device and monitoring method based on video detection
Chen et al. Driver behavior monitoring and warning with dangerous driving detection based on the internet of vehicles
CN104183091A (en) System for adjusting sensitivity of fatigue driving early warning system in self-adaptive mode
Said et al. Real time eye tracking and detection-a driving assistance system
Rezaei et al. Simultaneous analysis of driver behaviour and road condition for driver distraction detection
Huda et al. Mobile-based driver sleepiness detection using facial landmarks and analysis of EAR values
Hasan et al. State-of-the-art analysis of modern drowsiness detection algorithms based on computer vision
Suryawanshi et al. Driver drowsiness detection system based on lbp and haar algorithm
Khan et al. Efficient Car Alarming System for Fatigue Detectionduring Driving
Avizzano et al. Real-time embedded vision system for the watchfulness analysis of train drivers
Li et al. A new method for detecting fatigue driving with camera based on OpenCV
Roshini et al. Driver distraction and drowsiness detection system
Manjula et al. Driver inattention monitoring system based on the orientation of the face using convolutional neural network
Kondyli et al. A 3D experimental framework for exploring drivers' body activity using infrared depth sensors
CN103646508A (en) Device and operation method for preventing fatigue driving
Weng et al. Remote surveillance system for driver drowsiness in real-time using low-cost embedded platform
Apoorva et al. Review on Drowsiness Detection
Srilakshmi et al. Automated Driver Drowsiness Detection System using Computer Vision and Machine Learning
Draz et al. An Embedded Solution of Gaze Estimation for Driver Assistance Using Computer Vision
Dixit et al. Face detection for drivers’ drowsiness using computer vision

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20101103