CN101090482A - Driver fatigue monitoring system and method based on image process and information mixing technology - Google Patents
Driver fatigue monitoring system and method based on image process and information mixing technology Download PDFInfo
- Publication number
- CN101090482A CN101090482A CN 200610031817 CN200610031817A CN101090482A CN 101090482 A CN101090482 A CN 101090482A CN 200610031817 CN200610031817 CN 200610031817 CN 200610031817 A CN200610031817 A CN 200610031817A CN 101090482 A CN101090482 A CN 101090482A
- Authority
- CN
- China
- Prior art keywords
- driver
- face
- image processing
- fatigue
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
This invention discloses a system and a method for monitoring fatigue of driver based on image process and information combination technology, first of all a camera head collects continuous digital signals, tests position of the drive face and further decides positions and sizes of eye, pupil and mouth, then determines the PERCLOS, frequency of yawning, rule of head shaking, gazing direction and time of the eyes and turns the four measurement values to fatigue degree and finally uses the information combination technology to combine the information of the four characters, when his fatigue degree exceeds a set value, the alarm starts up warning.
Description
Technical field
The present invention relates to a kind of driver fatigue monitoring system and method based on image processing and information fusion technology.
Background technology
Have much for the correlative study of " driving fatigue detection " aspect both at home and abroad at present: " the tired driver degree detecting instrument " of Austria's exploitation.Change frequency that can accurate recording driver PD is judged driver's degree of fatigue and the danger of driving for the tester." the fatigue detecting glasses " of Australia's development.Have eyelid activity and frequency of wink that infrared ray sensor can detect the driver, judge in view of the above whether the driver has been in fatigue state.There are two alarms can in time remind driver's " this rest ", " you have been in fatigue state " in addition with in the can that glasses link to each other.All rocs of Shenzhen long-distance bus company in 1998 are used the humanistic psychology, and modern neurology, electronic engineering have been analyzed the cause of driver fatigue accident potential, propose to eliminate tired accident potential and must eliminate the unusual tired and cerebral palsy of driver when driving.He has developed " the tired driver prevention of accident device " worn in driver's calf and wrist portion based on this, can eliminate the existing fatigue conditions of driver in tens minutes to one or two hours, clear-headed brain.The research institute that Highway Administration of the United States Federal and automobile federation subsidize, by independently developed special camera, electroencephalograph and other head movements of coming together accurately to measure, the variation of PD and frequency of wink are in order to problems such as research driving behaviors.Result of study shows: generally speaking the time of people's eyes closed between 0.2s to 0.3s, if during driving eyes than and reach 0.5s and just be easy to take place traffic accident.Univ Pennsylvania USA intelligent transportation laboratory and NHTSA adopt the measurement index of PERCLOS (the eyes closed time accounts for the percentage of special time) as the degree of spiritual physiological fatigue.Electroencephalograph (EEG) is also adopted in the measurement of spirit physiological fatigue, a moving detector etc., but generally acknowledge that effective method is the PERCLOS method the most.Professor NikolaosP.Papanikolopoulos of University of Minnesota's department of computer science and technology has successfully developed the tracking and the navigation system of a cover driver's eyes, by being placed in the face of a CCD camera supervision driver in the car, realize following function: 1) determine the definite position of driver's eyes in face image and other face features with quick simple algorithm; 2) come whether driving fatigue of monitoring driving person by following the trail of several positive face feature images; 3) follow the trail of several side face feature images and estimate whether the driver is tired.In March, 2000, Nikolaos P.Papanikolopoulos improved said system, use the infrared ray colour imagery shot instead and add the noise and the non-face image of filter filtering image, the number of times of search face image is reduced, accelerated the speed of system handles image.Adopt the grayscale mode matching process to follow the trail of the input imagery sequence and search and definite eye areas, determine that with same method for mode matching eyes are opened or closure then, if the search failure, system restarts search automatically.
There have been very big development in above-mentioned the whole bag of tricks both domestic and external and system at the system aspects of fatigue driving being monitored and report to the police, but all more or less have following three problems: 1, often only considered some fatigue characteristics.And in fact because different driver personals are widely different, and the emphasis difference of each tired assessment method, therefore only consider that the various ad hoc approach of a fatigue characteristic are not all reliable to all drivers.If consider the complexity of driving indoor environmental condition,, only consider that the effect of particular detection method of a fatigue characteristic is just poorer as daytime, night, dawn, dusk, sunlight, shade etc.2, fail night that focuses on fatigue detecting.Be the tired time period occurred frequently night, and the detection means of computer vision and method at night be daytime distinguishing.3, also be in laboratory stage basically, do not pass through the test of cab environment, as: hot and cold, moist, vibrations etc.
Summary of the invention
At above technological deficiency, the invention provides a kind of driver fatigue monitoring system and method based on image processing and information fusion technology, this system and method thereof can be monitored driver's driving fatigue degree accurate, real-time, round-the-clock, contactlessly, judge with this whether the driver is in fatigue state, as monitor out fatigue driving, can in time report to the police, reduce the hidden peril of accident that causes because of fatigue driving effectively.
For reaching above purpose, the technical solution used in the present invention is, design a kind of driver fatigue monitoring system based on image processing and information fusion technology, be made up of embedded system and external equipment, described external equipment comprises camera (1), lighting source (2) and alarm (4); Described embedded system (3) comprises embedded microprocessor and the FLASH, the RAM that are controlled by CPU, and embedded system (3) is connected with camera (1) by USB interface, is connected with alarm (4) with lighting source (2) by GPIO; Described embedded system also comprises be stored in ias and the computer control treatment system that FLASH moves in RAM, wherein ias comprises image acquisition module, image processing module and alarm control module, and the computer control treatment system adopts flush type LINUX.
Furtherly, image acquisition has partly adopted the camera of CMOS photosensitive material and has done light source with the supporting near infrared luminescence pipe of camera, and wherein according to lighting condition, embedded system decides whether open infrared light supply in its sole discretion.
Wherein above-mentioned camera and infrared light supply are installed near the interior driver's frontal dish position of driver's cabin or in driver's head front upper place.
A kind of driver fatigue monitoring method based on image processing and information fusion technology, its monitoring step comprises:
A, by the continuous digital image signal of camera collection, by image processing module the digital image signal that collects is carried out calculation process, detect people's face position, and then the position and the size of definite eyes, pupil, face;
B, determine that the driver's eyes closing time accounts for the size of rule, eye gaze direction and four fatigue characteristics of gaze duration that the percentage of total time (PERCLOS), yawning frequency, head rocks, and above-mentioned four measuring value is converted to each self-corresponding degree of fatigue;
C, exploit information integration technology merge the information of four fatigue characteristics of driver, judge the current fatigue conditions of driver;
D, when the driver fatigue degree that obtains by information fusion surpasses set point, the alarm in the system will start warning voluntarily.
Wherein before the position probing of carrying out people's face, adopt Kalman Filter Technology that the head part position is predicted earlier.
The position probing of described people's face, be earlier the visual color space of gather to be transformed into YCbCr from RGC, again according to the CbCr value of each pixel, judge area of skin color, then behind the overexpansion erosion algorithm, mark the connected region of image again, and then determine people's face position according to the size and the boundary shape of each connected region.
Described visual integration technology is to adopt Dempster-Shafer (D-S) evidence theory to carry out decision level fusion to realize.
The processing speed of described image processing system is more than per second 20 frames.
Operation principle of the present invention is, by natural daylight or infrared light supply irradiation people face, by the continuous digital image signal of the camera collection of CMOS photosensitive material, and then digital image signal is carried out calculation process by image processing module, mainly comprise preliminary treatment, image Segmentation, feature extraction, the identification of image.After people's face position is determined, just can determine the position of eyes and face again at the relative position of face according to eyes and face.Because the color of cornea and eyes other parts and the difference of gray scale are all bigger, therefore also can be easy to determine the position and the size of cornea, and then calculate the size of PERCLOS, eye gaze direction and time two fatigue characteristics.Can determine the yawning frequency of driver according to the size of face.Utilize the affine transformation of driver's face in the continuous sequence image can determine that its head rocks rule.Adopt Dempster-Shafer (D-S) evidence theory to carry out decision level fusion by the tired information of PERCLOS, yawning frequency, head being rocked four features such as rule, eye gaze direction and time again.Wherein, the D-S theory has been promoted bayesian criterion, need not prior probability and gets final product reasoning.
Described fusion process is, the relation of earlier according to correlative study achievement both domestic and external, obtain PERCLOS, the frequency of yawning, head being rocked four facial characteristics such as rule, eye gaze direction and time and degree of fatigue.When specifically detecting, the branch two-stage merges tired information.At first (also can 3 or more at 2, decide on the experiment situation) measure PERCLOS in continuous measuring period, the frequency of yawning, head rock four of rule, eye gaze direction and times etc. and tired relevant facial characteristics, obtain the degree of fatigue of their representatives, produce tolerance thus to three propositions (tired, not tired, can't judge), just distribute basic confidence level, thereby constitute the evidence in the D-S theory.Utilize Dempster to merge rule then, respectively each fatigue characteristic is carried out the fusion in different measuring cycle and calculate, obtain belonging to the new confidence level of each fatigue characteristic, thereby obtain 4 new evidences.As: PERCLOS obtains 2 evidences that basic confidence level is different in these 2 continuous measuring periods, these two evidences are merged, and just obtains the confidence level that belongs to PERCLOS of this moment, obtains the evidence of PERCLOS.Again these 4 new evidence body utilization Dempster are merged rule and merge, obtain the fusion results of whole system.Make a strategic decision according to decision rule (based on belief function or based on basic reliability distribution) at last, carry out different processing according to the result of decision.The result of decision adds that historical information rejudges if for judging, then can be disregarded according to situation or extract new information from image.
Use two-stage to merge and to prevent that judgement produces adverse influence to various accidentalia to fatigue.Because the utilization computer vision technique detects in real time in the such environment of driver's cabin, be not subjected to the influence of various accidentalia unavoidably, cause erroneous judgement.At first carry out primary information and merge, can remove the influence that the part accidentalia causes, improved accuracy fatigue characteristic identification in each feature inside.At last four fatigue characteristics are merged together, just can monitor driver's degree of fatigue comparatively exactly.
Native system can be monitored driver's driving fatigue degree by repeatedly experiment accurate, real-time, round-the-clock, contactlessly, and monitoring warning in time under the fatigue driving situation, has reduced the hidden peril of accident that causes because of fatigue driving effectively.
Description of drawings
Fig. 1 is the scheme of installation of native system;
Fig. 2 is the system principle diagram of native system;
Fig. 3 is the structure connection diagram of native system;
Fig. 4 is the hardware connection layout of embedded system.
Specific implementation method
Referring to Fig. 1, CMOS camera 1 (modern HY 100A) and near-infrared light source 2 (Pacific Cybervision CV100D) are installed near driver's frontal dish 5 positions in driver's cabin, the infrared ray of visible light and other wavelength is filtered, and therefore can be similar to think that the light source of gathering image only contains the infrared light supply of respective wavelength (wavelength 850nm).When using infrared illumination, driver's face is more clear, and the object far away from light source almost is filtered.The intractability of image is less.Consider the real-time problem of graphical analysis, the size of the image of collection is set at 320*240.The size of driver's face image is adjusted at roughly about 1 everything element.
Embedded system 3 also has been installed on the car, has absorbed driver's the positive image of head incessantly.Judge the illumination condition of environment by embedded system 3, whether decision opens infrared illumination source 2 automatically, and carries out corresponding operating, judges driver fatigue state, and when too tired, then embedded system 3 control alarms 4 start.
Referring to Fig. 2, when adopting natural lighting, native system is at first located people's face by ias to the detection of people's face skin, at first utilizes facial skin to come separation of human face and non-face zone, must find the faceform that can adapt under the different colours of skin and the different illumination condition.
Coloured light three primary colors (RGB) are respectively red, green, blue, use additive color process, directly lead to our human eye.Its numerical value is R:0--255 G:0-255 B:0--255.Rgb value is big more, and is just bright more, is white thus when RGB is 255, is black when being zero on the contrary entirely.Analyze as can be seen thus, rgb space is representative color not only, has also represented brightness.Must use suitable method that traditional RGB is converted to the color expression of space that colourity and brightness separate, can be good at specifically realizing that the color model of brightness and chrominance separation has the YCbCr model.
The YCbCr model is a colour model commonly used in video image and the digital image, and wherein Y is brightness, and Cb and Cr describe the tone of image jointly.Cb wherein, Cr are respectively blue component and the red component coordinate with respect to reference value.The transformation relation of RGB and YCbCr is:
Y=0.299R+0.587G+0.114B
Cr=2(1-0.299)(R-Y)=(0.500R-0.4187G-0.0813B)+128
Cb=2(1-0.114)(B-Y)=(-0.1687R-0.3313G+0.500B)+128
Can set up the two-dimentional Gauss model skin pixel of classifying based on this point.Utilize the complexion model of being set up,, select appropriate threshold to change chromatic image into binary picture for ease of further locating people's face.After carrying out the separation of unhurt skin look, next step is exactly that picking out obviously is not the area of skin color of people's face.This moment, the erosion operation that at first expands removed less noise spot and cavity.Represent with an ellipse the zone of people's face is approximate then.So just can in image, detect people's face by detecting approximate ellipse.Because it is the connected region with colour of skin that people's face can be regarded as, therefore at first the isolated colour of skin is communicated with quantitative analysis.In a width of cloth binary image, image has been divided into black zone and pure white zone, but will generate split image, must carry out connectivity analysis.If promptly the brightness value of a pixel is higher than threshold value, and adjacent with a pixel among the regional i, think that then this pixel is in regional i.At first define the notion of a label image, a kind of algorithms most in use that generates label is called " region growing ".It is to use labelled storage, and corresponding with the auspicious buffer memory of original picture.Calling " black " pixel in the following text in this description is object, and " white " pixel is a background.
When initial, each unit among the labelled storage L all is made as 0, and the auspicious buffer memory that claims this image is f, therefore to certain marking serial numbers be the mark operation of N can be written as L (x, y)---N.
Adopt the Di Guishi zone algorithm to generate the label image below.This algorithm has adopted a push-down stack to realize region growing, and push-down stack can be retained in the coordinate of relevant pixel in the zone temporarily.Concrete steps are as follows:
1) find one not label (be L (x, y)=0) black picture element.Choose new label sequence number for this zone, represent with N.If all by label, algorithm stops all pixels so.
2)L(x,y)---N
3) if f (x-1 y) is black, and L (x-1, y)=0, (x-1 y) is pressed into storehouse with coordinate.
If f (x+1 y) is black, and L (x+1, y)=0, (x+1 y) is pressed into storehouse with coordinate.
If f (x y-1) is black, and L (x, y-1)=0, (x y-1) is pressed into storehouse with coordinate.
If f (x y+1) is black, and L (x, y+1)=0, (x y+1) is pressed into storehouse with coordinate.
4) from storehouse, take out a value as new (x, y).
5) if storehouse is empty, jumps to step 1, otherwise jump to step 2
Then the label operation can obtain one group of connected region, and each zone all is assigned a unique label sequence number.To any set point pixel, if want to find its affiliated area, computer need only be visited the relevant position of this pixel at the L memory, and reads regional sequence number and get final product.
According to said method, can obtain the colour of skin connected region in the image, what removal earlier was little is not the zone of people's face obviously.Further locate people's face then.
Detect the detection method of oval method use based on the edge.The concrete dual points method method of Hough transformation that adopts detects.Next, locate people's face at last with reference to the feature of people's face.Carry out according to following principle:
1. there is eyebrow owing to people face, eyes, face, therefore the non-area of skin color in nostril or the like can exist a plurality of holes in real human face region.
2. calculate the variance ratio of elliptical region, because there are some non-area of skin color in people face, thereby the unified area of skin color of its variance ratio color wants big.
3. the anglec of rotation of people's head is limited, it is generally acknowledged 45 and spends between 135 degree.
4. utilize the routine of people's face feature to distribute, eyes are above the nostril, and the nostril thus can be with the position of visual mesopore as a basis for estimation above lip.
Can obtain people's face position comparatively accurately thus.
After driver's face location determined, just can determine the position of eyes and face at the relative position of face according to eyes and face.Because the color of cornea and eyes other parts and the difference of gray scale are all bigger, therefore also can be easy to determine the position and the size of cornea, and then calculate the size of PERCLOS, eye gaze direction and time two fatigue characteristics.Can determine the yawning frequency of driver according to the size of face.Utilize the affine transformation of driver's face in the continuous sequence image can determine that its head rocks rule.
Rock the relation of four facial characteristics such as rule, eye gaze direction and time and degree of fatigue again according to PERCLOS, the frequency of yawning, head, first two-stage merges tired information.At first (also can 3 or more at 2, decide on the experiment situation) measure PERCLOS in continuous measuring period, the frequency of yawning, head rock four of rule, eye gaze direction and times etc. and tired relevant facial characteristics, obtain the degree of fatigue of their representatives, produce tolerance thus to three propositions (tired, not tired, can't judge), just distribute basic confidence level, thereby constitute the evidence in the D-S theory.Utilize Dempster to merge rule then, respectively each fatigue characteristic is carried out the fusion in different measuring cycle and calculate, obtain belonging to the new confidence level of each fatigue characteristic, thereby obtain 4 new evidences.As: PERCLOS obtains 2 evidences that basic confidence level is different in these 2 continuous measuring periods, these two evidences are merged, and just obtains the confidence level that belongs to PERCLOS of this moment, obtains the evidence of PERCLOS.Again these 4 new evidence body utilization Dempster are merged rule and merge, obtain the fusion results of whole system.Make a strategic decision according to decision rule (based on belief function or based on basic reliability distribution) at last, carry out different processing according to the result of decision.The result of decision adds that historical information rejudges if for judging, then can be disregarded according to situation or extract new information from image.
When the driver was too tired, the alarm that embedded system will start was wherein reported to the police.
Present embodiment provides the linux system environment of embedded version, comprises that a Shell who simplifies, utility program, operation configuration script, C language that some are commonly used move dynamic base, network tool etc.And adopted based on ARM nuclear embedded microprocessor---EP9312 is as the embedded target platform.
Its hardware platform comprises: CPU:EP9312, dominant frequency 200MHz, internal bus 100MHz.The instruction Cache of 16K and the Data Cache of 16K are arranged, inner integrated lcd controller, 3 USB Host controllers, 3 serial ports controllers, Ethernet MAC, EIDE, AC ' 97 interfaces etc.Include MMU, support ICP/IP protocol.32M byte SDRAM, the 16M bytes Flash, LCD supports, supports touch-screen, the support of many serial ports, parallel port, USB (slave) support, Jtag interface, IrDA support, keyboard/mouse support, SRAM to support.
In addition, in order to guarantee the operate as normal of system, also take following several measures:
1, in order to improve the speed of service and to carry out people's face location more accurately, adopt Kalman filtering that driver's head position is predicted.
2, when adopting the infrared light supply illumination, according to the character of infrared light, some background in the driver's cabin is suitably arranged, the image of background and driver's face image are had than big difference, reducing ambient interferences, thereby detect the position of driver face quickly.At this moment, according to face mask, utilize template matches just can detect people's face position.
Claims (8)
1, a kind of driver fatigue monitoring system based on image processing and information fusion technology is made up of embedded system and external equipment, it is characterized in that, described external equipment comprises camera (1), lighting source (2) and alarm (4); Described embedded system (3) comprises embedded microprocessor and the FLASH, the RAM that are controlled by CPU, and embedded system (3) is connected with camera (1) by USB interface, is connected with alarm (4) with lighting source (2) by GPIO; Described embedded system also comprises be stored in ias and the computer control treatment system that FLASH moves in RAM, wherein ias comprises image acquisition module, image processing module and alarm control module, and the computer control treatment system adopts flush type LINUX.
2, the driver fatigue monitoring system based on image processing and information fusion technology as claimed in claim 1, it is characterized in that, wherein said image acquisition has partly adopted the camera of CMOS photosensitive material and has done light source with the supporting near infrared luminescence pipe of camera, according to the ambient lighting situation, embedded system decides whether open infrared light supply in its sole discretion.
3, the driver fatigue monitoring system based on image processing and information fusion technology as claimed in claim 1, it is characterized in that wherein camera and infrared light supply are installed near the interior driver's frontal dish position of driver's cabin or in driver's head front upper place.
4, a kind of driver fatigue monitoring method based on image processing and information fusion technology is characterized in that monitoring step comprises:
A, by the continuous digital image signal of camera collection, by image processing module the digital image signal that collects is carried out calculation process, detect people's face position, and then the position and the size of definite eyes, pupil, face;
B, determine that the driver's eyes closing time accounts for the size of rule, eye gaze direction and four fatigue characteristics of gaze duration that the percentage (PERCLOS) of total time, yawning frequency, head rocks again, and above-mentioned four measuring value is converted to each self-corresponding degree of fatigue;
C, exploit information integration technology merge the information of four fatigue characteristics of driver, judge the current fatigue conditions of driver;
D, when the driver fatigue degree that obtains by information fusion surpasses set point, the alarm in the system will start warning voluntarily.
5, the driver fatigue monitoring method based on image processing and information fusion technology as claimed in claim 4 is characterized in that, before the position probing of carrying out people's face, adopts Kalman Filter Technology that the head part position is predicted earlier.
6, driver fatigue monitoring method based on image processing and information fusion technology as claimed in claim 4, it is characterized in that, the position probing of described people's face, when adopting natural lighting, be earlier the visual color space of gather to be transformed into YCbCr from RGC, again according to the CbCr value of each pixel, judge area of skin color, behind the overexpansion erosion algorithm, mark the connected region of image more then, and then determine people's face position according to the size and the boundary shape of each connected region, when adopting infrared illumination, significant image can not arranged in image through the background of arranging,, use template matches just can determine people's face position according to face mask.
7, the driver fatigue monitoring method based on image processing and information fusion technology as claimed in claim 4 is characterized in that, wherein the information fusion technology of Cai Yonging is based on Dempster-Shafer (D-S) evidence theory and carries out that decision level fusion realizes.
8, the driver fatigue monitoring system based on image processing and information fusion technology as claimed in claim 4 is characterized in that, wherein image processing speed is more than per second 20 frames.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2006100318170A CN101090482B (en) | 2006-06-13 | 2006-06-13 | Driver fatigue monitoring system and method based on image process and information mixing technology |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2006100318170A CN101090482B (en) | 2006-06-13 | 2006-06-13 | Driver fatigue monitoring system and method based on image process and information mixing technology |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101090482A true CN101090482A (en) | 2007-12-19 |
CN101090482B CN101090482B (en) | 2010-09-08 |
Family
ID=38943612
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2006100318170A Expired - Fee Related CN101090482B (en) | 2006-06-13 | 2006-06-13 | Driver fatigue monitoring system and method based on image process and information mixing technology |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN101090482B (en) |
Cited By (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101375796B (en) * | 2008-09-18 | 2010-06-02 | 浙江工业大学 | Real-time detection system of fatigue driving |
CN101278839B (en) * | 2008-05-22 | 2010-10-13 | 曹宇 | Method for tracking nighttime drive |
CN101917801A (en) * | 2010-07-30 | 2010-12-15 | 中山大学 | Light regulation method, device and intelligent desk lamp |
CN102097003A (en) * | 2010-12-31 | 2011-06-15 | 北京星河易达科技有限公司 | Intelligent traffic safety system based on human condition recognition |
CN102122357A (en) * | 2011-03-17 | 2011-07-13 | 电子科技大学 | Fatigue detection method based on human eye opening and closure state |
CN102310771A (en) * | 2011-05-26 | 2012-01-11 | 臧安迪 | Motor vehicle safety control method and system based on driver face identification |
CN102673483A (en) * | 2011-03-11 | 2012-09-19 | 纬创资通股份有限公司 | Support device capable of automatically adjusting direction and combination of support device and electronic device |
CN102752458A (en) * | 2012-07-19 | 2012-10-24 | 北京理工大学 | Driver fatigue detection mobile phone and unit |
CN102985277A (en) * | 2010-12-31 | 2013-03-20 | 北京星河易达科技有限公司 | Intelligent traffic safety system based on comprehensive state detection and decision method thereof |
CN103235939A (en) * | 2013-05-08 | 2013-08-07 | 哈尔滨工业大学 | Datum point positioning method based on machine vision |
CN101710427B (en) * | 2008-09-09 | 2013-09-18 | 富士胶片株式会社 | Face detector and face detecting method |
CN103426275A (en) * | 2013-08-01 | 2013-12-04 | 步步高教育电子有限公司 | Device, desk lamp and method for detecting eye fatigue |
CN103748533A (en) * | 2011-06-30 | 2014-04-23 | 约翰逊控股公司 | Apparatus and method for contactlessly detecting objects and/or persons and gestures and/or operating procedures made and/or carried out thereby |
CN104000698A (en) * | 2014-06-04 | 2014-08-27 | 西南交通大学 | Electric bicycle obstacle avoiding method integrating sparse representation and particle filter |
CN104077122A (en) * | 2013-03-29 | 2014-10-01 | 纬创资通股份有限公司 | Computer system for automatically detecting fatigue and method for automatically detecting fatigue |
CN104228838A (en) * | 2014-09-19 | 2014-12-24 | 安徽工程大学 | Early warning system for preventing fatigue driving and method thereof |
CN104269028A (en) * | 2014-10-23 | 2015-01-07 | 深圳大学 | Fatigue driving detection method and system |
CN104573725A (en) * | 2015-01-09 | 2015-04-29 | 安徽清新互联信息科技有限公司 | Blind driving detection method based on look-down characteristics |
CN104732251A (en) * | 2015-04-23 | 2015-06-24 | 郑州畅想高科股份有限公司 | Video-based method of detecting driving state of locomotive driver |
CN104751149A (en) * | 2015-04-16 | 2015-07-01 | 张小磊 | Personnel fatigue degree judging platform based on electronic detection |
CN104757981A (en) * | 2015-03-16 | 2015-07-08 | 于莹光 | Method and device for high-sensitively receiving and transmitting integrated infrared detection of driver's fatigue |
CN104952210A (en) * | 2015-05-15 | 2015-09-30 | 南京邮电大学 | Fatigue driving state detecting system and method based on decision-making level data integration |
CN105354985A (en) * | 2015-11-04 | 2016-02-24 | 中国科学院上海高等研究院 | Fatigue driving monitoring device and method |
CN105764735A (en) * | 2013-10-29 | 2016-07-13 | 金在哲 | Two-step sleepy driving prevention apparatus through recognizing operation, front face, eye, and mouth shape |
CN105956022A (en) * | 2016-04-22 | 2016-09-21 | 腾讯科技(深圳)有限公司 | Method and device for processing electron mirror image, and method and device for processing image |
CN106295600A (en) * | 2016-08-18 | 2017-01-04 | 宁波傲视智绘光电科技有限公司 | Driver status real-time detection method and device |
CN106845396A (en) * | 2017-01-18 | 2017-06-13 | 南京理工大学 | Illegal fishing Activity recognition method based on automated graphics identification |
CN107233103A (en) * | 2017-05-27 | 2017-10-10 | 西南交通大学 | High ferro dispatcher's fatigue state assessment method and system |
CN107248262A (en) * | 2017-06-07 | 2017-10-13 | 上海储翔信息科技有限公司 | Driver management platform for vehicle in use |
CN107301384A (en) * | 2017-06-09 | 2017-10-27 | 湖北天业云商网络科技有限公司 | A kind of driver takes phone behavioral value method and system |
CN107392153A (en) * | 2017-07-24 | 2017-11-24 | 中国科学院苏州生物医学工程技术研究所 | Human-body fatigue degree decision method |
CN107548483A (en) * | 2015-03-27 | 2018-01-05 | 法雷奥舒适驾驶助手公司 | Control method, control device, system and the motor vehicles for including such control device |
CN107657355A (en) * | 2016-07-25 | 2018-02-02 | 合肥美亚光电技术股份有限公司 | The appraisal procedure of screening machine operating personnel's degree of respecting work, apparatus and system |
CN108573210A (en) * | 2018-03-02 | 2018-09-25 | 成都高原汽车工业有限公司 | A kind of alarming method for fatigue drive and device |
CN108875642A (en) * | 2018-06-21 | 2018-11-23 | 长安大学 | A kind of method of the driver fatigue detection of multi-index amalgamation |
CN108985245A (en) * | 2018-07-25 | 2018-12-11 | 深圳市飞瑞斯科技有限公司 | Determination method, apparatus, computer equipment and the storage medium of eye locations |
CN109308445A (en) * | 2018-07-25 | 2019-02-05 | 南京莱斯电子设备有限公司 | A kind of fixation post personnel fatigue detection method based on information fusion |
CN109977930A (en) * | 2019-04-29 | 2019-07-05 | 中国电子信息产业集团有限公司第六研究所 | Method for detecting fatigue driving and device |
CN111104817A (en) * | 2018-10-25 | 2020-05-05 | 中车株洲电力机车研究所有限公司 | Fatigue detection method based on deep learning |
CN113518201A (en) * | 2020-07-14 | 2021-10-19 | 阿里巴巴集团控股有限公司 | Video processing method, device and equipment |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1680779A (en) * | 2005-02-04 | 2005-10-12 | 江苏大学 | Fatigue monitoring method and device for driver |
-
2006
- 2006-06-13 CN CN2006100318170A patent/CN101090482B/en not_active Expired - Fee Related
Cited By (58)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101278839B (en) * | 2008-05-22 | 2010-10-13 | 曹宇 | Method for tracking nighttime drive |
CN101710427B (en) * | 2008-09-09 | 2013-09-18 | 富士胶片株式会社 | Face detector and face detecting method |
CN101375796B (en) * | 2008-09-18 | 2010-06-02 | 浙江工业大学 | Real-time detection system of fatigue driving |
CN101917801A (en) * | 2010-07-30 | 2010-12-15 | 中山大学 | Light regulation method, device and intelligent desk lamp |
CN102985277A (en) * | 2010-12-31 | 2013-03-20 | 北京星河易达科技有限公司 | Intelligent traffic safety system based on comprehensive state detection and decision method thereof |
CN102985277B (en) * | 2010-12-31 | 2016-05-04 | 北京星河易达科技有限公司 | The intelligent traffic safety system and the decision-making technique thereof that detect based on comprehensive state |
CN102097003A (en) * | 2010-12-31 | 2011-06-15 | 北京星河易达科技有限公司 | Intelligent traffic safety system based on human condition recognition |
CN102097003B (en) * | 2010-12-31 | 2014-03-19 | 北京星河易达科技有限公司 | Intelligent traffic safety system and terminal |
CN102673483A (en) * | 2011-03-11 | 2012-09-19 | 纬创资通股份有限公司 | Support device capable of automatically adjusting direction and combination of support device and electronic device |
CN102673483B (en) * | 2011-03-11 | 2016-02-24 | 纬创资通股份有限公司 | Support device capable of automatically adjusting direction and combination of support device and electronic device |
US8981962B2 (en) | 2011-03-11 | 2015-03-17 | Wistron Corporation | Holder device capable of automatically adjusting orientation of an electronic device placed thereon, and assembly of holder device and electronic device |
CN102122357B (en) * | 2011-03-17 | 2012-09-12 | 电子科技大学 | Fatigue detection method based on human eye opening and closure state |
CN102122357A (en) * | 2011-03-17 | 2011-07-13 | 电子科技大学 | Fatigue detection method based on human eye opening and closure state |
CN102310771A (en) * | 2011-05-26 | 2012-01-11 | 臧安迪 | Motor vehicle safety control method and system based on driver face identification |
CN102310771B (en) * | 2011-05-26 | 2013-05-29 | 臧安迪 | Motor vehicle safety control method and system based on driver face identification |
CN103748533A (en) * | 2011-06-30 | 2014-04-23 | 约翰逊控股公司 | Apparatus and method for contactlessly detecting objects and/or persons and gestures and/or operating procedures made and/or carried out thereby |
CN102752458A (en) * | 2012-07-19 | 2012-10-24 | 北京理工大学 | Driver fatigue detection mobile phone and unit |
CN104077122A (en) * | 2013-03-29 | 2014-10-01 | 纬创资通股份有限公司 | Computer system for automatically detecting fatigue and method for automatically detecting fatigue |
CN103235939A (en) * | 2013-05-08 | 2013-08-07 | 哈尔滨工业大学 | Datum point positioning method based on machine vision |
CN103426275B (en) * | 2013-08-01 | 2016-01-06 | 步步高教育电子有限公司 | The device of detection eye fatigue, desk lamp and method |
CN103426275A (en) * | 2013-08-01 | 2013-12-04 | 步步高教育电子有限公司 | Device, desk lamp and method for detecting eye fatigue |
CN105764735A (en) * | 2013-10-29 | 2016-07-13 | 金在哲 | Two-step sleepy driving prevention apparatus through recognizing operation, front face, eye, and mouth shape |
CN104000698B (en) * | 2014-06-04 | 2016-07-06 | 西南交通大学 | A kind of barrier-avoiding method of the Moped Scooter merging rarefaction representation particle filter |
CN104000698A (en) * | 2014-06-04 | 2014-08-27 | 西南交通大学 | Electric bicycle obstacle avoiding method integrating sparse representation and particle filter |
CN104228838A (en) * | 2014-09-19 | 2014-12-24 | 安徽工程大学 | Early warning system for preventing fatigue driving and method thereof |
CN104269028A (en) * | 2014-10-23 | 2015-01-07 | 深圳大学 | Fatigue driving detection method and system |
CN104269028B (en) * | 2014-10-23 | 2017-02-01 | 深圳大学 | Fatigue driving detection method and system |
CN104573725B (en) * | 2015-01-09 | 2018-02-23 | 安徽清新互联信息科技有限公司 | It is a kind of that detection method is driven based on vertical view the blind of feature |
CN104573725A (en) * | 2015-01-09 | 2015-04-29 | 安徽清新互联信息科技有限公司 | Blind driving detection method based on look-down characteristics |
CN104757981A (en) * | 2015-03-16 | 2015-07-08 | 于莹光 | Method and device for high-sensitively receiving and transmitting integrated infrared detection of driver's fatigue |
CN107548483A (en) * | 2015-03-27 | 2018-01-05 | 法雷奥舒适驾驶助手公司 | Control method, control device, system and the motor vehicles for including such control device |
CN107548483B (en) * | 2015-03-27 | 2021-06-08 | 法雷奥舒适驾驶助手公司 | Control method, control device, system and motor vehicle comprising such a control device |
CN104751149B (en) * | 2015-04-16 | 2018-06-12 | 罗普特(厦门)科技集团有限公司 | Personnel based on detection of electrons degree judgement platform tired out |
CN104751149A (en) * | 2015-04-16 | 2015-07-01 | 张小磊 | Personnel fatigue degree judging platform based on electronic detection |
CN104732251A (en) * | 2015-04-23 | 2015-06-24 | 郑州畅想高科股份有限公司 | Video-based method of detecting driving state of locomotive driver |
CN104732251B (en) * | 2015-04-23 | 2017-12-22 | 郑州畅想高科股份有限公司 | A kind of trainman's driving condition detection method based on video |
CN104952210A (en) * | 2015-05-15 | 2015-09-30 | 南京邮电大学 | Fatigue driving state detecting system and method based on decision-making level data integration |
CN104952210B (en) * | 2015-05-15 | 2018-01-05 | 南京邮电大学 | A kind of fatigue driving state detecting system and method based on decision making level data fusion |
CN105354985A (en) * | 2015-11-04 | 2016-02-24 | 中国科学院上海高等研究院 | Fatigue driving monitoring device and method |
CN105354985B (en) * | 2015-11-04 | 2018-01-12 | 中国科学院上海高等研究院 | Fatigue driving monitoring apparatus and method |
CN105956022A (en) * | 2016-04-22 | 2016-09-21 | 腾讯科技(深圳)有限公司 | Method and device for processing electron mirror image, and method and device for processing image |
CN107657355A (en) * | 2016-07-25 | 2018-02-02 | 合肥美亚光电技术股份有限公司 | The appraisal procedure of screening machine operating personnel's degree of respecting work, apparatus and system |
CN106295600A (en) * | 2016-08-18 | 2017-01-04 | 宁波傲视智绘光电科技有限公司 | Driver status real-time detection method and device |
CN106845396B (en) * | 2017-01-18 | 2020-05-22 | 南京理工大学 | Illegal fishing behavior identification method based on automatic image identification |
CN106845396A (en) * | 2017-01-18 | 2017-06-13 | 南京理工大学 | Illegal fishing Activity recognition method based on automated graphics identification |
CN107233103A (en) * | 2017-05-27 | 2017-10-10 | 西南交通大学 | High ferro dispatcher's fatigue state assessment method and system |
CN107248262A (en) * | 2017-06-07 | 2017-10-13 | 上海储翔信息科技有限公司 | Driver management platform for vehicle in use |
CN107301384A (en) * | 2017-06-09 | 2017-10-27 | 湖北天业云商网络科技有限公司 | A kind of driver takes phone behavioral value method and system |
CN107392153A (en) * | 2017-07-24 | 2017-11-24 | 中国科学院苏州生物医学工程技术研究所 | Human-body fatigue degree decision method |
CN107392153B (en) * | 2017-07-24 | 2020-09-29 | 中国科学院苏州生物医学工程技术研究所 | Human body fatigue degree judging method |
CN108573210A (en) * | 2018-03-02 | 2018-09-25 | 成都高原汽车工业有限公司 | A kind of alarming method for fatigue drive and device |
CN108875642A (en) * | 2018-06-21 | 2018-11-23 | 长安大学 | A kind of method of the driver fatigue detection of multi-index amalgamation |
CN108985245A (en) * | 2018-07-25 | 2018-12-11 | 深圳市飞瑞斯科技有限公司 | Determination method, apparatus, computer equipment and the storage medium of eye locations |
CN109308445A (en) * | 2018-07-25 | 2019-02-05 | 南京莱斯电子设备有限公司 | A kind of fixation post personnel fatigue detection method based on information fusion |
CN109308445B (en) * | 2018-07-25 | 2019-06-25 | 南京莱斯电子设备有限公司 | A kind of fixation post personnel fatigue detection method based on information fusion |
CN111104817A (en) * | 2018-10-25 | 2020-05-05 | 中车株洲电力机车研究所有限公司 | Fatigue detection method based on deep learning |
CN109977930A (en) * | 2019-04-29 | 2019-07-05 | 中国电子信息产业集团有限公司第六研究所 | Method for detecting fatigue driving and device |
CN113518201A (en) * | 2020-07-14 | 2021-10-19 | 阿里巴巴集团控股有限公司 | Video processing method, device and equipment |
Also Published As
Publication number | Publication date |
---|---|
CN101090482B (en) | 2010-09-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101090482B (en) | Driver fatigue monitoring system and method based on image process and information mixing technology | |
JP4895797B2 (en) | Wrinkle detection device, wrinkle detection method and program | |
CN101593425B (en) | Machine vision based fatigue driving monitoring method and system | |
CN102263937B (en) | Driver's driving behavior monitoring device and monitoring method based on video detection | |
CN104809445B (en) | method for detecting fatigue driving based on eye and mouth state | |
CN101271517B (en) | Face region detecting device and method | |
CN107292251B (en) | Driver fatigue detection method and system based on human eye state | |
JP2021034035A (en) | System, method, and device for intelligent vehicle loaded fatigue detection based on facial discrimination | |
CN109389806A (en) | Fatigue driving detection method for early warning, system and medium based on multi-information fusion | |
CN110728241A (en) | Driver fatigue detection method based on deep learning multi-feature fusion | |
CN107427242A (en) | Pulse wave detection device and pulse wave detection program | |
Liu et al. | Driver fatigue detection through pupil detection and yawing analysis | |
CN101574260A (en) | Vehicle-mounted fatigue early warning device and method thereof | |
EP1868138A2 (en) | Method of tracking a human eye in a video image | |
CN104318237A (en) | Fatigue driving warning method based on face identification | |
CN109602391A (en) | Automatic testing method, device and the computer readable storage medium of fundus hemorrhage point | |
CN109460715A (en) | A kind of traffic lights automatic identification implementation method based on machine learning | |
CN102197418A (en) | Device for monitoring surrounding area of vehicle | |
CN105844245A (en) | Fake face detecting method and system for realizing same | |
CN104881956A (en) | Fatigue driving early warning system | |
CN102662470B (en) | A kind of method and system realizing eye operation | |
CN113435353A (en) | Multi-mode-based in-vivo detection method and device, electronic equipment and storage medium | |
CN107480629A (en) | A kind of method for detecting fatigue driving and device based on depth information | |
CN116503794A (en) | Fatigue detection method for cockpit unit | |
CN117132949A (en) | All-weather fall detection method based on deep learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
C17 | Cessation of patent right | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20100908 Termination date: 20120613 |