CN107985193A - Driver fatigue monitor system - Google Patents
Driver fatigue monitor system Download PDFInfo
- Publication number
- CN107985193A CN107985193A CN201711241785.1A CN201711241785A CN107985193A CN 107985193 A CN107985193 A CN 107985193A CN 201711241785 A CN201711241785 A CN 201711241785A CN 107985193 A CN107985193 A CN 107985193A
- Authority
- CN
- China
- Prior art keywords
- driver
- terminal
- driving
- cloud server
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/06—Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Human Computer Interaction (AREA)
- Mechanical Engineering (AREA)
- Emergency Alarm Devices (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The present invention relates to a kind of driver fatigue monitor system, including:Harvester, gathers the first driving information of the first driver in the first vehicle, wherein, first driving information includes the driving image of first driver;Dispensing device, is electrically connected the harvester, and first driving information is sent to Cloud Server;First terminal, is arranged in first vehicle;Cloud Server, wireless communication connect the first terminal, handle the driving image, judge first driver whether fatigue driving;If so, then the Cloud Server sends the first alarm signal to the first terminal.Driver fatigue monitor system provided in this embodiment, driving image is detected by Cloud Server, the driver of the driver and its periphery are reminded when driver is in fatigue driving state, significantly reduce the danger of fatigue driving generation, drastically increases the security of traffic.
Description
Technical field
The present invention relates to field of traffic safety, more particularly to a kind of driver fatigue monitor system.
Background technology
In traffic colleague and transportational process, the act of violating regulations such as hypervelocity, fatigue driving is serious, is tested the speed at present with microwave radar
Technological means plays the phenomenon for solving hypervelocity obvious effect, but to fatigue driving also without effective detection means.
Fatigue driving is one of important hidden danger of current traffic safety.When driver is in fatigue state, the perception energy to surrounding environment
Power, situation judgement and the manipulation ability to vehicle have different degrees of decline, and traffic accident easily occurs.
How research provides a kind of driver fatigue monitor system, and monitoring and early warning in real time are carried out to driver fatigue state,
The hidden danger that fatigue driving is brought is reduced, is had important practical significance.
The content of the invention
Therefore, to solve technological deficiency and deficiency existing in the prior art, the present invention proposes a kind of fatigue driving early warning system
System, including:
Harvester, is arranged in the first vehicle, for gathering the first driving information of the first driver, wherein, it is described
First driving information includes the driving image of first driver;
Dispensing device, is arranged in first vehicle and is electrically connected the harvester, for described first to be driven
Information is sent to Cloud Server;
First terminal, is arranged in first vehicle, for prompting the first alarm signal to first driver;
Cloud Server, wireless communication connect the first terminal, for handling the driving image, and judge described first
Whether driver is in fatigue driving state, when definite first driver is in fatigue driving state to described first eventually
End sends first alarm signal.
In an embodiment of the invention, first driving information further includes the position letter of the first terminal
Breath.
In an embodiment of the invention, the system further includes base station, and the base station judges in the Cloud Server
During first driver tired driving, the second alarm signal that the Cloud Server is sent is received, and described second is alarmed
Signal is broadcasted.
In an embodiment of the invention, the system further includes second terminal, and the second terminal is arranged at described
In the second vehicle in base station range, for receiving second alarm signal.
In an embodiment of the invention, second alarm signal includes the positional information of the first terminal.
In an embodiment of the invention, the second terminal includes Distance Judgment module, the Distance Judgment mould
Block is used to judge the distance between the second terminal and the first terminal;
If the distance exceedes threshold value, kid of the second terminal into second vehicle is carried
Wake up.
In an embodiment of the invention, pass through between the base station and the first terminal, the second terminal
Zigbee protocol carries out networking.
In an embodiment of the invention, the Cloud Server carries out the driving image processing and forms described
The eyes image of one driver, and judge whether first driver is in fatigue driving state according to the eyes image.
In an embodiment of the invention, the harvester includes:
Filming apparatus, is arranged in first vehicle, for being shot to the eye of first driver;
Holder, the filming apparatus is connected by cradle head control interface;
A/d converter, simulated eye of the filming apparatus the filming apparatus to be shot is electrically connected by AD conversion interface
Portion's image is converted into digital eyes image;
Memory, is electrically connected the a/d converter to store digital eyes image, and is electrically connected the dispensing device, for
The dispensing device extraction digital eyes image is sent to the Cloud Server.
In an embodiment of the invention, the filming apparatus is infrared camera.
Driver fatigue monitor system provided in this embodiment, is detected driving image by Cloud Server, is driving
The driver of the driver and its periphery are reminded when member is in fatigue driving state, significantly reduce fatigue driving production
Raw danger, drastically increases the security of traffic.
By the detailed description below with reference to attached drawing, other side of the invention and feature become obvious.But it should know
Road, which is only the purpose design explained, not as the restriction of the scope of the present invention, this is because it should refer to
Appended claims.It should also be noted that unless otherwise noted, it is not necessary to which scale attached drawing, they only try hard to concept
Ground illustrates structure and flow described herein.
Brief description of the drawings
Below in conjunction with attached drawing, the embodiment of the present invention is described in detail.
Fig. 1 is a kind of driver fatigue monitor system structure diagram provided by the invention;
Fig. 2 is the schematic diagram of a scenario that second terminal provided by the invention sends prompting message;
Fig. 3 is a kind of structure diagram for harvester that present embodiment provides;
Fig. 4 is judgement flow diagram of the Cloud Server provided in an embodiment of the present invention to driver eye's state;
Fig. 5 opens view for eyes provided in an embodiment of the present invention;
Fig. 6 is eyes closed view provided in an embodiment of the present invention;
Fig. 7 narrows view for eyes provided in an embodiment of the present invention.
Embodiment
In order to make the foregoing objectives, features and advantages of the present invention clearer and more comprehensible, below in conjunction with the accompanying drawings to the present invention
Embodiment be described in detail.
Embodiment one
Please refer to Fig.1, Fig. 1 is a kind of driver fatigue monitor system structure diagram provided by the invention, which includes:
Harvester, is arranged in the first vehicle, for gathering the first driving information of the first driver, wherein, it is described
First driving information includes the driving image of first driver;
Dispensing device, is arranged in first vehicle and is electrically connected the harvester, for described first to be driven
Information is sent to Cloud Server;
First terminal, is arranged in first vehicle, for prompting the first alarm signal to first driver;
Cloud Server, wireless communication connect the first terminal, for handling the driving image, and judge described first
Whether driver is in fatigue driving state, when definite first driver is in fatigue driving state to described first eventually
End sends first alarm signal.
Present embodiment discloses a kind of driver fatigue monitor system based on Cloud Server.Cloud Server and terminal can lead to
Cross wireless network to be connected, driving image is handled by Cloud Server, there is the advantages of memory capacity is big, and disposal ability is strong, and
And be conducive to whether vehicle supervision department currently monitors in fatigue driving state driver in real time.
Further, on the basis of the above embodiment, first driving information further includes the first terminal
Positional information.
Specifically, the positional information of first terminal can be positioned by locating module, such as GPS module.Locating module can be with
It is integrated in the harvester or individually device is connected with harvester.
Further, on the basis of the above embodiment, early warning system provided by the invention further includes base station, the base
Stand when the Cloud Server judges first driver tired driving, receive the second alarm signal that the Cloud Server is sent
Number, and second alarm signal is broadcasted.
Wherein, the base station that refers in the present invention, can be base station that telecommunication apparatus operator is set up or by traffic control
The base station that department re-establishes, the base station are connected with the terminal wireless communication on vehicle.
Further, on the basis of the above embodiment, early warning system provided by the invention further includes second terminal, institute
State in the second vehicle that second terminal is arranged in the base station range, for receiving second alarm signal.
Further, on the basis of the above embodiment, second alarm signal includes the position of the first terminal
Confidence ceases.
Further, on the basis of the above embodiment, the second terminal includes Distance Judgment module, the distance
Judgment module is used to judge the distance between the second terminal and the first terminal;
If the distance exceedes threshold value, kid of the second terminal into second vehicle is carried
Wake up.
For example, positioner can be included by the second terminal to determine the position of itself, map journey can also be included
Sequence, according to the positional information of its first terminal received and the position of its own, can easily be learnt by mapping program
The distance between first terminal and second terminal.
Fig. 2 is referred to, Fig. 2 is the schematic diagram of a scenario that second terminal provided by the invention sends prompting message.Cloud Server
After the positional information for obtaining first terminal, the position of the first vehicle is obtained immediately.At this time, because the fatigue of the first driver is driven
The state of sailing may influence the safety of the second vehicle of the first vehicle periphery, therefore, it is necessary to the driver of the second vehicle into
Row is reminded, in the present embodiment, can the second vehicle for being covered of pair base station corresponding with first vehicle remind.
Base station in the region that it is covered by " the first driver is in fatigue driving " this information and the positional information of the first vehicle into
Row broadcast, so as to remind the second vehicle near the first vehicle.Second vehicle is obtained by positioner, such as GPS device
Take its positional information and calculate its distance with the first vehicle.If distance between the two is less than threshold value, to kid
Reminded.Wherein, kid here can be common car owner, can also be the traffic control law enforcement on the way gone on patrol
Person.
Further, on the basis of the above embodiment, the base station and the first terminal, the second terminal it
Between networking carried out by Zigbee protocol.
In the present embodiment, the component wireless network by the way of this Internet of Things fidonetFidos of ZigBee, because this networking
Mode is convenient, flexible, and the node increase and decrease of network is easier to, and has preferable self-healing ability, reliable and stable and network capacity is big.
Further, on the basis of the above embodiment,
The Cloud Server carries out the driving image eyes image that processing forms first driver, and according to
The eyes image judges whether first driver is in fatigue driving state.
Cloud Server can be more to the processing mode of driving image, for example, can judge the both hands of driver from image
Whether off-direction disk, whether head tilt, and directly judges that the eye state of driver is one direct and convincing
Mode, the driver in fatigue state generally have it is unconscious close one's eyes, narrow a behavior, the aperture of eyes is substantially deteriorated.
Following embodiment of the present invention additionally provides the evaluation method to eye state, and this will not be repeated here for present embodiment.
Please refer to Fig.3, Fig. 3 is a kind of structure diagram for harvester that present embodiment provides.The harvester bag
Include:
Filming apparatus, is arranged in first vehicle, for being shot to the eye of first driver;
Holder, the filming apparatus is connected by cradle head control interface;
A/d converter, the filming apparatus, the simulation that the filming apparatus is shot are electrically connected by AD conversion interface
Eyes image is converted into digital eyes image;
Memory, is electrically connected the a/d converter, for storing digital eyes image, and described in memory electrical connection
Dispensing device, for the dispensing device extraction digital eyes image, and the digital eyes image is sent to described
Cloud Server.
Preferably, the filming apparatus is infrared camera.The infrared camera can be installed on the holder, the holder
It may be mounted in the outer rim of shield glass.Infrared camera will not be to the driving of driver when shooting driving image
Behavior has an impact, and preferably ensure that drive safety.
Driver fatigue monitor system provided in this embodiment, is detected driving image by Cloud Server, is driving
The driver of the driver and its periphery are reminded when member is in fatigue driving state, significantly reduce fatigue driving production
Raw danger, drastically increases the security of traffic.
Embodiment two
The present embodiment provides a kind of determination methods of eye state, and specifically, the executive agent of this method can be cloud clothes
Business device, specifically, such as can be handled by the processing central processing unit of Cloud Server.Cloud Server can be with the following method
The eye state of driver is judged, and then judges whether driver is in fatigue driving state.
Fig. 4 is referred to, Fig. 4 shows for judgement flow of the Cloud Server provided in an embodiment of the present invention to driver eye's state
It is intended to.This method comprises the following steps:
Step 1, according to eyes image position first pupil center's point;
Step 2, according to the first pupil center point, extract the first pupil boundary points;
Step 3, pass through the second pupil center of the first pupil boundary point location point;
Step 4, according to the second pupil center point calculate the first pupil area and the second pupil area respectively;
Step 5, according to first pupil area and the second pupil area judge eye state.
Wherein, for step 1, Cloud Server specifically can be with:
Eye central area is estimated by the gray value of the eyes image, searches eye central area gray value
Minimum point location is the first pupil center point.
Wherein, for step 2, Cloud Server specifically can be with:
Using the first pupil center point as origin, M bars are respectively sent out to minus half direction of principal axis in y-axis positive axis direction and y-axis respectively
First ray, first ray are symmetrical with x-axis;
Calculate the shade of gray of first directions of rays;
The point of shade of gray maximum is chosen as first pupil boundary points.
Wherein, Cloud Server specifically can be with:
Using pupil center's point be starting point along upper eyelid direction emitting linear, form M bar rays;
Using pupil center's point be starting point along lower eyelid direction emitting linear, form N bar rays.
Wherein, for step 3, Cloud Server specifically can be with:
First pupil boundary points are fitted, the central point of first pupil boundary points is obtained by averaging method;
Using the central point as the second pupil center point.
Wherein, for step 4, Cloud Server specifically can be with:
According to the second pupil center point, the second pupil boundary points are extracted;
The first pupil area and the second pupil area are calculated respectively by second pupil boundary points.
Wherein, for, according to the second pupil center point, extracting the second pupil boundary points in step 4, can include:
Using the second pupil center point as origin, N bars are respectively sent out to minus half direction of principal axis in y-axis positive axis direction and y-axis respectively
Second ray, second ray are symmetrical with x-axis;
Calculate the shade of gray of second directions of rays;
The point of shade of gray maximum is chosen as second pupil boundary points.
Wherein, for calculating the first pupil area by second pupil boundary points in step 4, can include:
Second pupil boundary points are fitted to circle;
The circular area is calculated to be used as first pupil area.
Wherein, for calculating the second pupil area according to the second pupil center point in step 4, can include:
By second pupil boundary points direct-connected formation polygon two-by-two;
The area of a polygon is calculated as second pupil area.
Wherein, for step 5, Cloud Server specifically can be with:
Eye state value is determined using eye state equation according to first pupil area and second pupil area;
Choose the first eye state threshold and the second eye state threshold;
The eye state value is compared with true with the first eye state threshold and the second eye state threshold respectively
The fixed eye state.
Further, it is for eye state equation in step 5:
Wherein, S1 is first pupil area, and S2 is second pupil area.
The determination methods for the eye state that the present embodiment proposes, can be preferable without substantial amounts of high-definition image learning template
Reduction computational complexity, improve real-time, reliability is high, is with a wide range of applications, and the present invention is without expensive multiple in addition
Miscellaneous equipment, it is of low cost.
Embodiment three
On the basis of above-described embodiment, the present embodiment does furtherly eyes detection flow Cloud Server
Bright, which includes:
Step 1, obtain eyes image
After obtaining eyes image, eyes image is handled, eyes are adjusted to horizontal level.Eyes image is converted
For eye gray-scale map, grey-scale contrast enhancing pretreatment is carried out to eye gray-scale map, processing method is:
F=c*log (1+double (f0))
Wherein, f0 represents original image, and f represents the enhanced image of contrast.
The enhanced image of contrast is done into Laplce's filtering process.
The differentiation that grey-scale contrast enhancing pretreatment is more advantageous to pupil and perimeter is carried out to eye gray-scale map;In addition
The non-directional denoising that all directions can be done to eyes image of Laplce's filtering.
Step 2, first pupil center's point of positioning
Eye gray-scale map after being handled in step 1 is estimated into eye central area, searches eye central area gray value
Minimum point, if the point is located approximately at the midpoint of eye central area, is positioned as first pupil center's point;Otherwise, continue to look into
Look for, until finding the gray value smallest point being located approximately near the midpoint of eye central area.
Step 3, the first pupil boundary points of extraction
Using first pupil center's point as origin, M bars first are respectively sent out to minus half direction of principal axis in y-axis positive axis direction and y-axis respectively
Ray, the first ray are symmetrical with x-axis;
The shade of gray of the first directions of rays is calculated, calculation procedure is as follows:
A) the gray value partial differential of the first directions of rays, is calculated:
Wherein, f (i, j) is gray value of the eyes image at coordinate (i, j) place.
B) shade of gray of the first directions of rays, is calculated:
The point of D maximums is extracted, is denoted as Dmax;Work as Dmax>Boundary point threshold value, then the point is pupil boundary points.Wherein, border
Point threshold value is chosen more than the shade of gray at pupil and skin interface and less than the spy of pupil and the shade of gray of white of the eye intersection
Definite value, according to individual difference self-defining.Pupil boundary points are at pupil portion and white of the eye part alternating.
Step 4, second pupil center's point of positioning
First pupil boundary points approximate fits are circle, by equal by the first pupil boundary points determined in fit procedure 3
Value method extracts the central point of the first pupil boundary points;
Using the central point as second pupil center's point.
Step 5, the second pupil boundary points of extraction
Using second pupil center's point as origin, M bars second are respectively sent out to minus half direction of principal axis in y-axis positive axis direction and y-axis respectively
Ray, the second ray are symmetrical with x-axis;
The shade of gray of the second directions of rays is calculated, calculation procedure is as follows:
A) the gray value partial differential of the second directions of rays, is calculated:
Wherein, f (i, j) is gray value of the eyes image at coordinate (i, j) place.
B) shade of gray of the second directions of rays, is calculated:
Choose shade of gray D it is maximum o'clock as the second pupil boundary points.
Step 6, calculate the first pupil area
Second pupil boundary points approximate fits are circle, calculate circle by the second pupil boundary points determined in fit procedure 5
The area of shape, using the circular area as the first pupil area.
Step 7, calculate the second pupil area
Second pupil boundary points are direct-connected two-by-two, and connection forms polygon;
The area of a polygon is calculated as second pupil area.
Step 8, judge eye state
Eye state equation is:
Wherein, S1 is first pupil area, and S2 is second pupil area;
The first pupil area drawn in step 6 and step 7 and the second pupil area are substituted into eye state equation to solve
Eye state value;
Choose the first eye state threshold cth1 and the second eye state threshold cth2;
Work as θ>During cth1, eyes are in closure state,
Work as θ<During cth2, eyes are in and open state,
As cth1≤θ≤cth2, eyes are in and narrow state.
The eye pupil of people is smaller, and gray scale is low, because of individual physiological factor pupil image will not be caused to be blocked by eyelid, when
Normal when opening state, pupil is complete, when being that through hole disappears in closure, when in open and close intermediate state when, pupil
Hole lower edges are blocked, therefore are closed using detecting pupil boundary and judging that eyes are opened.
Example IV
On the basis of above-described embodiment, the process that the present embodiment judges eye state cloud service is illustrated
It is bright.
Fig. 5 is referred to, Fig. 5 opens view for eyes provided in an embodiment of the present invention.As shown in the figure, choose first
Closure threshold value is 0.8, and the second closure threshold value is 0.2, is fitted the second pupil boundary points, by the approximate plan of the second pupil boundary points
Circle is combined into, it is 3.14 to calculate circular area, and the second pupil boundary points are direct-connected two-by-two, and connection forms polygon, is calculated more
The area of side shape is 2.6, brings eyes closed degree formula into, and it is that 0.17, θ less than the second closure threshold value is 0.2 to try to achieve θ, therefore
Eyes are in and open state.
Continuing with referring to Fig. 6, Fig. 6 is eyes closed view provided in an embodiment of the present invention.As shown in the figure, as schemed
Shown, it is 0.8 to choose the first closure threshold value, and the second closure threshold value is 0.2, the second pupil boundary points is fitted, by the second pupil
Hole boundary point approximate fits are circle, and it is 3.14 to calculate circular area, and the second pupil boundary points are direct-connected two-by-two, and connection is formed
Polygon, the area for calculating polygon is 0.42, brings eyes closed degree formula into, tries to achieve θ and is more than the first closure for 0.86, θ
Threshold value is 0.2, therefore eyes are in closure state.
Fig. 7 is referred to, Fig. 7 narrows view for eyes provided in an embodiment of the present invention.As shown in the figure, choose
First closure threshold value is 0.8, and the second closure threshold value is 0.2, is fitted the second pupil boundary points, and the second pupil boundary points are near
Circle is seemingly fitted to, it is 3.14 to calculate circular area, and the second pupil boundary points are direct-connected two-by-two, and connection forms polygon, is counted
The area for calculating polygon is 1.7, brings eyes closed degree formula into, and it is that 0.46, θ more than the second closure threshold value is 0.2 to try to achieve θ,
It is 0.8 less than the first closure threshold value, therefore eyes are in and narrow state.
In conclusion specific case used herein is set forth the present invention, the explanation of above example is
It is used to help understand the method and its core concept of the present invention;Meanwhile for those of ordinary skill in the art, according to the present invention
Thought, there will be changes in specific embodiments and applications, in conclusion this specification content should not be understood
For limitation of the present invention, protection scope of the present invention should be subject to appended claim.
Claims (10)
- A kind of 1. driver fatigue monitor system, it is characterised in that including:Harvester, is arranged in the first vehicle, for gathering the first driving information of the first driver, wherein, described first Driving information includes the driving image of first driver;Dispensing device, is arranged in first vehicle and is electrically connected the harvester, for by first driving information Send to Cloud Server;First terminal, is arranged in first vehicle, for prompting the first alarm signal to first driver;Cloud Server, wireless communication connects the first terminal, for handling the driving image, and judges that described first drives Whether member is in fatigue driving state, is sent out when definite first driver is in fatigue driving state to the first terminal Send first alarm signal.
- 2. the system as claimed in claim 1, it is characterised in that first driving information further includes the position of the first terminal Confidence ceases.
- 3. system as claimed in claim 2, it is characterised in that further include base station, the base station judges in the Cloud Server During first driver tired driving, the second alarm signal that the Cloud Server is sent is received, and described second is alarmed Signal is broadcasted.
- 4. system as claimed in claim 3, it is characterised in that further include second terminal, the second terminal is arranged at described In the second vehicle in base station range, for receiving second alarm signal.
- 5. system as claimed in claim 4, it is characterised in that second alarm signal includes the position of the first terminal Information.
- 6. system as claimed in claim 5, it is characterised in that the second terminal includes Distance Judgment module, the distance Judgment module is used to judge the distance between the second terminal and the first terminal;If the distance exceedes threshold value, kid of the second terminal into second vehicle is reminded.
- 7. system as claimed in claim 6, it is characterised in that the base station and the first terminal, the second terminal it Between networking carried out by Zigbee protocol.
- 8. the system as claimed in claim 1, it is characterised in that the Cloud Server carries out processing to the driving image and formed The eyes image of first driver, and judge whether first driver is in fatigue driving according to the eyes image State.
- 9. system as claimed in claim 8, it is characterised in that the harvester includes:Filming apparatus, is arranged in first vehicle, for being shot to the eye of first driver;Holder, the filming apparatus is connected by cradle head control interface;A/d converter, simulation eye figure of the filming apparatus the filming apparatus to be shot is electrically connected by AD conversion interface As being converted into digital eyes image;Memory, is electrically connected the a/d converter to store digital eyes image, and is electrically connected the dispensing device, for described The dispensing device extraction digital eyes image is sent to the Cloud Server.
- 10. system as claimed in claim 9, it is characterised in that the filming apparatus is infrared camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711241785.1A CN107985193B (en) | 2017-11-30 | 2017-11-30 | Fatigue driving early warning system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711241785.1A CN107985193B (en) | 2017-11-30 | 2017-11-30 | Fatigue driving early warning system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107985193A true CN107985193A (en) | 2018-05-04 |
CN107985193B CN107985193B (en) | 2021-06-04 |
Family
ID=62034809
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711241785.1A Active CN107985193B (en) | 2017-11-30 | 2017-11-30 | Fatigue driving early warning system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107985193B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109285320A (en) * | 2018-05-28 | 2019-01-29 | 惠州市德赛西威汽车电子股份有限公司 | A kind of cloud processing method for fatigue driving |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5570698A (en) * | 1995-06-02 | 1996-11-05 | Siemens Corporate Research, Inc. | System for monitoring eyes for detecting sleep behavior |
JP2002331850A (en) * | 2001-05-07 | 2002-11-19 | Nissan Motor Co Ltd | Driving behavior intention detector |
CN101540090A (en) * | 2009-04-14 | 2009-09-23 | 华南理工大学 | Driver fatigue monitoring device based on multivariate information fusion and monitoring method thereof |
CN101558998A (en) * | 2009-03-06 | 2009-10-21 | 北京理工大学 | System for remote monitoring and prewarning of driving fatigue state based on multi-element network transmission |
CN101673464A (en) * | 2009-09-27 | 2010-03-17 | 上海大学 | Intelligent management system of fatigue driving |
CN101692980A (en) * | 2009-10-30 | 2010-04-14 | 吴泽俊 | Method for detecting fatigue driving |
CN204242345U (en) * | 2014-12-12 | 2015-04-01 | 湖北德强电子科技有限公司 | Fatigue of automobile driver status monitoring and automotive safety status monitoring warning system |
CN105679253A (en) * | 2016-03-30 | 2016-06-15 | 深圳还是威健康科技有限公司 | Terminal backlight adjustment method and device |
CN106203262A (en) * | 2016-06-27 | 2016-12-07 | 辽宁工程技术大学 | A kind of ocular form sorting technique based on eyelid curve similarity Yu ocular form index |
CN106671880A (en) * | 2016-12-30 | 2017-05-17 | 天津云视科技发展有限公司 | Vehicle operation state diagnostic system applying cloud computing and artificial intelligence technology |
-
2017
- 2017-11-30 CN CN201711241785.1A patent/CN107985193B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5570698A (en) * | 1995-06-02 | 1996-11-05 | Siemens Corporate Research, Inc. | System for monitoring eyes for detecting sleep behavior |
JP2002331850A (en) * | 2001-05-07 | 2002-11-19 | Nissan Motor Co Ltd | Driving behavior intention detector |
CN101558998A (en) * | 2009-03-06 | 2009-10-21 | 北京理工大学 | System for remote monitoring and prewarning of driving fatigue state based on multi-element network transmission |
CN101540090A (en) * | 2009-04-14 | 2009-09-23 | 华南理工大学 | Driver fatigue monitoring device based on multivariate information fusion and monitoring method thereof |
CN101673464A (en) * | 2009-09-27 | 2010-03-17 | 上海大学 | Intelligent management system of fatigue driving |
CN101692980A (en) * | 2009-10-30 | 2010-04-14 | 吴泽俊 | Method for detecting fatigue driving |
CN204242345U (en) * | 2014-12-12 | 2015-04-01 | 湖北德强电子科技有限公司 | Fatigue of automobile driver status monitoring and automotive safety status monitoring warning system |
CN105679253A (en) * | 2016-03-30 | 2016-06-15 | 深圳还是威健康科技有限公司 | Terminal backlight adjustment method and device |
CN106203262A (en) * | 2016-06-27 | 2016-12-07 | 辽宁工程技术大学 | A kind of ocular form sorting technique based on eyelid curve similarity Yu ocular form index |
CN106671880A (en) * | 2016-12-30 | 2017-05-17 | 天津云视科技发展有限公司 | Vehicle operation state diagnostic system applying cloud computing and artificial intelligence technology |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109285320A (en) * | 2018-05-28 | 2019-01-29 | 惠州市德赛西威汽车电子股份有限公司 | A kind of cloud processing method for fatigue driving |
Also Published As
Publication number | Publication date |
---|---|
CN107985193B (en) | 2021-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106965675B (en) | A kind of lorry swarm intelligence safety work system | |
CN103927904B (en) | Early warning method of pedestrian anti-collision early warning system using smartphone | |
CN105069976B (en) | A kind of fatigue detecting and traveling record integrated system and fatigue detection method | |
CN104408878A (en) | Vehicle fleet fatigue driving early warning monitoring system and method | |
CN1495658A (en) | Driver's face image identification and alarm device and method | |
CN105894735B (en) | A kind of intelligent vehicle-mounted fatigue monitoring system and method | |
CN105684050A (en) | Safe positioning school bus picking system | |
CN103366506A (en) | Device and method for automatically monitoring telephone call behavior of driver when driving | |
CN107945573A (en) | Vehicle parking detecting system | |
CN106297410A (en) | vehicle monitoring method and device | |
CN107844783A (en) | A kind of commerial vehicle abnormal driving behavioral value method and system | |
CN104238733B (en) | Method for triggering signal and electronic device for vehicle | |
CN114446026B (en) | Article forgetting reminding method, corresponding electronic equipment and device | |
CN106485880A (en) | Automobile driving safe early warning vehicle intelligent terminal | |
WO2022161139A1 (en) | Driving direction test method and apparatus, computer device, and storage medium | |
CN109459018A (en) | Automotive safety monitoring device based on mobile 4G technology | |
CN109637089A (en) | The method for early warning and device of user security | |
CN109492548A (en) | The preparation method of region mask picture based on video analysis | |
CN108399795A (en) | A kind of working method of the vehicle collision alarm set based on intelligent transportation base station | |
CN205997840U (en) | Hot environment car door is from open system | |
CN107985193A (en) | Driver fatigue monitor system | |
CN206301462U (en) | Automobile driving safe early warning vehicle intelligent terminal | |
CN203562070U (en) | Drunk-driving and sleeping prevention device | |
CN107170190B (en) | A kind of dangerous driving warning system | |
CN106228754A (en) | A kind of hands based on binocular vision detection equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20210514 Address after: 14 / F, Beidou building, 6 Huida Road, Jiangbei new district, Nanjing, Jiangsu Province 210000 Applicant after: Jiangsu Zhongtian Anchi Technology Co., Ltd Address before: 710065 Xi'an new hi tech Zone, Shaanxi, No. 86 Gaoxin Road, No. second, 1 units, 22 stories, 12202 rooms, 51, B block. Applicant before: Xi'an Cresun Innovation Technology Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |