GB2454916A - Fatigue monitoring using facial images - Google Patents

Fatigue monitoring using facial images Download PDF

Info

Publication number
GB2454916A
GB2454916A GB0722986A GB0722986A GB2454916A GB 2454916 A GB2454916 A GB 2454916A GB 0722986 A GB0722986 A GB 0722986A GB 0722986 A GB0722986 A GB 0722986A GB 2454916 A GB2454916 A GB 2454916A
Authority
GB
United Kingdom
Prior art keywords
individual
permitted
fatigue level
identified
permitted individual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0722986A
Other versions
GB0722986D0 (en
Inventor
Dimuth Jayawarna
Kapila Priyadarshana Malawwethanthri
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MALAWWETHANTHRI KAPILA PRIYADA
Original Assignee
MALAWWETHANTHRI KAPILA PRIYADA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by MALAWWETHANTHRI KAPILA PRIYADA filed Critical MALAWWETHANTHRI KAPILA PRIYADA
Priority to GB0722986A priority Critical patent/GB2454916A/en
Publication of GB0722986D0 publication Critical patent/GB0722986D0/en
Priority to PCT/GB2008/051093 priority patent/WO2009066109A1/en
Publication of GB2454916A publication Critical patent/GB2454916A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/10Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
    • B60R25/104Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device characterised by the type of theft warning signal, e.g. visual or audible signals with special characteristics
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/20Means to switch the anti-theft system on or off
    • B60R25/25Means to switch the anti-theft system on or off using biometry
    • B60R25/255Eye recognition
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/20Means to switch the anti-theft system on or off
    • B60R25/25Means to switch the anti-theft system on or off using biometry
    • B60R25/257Voice recognition
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/30Detection related to theft or to other events relevant to anti-theft systems
    • B60R25/305Detection related to theft or to other events relevant to anti-theft systems using a camera
    • G06K9/00221
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/1961Movement detection not involving frame subtraction, e.g. motion detection on the basis of luminance changes in the image
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19641Multiple cameras having overlapping views on a single scene
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19647Systems specially adapted for intrusion detection in or around a vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B31/00Predictive alarm systems characterised by extrapolation or other computation using updated historic data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1103Detecting eye twinkling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention

Abstract

A system for monitoring individuals in a control chambers such as a power plant control room or vehicle, such as a ship's bridge comprises means to captured facial images of the individuals and determine whether they are authorised to be present. Further means are provided to analyse the captured facial images to assess the individuals' level of fatigue. The system may predict a safe watch-keeping period for each individual, which can be adjusted as the watch progresses using learnt historical data and ambient factors. Alarm states may be initiated if unidentified individuals are detected or levels of fatigue and intoxication exceed predetermined levels.

Description

Fatigue Monitoring and Intruder Alert System The invention relates to systems for monitoring the individuals present, and their fatigue levels, in the control chamber of plants or vessels, particularly on the bridge of a ship. In particular, the invention relates to systems for predicting safe watch keeping periods for individuals on watch. The invention also relates to a system for monitonng the presence of intruders.
Plants, such as power stations or manufacturing plants, or vessels such as ships, aircraft, trains, cars etc. have control chambers where the operation of the plant or vessel is monitored and controlled. It may be that one individual operates the plant or vehicle, or there may be a team of operators.
The control chamber may be, for instance, a control room for a plant, the bridge of a ship, the flight deck of an aircraft or other vessel, controlling station of an oil platform, or control centre for vessels or aircraft monitoring and br controlling station, CCTV monitoring station or the driver's seat of a car, lorry or a train.
For reasons of safety and security, it is vital that individuals in charge of the operation of plants or vehicles are alert and not excessively fatigued or under the influence of intoxicating substances such as alcohol or drugs. It is also important for reasons of safety and security that unauthorised individuals are not allowed to remain in the control chamber of a plant, station or vessel, should they gain access. It is also important to know how long individuals can safely remain on watch without risk of excessive tiredness by the end of their watch. This enables manning levels to be scheduled.
US 5,745,038 discloses an eye monitor that examines reflected light from the eye to detect blinking behaviour as an indicator of drowsiness of a individual. * 2
US 5,867,587 discloses a system which utilises digital images of the face and eyes of an operator, derives a parameter attributable to an eye blink and compares this to a threshold value of that parameter. A warning signal is given if the parameter falls below the threshold.
WO 98/49028 uses a video image to monitor a range of eye movements and to derive from these a degree of alertness.
US 6,091,334 discloses a system for analysing drowsiness which monitors head movement and gaze stability.
US 6,147,612 discloses a system for preventing sleep which detects eyelid movement and actuates an alarm when the eyelid movement is indicative of drowsiness US 6,346,887 uses a video based eye tracking system which tracks eye activity and pupil diameter and position to produce a signal representing eye activity that can be used to estimate alertness.
GB 2 431 495 is concerned with a monitoring system for a ship's navigation bridge, where inactivity of the crew is detected by means of passive infra-red detectors.
DE 102 18676 describes an on-board computer for a vehicle which predicts when a driver or pilot may be tired based on sleep history information, the circadian cycle and ambient conditions (such as bad weather or darkness).
US 5,900,827 is concerned with a method and apparatus for measuring the alertness levels for the flight crew of an aircraft. It operates by monitoring the performance of the flight crew as tasks are performed and sets alarms if there is excessive deviation from optimal behaviour.
There are various problems with the above-mentioned systems. Some fatigue monitoring systems require the individual to be monitored to place their face in a particular position to allow for eye movement monitoring, restricting free movement. Prior art monitoring systems generally monitor only one individual at a time, which is suitable for a cockpit type arrangement, but not for a control room where several individuals may be present, moving from station to station within the room. Additionally, the prior art may generate warning alarms at onset of sleep but may not take cognitive deterioration prior to immediate onset of sleep into account. Also, the accuracy correlating tiredness with eye-movement monitoring, particularly when eye data is captured from a video image, is highly dependent on the physiognomy on the characteristic behaviour, when alert, of the individual concerned. For instance, a movement caused by a nervous tic might be assessed as excessive blinking, indicating fatigue. An individual with droopy eyelids might be thought to be fatigued when this is not the case.
With regard to security of access to the control chamber, conventional systems usually operate at entry/exit points but would not be triggered if a individual gained access by unconventional means (such as through the ceiling or floor). Also, cumbersome security means at entrances and exits can hinder smooth operation (and also hinder evacuation) as individuals have to wait to be identified before they can gain access to the control chamber.
Hence, there is a need for a system for monitoring control chambers which can overcome some or all of the problems of the prior art. In particular, there is a need for a system which allows continuous monitoring and determination at an early stage of the fatigue levels, where cognitive functions start to detenorate, of each individual present in a group of people.
There is also a need for a system which can predict safe watch keeping periods for each individual on watch, so that watch-keeper changes and rest periods can be scheduled. There is also a need for a system which cam identify when intruders are present in a control chamber.
Hence, one object, amongst others, of the present invention, is to provide a system which addresses some of the problems of the prior art by identifying individuals and monitoring their individual fatigue levels. Another objest is to provide a system capable of predicting safe watch keeping periods for individuals, and also capable of learning how to better predict safe watch keeping periods for individuals.
A first aspect of the invention provides a system for monitoring one or more individuals in a control chamber of a plant or vehicle comprising a means for identifying whether each individual is a permitted individual allowed to access the control chamber charactensed in that the means for identifying each permitted individual is by analysis of one or more captured facial images of each permitted individual's face and that the system further comprises a means for measuring a fatigue level for each permitted individual using data derived from the captured facial images.
Preferably, the system further comprises a means for predicting a safe watch keeping period from the measured fatigue level for each permitted individual, wherein the measured fatigue level for each permitted individual is predicted to fall to a predetermined unsafe fatigue level for that permitted individual by the end of the safe watch keeping period.
A suitable way for putting the system into effect is to provide the control chamber with one or more cameras, suitably video cameras, which are adapted to capture facial images of the individuals within the control chamber as these individuals move around the chamber carrying out their work tasks. A computer program, running on a computer that may form part of the system, carries out image analysis on the captured images, identifying the various individuals from the facial images, using stored facial images for reference. The program also sequences and analyses the facial images for each individual, allowing fatigue levels to be monitored. Clearly, the facial image capture rate should be adequate to provide sufficient eye and/or eyelid movement data to allow fatigue analysis to be carried out.
Many work tasks in a control chamber require an individual to look at instrument displays in particular places or to make observations through windows or on display screens. Suitable placement of cameras in these locations enables facial images of the individuals to be captured as the individuals go about their normal work tasks, without requiring the individuals to hold their faces still in front of a camera. The images can also be used to provide information on the eye movements of the individuals, such as by monitoring eye and eyelid movement for presence of slow or drifting movement, eyelid closure, absence of saccadic movement, loss of eye co-ordination, wavering eyelids, partial eye closure or drooping eyelids.
As an individual moves from the space covered by one camera, to the space monitored by second camera, it is preferable for the system to correlate the movement of the individual with the transfer between different cameras, such that the captured facial images can be easily linked to the same individual. This enables the eye and eyelid movement behaviour of an individual to be monitored sequentially as the individual moves between different work tasks. Methods for monitoring fatigue levels from such eye movements are known, as detailed above, from the prior art.
Systems for facial recognition from captured facial images are available in the prior art. For instance, such a system is sold by Aurora Computer Services Limited, UK, under the trade names ClockFaceTM and eGalleryTM.
Facial recognition software is based on the ability to recognize a face and then measure various features of the face (such as distance between the eyes, width of nose, depth of eye sockets, shape of cheekbones, etc.).
However, there has been no attempt to use such systems for also monitoring fatigue for a particular individual working in a group of other people, such as in a control chamber alongside other individuals. * 6
A preferred system according to a first aspect of the invention comprises i) a database comprising facial image, name data and the predetermined unsafe fatigue level for each permitted individual and optionally baseline eye movement data for each permitted individual, ii) one or more cameras adapted to capture images of the faces of each individual in the control chamber, wherein each captured facial image is provided with a time marker, iii) a means for tracking which captured facial images correspond to which individuals present in the control chamber, iv) a means for comparing captured facial images of individuals present in the control chamber with facial images of permitted individuals from the database whereby each captured facial image can be provided with a name marker identifying each identified permitted individual, or an indication that identification is not possible.
v) a means for sequencing the captured facial images having the same name marker in time order by means of the time markers whereby eye and/or eyelid movement of each identified permitted individual can be measured, and optionally compared with baseline eye data for each identified permitted individual, to provide the measured fatigue level for each identified permitted individual.
By fatigue level is meant a numerical value or a multidimensional parameter indicating the alertness, or otherwise of the individual. For each individual, there will be a predetermined fatigue level at which it is considered unsafe for the individual to continue to work safely, and when this predetermined unsafe fatigue level has been reached, the individual should be relieved from the watch.
The invention may be used to monitor permitted individuals and their respective fatigue levels and to display or sound a warning when the fatigue level of that individual reaches the predetermined unsafe fatigue level.
The predetermined unsafe fatigue level may be the same level for each permitted individual, but is preferably tailored to each permitted individual based on tests carried out on each individual to determine each individuals predetermined unsafe fatigue level at which their cognitive functions are sufficiently deteriorated for them to be considered unsafe to work.
As explained above, in addition to monitoring the fatigue level for each individual, the system of the invention may also predict a safe watch keeping period for each individual, at the end of which period it is expected that the individuals fatigue level will have reached the predetermined unsafe fatigue level. This calculation will typically be by means of a computer program running on a computer forming part of the system, and may be based on the measured fatigue level for an individual at the start of that individual's watch keeping period and typical fatigue level deterioration data for humans. To improve the accuracy of the prediction of the safe watch keeping period for an individual, preferably, personalized fatigue level deterioration data for that individual will be employed.
Preferably, the system of the invention further comprises a means for monitoring current fatigue level deterioration data for each permitted individual and wherein the means for predicting a safe watch keeping period takes into account the current fatigue level deterioration data for each permitted individual.
In other words, for each individual on watch, the system monitors not only the fatigue level of each individual but also the changes in fatigue level for that individual as the watch progresses. The resulting fatigue level deterioration at the time, for the watch for that individual, referred to as the * 8 current fatigue level deterioration data, may be extrapolated to provide an updated or revised safe watch keeping period for that individual as the watch progresses. This allows for better planning and also allows the individual to take action such as taking permitted stimulants, such as caffeine, in order to prolong the safe watch keeping period.
Suitably, the system comprises a means for storing in the database current fatigue level deterioration data for each permitted individual to provide a database of historical fatigue level deterioration data for each permitted individual, and a means for calculating average fatigue level deterioration data for each permitted individual from stored historical fatigue level deterioration data for each permitted individual and the means for predicting a safe watch keeping period takes into account the average fatigue level deterioration data for each permitted individual.
Average fatigue level deterioration data again is either a numerical value or a multidimensional parameter. This may be derived from a regression analysis based method wherein each monitored fatigue level deterioration that occurred in the past for that individual is given its proper relevance in determining the average fatigue deterioration data for that individual.
Preferably, the system is such that the means for predicting the safe watch keeping period for each permitted individual takes into account both the average fatigue level deterioration data and the current fatigue level deterioration data for each permitted individual.
The weighting of each of the average and current fatigue level data may be chosen appropriately and may be fixed, or may change as the watch keeping period progresses. For instance, the initial calculation of safe watch keeping period may be based on the average fatigue level deterioration data, as the current data will be limited, but once sufficient current data has been gathered, the calculation may be revised to use the current fatigue level deterioration data for each individual. The extrapolation of the data to predict the safe watch keeping period may be a linear extrapolation or may be a multinomial extrapolation.
Preferably, in order to provide rapid recognition of permitted individuals, the database holds a plurality of images for each individual.
The means for tracking which facial images correspond to which individuals present in the control chamber, as well as the means for comparing captured facial images of individuals present in the control chamber with facial images of permitted individuals from the database are both suitably computer programs operating on a computer.
Similarly, the means for sequencing the captured facial images having the same name marker in time order by means of the time markers whereby eye and/or eyelid movement of each identified permitted individual can be measured, and optionally compared with baseline eye and/or eyelid movement data for each identified permitted individual, to provide an fatigue level for each identified permitted individual, is typically a computer program running on a computer.
Although the fatigue level can be established simply from the eye and eyelid movement data generated for each individual, it is preferable if baseline data are present in the database for the eye and eyelid configuration of the individual gathered when they were in an alert state. By comparison with this baseline data, a more reliable fatigue estimate can be made in order to provide the fatigue level. By "fatigue level" is meant the estimate of fatigue based on the measured eye and/or eyelid movement data for an individual, irrespective of whether this has been modified by comparison with baseline eye and/or eyelid movement data.
Images are preferably captured at frequent intervals in order to allow eye and/or eyelid movement to be monitored and used as a basis for monitoring of fatigue levels. * 10
Preferably, the system further comprises a means for collecting, and storing in the database, fatigue-related diary data for each permitted individual and the means for predicting a safe watch keeping period takes into account the fatigue-related diary data for each permitted individual.
By fatigue-related diary data is meant information such as the sleep history of the permitted individual, for instance caused by changing sleep patterns due to time zone crossings such as caused by being on a moving vehicle or vessel and information concerning whether the individual has moved between time zones in the recent past (which could lead to jet-lag problems). Again, such factors have well-known effects on fatigue deterioration for individuals.
The fatigue level and the fatigue level deterioration data may be combined with fatigue related diary data to better predict the safe watch keeping period at the start of an individual's watch.
The time of day may also be taken into account such that circadian fluctuation in fatigue is included in the prediction of the safe watch keeping period. Hence the means for predicting a safe watch keeping period suitably takes into account time of day at the location where the watch is occurring.
On the assumption that performance related cognitive functions are influenced by sleep deprivation and circadian processes it is prudent to model the accumulated' fatigue using historical and circadian fatigue elements. Furthermore the best judgement on an individual's awareness state' could be arrived at by observing the current behaviour which is a reflection of the past fatigue conditions. This allows the system to effectively have a "learning mechanism" using the current fatigue level and its deterioration behaviour to adjust the predictive model for each individual and to ensure that correct adjustments are made to the initial safe watch * . 11 keeping period, which was set out originally, as each individual's watch progresses.
Predicting a safe watch keeping period more reliably is the main aim of the learning mechanism implemented by the system. This process evaluates how accurate/inaccurate the previous predictions were and adjusts individual aspects/parameters relevant to the safe watch keeping period accordingly with the ultimate objective of making a more accurate current/future prediction. The learning system may be used to adjust the predetermined unsafe fatigue level for each individual as a way to adjust the safe watch keeping period.
The means for deriving "predetermined unsafe fatigue level" and "average fatigue level deterioration data" may be executed as background processes in the computer program running on the computer forming part of the system. As another part of the learning mechanism, these two sets of data may be determined by a processing module whereby parameters relevant to each set of data are adjusted dynamically. This process takes into account how good/bad a result it was when the two sets of data were evaluated in the previous instance.
The system may make an initial estimate of the predicted safe period that reflects the user entered sleep/duty patterns and circadian effects. At this early stage of the learning activity, the estimate of the historical fatigue component is purely based on the individual's duty- rest and sleep-wake patterns. The system may then make adjustments to the predicted safe period, based on the observed fatigue deterioration rate for each individual, so as to continually update the individual's varying degree of cognitive awareness and predicted safe period. As a whole the system represents the relationship that prevails between sleep related historical data, circadian influence and the effect of the duration/timing of a control room watch shift on the cognitive performance of an individual. Hence the system may be used to provide a predicted safe period for each individual, and to adjust the * 12 predicted safe period during an individuals watch by learning from the deterioration rate of the individual's fatigue level. Once the safe watch keeping period for an individual has passed, deterioration of that individual's cognitive function is expected.
An initial safe watch period is provided by the system for each individual at the start of a watch in order to gauge each individual's ability to complete their full shift duration. This initial safe period is useful in preparing rosters for day to day operation of a control chamber. It is therefore important for a management team to know in advance what safe periods could be assigned to each individual on duty as their watch duration. The system could also output probabilistic views on the timings and efficiency of shifts for each individual that will again aid in the process of scheduling man-hours.
This again highlights the fact that the system learns how accurate/inaccurate the previous value assignment was' when the initial safe watch period was determined on the previous instance for the same individual and adjusting the core influential factors affecting the determination of the same in a dynamical manner.
There are many physiological changes that may be used as a basis for fatigue sensing. Various studies which have been carried out in the past have shown that PERCLOS (percentage eyelid closure over the pupils) is an excellent determinant in assessing the onset of human sleepiness or drowsiness. This is evaluated by a procedure wherein over a certain period of time, the proportion of time that the eyes were closed is monitored and measured. Eye-blink rate is another sensory measurement that can be combined with PERCLOS to arrive at a drowsiness state. Measurements are taken accurately on eyelid position, pupil size (separate readings for each eye) and vergence (the simultaneous movement of both eyes in opposite directions to obtain or maintain single binocular vision) in real-time and fed into a fatigue modelling engine which determines the relevant fatigue level. * 13
Cameras in various positions capture the relevant areas of the individual's face in real-time taking into account the posture changes and/or the movement of an individual from one capture area to another. Various parts of the face, including eye lids and pupils, may be recognised through image analysis software and their relative positions are tracked in real-time with high accuracy. In this way, an individual's interaction with the equipment he is dealing with can be maintained continuously whilst preserving the integrity of the fatigue analysis. No specific action by the individual is required, other than carrying on with assigned activities as normal.
Face tracking and identification is achieved through an image-based face recognition process. In simple terms, this outputs the identity of the subject based upon a repository of known facial images and by a matching process.
This involves comparing characteristics of a monitored face to those of known individuals. Given an image, the first task in the process is face detection, i.e. finding the outline of the face. The cut-out image of the face is then checked for its probity as a face. A database of facial images is created initially in such a way that a plurality of facial images from each individual is stored. Selection of images is done carefully under the same illumination conditions and a wide variety of expressions are captured. Each individual has his own set of facial features which are an important factor in the recognition process.
Preferably, the system further comprises a means for updating the facial images of permitted individuals and optionally for updating the baseline eye movement data for each permitted individual in the database.
As individuals facial features may change over time, ills useful if the system includes a facility for updating the database with recent images. The ability to update the database should be restricted, for instance by password security measures, in order to prevent an unauthonsed individual from including a false facial image in the database.
A further method for improving the security of the system would require each individual present in the control chamber to have registered their presence in the control chamber before entry, on entry, or within a predetermined registration time period from entry. This simplifies the operation of the system, as individuals present identify themselves, minimising the risk of an error due to the system identifying the wrong permitted individual or failing to identify the individual.
More preferably, the system should be able to identify a permitted individual from the captured facial images within a predetermined identification time period, irrespective of whether that individual has already registered their presence in the control chamber.
Hence it is preferred that each permitted individual present in the control chamber is required to have registered their presence in the control chamber before entry or within a predetermined registration time period.
Suitably, an intruder alarm state is generated when the system cannot identify an individual present in the chamber from the captured facial images within a predetermined identification time period from entry.
A captured facial image of an unidentified individual may be transmitted to a remote location when the intruder alarm state is generated. This can aid with subsequent identification of terrorists or pilots.
The system may also comprise a voice recording means in the control chamber such that captured voice recordings may be transmitted at regular intervals to a remote location when the intruder alarm state is generated.
This may also aid with the identification of individuals and the assessment of a security breach. * 15
The data, such as image and voice recordings for intruders, is also suitably stored on the system's database.
Hence the system is preferably such that an intruder alarm state is generated when the system cannot identify an individual present in the chamber from the captured facial images within a predetermined identification time period, for instance within 5 minutes or less.
This also has the advantage that if an intruder manages to register as a permitted individual (by stealing security data for instance), the system should still detect them as an intruder if the captured images of their face do not correspond to the data present in the database for the facial images of permitted individuals.
Suitably, the system further comprises a means for collecting and recording an intoxicant level for each permitted individual present in the control chamber.
Preferably, the means for predicting a safe watch-keeping period takes into account the intoxicant level for each identified permitted individual. This may affect the fatigue deterioration behaviour for an individual, and so may be factored into the calculation.
Preferably, the system generates an intoxication alarm state if the intoxicant level of an identified permitted individual exceeds a predetermined maximum intoxicant level.
* The means for collecting and recording an intoxicant level could be, for instance, an apparatus for taking a breath sample from an individual. A breathalyser is suitable. Various breathalyser systems are commercially available for estimating blood alcohol levels by analysis of the amount of alcohol present from a breath sample. It could be required that an individual must provide a breath sample at the same time as when they register their * 16 presence in the control chamber, or the system could request samples from identified permitted individuals at random times when they are in the control room. In the latter case, non-compliance could be linked to an alarm state.
Suitably, a camera for capturing images could be located to capture facial images of the individual giving the sample to ensure that the sample is not Suitably, the system may generate a record of the fatigue level and optionally of the intoxicant level for each individual present in the control chamber. This information can be accessed during operation of the control chamber either by the individuals or by a supervisor so that appropriate action can be taken. The record may be accessed via any suitable display means, such as a monitor screen, or a printer, in the control chamber or at a remote location. Additionally for each individual weekly/monthly work-rest reports may be generated as a reference for future duty scheduling. The system can also be configured to advise on recommended action (such as advising on the hours of sleep required by an individual before that individual should return to the control chamber or advising on taking stimulants, along with their effective periods, as countermeasures).
Preferably, the system is configured to generate an alarm state if the fatigue level of an identified permitted individual exceeds a predetermined maximum fatigue level.
Alarm states may be indicated in the control chamber only, for instance by means of audible and/or visual signals, or may also be signalled at a remote location. Depending upon whether the alarm state is to indicate fatigue, intoxication or an intruder, the alarm may be switched off or reset by the people in the control chamber, or the resetting of the alarm may have to be carried out by a supervisor from a remote location. The protocol can be pre-set within the system.
Suitably, a current copy of the database is duplicated in a data recorder which forms part of the system, and which is adapted to survive a disaster * 17 event. This duplication occurs at frequent intervals, say every minute or less, in order to ensure that a record is kept after any disaster or criminal event such as piracy. Such data recorders are known as "black box recorders".
The database may include data concerning each alarm state and may also include system violation data.
For each permitted individual, from the start of their watch, when an initial prediction of that individual's safe watch keeping period may be made, a reassessment of the safe watch keeping period may be carried out regularly to update the predicted safe period. For instance this may take place every minutes. The system may be configured to provide a display for each individual indicating their fatigue level in simplified and easily recognised form. A fully alert individual may have a green display, changing to amber when cognitive functions first deteriorate, and to red when the individual should be relieved.
The system may also generate an alarm state when, over a predetermined length of time, no permitted person has been identified as being present in the control chamber. This means that when, over any predetermined length of time, the image of the face of a permitted person has not been captured and/or recognised by the system, then an alarm state is generated to indicate that there is no permitted person on watch.
The system is, for instance, of particular use for monitoring the navigation bridge of a ship or other sea vessel. Alarms may be set to signal in the cabin of the vessel's master, or in the cabin of the second in command, when the system generates an alarm state. When the system generates an intruder alarm state, for instance, captured photo frames in addition to a coded signal might also be sent via a telecommunication network to a remote site, such as an on-shore operations centre, for instance to give an alert of a potential terrorist, piracy or sabotage attack. To allow remote monitoring of the operation of the system, some or all events, optionally including facial images and reports on fatigue and/or intoxicant levels of individuals, can be * 18 logged and a record of the log stored and/or sent via a telecommunication network to a remote site, such as an onshore site.
A second aspect of the invention provides a method for monitoring one or more individuals present in the control chamber of a plant or vehicle by means of a system according to the first aspect of the invention.
The preferred features of the first aspect of the invention are also applicable to the second aspect of the invention.
The invention will now be further described by way of example only with reference to the accompanying drawings, in which: Figure 1 shows a perspective view of the navigation bridge of a ship using a system according to the first aspect of the invention.
Figure 2 shows a schematic flow chart of the operation of the system.
There are two seating positions 1,2 for watch keepers on the bridge, in front of the radar 11 and automatic radar plotting aid (ARPA) 12 respectively.
There is also a hand steering position 3 as well as a chart table and plotter 10. Observation windows 4 are provided at key locations to allow external observation from the bridge. A control centre for the system of the invention 6 has a system display monitor 13, a keyboard for data input 9, an alarm klaxon 7 and a breathalyser 8 for measuring breath alcohol levels.
Cameras 5 are positioned at locations on the bridge where the watch keepers are likely to carry out their activities when facing the cameras 5, such that facial images can be collected. These locations are over the observation windows 4, over the radar 11 and ARPA 12, at the plotting table 10 and at the hand steering system 3. There is also a camera on the control centre 6 for the system of the invention. O 19
In use, the watch keepers must log in at the control centre 6, typing in their name and a password at the keyboard 9 whilst an image of their face is recorded by the camera 5 on the control centre 6. They must also provide a breathalyser sample. They may also be required to provide data concerning their sleep history over previous days, and time zone change information (for instance if they recently crossed several time zones to join the crew). If a watch keeper does not log in within a permitted time, providing a breath sample at the breathalyser 8, then system will sound the klaxon 7, and display the name of the watch keeper (if this has been identified using facial image analysis by the system) and the head of the watch will need to take appropriate action.
The watch keeper will be provided with an "Initial Safe Watch Keeping Period" upon logging on and entering sleep related data into the system.
This safe watch period will be determined on the initial fatigue level measured for that individual as well as diary data, circadian fatigue based on time of the day and average fatigue deterioration data for that watch keeper.
The cameras 5 monitor the watch keepers as they carry out their tasks, recording facial images and transmitting these to the control centre 6 which will generate a fatigue level for each watch keeper. This fatigue level score will be used to adjust the "Initial Safe Watch Keeping Period" and the updated safe watch period will be displayed. In the event that the fatigue level for a particular watch keeper exceeds a certain value, the system will sound the klaxon 7 and display the name of the relevant offending watch keeper. Again, the head of the watch will need to take appropriate action, such as relieving the fatigued watch keeper, before resetting the alarm.
It will be appreciated that numerous modifications to the above described embodiment may be made without departing from the scope of the invention as defined in the appended claims. For example, the system may simply use O 20 current fatigue deterioration data for the prediction of the safe watch keeping period for each individual.
The described and illustrated embodiments are to be considered as illustrative and not restrictive in character, it being understood that only the preferred embodiments have been shown and described and that all changes and modifications that come within the scope of the inventions as defined in the claims are desired to be protected. It should be understood that while the use of words such as "preferable", "preferably", "preferred" or "more preferred" in the description suggest that a feature so described may be desirable, it may nevertheless not be necessary and embodiments lacking such a feature may be contemplated as within the scope of the invention as defined in the appended claims. In relation to the claims, it is intended that when words such as "a," "an," "at least one," or "at least one portion" are used to preface a feature there is no intention to limit the claim to only one such feature unless specifically stated to the contrary in the claim. When the language "at least a portion" and/or "a portion" is used the item can include a portion and/or the entire item unless specifically stated to the contrary. * 21

Claims (21)

  1. Claims 1. A system for monitoring one or more individuals in a control chamber of a plant or vehicle comprising a means for identifying whether each individual is a permitted individual allowed to access the control chamber characterised in that the means for identifying each permitted individual is by analysis of one or more captured facial images of each permitted individual's face and that the system further comprises a means for measuring a fatigue level for each permitted individual using data derived from the captured facial images.
  2. 2. A system according to claim 1 further comprising a means for predicting a safe watch keeping period from the measured fatigue level for each permitted individual, wherein the measured fatigue level for each permitted individual is predicted to fall to a predetermined unsafe fatigue level for that permitted individual by the end of the safe watch keeping period.
  3. 3. A system according to any claim 1 or claim 2 comprising i) a database comprising facial image, name data and the predetermined unsafe fatigue level for each permitted individual and optionally baseline eye movement data for each permitted individual, ii) one or more cameras adapted to capture images of the faces of each individual in the control chamber, wherein each captured facial image is provided with a time marker, iii) a means for tracking which captured facial images correspond to which individuals present in the control chamber, iv) a means for comparing captured facial images of individuals present in the control chamber with facial images of permitted individuals from the O 22 database whereby each captured facial image can be provided with a name marker identifying each identified permitted individual, or an indication that identification is not possible.
    v) a means for sequencing the captured facial images having the same name marker in time order by means of the time markers whereby eye and/or eyelid movement of each identified permitted individual can be measured, and optionally compared with baseline eye data for each identified permitted individual, to provide the measured fatigue level for each identified permitted individual.
  4. 4. A system according to claim 3 further comprising a means for monitoring current fatigue level deterioration data for each identified permitted individual and wherein the means for predicting a safe watch keeping period takes into account the current fatigue level deterioration data for each identified permitted individual.
  5. 5. A system according to claim 4 comprising a means for storing in the database current fatigue level deterioration data for each identified permitted individual to provide a database of historical fatigue level deterioration data for each permitted individual, and a means for calculating average fatigue level deterioration data for each identified permitted individual from stored historical fatigue level deterioration data for each identified permitted individual wherein the means for predicting a safe watch keeping period takes into account the average fatigue level deterioration data for each identified permitted individual.
  6. 6. A method according to claim 5 wherein the means for predicting the safe watch keeping period for each identified permitted individual takes into account both the average fatigue level deterioration data and the current fatigue level deterioration data for each identified permitted individual.
  7. 7. A system according to any one of claims 3 to 6 wherein the system further comprises a means for collecting, and storing in the database, fatigue-related diary data for each permitted individual and wherein the means for predicting a safe watch keeping period takes into account the fatigue-related diary data for each identified permitted individual.
  8. 8. A system according to any one of claims 2 to 7 wherein the means for predicting a safe watch keeping period takes into account time of day.
  9. 9. A system according to any one of claims 3 to 8 wherein the system further comprises a means for updating the facial images of permitted individuals and optionally baseline eye movement data for each permitted individual in the database.
  10. 10. A system according to any preceding claim wherein each permitted individual present in the control chamber is required to have registered their presence in the control chamber before entry or within a predetermined registration time period.
  11. 11. A system according to any preceding claim wherein an intruder alarm state is generated when the system cannot identify an individual present in the chamber from the captured facial images within a predetermined identification time period from entry.
  12. 12. A system according to claim 13 wherein a captured facial image of an unidentified individual is transmitted to a remote location when the intruder alarm state is generated.
  13. 13. A system according to claim 11 or claim 12 comprising a voice recording means in the control chamber and wherein captured voice recordings from the voice recording means are transmitted at regular intervals to a remote location when the intruder alarm state is generated. * 24
  14. 14. A system according to any one of claims 2 to 13 wherein the system further comprises a means for collecting and recording an intoxicant level for each identified permitted individual present in the control chamber.
  15. 15. A system according to claim 14 wherein the means for predicting a safe watch-keeping period takes into account the intoxicant level for each identified permitted individual.
  16. 16. A system according to claims 14 or claim 15 wherein the system generates an intoxication alarm state if the intoxicant level of an identified permitted individual exceeds a predetermined maximum intoxicant level.
  17. 17. A system according to any preceding claim wherein the system generates a fatigue alarm state if the measured fatigue level of an identified permitted individual exceeds a predetermined unsafe fatigue level for that individual.
  18. 18. A system according to any preceding claim wherein the system generates an alarm state when, over a predetermined length of time, no permitted person has been identified as being present in the control chamber.
  19. 19. A system according to any preceding claim wherein the control chamber is the bridge of a ship.
  20. 20. A system according to any preceding claim wherein a current copy of the database is duplicated in a data recorder adapted to survive a disaster event.
  21. 21. A method for monitoring one or more individuals present in the control chamber of a plant or vehicle by means of a system according to any preceding claim.
    21. A system substantially as hereinbefore described with reference to and as shown in the accompanying figures. * 25
    22. A method for monitoring one or more individuals present in the control chamber of a plant or vehicle by means of a system according to any preceding claim.
    Amendments to the claims have been filed as follows Claims 1. A system for monitoring one or more individuals in a control chamber of a plant or vehicle comprising a means for identifying whether each individual is a permitted individual allowed to access the control chamber characterised in that the means for identifying each permitted individual is by analysis of one or more captured facial images of each permitted individual's face and that the system further comprises a means for measuring a fatigue level for each permitted individual using data derived from the captured facial images and a means for predicting a safe watch keeping period from the measured fatigue level for each permitted individual, wherein the measured fatigue level for each permitted individual is predicted to fall to a predetermined unsafe fatigue level for that permitted individual by the end of the safe watch keeping period.
    2. A system according to any claim 1 comprising i) a database comprising facial image, name data and the predetermined unsafe fatigue level for each permitted individual and optionally baseline eye movement data for each permitted individual, ii) one or more cameras adapted to capture images of the faces of each individual in the control chamber, wherein each captured facial image is provided with a time marker, iii) a means for tracking which captured facial images correspond to which individuals present in the control chamber, iv) a means for comparing captured facial images of individuals present in the control chamber with facial images of permitted individuals from the database whereby each captured facial image can be provided with a name marker identifying each identified permitted individual, or an indication that identification is not possible.
    v) a means for sequencing the captured facial images having the same name marker in time order by means of the time markers whereby eye and/or eyelid movement of each identified permitted individual can be measured, and optionally compared with baseline eye data for each identified permitted individual, to provide the measured fatigue level for each identified permitted individual.
    3. A system according to claim 2 further comprising a means for monitoring current fatigue level deterioration data for each identified permitted individual and wherein the means for predicting a safe watch keeping period takes into account the current fatigue level deterioration data for each identified permitted individual.
    4. A system according to claim 3 comprising a means for storing in the database current fatigue level deterioration data for each identified permitted individual to provide a database of historical fatigue level deterioration data for each permitted individual, and a means for calculating average fatigue level deterioration data for each identified permitted individual from stored historical fatigue level deterioration data for each identified permitted individual wherein the means for predicting a safe watch keeping period takes into account the average fatigue level deterioration data for each identified permitted individual.
    5. A method according to claim 4 wherein the means for predicting the safe watch keeping period for each identified permitted individual takes into account both the average fatigue level deterioration data and the current fatigue level deterioration data for each identified permitted individual.
    6. A system according to any one of claims 2 to 5 wherein the system further comprises a means for collecting, and storing in the database, fatigue-related diary data for each permitted individual and wherein the means for predicting a safe watch keeping period takes into account the fatigue-related diary data for each identified permitted individual.
    7. A system according to any one of claims 1 to 6 wherein the means for predicting a safe watch keeping period takes into account time of day.
    8. A system according to any one of claims 2 to 7 wherein the system further comprises a means for updating the facial images of permitted individuals and optionally baseline eye movement data for each permitted individual in the database.
    9. A system according to any preceding claim wherein each permitted individual present in the control chamber is required to have registered their presence in the control chamber before entry or within a predetermined registration time period.
    10. A system according to any preceding claim wherein an intruder alarm state is generated when the system cannot identify an individual present in the chamber from the captured facial images within a predetermined identification time period from entry.
    11. A system according to claim 10 wherein a captured facial image of an unidentified individual is transmitted to a remote location when the intruder alarm state is generated.
    12. A system according to claim 10 or claim 11 comprising a voice recording means in the control chamber and wherein captured voice recordings from the voice recording means are transmitted at regular intervals to a remote location when the intruder alarm state is generated.
    13. A system according to any one of claims 1 to 12 wherein the system further comprises a means for collecting and recording an intoxicant level for each identified permitted individual present in the control chamber.
    14. A system according to claim 13 wherein the means for predicting a safe watch-keeping period takes into account the intoxicant level for each identified permitted individual.
    15. A system according to claims 13 or claim 14 wherein the system generates an intoxication alarm state if the intoxicant level of an identified permitted individual exceeds a predetermined maximum intoxicant level.
    16. A system according to any preceding claim wherein the system generates a fatigue alarm state if the measured fatigue level of an identified permitted individual exceeds a predetermined unsafe fatigue level for that individual.
    17. A system according to any preceding claim wherein the system generates an alarm state when, over a predetermined length of time, no permitted person has been identified as being present in the control chamber.
    18. A system according to any preceding claim wherein the control chamber is the bridge of a ship.
    19. A system according to any preceding claim wherein a current copy of the database is duplicated in a data recorder adapted to survive a disaster event.
    20. A system substantially as hereinbefore described with reference to and as shown in the accompanying figures.
GB0722986A 2007-11-23 2007-11-23 Fatigue monitoring using facial images Withdrawn GB2454916A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB0722986A GB2454916A (en) 2007-11-23 2007-11-23 Fatigue monitoring using facial images
PCT/GB2008/051093 WO2009066109A1 (en) 2007-11-23 2008-11-20 Fatigue monitoring and intruder alert system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0722986A GB2454916A (en) 2007-11-23 2007-11-23 Fatigue monitoring using facial images

Publications (2)

Publication Number Publication Date
GB0722986D0 GB0722986D0 (en) 2008-01-02
GB2454916A true GB2454916A (en) 2009-05-27

Family

ID=38925938

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0722986A Withdrawn GB2454916A (en) 2007-11-23 2007-11-23 Fatigue monitoring using facial images

Country Status (2)

Country Link
GB (1) GB2454916A (en)
WO (1) WO2009066109A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012076642A1 (en) 2010-12-10 2012-06-14 Nagravision S.A. Method and device to speed up face recognition
US10514553B2 (en) 2015-06-30 2019-12-24 3M Innovative Properties Company Polarizing beam splitting system

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101814136B (en) * 2010-02-11 2012-12-05 华南理工大学 Napping behavior detection method based on fast Fourier transform
CN103413398A (en) * 2013-08-01 2013-11-27 江苏海事职业技术学院 Duty supervising device
CA2937045C (en) * 2014-01-29 2020-07-14 Dignity Health Systems and methods for using eye movements to determine states
EP3034374B1 (en) * 2014-12-19 2017-10-25 Volvo Car Corporation Vehicle safety arrangement, vehicle and a method for increasing vehicle safety
TWI582728B (en) * 2015-11-20 2017-05-11 致伸科技股份有限公司 Fatigue-warning system
US10293830B2 (en) 2016-11-07 2019-05-21 Honeywell International Inc. Systems and methods for recognizing and analyzing emotional states of a vehicle operator
CN106725364B (en) * 2016-12-07 2020-08-07 中国民用航空总局第二研究所 Controller fatigue detection method and system based on probability statistical method
US10055963B1 (en) 2017-02-07 2018-08-21 Honeywell International Inc. On-duty/off-duty work alternation planning based on sensed physiological and activity parameters
WO2019107167A1 (en) * 2017-11-30 2019-06-06 パナソニックIpマネジメント株式会社 Image processing device, image processing system, image pickup device, image pickup system, and image processing method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5729619A (en) * 1995-08-08 1998-03-17 Northrop Grumman Corporation Operator identity, intoxication and drowsiness monitoring system and method
US6130617A (en) * 1999-06-09 2000-10-10 Hyundai Motor Company Driver's eye detection method of drowsy driving warning system
WO2002008023A2 (en) * 2000-07-21 2002-01-31 Trw Inc. Application of human facial features recognition to automobile safety
WO2004040531A1 (en) * 2002-10-28 2004-05-13 Morris Steffin Method and apparatus for detection of drownsiness and for monitoring biological processes
EP1452127A1 (en) * 2003-02-28 2004-09-01 Agilent Technologies, Inc. Apparatus for detecting pupils
US20060072792A1 (en) * 2004-09-29 2006-04-06 Aisin Seiki Kabushiki Kaisha Driver monitoring system for vehicle

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10218676B4 (en) * 2002-04-26 2006-05-11 Deutsches Zentrum für Luft- und Raumfahrt e.V. On-board computer in a vehicle
US20060012679A1 (en) * 2004-07-14 2006-01-19 Ressler Galen E Multifunction vehicle interior imaging system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5729619A (en) * 1995-08-08 1998-03-17 Northrop Grumman Corporation Operator identity, intoxication and drowsiness monitoring system and method
US6130617A (en) * 1999-06-09 2000-10-10 Hyundai Motor Company Driver's eye detection method of drowsy driving warning system
WO2002008023A2 (en) * 2000-07-21 2002-01-31 Trw Inc. Application of human facial features recognition to automobile safety
WO2004040531A1 (en) * 2002-10-28 2004-05-13 Morris Steffin Method and apparatus for detection of drownsiness and for monitoring biological processes
EP1452127A1 (en) * 2003-02-28 2004-09-01 Agilent Technologies, Inc. Apparatus for detecting pupils
US20060072792A1 (en) * 2004-09-29 2006-04-06 Aisin Seiki Kabushiki Kaisha Driver monitoring system for vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Real-Time Eye, Gaze, and Face Pose Tracking for Monitoring Driver Vigilance", Qiang Ji & Xiaojie Yang, Real-Time Imaging, Volume 8, pages 357-377, 2002 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012076642A1 (en) 2010-12-10 2012-06-14 Nagravision S.A. Method and device to speed up face recognition
US9740913B2 (en) 2010-12-10 2017-08-22 Nagravision S.A. Method and device to speed up face recognition
US10192101B2 (en) 2010-12-10 2019-01-29 Nagravision S.A. Method and device to speed up face recognition
US10909350B2 (en) 2010-12-10 2021-02-02 Nagravision S.A. Method and device to speed up face recognition
US11783561B2 (en) 2010-12-10 2023-10-10 Nagravision S.A. Method and device to speed up face recognition
EP2490151A1 (en) * 2011-02-17 2012-08-22 Nagravision S.A. Method and device to speed up face recognition
US10514553B2 (en) 2015-06-30 2019-12-24 3M Innovative Properties Company Polarizing beam splitting system
US11061233B2 (en) 2015-06-30 2021-07-13 3M Innovative Properties Company Polarizing beam splitter and illuminator including same
US11693243B2 (en) 2015-06-30 2023-07-04 3M Innovative Properties Company Polarizing beam splitting system

Also Published As

Publication number Publication date
GB0722986D0 (en) 2008-01-02
WO2009066109A1 (en) 2009-05-28

Similar Documents

Publication Publication Date Title
GB2454916A (en) Fatigue monitoring using facial images
US6497658B2 (en) Alarm upon detection of impending sleep state
CN109528219A (en) System for monitoring operation person
EP1853155B1 (en) Measuring alertness
Hartley et al. Review of fatigue detection and prediction technologies
US7027621B1 (en) Method and apparatus for operator condition monitoring and assessment
US20210034053A1 (en) Pilot Health and Alertness Monitoring, Communication and Safeguarding
EP1799106B1 (en) Method for generating an indication of a level of vigilance of an individual
CN113994403A (en) Mitigating operational risks of an aircraft
WO2006000166A1 (en) Method and device for detecting operator fatigue or quality
Choudhary et al. A survey paper on drowsiness detection & alarm system for drivers
Bittner et al. Detecting of fatigue states of a car driver
EP3939020A2 (en) Mitigating operational risk in aircraft
Jiang et al. Correlation evaluation of pilots’ situation awareness in bridge simulations via eye-tracking technology
Butlewski et al. Psychomotor performance monitoring system in the context of fatigue and accident prevention
Puspasari et al. Ocular indicators as fatigue detection instruments for Indonesian drivers
KR102528032B1 (en) Method and system for checking fatigue of pilot before flying
Damousis et al. Physiological indicators based sleep onset prediction for the avoidance of driving accidents
Wright et al. Involuntary sleep during civil air operations: wrist activity and the prevention of sleep
Gruyer et al. The use of belief theory to assess driver’s vigilance
US11593734B2 (en) System and method for management and support of workplace
WO2024089479A1 (en) Method and system for early recognition of accident-prone individuals through neural-based safety in the jobs involved extreme safety hazards
US20230293090A1 (en) Neurophysiological assessment, identification, permission control, monitoring, and notification system for covid-19
Culp et al. Driver alertness monitoring techniques: a literature review
Young et al. Fatigue: identification, management and countermeasures

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)