WO2009134784A2 - Real time system for readiness monitoring of dismounted soldiers - Google Patents

Real time system for readiness monitoring of dismounted soldiers Download PDF

Info

Publication number
WO2009134784A2
WO2009134784A2 PCT/US2009/041954 US2009041954W WO2009134784A2 WO 2009134784 A2 WO2009134784 A2 WO 2009134784A2 US 2009041954 W US2009041954 W US 2009041954W WO 2009134784 A2 WO2009134784 A2 WO 2009134784A2
Authority
WO
WIPO (PCT)
Prior art keywords
eye
system
fatigue
configured
person
Prior art date
Application number
PCT/US2009/041954
Other languages
French (fr)
Other versions
WO2009134784A3 (en
Inventor
Theodore W. Berger
Alireza A. Dibazar
Ali Yousefi
Original Assignee
University Of Southern California
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US4831708P priority Critical
Priority to US61/048,317 priority
Application filed by University Of Southern California filed Critical University Of Southern California
Publication of WO2009134784A2 publication Critical patent/WO2009134784A2/en
Publication of WO2009134784A3 publication Critical patent/WO2009134784A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00597Acquiring or recognising eyes, e.g. iris verification
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal operating condition and not elsewhere provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms

Abstract

A fatigue monitoring system may monitor the fatigue of a person in real time. An electronic camera may be configured to capture substantially clear, sequential images of an area occupied by an eye of the person while in close proximity to the eye. A camera support system may be attached to the electronic camera. The camera support system may be configured to be worn by the person and to position the camera while being worn by the person in close proximity to and substantially in front of the eye. This position may enable the camera to capture substantially clear, sequential images of the area occupied by the eye. An image processing system may be configured to extract information from the images indicative of the fatigue. A transmitter system may be configured to wirelessly transmit the information indicative of the fatigue to a location remote from the person.

Description

REAL TIME SYSTEM FOR READINESS MONITORING OF DISMOUNTED SOLDIERS

CROSS-REFERENCE TO RELATED APPLICATION(S)

[0001] This application is based upon and claims priority to U.S. Provisional Patent Application number 61/048,317, entitled "REAL-TIME SYSTEM FOR READINESS MONITORING OF DISMOUNTED SOLDIERS," filed April 28, 2008, attorney docket number 028080-0343, the entire content of which is incorporated herein by reference.

BACKGROUND

TECHNICAL FIELD

[0002] This disclosure relates to monitoring fatigue in soldiers and other persons.

DESCRIPTION OF RELATED ART

[0003] A threat may be posed by fatigue in soldiers, as well as in other individuals who may be moving about, such as emergency medical service providers and factory workers. A high level of cognitive readiness, for example, may be required in connection with computerized surveillance and reconnaissance systems, complex communications and medical devices, highly interactive computerized systems, and technologically advanced diagnostic systems used in the maintenance of equipment.

[0004] The efficiency of personnel in sustained operations may also be degraded by inadequate cognitive fitness. This may cause an increase in reaction time, a mood decline, a perceptual disturbance, a motivational decrement, impaired attention, short-term memory loss, carelessness, reduced physical endurance, degraded verbal communication skill, and/or impaired judgment. SUMMARY

[0005] A fatigue monitoring system may monitor the fatigue of a person in real time. An electronic camera may be configured to capture substantially clear, sequential images of an area occupied by an eye of the person while in close proximity to the eye. A camera support system may be attached to the electronic camera. The camera support system may be configured to be worn by the person and to position the camera while being worn by the person in close proximity to and substantially in front of the eye. This position may enable the camera to capture substantially clear, sequential images of the area occupied by the eye. An image processing system may be configured to extract information from the images indicative of the fatigue. A transmitter system may be configured to wirelessly transmit the information indicative of the fatigue to a location remote from the person.

[0006] The electronic camera may have a volume of less than 100 cubic millimeters.

[0007] The camera support system may include a helmet, pair of glasses, and/or a heads-up display, and the electronic camera may be attached to the helmet, the pair of glasses, and/or the heads-up display.

[0008] The camera support system may be configured to position the electronic camera within no more than two inches from an outer surface of the eye.

[0009] The camera support system may be configured to position the electronic camera offset from the central viewing axis of the eye at an angle of between twenty and sixty degrees.

[0010] The image processing system may include an image recognition system configured to extract from each of the images at least one feature related to the eye. One feature which the image recognition system may be configured to extract from each of the images may include the degree of eyelid closure, the location of the pupil, the size of the pupil, the location of the iris, and/or the size of the iris. [0011] The image processing system may include a parameter computation system configured to compute at least one parameter indicative of a change in the images relating to the eye. The parameter may be indicative of the percentage of time the eye remains substantially closed, the frequency of eye blinks, the amplitude of eye blinks, the amount of time to close the eye, a change in the size of the pupil, the velocity of eye movement, and/or whether the person is dead or alive.

[0012] The fatigue monitoring system may include an ambient light sensor. The parameter computation system may be configured to substantially factor out changes in the size of the pupil caused by changes in the ambient light as sensed by the ambient light sensor.

[0013] The transmitter system may be configured to wirelessly transmit information indicative of the fatigue of the person only when the information indicates that the person has reached one or more threshold levels of fatigue.

[0014] The fatigue monitoring system may include an illumination system configured to illuminate the eye when images of the eye are being captured by the electronic camera. The illumination system may be configured to emit infrared light.

[0015] A fatigue monitoring network may monitor the fatigue of each person in a group of persons in real time. The system may include a fatigue monitoring system of any of the types described above for each of the persons or of any other type. A receiver system may be configured to receive and process the information transmitted by each of the transmitter systems.

[0016] A process may delegate tasks among several persons based on their level of fatigue. A fatigue monitoring system may be distributed to each of the persons that is configured to transmit information about the fatigue of the person based on images of the eye of the person when the images indicates that the fatigue of the person exceeds one or more threshold levels. Each of the persons may be directed to wear the fatigue monitoring system while performing a task. A transmission from one of the fatigue monitoring systems may be received while it is being warn by one of the persons indicating that the fatigue of the person has exceeded one or more threshold levels. In response to the received transmission, the person whose fatigue has exceed one or more threshold levels may be directed to cease the task the person was directed to perform.

[0017] Each of the persons may be soldiers and the directing and receiving may be performed by one or more commanders of the soldiers.

[0018] These, as well as other components, steps, features, objects, benefits, and advantages, will now become clear from a review of the following detailed description of illustrative embodiments, the accompanying drawings, and the claims.

BRIEF DESCRIPTION OF DRAWINGS

[0019] The drawings disclose illustrative embodiments. They do not set forth all embodiments. Other embodiments may be used in addition or instead. Details that may be apparent or unnecessary may be omitted to save space or for more effective illustration. Conversely, some embodiments may be practiced without all of the details that are disclosed. When the same numeral appears in different drawings, it is intended to refer to the same or like components or steps.

[0020] Fig. 1 illustrates a fatigue monitoring system for monitoring the fatigue of a person in real time.

[0021] Fig. 2 illustrate a soldier wearing a fatigue monitoring system.

[0022] Fig. 3(a) illustrates an image of an area occupied by an eye with the eyelids, pupil, and iris demarcated by an image recognition system.

[0023] Fig. 3(b) illustrates an image of the area occupied by the eye illustrated in Fig. 3(a) with the eyelids, pupil, and iris demarcated by an image recognition system when the person is fatigued.

[0024] Fig. 4(a) illustrates changes in the degree of eyelid closure in both a normal and fatigued person.

[0025] Fig. 4(b) illustrates changes in the percentage of time an eye remains in a closed position in both a normal and fatigued person.

[0026] Fig. 4(c) illustrates eye blinks in both a normal and fatigued person. [0027] Fig. 4(d) illustrates changes in the radius of a pupil in both a normal and fatigued person.

[0028] Fig. 5 illustrates a fatigue monitoring network for monitoring the fatigue of a group of persons in real time.

[0029] Fig. 6 illustrates a process for delegating tasks among several persons based on their level of fatigue.

DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS

[0030] Illustrative embodiments are now discussed. Other embodiments may be used in addition or instead. Details that may be apparent or unnecessary may be omitted to save space or for a more effective presentation. Conversely, some embodiments may be practiced without all of the details that are disclosed.

[0031] Fig. 1 illustrates a fatigue monitoring system for monitoring the fatigue of a person in real time.

[0032] As illustrated in Fig. 1 , the fatigue monitoring system may include an electronic camera 101. The electronic camera may be configured to capture substantially clear, sequential images of an area occupied by an eye of a person while in close proximity to the eye. The electronic camera may be very small so that it can be close to the eye, without substantially obstructing its view. For example, the electronic camera may occupy a volume of less than 100 cubic millimeters.

[0033] The electronic camera 101 may be of any type. For example, the electronic camera may be a video camera having a CCD. The CCD may have any number of pixels and may be of any size. For example, it may have 320x240 pixels and be 7.5x3.2 millimeters.

[0034] The fatigue monitoring system may include a camera support system 103. The camera support system may be attached to the electronic camera 101. The camera support system 103 may be configured to be worn by the person. The camera support system 103 may be configured to position the electronic camera 101 while being worn by the person in close proximity to and substantially in front of the eye. [0035] The electronic camera 101 may be positioned with respect to the eye while being supported by the camera support system 103 such that the electronic camera 101 may capture substantially clear, sequential images of the area occupied by the eye, including the eye itself.

[0036] The camera support system 103 may be configured to position the electronic camera 101 offset at an angle from the central viewing access of the eye, such as at an angle that is between 20 and 60 degrees. This may help insure that the vision of the eye is not materially blocked by the electronic camera 101 , while still allowing the electronic camera 101 to capture substantially clear, sequential images of the area occupied by the eye.

[0037] The camera support system 103 may be of any type. For example, the camera support system 103 may be or may include a helmet, a pair of glasses, and/or a heads-up display. In the case of a heads-up display, the camera support system 103 may include an LCD display to which the electronic camera 101 is mounted, such as at a corner of the display.

[0038] The electronic camera 101 and the camera support system 103 may be configured such that the entire field of view of the electronic camera 101 is approximately occupied by a single eye of a person, including the eyelids which surround the eye. For example, the camera support system 103 may be configured to position the electronic camera 101 within no more than two inches from an outer surface of the eye. They may instead be configured to provide a larger or smaller field of view.

[0039] The fatigue monitoring system may include an illumination system 1 15. The illumination system may be configured to illuminate the eye when images of the eye are being captured by the electronic camera 101. The illumination system 1 15 may be used at night or when it is dark for any other reason.

[0040] The illumination system 1 15 may be of any type. For example, the illumination system 1 15 may include one or more electronic lights, such as one or more LED's. When positioned closely to the eye, only a single LED may be used. The illumination system 1 15 may be configured to emit infrared light, thus ensuring that the person wearing the fatigue monitoring system is not visually lit while in the dark.

[0041] The illumination system 1 15 may be configured to be worn by the person and. While being worn by the person, the illumination system 1 15 may be configured to be positioned in close proximity to and substantially in front of the eye. For example, the illumination system 1 15 may be positioned within no more than two inches from the outer surface of the eye. The illumination system 1 15 may be configured to be offset at an angle from the central viewing access of the eye, such as at an angle between 20 and 60 degrees. This may help insure that the vision of the eye is not materially blocked by the illumination system 1 15, while still allowing the illumination system 1 15 to illuminate the eye when images of the eye are being captured by the electronic camera 101. The camera support system 103 may be configured to support the illumination system 1 15 in addition to the electronic camera 101. The illumination system 1 15 may instead be supported by a different support system.

[0042] The fatigue monitoring system may include an image processing system 105. The image processing system may be configured to extract information from the images that are provided by the electronic camera 101 indicative of the fatigue of the person.

[0043] The image processing system 105 may include an image recognition system 107. The image recognition system 107 may be configured to extract one or more features related to the eye from each of the images from the electronic camera 101 , including their relative size and/or position. For example, the image recognition system 107 may be configured to extract from each of the images the degree of eyelid closure, the location of the pupil, the size of the pupil, the location of the iris, and/or the size of the iris.

[0044] Fig. 3 (a) illustrates an image of an area occupied by an eye with the eyelids, pupil, and iris demarcated by an image recognition system. This image is an example of one of the sequential images of the area of an eye that may be captured by the electronic camera 101. [0045] To extract the degree of eyelid closure, an upper eyelid curve 301 may be fitted to the lower edge of the upper eyelid and a lower eyelid curve 303 may be fitted to the upper edge of the lower eyelid, as illustrated in Fig. 3(a). The widest separation distance between the upper eyelid curve 301 and the lower eyelid curve 303 may be determined and equated to the degree of eyelid closure.

[0046] To extract the size and location of the pupil, a pupil perimeter circle 305 may be fitted to the pupil, as also illustrated in Fig. 3 (a). The diameter of the pupil perimeter circle 305 may be equated with the size of the pupil. The location of the center of the pupil perimeter circle 305 may be equated with the location of the pupil.

[0047] To extract the location of the iris, an iris perimeter circle 307 may be fitted to the iris, as also illustrated in Fig. 3(a). The center of the iris perimeter circle 307 may be equated with the location of the iris.

[0048] Any type of pattern image recognition technique may be used to extract the desired features from the area occupied by the eye. For example, one or more image processing algorithms may be used to extract the features, such as one or more of the algorithms described in J. G. Daugman, "High confidence visual recognition of persons by a test of statistical independence," IEEE Trans. Pattern Analysis and Machine Intelligence, Vol. 5, No. 1 1 , pp. 1 148-1 161 (Nov 1993), the entire content of which is incorporated herein by reference.

[0049] For finding the pupil and iris boundary, for example, reference may also be made to J. Daugman, and C. Downing, "Effect of Severe Image Compression on Iris Recognition Performance," IEEE Trans. Information Forensic and Security, vol. 3, No. 1 , pp. 570-578, March 2008, and J. G. Daugman, and "High confidence visual recognition of persons by a test of statistical independence," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 5, No. 1 1 , pp. 1148— 1 161 , Nov 1993.J, the content of both of which are incorporated herein by reference. The method may be based on circular edge detection and may have consistent results when the pupil or iris is partially covered by eyelid. [0050] Curves, shapes, and other tracked indicia, other than those which have been described, and pattern recognition techniques other than curve fitting, may be used in addition or instead.

[0051] The image processing system 105 may include a parameter computation system 109. The parameter computation system 109 may be configured to compute one or more parameters, each indicative of one or more changes in the images from the electronic camera 101 relating to the eye. The parameter computation system 109 may be configured to compute these parameters based on one or more changes in one or more of the features which are extracted from the images by the image recognition system 107, including one or more changes in these features over time.

[0052] For example, the parameter computation system 109 may be configured to compute a parameter indicative of the percentage of time the eye remains substantially closed. This computation may be based on an examination of changes in the degree of eyelid closure in each image or in a selected set of the images, as computed by the image recognition system 107.

[0053] The degree of eyelid closure used to qualify as being substantially closed may vary. In some systems, for example, a closure threshold may be defined, such as at 80%, beyond which the eye may be deemed to be closed. The percentage may instead be between 60%-70%, 70%-80%, 80%-90%, or 90%- %100, or any other value.

[0054] The percentage of time the eye remains substantially closed may be computed over a time interval. Examples of such computations are set forth in F.Russo, S. Pitzalis, and D. Spinelli, "Fixation stability and saccadic latency in elite shooters," Vision Res., vol. 43, pp. 1837-1845 (2003), the content of which is incorporated herein by reference.

[0055] The computation may be based upon the following equation:

PC(n) = ±-∑P(n)P(n- nnb)

N N where PC is the percentage of eye closure (also known as PERCLOS), P(n) is the degree of eyelid closure at time index n, nnb is the nominal eye-blink duration time, and N is the measurement time interval. In one embodiment, for example, N may be set to 1800 frames in a one minute time interval.

[0056] The widest distance between the upper eyelid curve 301 and the lower eyelid curve 303 may be 1 -P(t). The average of PC over 10 minutes may be calculated. If the average of PC is more than 80%, for example, then an alert may be sent to a command center, as discussed in more detail below. A different threshold or time period may instead be used, as well as a different computational algorithm or approach.

[0057] Another example of a parameter that may be computed by the parameter computation system 109 is a parameter which is indicative eye blinks. The determination of each eye blink may be based on an examination of changes in the degree of eyelid closure in each image or in a selected set of the images, as computed by the image recognition system 107. One of the closure thresholds discussed above may be used for this purpose.

[0058] Eye blinks may be measured in terms of blink frequency, timing in respect to stimulus presentation, and/or duration of the closing and reopening movement of eyelid.

[0059] The frequency of eyelid closure may increase under fatigue conditions. Normal adult eyes may average 10 blinks per minute. On average, a blink may take about 300 to 400 milliseconds. Blinking rate may decrease during cognitive load and object focusing to 3-4 times per minute. Emotional arousal may increase blinks. A blinking rate (BR) may be measured using the following equations: g(n) = P2 (n) * f(n)

Figure imgf000012_0001

BR(n) = ∑gd(n)

N

[0060] To calculate BR, P(t) may be estimated as explained above. A and f(n) may be a low-pass filter. An alert may be sent to a command center if BR falls below 4 or rises above 20, for example, as discussed in more detail below. Different thresholds may be used instead, as well as a different computational algorithm.

[0061] Another parameter which the parameter computation system 109 may compute may be indicative of the amount of time that is taken to close the eye. Again, this may be based on an examination of changes in the degree of eyelid closure in each image or in a selected set of the images, as computed by the image recognition system 107.

[0062] Average eye closure time ("AECT") may be a reliable measure of fatigue. The eye closure speed of a drowsy person may be distinctively slower than that of an alert person. Using following relation, AECT may be computed:

Figure imgf000013_0001

AECT = numberN (o — > c) + numberN (c — > o) where n0→c is the transient time which eyelid goes from open to closed and nc→o is the transient time which eyelid goes from closed to open. NumberN(o→c) is the frequency of transitioning from open to closed in the time interval N, and numberN(c→o) is the frequency of transitioning from closed to open in the time interval N.

[0063] To calculate AECT in this equation, the time between an open and closed eye may be measured based. An eye may be considered closed if the value of P(n) is less than 20% of its maximum value, for example. Other thresholds, such as those discussed above may be used instead. The eye may be considered open, for example, if its P(n) value is more than 90 % of its maximum value. Again, other threshold values may be used instead. Other algorithms or equations may be used in addition or instead to compute this parameter.

[0064] Another parameter which the parameter computation system 109 may compute may be indicative of the amplitude of eye blinks. Fatigue may cause smaller amplitude eye blinks. Again, this may be based on an examination of changes in the degree of eyelid closure in each image or in a selected set of the images, as computed by the image recognition system 107. [0065] The parameter computation system 109 may be configured to compute a parameter indicative of the velocity of eye movement. This may be based upon an examination of changes in the location of the pupil and/or the iris in each image or in a selected set of images, as computed by the image recognition system 107. The location may be with respect to another point of reference in each image, such as with respect to a corner of the eye.

[0066] One parameter of eye movement velocity which the parameter computation system 109 may be configured to compute is the Saccadic Index. The Saccadic Index may be a vigilance indicator. Saccadic movement may be defined as the speed at which the eye moves when transfering the eye gaze from one fixation point to another. This movement may be initiated voluntary. However, once started, the speed may not be under the individual's conscious control. Saccadic velocity may be sensitive to an increase in sleepiness in response to prolong periods of partial sleep deprivation. Eye movement velocity may thus be a measure of alertness.

[0067] A Saccadic Index may be calculated by the parameter computation system 109 by using the following relation:

Vs (n) = [X1Jn) - X1Jn - 1) Y1Jn) - Y1Jn - Y)]τ

Figure imgf000014_0001

[0068] where (Xιns(n), Yιns(n)) present the iris center location in Cartesian coordinate in the fame n. In general, these equations may measure average velocity (Os) of the iris movement.

[0069] Another parameter which the parameter computation system 109 may be configured to compute is a parameter indicative of changes in the size of the pupil. This may be based upon an examination of changes in the size of the pupil in each image or in a selected set of images, as computed by the image recognition system 107.

[0070] Pupil size changes may correlate with emotion arousal, cognitive load, and environmental light. Pupil size increase may have a correlation with skin conductance reactions, supporting the hypothesis that pupil diameter variations reflect nervous system activities. A decrease in pupil diameter or a slow fluctuation in pupil diameter may coincide with a feeling of fatigue. Similarly, a decrease in saccadic velocity and an increase in pupil constriction latency may correlate with an increase in the rate of crashes in simulated driving.

[0071] After extracting pupil diameter, the rate of variation may be modeled by the following equation:

= d(n) -d(n - l) d(n)

RPϋP(n) = Rt(n) * h(n) where Rpup(n) is the rate of pupil diameter variation at time n, h(n) is a lowpass filter, and d(n) is the pupil diameter.

[0072] The fatigue monitoring system may include an ambient light sensor 1 1 1. The ambient light sensor 1 1 1 may include one or more sensors configured to detect the intensity of the ambient light to which the eye of the person is exposed. One or more of these detectors may be worn in close proximity to the electronic camera 101 , to the eye of the person, and/or at some other location.

[0073] The parameter computation system 109 may be configured to substantially factor out changes in the size of the pupil that may be caused by changes in the ambient light as sensed by the ambient light sensor 1 1 1. In other words, the parameter computation system 109 may be configured to ensure that changes in the size of the pupil that are caused by changes in ambient light are not construed as changes in the level of fatigue of the person. This may be done based on empirical data or on any other basis.

[0074] The parameter computation system 109 may be configured to compute a parameter indicative of whether the person is dead or alive. Such a parameter, for example, may be computed based on a lack of change in the position of the iris or pupil or a lack of change in the degree to which the eyelids are closed in one or more of the images provided by the electronic camera 101 over a period of time. [0075] The fatigue monitoring system may include a transmitter system 1 13. The transmitter system may be configured to wirelessly transmit the information indicative of fatigue of the person to a location that is remote from the person.

[0076] The transmitter system 1 13 may be part of another system. For example, the transmitter system 1 13 may be part of a system that is used to transmit other types of information.

[0077] The transmitter system 1 13 may be of any type. For example, it may transmit via radio wave, micro wave, or light wave. Although being described as wireless, the transmitter system 1 13 may instead be configured to transmit over a wired connection.

[0078] The fatigue monitoring system may have a different combination of components, such as components in addition or instead of the components which have been described above. For example, the fatigue monitoring system may include one or more sources of power, such as one or more batteries and/or solar cells.

[0079] One or more of the various components of the fatigue monitoring systems which have now been described may be packaged in an ultra-compact package, such as a UC1394a-1 processing core. The core may include a series of ready-to-use embedded, peripheral interface devices for integrating hardware, such as the electronic camera 101 , the image processing system 105, and/or the transmitter system 1 13 with each other and/or with other embedded devices. The integrated device may be configures to interface with a FireWire networking environment. The entire core may be implemented as a 1 16 pin PLCC surface mount, multi-chipped module, which may have a size of only 30 mm x 36 mm. All of this hardware may be installed on a single article which may be worn by the person, such as a helmet, glasses, or a head-up display. In addition or instead, some of the components may be packaged in a small package which may be carried by the person, while other components, such as the electronic camera 101 , the illumination system 1 15, and the camera support system 103, may be separately packaged and worn by the person. [0080] The image processing system 105 may be implemented by any type of computational platform, such as by Tl's TMS320C55x DSP hardware generation plus Code Composer Studio (CCStudio) IDE. This platform may include one or more software programs, algorithms, and/or data structures stored on one or more computer-readable storage devices, such as one or more ePROMS, USB hard drives or flash ROM. Other digital video processing platforms may use TMS320DM6437 from Tl DSP Series, ARM1 1 MPCore from ARM Processor series, or embedded single board computes.

[0081] The transmitter system 1 13 may be configured to wirelessly transmit information indicative of the fatigue of the person only when the information indicates that the person has a threshold level of fatigue. Examples of such thresholds are discussed above. The transmitter system 1 13 may be configured to wirelessly transmit information indicative of the fatigue each time it reaches a different threshold level and, in such a situation, to also transmit information indicative of the threshold level of the fatigue which has been reached.

[0082] In another embodiment, the transmitter system 1 13 may be configured to periodically or constantly transmit information indicative of the fatigue level of the person, regardless of the degree of detected fatigue. The transmitter system 1 13 may in addition or instead be configured to transmit information indicative of the fatigue level of the person in response to triggers other than the level of fatigue, such as checking request of eye blinking, closing eye for a period or cleaning the camera.

[0083] Fig. 3(b) illustrates an image of the area occupied by the eye illustrated in Fig. 3(a) with the eyelids, pupil, and iris demarcated by an image recognition system when the person is fatigued. As illustrated in Fig. 3(b), the maximum separation distance between the upper eyelid curve 301 and the lower eyelid curve 303 may be substantially less than as shown in FIG. 3(a). If this semi-closed state of the eyelids continues during a sequence of images, this may be indicative of fatigue.

[0084] Fig. 2 illustrate a soldier wearing a fatigue monitoring system, such as one of the fatigue monitoring systems described above. As illustrated in FIG. 2, a soldier 201 may wear a helmet 203 which may have attached thereto a heads-up display 205. An electronic camera 207 of one of the types discussed above may be mounted on a corner of the heads-up display 205 such that it is facing and capturing the area around the right eye of the soldier 201. Other components of the fatigue monitoring system may be included within the helmet 203 or may be worn by the soldier 201 at other locations. In the event that the electronic camera 207 is separated from one or more of the other components of the fatigue monitoring system, the electronic camera 101 may be configured to communicate with these other components over a wireless communication link.

[0085] Fig. 4(a) illustrates changes in the degree of eyelid closure in both a normal and fatigued person. Fig. 4(b) illustrates changes in the percentage of time an eye remains in a closed position in a both normal and fatigued person. Fig. 4(c) illustrates eye blinks in both a normal and fatigued person. Fig. 4(d) illustrates changes in the radius of a pupil in both a normal and fatigued person.

[0086] Fig. 5 illustrates a fatigue monitoring network for monitoring the fatigue of a group of persons in real time. As illustrated in Fig. 5, one or more soldiers, such as soldiers 501 , 503, 505, and 507, may each be wearing a fatigue monitoring system, such as fatigue monitoring systems 51 1 , 513, 515, and 517, respectively. Each of the fatigue monitoring systems, 51 1 , 513, 515, and 517, may be of one of the types described above or may be of any other type.

[0087] Each of the soldiers 501 , 503, 505, and 507 may be performing tasks under active conditions. They may be dismounted soldiers, or performing a task within a vehicle or aircraft, such as within a C2 platform. The fatigue monitoring systems may in addition or instead be worn by persons performing other types of tasks, such as , emergency medical service providers and/or factory workers.

[0088] A fatigue receiver system 521 may be configured to receive and process the fatigue information transmitted by each of the fatigue monitoring systems 51 1 , 513, 515, and 517.

[0089] The fatigue receiver system 521 may be of any type. For example, the fatigue receiver system 521 may include one or more displays, such as a display 523. The display 523 may be configured to display which of these soldiers have fatigue monitoring systems that have signaled that the fatigue of the soldier has reached one or more threshold levels. In the event that the fatigue monitoring systems constantly or periodically transmits fatigue information, the display 523 may display the latest level of fatigue in each soldier and, in certain embodiments, a history of these levels. Other means of communicating this information may be used in addition or instead.

[0090] The fatigue receiver system 521 may include one or more alarms, such as an alarm 525. The alarm 525 may be configured to issue an alert when one or more soldiers reaches a level of fatigue which is considered dangerous or otherwise undesirable. The alert may be in the form of an audible sound and/or a transmission to another system, such as to a commander at a different location, and/or in any other form.

[0091] Fig. 5 illustrates a process for delegating tasks among several persons based on their level of fatigue. Fatigue monitoring equipment may be distributed to each of a group of persons that is preparing to perform a task, as reflected by a Distribute Fatigue Monitoring Equipment To Person Performing Task step 601. The persons may be soldiers, factory workers, emergency medical workers, and/or persons performing other types of tasks.

[0092] The fatigue monitoring equipment may be any of the types described above or may be any other type. Each of the persons may be directed to wear the fatigue monitoring system while performing a task, as reflected by a Direct Fatigue Monitoring Equipment To Be Worn While Performing Task step 603. This direction may come from a commander, a supervisor, a communication system, or from any other source.

[0093] Each person may wear the fatigue monitoring equipment while performing his or her assigned task, as reflected by Each Person Wears Fatigue Monitoring Equipment While Performing Task step 605.

[0094] During performance of a task, one or more of the persons may experience fatigue, which may be communicated by their fatigue monitoring equipment. Each of these communications may be received, as reflected by a Receive Transmission From Fatigued Person(S) step 607. [0095] Each of the persons who are fatigued may be directed to cease performing the task which he or she has been performing, as directed by a Direct Each Fatigue Person to Cease Performing Task step. Each fatigued person may then stop this task.

[0096] The components, steps, features, objects, benefits and advantages that have been discussed are merely illustrative. None of them, nor the discussions relating to them, are intended to limit the scope of protection in any way. Numerous other embodiments are also contemplated. These include embodiments that have fewer, additional, and/or different components, steps, features, objects, benefits and advantages. These also include embodiments in which the components and/or steps are arranged and/or ordered differently.

[0097] Unless otherwise stated, all measurements, values, ratings, positions, magnitudes, sizes, and other specifications that are set forth in this specification, including in the claims that follow, are approximate, not exact. They are intended to have a reasonable range that is consistent with the functions to which they relate and with what is customary in the art to which they pertain.

[0098] All articles, patents, patent applications, and other publications which have been cited in this application are hereby incorporated herein by reference.

[0099] The phrase "means for" when used in a claim is intended to and should be interpreted to embrace the corresponding structures and materials that have been described and their equivalents. Similarly, the phrase "step for" when used in a claim embraces the corresponding acts that have been described and their equivalents. The absence of these phrases means that the claim is not intended to and should not be interpreted to be limited to any of the corresponding structures, materials, or acts or to their equivalents.

[00100] Nothing that has been stated or illustrated is intended or should be interpreted to cause a dedication of any component, step, feature, object, benefit, advantage, or equivalent to the public, regardless of whether it is recited in the claims.

[00101] The scope of protection is limited solely by the claims that now follow. That scope is intended and should be interpreted to be as broad as is consistent with the ordinary meaning of the language that is used in the claims when interpreted in light of this specification and the prosecution history that follows and to encompass all structural and functional equivalents.

Claims

1. A fatigue monitoring system for monitoring fatigue of a person in real time comprising: an electronic camera configured to capture substantially clear, sequential images of an area occupied by an eye of the person while in close proximity to the eye; a camera support system attached to the electronic camera and configured to be worn by the person and to position the camera while being worn by the person in close proximity to and substantially in front of the eye at a position with respect to the eye that enables the camera to capture the substantially clear, sequential images of the area occupied by the eye; an image processing system configured to extract information from the images indicative of the fatigue of the person; and a transmitter system configured to wirelessly transmit the information indicative of the fatigue of the person to a location remote from the person.
2. The fatigue monitoring system of claim 1 wherein the electronic camera has a volume of less than 100 cubic millimeters.
3. The fatigue monitoring system of claim 1 wherein the camera support system includes a helmet and wherein the electronic camera is attached to the helmet.
4. The fatigue monitoring system of claim 1 wherein the camera support system includes a pair of glasses and wherein the electronic camera is attached to the pair of glasses.
5. The fatigue monitoring system of claim 1 wherein the camera support system includes a heads-up display and wherein the electronic camera is attached to the heads-up display.
6. The fatigue monitoring system of claim 1 wherein the camera support system is configured to position the electronic camera within no more than two inches from an outer surface of the eye.
7. The fatigue monitoring system of claim 1 wherein the camera support system is configured to position the electronic camera offset at an angle from the central viewing axis of the eye of between twenty and sixty degrees.
8. The fatigue monitoring system of claim 1 wherein the image processing system include an image recognition system configured to extract from each of the images at least one feature related to the eye.
9. The fatigue monitoring system of claim 8 wherein the eye is surrounded by two eyelids and wherein the at least one feature which the image recognition system is configured to extract from each of the images includes the degree of eyelid closure.
10. The fatigue monitoring system of claim 8 wherein the eye has a pupil and wherein the at least one feature which the image recognition system is configured to extract from each of the images includes the location of the pupil.
1 1.The fatigue monitoring system of claim 8 wherein the eye has a pupil and wherein the at least one feature which the image recognition system is configured to extract from each of the images includes the size of the pupil.
12. The fatigue monitoring system of claim 8 wherein the eye has an iris and wherein the at least one feature which the image recognition system is configured to extract from each of the images includes the location of the iris.
13. The fatigue monitoring system of claim 8 wherein the eye has an iris and wherein the at least one feature which the image recognition system is configured to extract from each of the images includes the size of the iris.
14. The fatigue monitoring system of claim 1 wherein the image processing system include a parameter computation system configured to compute at least one parameter indicative of a change in the images relating to the eye.
15. The fatigue monitoring system of claim 14 wherein the at least one parameter which the parameter computation system is configured to compute includes a parameter indicative of the percentage of time the eye remains substantially closed.
16. The fatigue monitoring system of claim 14 wherein the at least one parameter which the parameter computation system is configured to compute includes a parameter indicative of the frequency of eye blinks.
17. The fatigue monitoring system of claim 14 wherein the at least one parameter which the parameter computation system is configured to compute includes a parameter indicative of the amplitude of eye blinks.
18. The fatigue monitoring system of claim 14 wherein the at least one parameter which the parameter computation system is configured to compute includes a parameter indicative of the amount of time to close the eye.
19. The fatigue monitoring system of claim 14 wherein the eye has a pupil and wherein at least one parameter which the parameter computation system is configured to compute includes a parameter indicative of a change in the size of the pupil.
20. The fatigue monitoring system of claim 19 wherein: the fatigue monitoring system includes an ambient light sensor; and the parameter computation system is configured to substantially factor out changes in the size of the pupil caused by changes in the ambient light as sensed by the ambient light sensor.
21.The fatigue monitoring system of claim 14 wherein at least one parameter which the parameter computation system is configured to compute includes a parameter indicative of the velocity of eye movement.
22. The fatigue monitoring system of claim 14 wherein at least one parameter which the parameter computation system is configured to compute includes a parameter indicative of whether the person is dead or alive.
23. The fatigue monitoring system of claim 1 wherein the transmitter system configured to wirelessly transmit information indicative of the fatigue of the person only when the information indicates that the person has reached one or more threshold levels of fatigue.
24. The fatigue monitoring system of claim 1 further comprising an illumination system configured to illuminate the eye when images of the eye are being captured by the electronic camera.
25. The fatigue monitoring system of claim 24 wherein illumination system is configured to emit infrared light.
26. A fatigue monitoring network for monitoring fatigue of each person in a group of persons in real time comprising: for each of the persons: an electronic camera configured to capture substantially clear, sequential images of an area occupied by an eye of the person while in close proximity to the eye; a camera support system attached to the electronic camera and configured to be worn by the person and to position the camera while being worn by the person in close proximity to and substantially in front of the eye at a position with respect to the eye that enables the camera to capture the substantially clear, sequential images of the area occupied by the eye; an image processing system configured to extract information from the images indicative of the fatigue of the person; and a transmitter system configured to wirelessly transmit the information indicative of the fatigue of the person to a location remote from the person; and a receiver system configured to receive and process the information transmitted by each of the transmitter systems.
27. A process for delegating tasks among several persons based on their level of fatigue comprising: distributing a fatigue monitoring system to each of the persons that is configured to transmit information about the fatigue of the person based on images of the eye of the person when the images indicates that the fatigue of the person exceeds one or more threshold levels; directing each of the persons to wear the fatigue monitoring system while performing a task; receiving a transmission from one of the fatigue monitoring systems while it is being warn one of the persons indicating that the fatigue of the person has exceeded one or more threshold levels; and directing that the person whose fatigue has exceed one or more threshold levels to cease the task the person was directed to perform in response to the received transmission.
28. The process of claim 27 wherein each of the persons are soldiers and wherein the directing and receiving is performed by one or more commanders of the soldiers.
PCT/US2009/041954 2008-04-28 2009-04-28 Real time system for readiness monitoring of dismounted soldiers WO2009134784A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US4831708P true 2008-04-28 2008-04-28
US61/048,317 2008-04-28

Publications (2)

Publication Number Publication Date
WO2009134784A2 true WO2009134784A2 (en) 2009-11-05
WO2009134784A3 WO2009134784A3 (en) 2010-01-21

Family

ID=41255728

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2009/041954 WO2009134784A2 (en) 2008-04-28 2009-04-28 Real time system for readiness monitoring of dismounted soldiers

Country Status (1)

Country Link
WO (1) WO2009134784A2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8187002B2 (en) 2010-06-29 2012-05-29 Mcneil-Ppc, Inc. Method for cleaning the oral cavity
US8186997B2 (en) 2010-06-29 2012-05-29 Mcneil-Ppc, Inc. Method for cleaning the oral cavity
US8314377B2 (en) 2009-12-23 2012-11-20 Mcneil-Ppc, Inc. Device and method for detecting plaque in the oral cavity
US8512040B2 (en) 2010-06-29 2013-08-20 Mcneil-Ppc, Inc. Device and method for cleaning the oral cavity
WO2016123030A1 (en) * 2015-01-30 2016-08-04 Raytheon Company Wearable retina/iris scan authentication system
CN109151183A (en) * 2018-07-31 2019-01-04 珠海格力电器股份有限公司 A kind of audio-visual application program automatic closing method, device and mobile terminal

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040070509A1 (en) * 2002-10-11 2004-04-15 Richard Grace Apparatus and method of monitoring a subject and providing feedback thereto
US20040170304A1 (en) * 2003-02-28 2004-09-02 Haven Richard Earl Apparatus and method for detecting pupils
US7027621B1 (en) * 2001-03-15 2006-04-11 Mikos, Ltd. Method and apparatus for operator condition monitoring and assessment
US20070273611A1 (en) * 2004-04-01 2007-11-29 Torch William C Biosensors, communicators, and controllers monitoring eye movement and methods for using them

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7027621B1 (en) * 2001-03-15 2006-04-11 Mikos, Ltd. Method and apparatus for operator condition monitoring and assessment
US20040070509A1 (en) * 2002-10-11 2004-04-15 Richard Grace Apparatus and method of monitoring a subject and providing feedback thereto
US20040170304A1 (en) * 2003-02-28 2004-09-02 Haven Richard Earl Apparatus and method for detecting pupils
US20070273611A1 (en) * 2004-04-01 2007-11-29 Torch William C Biosensors, communicators, and controllers monitoring eye movement and methods for using them

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8314377B2 (en) 2009-12-23 2012-11-20 Mcneil-Ppc, Inc. Device and method for detecting plaque in the oral cavity
US8187002B2 (en) 2010-06-29 2012-05-29 Mcneil-Ppc, Inc. Method for cleaning the oral cavity
US8186997B2 (en) 2010-06-29 2012-05-29 Mcneil-Ppc, Inc. Method for cleaning the oral cavity
US8512040B2 (en) 2010-06-29 2013-08-20 Mcneil-Ppc, Inc. Device and method for cleaning the oral cavity
US8702422B2 (en) 2010-06-29 2014-04-22 Mcneil-Ppc, Inc. Device and method for cleaning the oral cavity
WO2016123030A1 (en) * 2015-01-30 2016-08-04 Raytheon Company Wearable retina/iris scan authentication system
US9747500B2 (en) 2015-01-30 2017-08-29 Raytheon Company Wearable retina/iris scan authentication system
CN109151183A (en) * 2018-07-31 2019-01-04 珠海格力电器股份有限公司 A kind of audio-visual application program automatic closing method, device and mobile terminal

Also Published As

Publication number Publication date
WO2009134784A3 (en) 2010-01-21

Similar Documents

Publication Publication Date Title
JP3881655B2 (en) Arousal monitoring device
Lee et al. Driver alertness monitoring using fusion of facial features and bio-signals
US6346887B1 (en) Eye activity monitor
CN104427960B (en) Self-adaptive visual servicing unit
US20100056878A1 (en) Indirectly coupled personal monitor for obtaining at least one physiological parameter of a subject
KR101939888B1 (en) Body position optimization and bio-signal feedback for smart wearable devices
CA2613999C (en) Alertness sensing spectacles
US5573006A (en) Bodily state detection apparatus
JP2014515291A (en) System and method for measuring head, eye, eyelid and pupil response
EP2054007B1 (en) Assistance system for visually handicapped persons
JP2016538097A (en) Consumer biometric devices
US20160070122A1 (en) Computerized replacement temple for standard eyewear
US9298985B2 (en) Physiological biosensor system and method for controlling a vehicle or powered equipment
US7347551B2 (en) Optical system for monitoring eye movement
CN102670163B (en) The system and method for controlling calculation device
USRE42471E1 (en) System and method for monitoring eye movement
US8929589B2 (en) Systems and methods for high-resolution gaze tracking
US6542081B2 (en) System and method for monitoring eye movement
US6163281A (en) System and method for communication using eye movement
US9994228B2 (en) Systems and methods for controlling a vehicle or device in response to a measured human response to a provocative environment
KR20170057313A (en) Methods and apparatus for monitoring alertness of an individual utilizing a wearable device and providing notification
US9406211B2 (en) Wearable posture regulation system and method to regulate posture
CN104094591A (en) The image display apparatus
CN109147928A (en) Employee is shown as by augmented reality, and system, computer media and the computer implemented method of health and fitness information are provided
CN101742981A (en) Wearable mini-size intelligent healthcare system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09739586

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09739586

Country of ref document: EP

Kind code of ref document: A2