WO2002082999A1 - Image analysis system and method for discriminating movements of an individual - Google Patents

Image analysis system and method for discriminating movements of an individual Download PDF

Info

Publication number
WO2002082999A1
WO2002082999A1 PCT/US2002/010642 US0210642W WO02082999A1 WO 2002082999 A1 WO2002082999 A1 WO 2002082999A1 US 0210642 W US0210642 W US 0210642W WO 02082999 A1 WO02082999 A1 WO 02082999A1
Authority
WO
WIPO (PCT)
Prior art keywords
system
images
sequence
movement
captured
Prior art date
Application number
PCT/US2002/010642
Other languages
French (fr)
Inventor
Richard J. Littlefield
Paul D. Whitney
Alan Edward Ii Turner
James J. Thomas
Kenneth A. Perrine
Harlan P. Foote
Original Assignee
Battelle Memorial Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US28279301P priority Critical
Priority to US60/282,793 priority
Priority to US94478001A priority
Priority to US09/944,780 priority
Application filed by Battelle Memorial Institute filed Critical Battelle Memorial Institute
Publication of WO2002082999A1 publication Critical patent/WO2002082999A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4094Diagnosing or monitoring seizure diseases, e.g. epilepsy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00335Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00362Recognising human body or animal bodies, e.g. vehicle occupant, pedestrian; Recognising body parts, e.g. hand

Abstract

A method and system for automatically detecting and interpreting the motion of a subject from sequences of images captured from continuous long-term physiological recordings for improving the discrimination of epileptic seizures and non-epileptic events of the recorded subject. A discrimination function detects and characterises the periodic movement of the subject in which rapid and jerky periodic movement is typical of the clonic phase of many seizures, and smoother and smaller periodic movements are typical of innocent behaviours, such as scratching, rubbing, hair-brushing, and the like. Accordingly, the discrimination function improves the identification of epileptic seizures and non-epileptic events, thereby reducing the number of EEG signal artifacts that produce spurious alarms in EEG-based alarm systems.

Description

IMAGE ANALYSIS SYSTEM AND METHOD FOR DISCRIMLNATING MOVEMENTS OF AN INDIVIDUAL

The present invention generally relates to image processing, and more particularly to a method and system for automatically detecting and improving the discrimination of seizures and seizure-like events in long-term physiological recordings of an individual.

Classification plays an essential role in the diagnosis and management of epileptic disorders. Diagnosis of a specific syndrome may have important therapeutic and prognostic implications. Because the seizure type determines the choice of an antiepileptic drug, the differential diagnosis is essential for determining appropriate drug therapy. Accordingly, before deeming seizures intractable, it is necessary to be certain that a patient used the correct drugs in the correct amounts.

To help classify a specific epileptic syndrome, long-term video- electroencephalogram (EEG) monitoring is clinically useful for those patients with seizures for whom a diagnosis could not be made on the basis of a neurological examination, routine EEG studies, and ambulatory cassette EEG monitoring. In long- term video-EEG monitoring, a patient's EEG is continuously recorded along with pictures and audio taken by closed-circuit television camera(s) in the patient's room for an extended period of time. The purpose of these extended recordings is to provide simultaneous documentation of the clinical manifestations correlated to EEG changes during the patient's ictal events or seizures. Because these seizures occur irregularly, such recording periods can last on average three to four days in order to capture a determinative amount of physiological readings.

With video-EEG monitoring sessions lasting several days, large volumes of data are produced. Much of this data, although monitored by experienced nurses and computers, still require manual analysis by epileptologists in order to identify behaviors and EEG patterns that indicate the patient's ictal events. Accordingly, much effort has been made in the field of physiological monitoring to computerize the identification and classification of ictal events in the EEG readings. Such a method is discussed by U.S. Pat. Nos. 5,047,930 and 5,299,118 both to Martens et al, which are herein incorporated by reference.

The references to Martens et al. disclose an analysis system that accepts physiological sensor signals, including electroencephalogram (EEG) and other signals commonly sensed in sleep analysis, and stores the raw data for later retrieval. The analysis system then extracts features from the digitized signal data which are of a type apt to be significant in classifying and detecting physiological functions (such as the state of sleep) and matches the extracted features to patterns which indicate the type of feature that has been extracted. The matched features are then utilized to classify for each epoch or limited period of time, the state of the physiological function, such as the stage of sleep, for that epoch which is displayed to a user of the system.

However, such conventional polygraphic physiological data analysis systems have only limited value in discriminating the difference between seizure and non-seizure activity, such as scratching, rubbing, hair-brushing, and the like. With these systems, such non-seizure activities will generate misleading EEG signals called artifacts, which often produce spurious alarms in EEG-based alarm systems. Failing to identify and eliminate such non-seizure activities in such voluminous data makes the process of manually determining actual seizure events an extremely arduous and challenging task.

Additionally, because such conventional screening techniques require both a high degree of sophistication in equipment and extensive use of manpower, few medical centers offer this type of intensive monitoring.

Accordingly, there is a continuing need in the medical profession for a method and system, which would automate the designation of events that might represent epileptic and nonepileptic seizures in order to augment conventional long-term physiological data analysis methods and systems. Additionally, there is a need to reduce the number of false negative responses in conventional epilepsy diagnostic methods.

Furthermore, there is a need to increase the detection of abnormal motion or seizure like behavior.

Moreover, there is a need to improve the diagnostic sensitivity of patient monitoring in order to provide a better diagnosis.

SUMMARY OF THE INVENTION

The above-mentioned needs are met by the present invention wherein provided is an image processing system and method for detecting and improving the discrimination of seizure and seizure-like events associated with the epilepsy monitoring of an individual. The present invention focuses on the detection and characterization of periodic movement of a patient in which rapid and jerky periodic movement is typical of the clonic phase of many seizures, and smoother and smaller periodic movements are typical of innocent behaviors. The purpose of the discrimination function is to identify the difference between seizure and non-seizure activity, such as scratching, rubbing, hair- brushing, and the like, that generate misleading EEG signals, called artifacts, that can produce spurious alarms in EEG-based alarm systems.

Successive images of a region of interest from a monitored individual are captured by an optical sensor over time. The optical sensor outputs a signal representing the captured image, which is then processed to determine the monitored individual's motion and posture in the region of interest as a function of time. In particular, the region of interest is partitioned into a multitude of spatial regions, such as blocks of a pixel dimension (e.g., 32x32 pixels within a 640-by 480 or 720-by-480 image, overlapping on a center of 16x16 pixels). For each region, a temporal signal is generated that reflects movement or the degree of change in the pixel patterns as a function of time. Any known method for detecting change/movement may be used, such as a conventional statistical correlation function applied between the gray-scale pixel value in one frame t and the same region in a subsequent frame ι+2.

Next, each temporal signal is analyzed to identify and characterize occurrences of periodic changes in the image content of specified neighborhoods within the region of interest. A sliding-correlation strategy is used in which multitudes of relatively short signal templates are correlated with the temporal signal being analyzed to discriminate seizure and seizure-like events from innocent behavior. As an alternative to the sliding- correlation strategy, any known method could be used to identify and characterize the occurrences of periodic changes such as for example wavelet decomposition followed by analysis of the wavelet coefficients. The results of the frame-by-frame analysis may be displayed in various colors (red, blue, green, black) wherein varying intensity/brightness and/or an alarm may be provided should a threshold be achieved.

A secondary item on or near the individual may be used in the region of interest in order to isolate and augment the motion signals generated from the monitored individual.

In accordance to one aspect of the invention provided is a system for automatically detecting and interpreting the motion of a subject from sequences of images captured from continuous long-term physiological recordings for improving the discrimination of seizures and seizure-like events of the recorded subject.

In accordance to another aspect of the invention provided is a method of automatically detecting and interpreting the motion of a subject, comprising capturing a sequences of images from a continuous long-term physiological recordings, and discriminating from the sequence of images seizures and seizure-like events of the recorded subject.

The foregoing and other objects, features and advantages of the invention will be apparent from the following more particular description of preferred embodiments of the invention, as illustrated in the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS In the drawings, reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating principles of the invention in a clear manner.

FIG. 1 is a block diagram of an image analysis system and method for discriminating movements of an individual according to the present invention.

FIG. 2 is a block diagram of a variation of the present invention, in which a sequencer/splitter device is provided to enable the display of multiple physiological inputs on a monitor.

FIG. 3 illustrates a placement of a monitoring device of an image analysis system according to the present invention, which covers an area that includes a portion of a support structure, such as, for example, a bed.

FIG. 4 is a block diagram illustration of a preferred method according to the present invention by which an individual's relative motion and posture in a predetermined region of interest is monitored in order to discriminate between a seizure and non-seizure event.

FIG. 5 illustrates a display output of the present invention detecting an artifact according to the present invention.

FIG. 6 illustrates a display output of the present invention detecting a non- epileptic seizure event according to the present invention.

FIG. 7 illustrates a display output of the present invention detecting an epileptic seizure event according to the present invention. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

In the following discussion, the present invention is described as being used to monitor an individual lying in a bed. However, any device that may comfortably support an individual for an extended period of time, such as, but not limited to, a chair, a wheelchair, or a traction recovery device, may be used with the invention. Additionally, it is understood that the present invention may be used in a variety of environments, such as, but not limited to, a medical center, a hospital, an assisted living environment, or a home of the individual. Furthermore, while the present discussion will center on monitoring an individual with epilepsy, it is understood that the present invention in the diagnostic monitoring environment, would be useful in detecting and discriminating sleep apnea, periodic limb movement, and parasomnias events.

FIG. 1 is a block diagram illustration of the system 10 of the present invention, which comprises an optical system 12, an image collector 14, and a control unit 16. In a preferred embodiment, the optical system 12 and the image collector 14 together forms a monitoring head 18 that is lightweight and easily positionable. Alternatively, the monitoring head 18 may be any conventional CCD camera, CCTV camera, or camcorder. The control unit 16 includes control and image processing circuitry which sends operating signals, such as stabilization, tracking, focusing (pan, tilt, and zoom), and the like to the monitoring head 18, and receives signals from the image collector 14 which are processed (digitized) for data storage and display.

The system 10 further includes a digitized image database 20, image processor 22, a monitor 24, and an alarm 26. Preferably, the control unit 16, digitized image database 20, and image processor 22 are provided as a programmable computer 23. In such an embodiment, the control unit 16 is a digitizer coupled to the monitoring head 18, wherein the image processor 22 is the conventional microprocessor provided in the programmable computer 23. Accordingly, the control unit 16, the monitoring head 18, the database 20, image processor 22, the monitor 24, and the alarm 26 are all electrically interfaced. The electrical interface between the elements are conventional and may comprise, for example, fiber optic cables, video cables, twisted pair wires, ribbon cables, a data bus, wireless transmission links or a combination thereof.

hi another embodiment, the system 10 may be coupled to a conventional video overlay device 28 that enables the display of multiple physiological inputs 30, such as the monitored patient's EEG, blood pressure and pulse, and the like, on the monitor 24, which is illustrated by FIG. 2. It is to be appreciated that the overlay is graphical data, possibly alpha/numeric or other symbology, that appears on top of the captured video data on the monitor 24. As such, the overlay device 28 is configured to accept the output from a plurality of physiological monitoring devices associated with the long-term monitoring of a patient, in order to enable the plurality of monitored physiological factors to be watched on the single monitor 24. Preferably, for this embodiment, the inputs are synchronized to and superimposing on the corresponding detected images, as illustrated in FIG. 5-7. If desired, other information such for example as the monitored individual's name, date, time, and the like may also be superimposed with the associated image on the monitor. Overlay physiological information of the monitored individual is extremely useful as it gives a permanent record of the monitored individual's condition at any instance of time.

For the embodiments of FIGS. 1 and 2, the optical system 12 may be any conventional lens system. Preferably, the optical system 12 is a zoom lens system that sufficiently corrects chromatic aberrations over the entire zoom range, minimizes aliasing effects, and suppresses stray light. One such lens system is disclosed by U.S. Pat. No. 6,078,433 to Hasenauer, et al. which is herein incorporated by reference. The image collector 14 is preferably a solid-state image sensor, such as, for example, a charge- coupled device (CCD) image detector. However, other types of analog or digital sensors, such as, but not limited to, for example, linear scanning and/or multi-dimensional line sensors that cover the infrared (IR) and/or visible light spectrum, and/or other predetermined wavelength range of the light spectrum, a wide spectral image charge- injection device (CID) or a complementary metal oxide semiconductor (CMOS) device may be employed without diverting from the spirit and/or scope of the invention. In use of the monitoring head 18, the optical system 12 captures a visible light image, corresponding to a detection zone having a monitored individual, and focuses the captured image onto a focal plane of the image collector 14. The image collector 14 scans the captured image and provides an image signal to the control unit 16. Successive image signals from the image collector 14 constitute a series of detection zone depictions. These depictions are digitized and formatted into a digital recording by the control unit 16. The format of the digital recordings is preferably in AVI format, however, a variety of digital video compression formats such as and not limited to DV-CAM, Cinepak, and MPEG, may be used.

Accordingly, it should be apparent to those persons skilled in the art that any conventional method for capturing a time sequence of digital images to create digital recordings may be used without departing from the spirit of the invention. What is important is that the quantization of the analog-to-digital conversion of the captured images has sufficient granularity such that the image processor 22 will be able to determine the posture and motion of the monitored individual over time, as will be explained later. The dynamics of the individual's position and motion in the detection zone improves the discrimination of seizures and seizure-like events from sequences of images captured from continuous long-term physiological recordings of the monitored individual.

As shown by FIG. 3, illustrating a preferred use of the system 10 of the present invention, the monitoring head 18 is positioned in a room 32 so that a fan-shaped detection zone 34 of the optical systeml2 covers at least an upper half of a patient 36 when situated in a bed 38. The remaining elements of the system 10 are remotely located, however, it is understood that all the elements comprising the present invention or any combination thereof may also be located within the room 32.

Typically, the monitoring head 18 will be mounted to a wall 40 of the room 32, but if desired, the device may be mounted to the ceiling, or attached to a freestanding tripod. The specific manner in which the monitoring head 18 is mounted is not critical. What is important, is that the optical system! 2 of the device is positioned so that the fan- shaped detection zone 34 preferably monitors the upper torso of the individual 36 when situated upon the bed 38 and that the incoming image sequences are stable, so that all changes in pixel values of the image sequences are largely attributed to motion or illumination changes in the scene being observed. The precision with which the optical system 12 is aimed at the bed 38 is only as necessary to ensure that the system 10 obtains information with sufficient resolution to be able to derive features from an area of interest of the monitored patient 36 by subsequent image processing.

The sensitivity of the employed monitoring head 18 determines whether additional (external) illumination of the monitored area is required. Although the present invention is implemented using only visible light, the use of a combination of infrared (IR) and visible light images to help distinguish between the monitored patient and inanimate objects surrounding the individual may be used. As is known, the detection of IR light corresponds to heat given off from an object. Objects such as, for example, chairs and beds give off very little, if any, heat. On the other hand, a human body produces larger quantities of heat. As such, it is recognized that the use of two light bands (e.g., visible light and infrared light) would improve the ability of the system to identify a person from among a plurality of inanimate objects in a captured image. Alternatively or additionally, the subject 36 may be illuminated with IR or other invisible wavelengths to permit monitoring during periods that the monitored individual 36 perceives as darkness.

The refresh-rate of the image collector 14 is preferably thirty or more frames per second, such that sufficient images of the patient 36 are recorded by the control unit 16. The digital recording output of the control unit 16 is a series of digitized image frames, which are inputted to both the digitized image database 20 for raw data storage and to the image processor 22 for real time processing. Should real time processing not be desired, the raw data stored in the database 20 may be inputted to the image processor 22 at a later time. FIG. 4 is a block diagram illustration of a preferred method according to the present invention by which the image processor 22 of the system 10 is programmed to identify from the digital frames of the monitored individual 36 (FIG. 3), the individual's relative motion and posture in a predetermined region of interest in order to discriminate between a seizure and non-seizure event. However, it is understood, that the present invention is not limited to the use of any specific image processing techniques used to generate a temporal signal reflecting change/movement. Current processing techniques, along with image processing techniques to be developed, may be employed without diverting from the spirit and/or scope of the present invention. For example, algorithms (processes) for feature tracking, optical flow, pattern recognition techniques, edge detection, neural networks, or fuzzy logic techniques may be employed. Furthermore, combinations of these processes can be utilized to provide an outline of the individual's body (or part thereof) as well as its position changes and/or acceleration as a function of time. Such algorithms and processes are well known to those persons skilled in the art of pattern recognition detection, and thus, are not discussed in detail herein.

Further, the sophistication of the image processing and algorithm(s) may vary, based upon the hardware employed. Accordingly, as new and/or improved sensors are developed, modifications to the image processing may be required. As such, the present invention is not limited to the use of a specific sensor and/or image process, and thus, a specific processing algorithm, but to the concept of capturing image recordings of a monitored individual in order to discriminate between a seizure and non-seizure event.

In discussing the hereafter method of the present invention, reference is also made to FIGS. 1-3. In step 100, a region of interest in the detection zone 34 is selected. Preferably, the region of interest is the face of monitored individual and the immediate surroundings. However, the region of interest could optionally be the entire image area of the detection zone 28 or part thereof. In addition, active or passive markings might be placed on the individual's body, clothing, or in the immediate surroundings, to simplify and improve the system's ability to select and identify clinically important areas such as the subject's head, hands, and the like. Furthermore, it should be apparent to those skilled persons in the art that numerous enhancements can be introduced to avoid unnecessary processing. Such enhancements may be, for example and not limited to, not performing signal analysis in any regions where no significant changes have been observed for longer than the duration of a preset time or template.

h step 110, the predetermined region of interest in each digital frame is partitioned by the image processor 22 into a multitude of spatial regions. Within a standard 640- or 720-by-480 image, preferably, the spatial regions are blocks of 32x32 pixels, which overlap on a center of 16x16 pixels. However, spatial regions of different pixel dimensions and/or shapes, such as a grid, may also be used without departing from the scope and spirit of the invention.

In step 120, a temporal signal for each spatial region is generated by the image processor 22 that reflects the degree of change in pixel patterns of each frame of the digital recording as a function of time. In this embodiment, the temporal signals are a statistical correlation or a measurement of the "degree of sameness" of gray-scale pixel values in each spatial region between frames that are separated from each other by some constant K, such as for example comparing frame i with t+/ς frame t+1 with ι+κXl, and so on. hi other embodiments, other pixel values and comparison measures may be used. hi particular, in the illustrated embodiment the statistical correlation between the grayscale pixel values in a small rectangular region α in one frame and the same region β in a subsequent frame is applied according to equation 1 :

χ(α,β) = ((α-mean(α)) (β-mean(β))) / (stdev(α) stdev(β)) (1)

As such, the temporal signal is defined as TemporalSignal [x,y, ] = correlation of image frame [x-dx:x+dx-l, y-dy:y+dy-l, i] against image frame [x-dx:x+dx-l , y-dy:y+dy-l, ι+7 ], that is, correlating the pixel values betweens frames i and x+κ, in a rectangle 2*dx pixels wide and 2*dy pixels high, centered at pixel (x,v). It is to be appreciated that the exact separation between frames is not important because very similar temporal signals are generated for a wide range of separations, such as, for example, from 1 to 6 frames. Accordingly, as defined, the temporal signal will peak twice per physical cycle with a reciprocating motion, once for each end of the range, when movement stops.

hi step 130, each temporal signal is then analyzed by the processor 22 to identify and characterize the occurrences of periodic change. The purpose of this step is to identify relatively brief periods where the temporal signals have a significant periodic component. For example, scratching, rubbing, hair-brushing, and the like may contain only a few cycles of hand movement resulting in rapid but largely smooth periodic signals. Movements such as turning over, crossing arms, yawning, and the like produce longer but largely smooth aperiodic signals. Accordingly, these types of signals can be discriminated from the rapid and jerky aperiodic signals typically associated with a seizure in an individual.

To accomplish this discrimination in step 130, a correlation is performed by the image processor 22 in which a multitude of signal templates which represent temporal patterns of various types of innocent and/or non-innocent behavior are compared against the temporal signals being analyzed. In particular, a sliding correlation between each

TemporalSignal, as defined above, and a set of short templates is performed, wherein each template represents preferably about four cycles of a cosine wave with periods ranging from about two to about thirty frames in the temporal signal (assuming 30 frames per second). It is to be appreciated that this period range reflects the physical movements from about 0.5 to about 6 cycles per second for objects typically moving in a reciprocating manner. The principle behind a sliding correlation involves stepping through the set of short templates with the incoming temporal signal. At each step, a correlation is done to determine how close the current short template is with the incoming temporal signal, wherein the correlation is repeated for all possible alignments of each of the short templates with the temporal signal. This correlation produces a set of new

KernelStrength functions as defined below.

To avoid spurious high correlations caused by noisy signals, the correlation function is modified in step 140 to filter or reduce the potentially corrupted correlation for small amplitude signals. In particular, the noise-adjusted correlation function is defined according to equation (2):

χ(α,β,noise) = ((α-mean(α)) (β-mean(β)))/(max(stdev(α)*stdev(β),noise)) (2)

where noise is a pre-defined constant reflecting a threshold of signal reliability.

More particularly, the KernelStrength functions noted above are defined as KernelStrength [x,v,t,φ] and preferably are equal to the noise-adjusted sliding correlation of TemporalSignal [x,y,x-k :t+/cψ] against a set of template kernels TemplateKernel[φ], where TemplateKernel[φ] is vector of 2*/ψ+i elements that samples four cycles of a cosine function of frequency φ centered at the middle of the kernel, that is, TemplateKernel[φ][σ] = cos(2π*(ξφ*(σ-/rφ))), wherein: ξψ = cycles per frame at the indicated frequency φ; /tψ = the smallest number of frames encompassing two complete cycles; and σ is a frame index that ranges over 0:2*Aφ. Taking a temporal view, the KernelStrength function represents the result of sliding each temporal kernel along the temporal signal corresponding to each region of the image plane, and correlating the kernel with that signal at each time.

In step 150, the areas in which the periodic change is taking place in the detection zone 34 are located from the spatial images to help discriminate innocent motion from non-innocent motion. This step is carried out by deriving the envelope of KernelStrength in each region of the detection zone 34. The resulting function, called KernelEnvelope [x,v,ι,φ], is a positive- valued function that measures how strongly TemplateKernel[φ] is represented in the temporal signal (produced by pixel correlation) at the image location [x,y] in the detection zone 34, for approximately one cycle of frequency φ around time x.

In step 160, the derived values of KernelEnvelope [X ,ι,φ] are displayed as a matrix of semi-transparent color blocks showing the type and intensity (degree of matching) of periodic movement overlaid on a gray-scale rendition of the original color image recording. The color of each block is determined by mapping the three types of kernels provided for in the template kernel to a specific color. For example, short kernels which indicate motions of predetermined high frequencies may be represented by the color red. Medium kernels, which indicate motions of predetermined medium frequencies, may be represented by the color green, and long kernels, which indicate motions of predetermined low frequencies, may be represented by the color blue.

The intensity of the displayed color represents the degree to which the KernelEnvelope in image location [x,y] matches each type of template kernel at time t, wherein each matching color is displayed at its brightest level. The mapping of intensity for these representative colors may be either linear or nonlinear. Typically, the mapping is piecewise linear with KernelStrength values below 0.35 producing black (no color contribution) and KernelStrength values above 0.8 producing maximum color brightness, and intermediate values interpreted linearly. In this manner, the visualization of the derived values for KernelEnvelope help to discriminate the spatial and temporal patterns of movements appearing in the source image sequences.

FIGS. 5-7, depicting various types of classification results, are illustrations of the manner in which the representative color blocks show the results derived by the system of the invention, programmed with the above-mentioned process. FIG. 5 illustrates the typical results produced by the system 10 detecting innocent movement of the patient 36 and how the system 10 displays such results on the monitor 24. As illustrated, for such innocent movement, the system will produce very few color blocks wherein a majority of the color blocks occurs in a very localized area. In addition, the intensity levels of the color blocks, indicating the level of matching or correlation to each of the three predetermined frequencies show that a majority of the detected motion in the detection zone 34 are medium to low frequencies. Accordingly, it is to be appreciated that for such a detected event, the system 10 is further programmed to discriminate such motion as an EEG artifact for innocent movement, such as the illustrated scratch, if the movement continues for a predetermined short period of time. Additionally, it is to be appreciated that for this detected artifact, the system may store the time index and its classification along with the detected event in the digitized image database 20 for further processing and/or analysis.

FIG. 6 illustrates the typical response of the system 10 in discriminating other types of non-epileptic events, such as snoring. Although, a larger area of higher intensity color blocks is displayed, the color blocks are still localized to the region surround the head of the patient 36. Accordingly, the system 10 may be programmed to ignore such movement as innocent. On the other hand, FIG. 7 illustrates the typical response of the system 10 to a seizure event, wherein intense color blocks cover a large portion of the monitored patient 36. It is to be appreciated that in addition to the patient 36 showing high frequency movement, structures peripheral to the patient, such as the bed 38 and a table 42 with items located thereon show signs of movement. Accordingly, in such detected events the system 10 may be accordingly programmed to send an alarm signal to the alarm 26 since correlation indicates the presents of the preselected movement behavior (i.e., an epileptic-seizure).

In addition to the discrimination function, it is contemplated that in order to isolate and augment the motion signals generated from the subject, a secondary indicator 37 may be used with and/or in lieu of the monitored patient 36. In this method, secondary motion analysis can be obtained from specific indicators such as, for example, indicator items and patterns. Examples of indicator items include specifically designed "bike flag type" items in which the motion is augmented yet localized for video data capture and processing. Examples of indicator patterns include grids or other repetitive patterns placed on critical indicator surfaces such as beds, bed linens, clothing, or body elements (such as head, hands, torso, etc).

The present invention has been described with respect to a single stationary location. However, it is understood that in practice, the present invention can be adapted to have a plurality of remote monitoring stations, which may be stationary and/or mobile. Also, while the present invention has been described with respect to the monitoring of an individual with epilepsy other potential fields of application include animal seizure diagnosis and/or monitoring, industrial processes assessment and improvement (continuous manufacturing involving repetitive motion), sports training and rehabilitation tool, ergonomic assessment tool, and security and surveillance. Additionally, it is to be appreciated that the system of the present invention is useable with both fixed and self- tracking video monitoring systems. Furthermore, displays utilizing split, screen technique may be used to produce a close up of the face as well as a full view of the body.

Further, it is to be appreciated that the ability to identify periodic motions of the body also gives rise to the ability to recognize and categorize specific periodic motions themselves in order to diagnose specific conditions (as opposed to categorizing them as benign artifacts). It is a straightforward extension of the motion categorization techniques to recognize periodic motion due to breathing, heart beat (possibly with a flag attached to the patient to amplify the motion), snoring, or leg motion (restless legs). Categorization of these motions can facilitate direct diagnosis of certain ailments. For example, utilizing this system to track breathing may address diagnosis of sleep apnea, sudden infant death syndrome (SIDS), post-surgical respiratory distress, snoring, sudden cessation without a corresponding body movement associated with rolling over, etc. Restless legs syndrome is one of a host of sleep-related illnesses that might be identified with this video processing technique.

One of the more powerful non-medical applications of this video processing technique is to identify drowsy drivers. A video signal, which is generated and captured in a similar fashion as previously described above, can be used to identify when the rate of eye blinking changes dramatically. The inner region of the signal corresponds to an eye and the outer region corresponds to the face of the driver. The algorithm used here can detect changes in the frequency of eye blinks, which can be used to characterize and classify the onset of drowsiness or sleep. In such an embodiment, the alarm 26, as shown in FIG.l, can be sounded when the onset of drowsiness or sleep is detected to alert the driver to this condition. The mathematical function described above to process the video of a monitored epileptic event (or other periodic motion) describes a feature of the video output that relates to an epileptic event. Current technology that uses an algorithm to process EEG waveforms and produce a trigger on a seizure occurrence produces another feature that relates to an epileptic event. Using the techniques described above, these features can be combined in a straightforward manner to produce a classifier that uses both features to characterize and classify the occurrence of an epileptic seizure.

This technique of combining the video output feature of motion processing with other physiological outputs has the ability to produce better monitoring systems than were possible before. Another related example is to combine the existing video techniques employed to detect snoring with a microphone that monitors sound levels. In this case, classifiers may be employed that relate sudden drop-offs of the snoring volume with video characteristics that show the patient has ceased the vibratory motions associated with snoring, but has not significantly shifted position (such as rolling over). In this case, an improved monitor for sleep apnea may be constructed.

Although the present invention has been described in terms of specific exemplary embodiments, it will be appreciated that various modifications and alterations might be made by those persons skilled in the art without departing from the spirit and scope of the invention as set forth in the following claims.

Claims

1. A system for discriminating movement of a monitored subject from sequences of images captured from an imaging device comprising: a processing device adapted to receive the images from the imaging device and programmed to characterize movements of the monitored subject from the sequences of images as interesting versus non-interesting activities; and an alarm actuated by said processing device when said processing device characterizes said movements as interesting.
2. The system of claim 1, wherein said processing device characterizes said movements of the monitored subject by analyzing predetermined signal characteristics in a sequence of captured images with signal patterns for various types of predetermined interesting movement behavior.
3. The system of claim 2, wherein said processing device further characterizes said movement as interesting if said movement is localized to a predetermined area in the 41 sequence of images.
4. The system of claim 2 wherein said movements are periodic movements and said signal characteristics is periodic information, and said signal patterns are temporal patterns.
5. The system of claim 4, wherein said alarm is alarm actuated by said processing device when analysis of said periodic information in said sequence of captured images with at least one of said temporal patterns indicates a predetermined interesting movement behavior is likely occurring.
6. The system of claim 3 wherein said movements are periodic movements and said signal characteristics is periodic information, and said signal patterns are temporal patterns.
7. The system of claim 6, wherein said alarm is alarm actuated by said processing device when analysis of said periodic information in said sequence of captured images with at least one of said temporal patterns indicates a predetermined interesting movement behavior is likely occurring.
8. The system of claim 2, wherein said predetermined movement behavior is movements characteristic of an epileptic seizure event.
9. The system of claim 8, wherein rapid and jerky periodic movement is represented in said signal patterns as interesting.
10. The system of claim 8, wherein smoother and smaller periodic movements is represented in said signal pattern as a non-interesting.
11. The system of claim 9, wherein said processing device further characterizes said movement as interesting if said movement is localized to a predetermined area in the sequence of images.
12. The system of claim 2, wherein said sequence of captured images is taken from a physiological video recording.
13. The system of claim 12, further comprising a database that stores said physiological video recording and said signal patterns.
14. The system of claim 2, wherein said analyzing is a sliding correlation.
15. The system of claim 2, wherein said signal patterns represents about four cycles of a cosine wave with periods ranging from about two to about thirty frames.
16. The system of claim 2, wherein said processor analyzes a predetermined region of interest of said sequence of captured images.
17. The system of claim 17, further comprising a monitor that provides a visual display of said monitored subject, and said sequence of captured images is overlaid with a first visual indication of said predetermined region of interest and provided to said monitor.
18. The system of claim 17, wherein said sequence of captured images is further overlaid with other visual indications of said analysis and provided to said monitor.
19. The system of claim 18, wherein said sequence of captured images is further overlaid with physiological data and provided to said monitor.
20. The system of claim 19, wherein said overlaid sequence of captured images is stored in a database.
21. The system of claim 19, wherein said sequence of captured images is stored in a database when said alarm is actuated.
22. The system of claim 1, further comprising a secondary indicator to improve identification of said monitored movement behavior.
23. The system of claim 6, wherein said periodic information is temporal information for each spatial region in said sequence of captured images generated by said processing device.
24. The system of claim 24, wherein said temporal signal reflects the degree of change in pixel patterns of each frame in said sequence of captured images as a function of time.
25. A system for discriminating movement of a monitored subject, comprising: an image capturing device that captures successive images of the monitored subject; a processing device that processes said captured images to detect and characterize movements of the monitored subject by analyzing periodic information in a sequence of captured image with temporal patterns of various types of predetermined movement behavior; and an alarm that is actuated when said processing device determines a preselected movement behavior is likely occurring from said analysis.
26. The system of claim 25, wherein said image capturing device detects electromagnetic energy of a predetermined wavelength range emitted by the monitored subject.
27. The system of claim 26, wherein said electromagnetic energy is selected from the group consisting of visible light, infrared light, and combinations thereof.
28. The system of claim 25, further comprising a monitor for displaying said captured successive images and a video sequencer that displays physiological data and information associated with the monitored subject on said monitor overlaying said captured successive images.
29. The system of claim 28, wherein said processing device provides indications of said correlation on said monitor overlaying said captured successive images.
30. The system of claim 29, further comprising a database for storing said captured successive image with said overlaid physiological data, monitored subject information, and correlation indications when said alarm is activated.
31. A method for discriminating movement of a monitored subject comprising: capturing successive images of the monitored subject; processing said captured images to detect and characterize movements of the monitored subject from the sequences of images as interesting versus non- interesting activities; and indicating a response when said processing device characterizes said movements as interesting.
32. The method of claim 31, further comprising selecting a region of interest to process in said sequence of said captured successive images.
33. The method of claim 32, further comprising generating spatial regions in said region of interest.
34. The method of claim 33, further comprising deriving temporal signals for each spatial region, wherein said temporal signals reflects the degree of change in pixel patterns of each frame in said sequence of captured successive images as a function of time.
35. The method of claim 34, further comprising filtering said temporal signals for noise.
36. The method of claim 35, further comprising detecting the location and intensity of periodic change in said temporal signals.
37. The method of claim 36, further comprising displaying said location and said intensity of said periodic change in said temporal signals overlaying said successive captured images.
38. The method of claim 3 1, wherein said predetermined movement behavior is periodic movements that are characteristic of an epileptic seizure event.
39. The method of claim 38, wherein rapid and jerky periodic movement is represented in said signal patterns as interesting.
40. The method of claim 39, wherein smoother and smaller periodic movements is represented in said signal pattern as a non-interesting.
41. The method of claim 40, wherein said processing device further characterizes said movement as interesting if said movement is localized to a predetermined area in the sequence of images.
42. The method of claim 3 1, wherein said response is an alarm condition.
43. The method of claim 3 1, wherein said analysis is a sliding correlation.
44. The method of claim 37, further comprising overlaying said displayed sequence of captured images with physiological data.
45. The method of claim 37, further comprising storing said overlaid sequence of captured images in a database.
46. The method of claim 42, further comprising storing said analysis and sequence of captured images in a database when said alarm condition is actuated.
47. The method of claim 31, further comprising providing a secondary indicator to improve detection of said movement of said monitored subject.
48. A system for discriminating movement of a monitored subject from sequences of images captured from an imaging device comprising: a processing device adapted to receive the images from the imaging device and programmed to characterize movements of the monitored subject by comparing periodic information in a sequence of captured images to temporal patterns for various types of predetermined movement behavior; and an alarm actuated by said processing device when correlation of said periodic information in said sequence of captured images to at least one of said temporal patterns indicates a preselected movement behavior is likely occurring.
49. A system for discriminating movement of a monitored subject comprising: an image capturing device that captures successive images of the monitored subject; a processing device that processes said captured images to detect and characterize movements of the monitored subject by comparing periodic information in a sequence of captured images to temporal patterns of various types of movement behavior; and an alarm that is actuated when said processing device determines that correlation of said periodic information to at least one of said temporal patterns indicates a preselected movement behavior is likely occurring.
50. A method for discriminating movement of a monitored subject comprising: capturing successive images of the monitored subject; processing the captured images to detect and characterize movements of the monitored subject by comparing periodic information in a sequence of said captured successive images to temporal patterns of various types of movement behavior; and indicating that the monitored individual is likely exhibiting a preselected movement behavior when correlation of said periodic information to at least one of said signal templates is above a threshold value.
CINlibrary/1184691.1
PCT/US2002/010642 2001-04-10 2002-04-04 Image analysis system and method for discriminating movements of an individual WO2002082999A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US28279301P true 2001-04-10 2001-04-10
US60/282,793 2001-04-10
US94478001A true 2001-08-31 2001-08-31
US09/944,780 2001-08-31

Publications (1)

Publication Number Publication Date
WO2002082999A1 true WO2002082999A1 (en) 2002-10-24

Family

ID=26961672

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2002/010642 WO2002082999A1 (en) 2001-04-10 2002-04-04 Image analysis system and method for discriminating movements of an individual

Country Status (1)

Country Link
WO (1) WO2002082999A1 (en)

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004064638A1 (en) * 2003-01-24 2004-08-05 Pedro Monagas Asensio Mood analysing device for mammals
EP1651965A2 (en) * 2003-07-22 2006-05-03 Ronjo Company Method of monitoring sleeping infant
WO2007034476A2 (en) 2005-09-19 2007-03-29 Biolert Ltd A device and method for detecting an epileptic event
FR2898482A1 (en) * 2006-03-15 2007-09-21 Hospices Civils De Lyon Etabli Method for monitoring a patient and system for carrying out said method.
EP2123221A2 (en) * 2008-05-19 2009-11-25 Vaidhi Nathan Abnormal motion detector and monitor
EP2399513A1 (en) * 2010-06-23 2011-12-28 Qatar University System for non-invasive automated monitoring, detection, analysis, characterisation, prediction or prevention of seizures and movement disorder symptoms
US8743200B2 (en) 2012-01-16 2014-06-03 Hipass Design Llc Activity monitor
WO2015091582A1 (en) * 2013-12-19 2015-06-25 Koninklijke Philips N.V. A baby monitoring device
WO2016094749A1 (en) * 2014-12-11 2016-06-16 Rdi, Llc Method of analyzing, displaying, organizing and responding to vital signals
WO2016142734A1 (en) * 2015-03-12 2016-09-15 Mis*Tic Telemedicine system using a multi sensor acquisition device
US9465981B2 (en) 2014-05-09 2016-10-11 Barron Associates, Inc. System and method for communication
WO2016205246A1 (en) * 2015-06-15 2016-12-22 Knit Health, Inc. Remote biometric monitoring system
US9575560B2 (en) 2014-06-03 2017-02-21 Google Inc. Radar-based gesture-recognition through a wearable device
US9600080B2 (en) 2014-10-02 2017-03-21 Google Inc. Non-line-of-sight radar-based gesture recognition
US9693592B2 (en) 2015-05-27 2017-07-04 Google Inc. Attaching electronic components to interactive textiles
US9778749B2 (en) 2014-08-22 2017-10-03 Google Inc. Occluded gesture recognition
US9811164B2 (en) 2014-08-07 2017-11-07 Google Inc. Radar-based gesture sensing and data transmission
WO2017198894A1 (en) * 2016-05-20 2017-11-23 Nokia Technologies Oy Method and apparatus for matching vital sign information to a concurrently recorded data set
US9837760B2 (en) 2015-11-04 2017-12-05 Google Inc. Connectors for connecting electronics embedded in garments to external devices
US9848780B1 (en) 2015-04-08 2017-12-26 Google Inc. Assessing cardiovascular function using an optical sensor
US9921660B2 (en) 2014-08-07 2018-03-20 Google Llc Radar-based gesture recognition
US9933908B2 (en) 2014-08-15 2018-04-03 Google Llc Interactive textiles
US9983747B2 (en) 2015-03-26 2018-05-29 Google Llc Two-layer interactive textiles
US10016162B1 (en) 2015-03-23 2018-07-10 Google Llc In-ear health monitoring
US10064582B2 (en) 2015-01-19 2018-09-04 Google Llc Noninvasive determination of cardiac health and other functional states and trends for human physiological systems
US10080528B2 (en) 2015-05-19 2018-09-25 Google Llc Optical central venous pressure measurement
US10088908B1 (en) 2015-05-27 2018-10-02 Google Llc Gesture detection and interactions
US10139916B2 (en) 2015-04-30 2018-11-27 Google Llc Wide-field radar-based gesture recognition
US10175781B2 (en) 2016-05-16 2019-01-08 Google Llc Interactive object with multiple electronics modules
US10241581B2 (en) 2015-04-30 2019-03-26 Google Llc RF-based micro-motion tracking for gesture tracking and recognition
US10268321B2 (en) 2014-08-15 2019-04-23 Google Llc Interactive textiles within hard objects
DE102017010649A1 (en) 2017-11-17 2019-05-23 Drägerwerk AG & Co. KGaA Method, computer program and device for classifying activities of a patient
US10300370B1 (en) 2015-10-06 2019-05-28 Google Llc Advanced gaming and virtual reality control using radar
US10310620B2 (en) 2015-04-30 2019-06-04 Google Llc Type-agnostic RF signal representations
US10376195B1 (en) 2015-06-04 2019-08-13 Google Llc Automated nursing assessment
US10492302B2 (en) 2016-11-15 2019-11-26 Google Llc Connecting an electronic component to an interactive textile

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5047930A (en) 1987-06-26 1991-09-10 Nicolet Instrument Corporation Method and system for analysis of long term physiological polygraphic recordings
US5798798A (en) * 1994-04-28 1998-08-25 The Regents Of The University Of California Simultaneously acquiring video images and analog signals
EP0933726A2 (en) * 1998-01-30 1999-08-04 Mitsubishi Denki Kabushiki Kaisha System for having concise models from a signal utilizing a hidden markov model

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5047930A (en) 1987-06-26 1991-09-10 Nicolet Instrument Corporation Method and system for analysis of long term physiological polygraphic recordings
US5299118A (en) 1987-06-26 1994-03-29 Nicolet Instrument Corporation Method and system for analysis of long term physiological polygraphic recordings
US5798798A (en) * 1994-04-28 1998-08-25 The Regents Of The University Of California Simultaneously acquiring video images and analog signals
EP0933726A2 (en) * 1998-01-30 1999-08-04 Mitsubishi Denki Kabushiki Kaisha System for having concise models from a signal utilizing a hidden markov model

Cited By (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004064638A1 (en) * 2003-01-24 2004-08-05 Pedro Monagas Asensio Mood analysing device for mammals
EP1651965A2 (en) * 2003-07-22 2006-05-03 Ronjo Company Method of monitoring sleeping infant
EP1651965A4 (en) * 2003-07-22 2009-05-27 Ronjo Co Llc Method of monitoring sleeping infant
WO2007034476A2 (en) 2005-09-19 2007-03-29 Biolert Ltd A device and method for detecting an epileptic event
FR2898482A1 (en) * 2006-03-15 2007-09-21 Hospices Civils De Lyon Etabli Method for monitoring a patient and system for carrying out said method.
WO2007104796A3 (en) * 2006-03-15 2007-11-01 Alexandros Arzimanoglou Method for monitoring a patient and system for implementing said method
EP2123221A2 (en) * 2008-05-19 2009-11-25 Vaidhi Nathan Abnormal motion detector and monitor
EP2123221A3 (en) * 2008-05-19 2013-09-04 Vaidhi Nathan Abnormal motion detector and monitor
EP2399513A1 (en) * 2010-06-23 2011-12-28 Qatar University System for non-invasive automated monitoring, detection, analysis, characterisation, prediction or prevention of seizures and movement disorder symptoms
US8743200B2 (en) 2012-01-16 2014-06-03 Hipass Design Llc Activity monitor
WO2015091582A1 (en) * 2013-12-19 2015-06-25 Koninklijke Philips N.V. A baby monitoring device
US9465981B2 (en) 2014-05-09 2016-10-11 Barron Associates, Inc. System and method for communication
US9575560B2 (en) 2014-06-03 2017-02-21 Google Inc. Radar-based gesture-recognition through a wearable device
US9971415B2 (en) 2014-06-03 2018-05-15 Google Llc Radar-based gesture-recognition through a wearable device
US9811164B2 (en) 2014-08-07 2017-11-07 Google Inc. Radar-based gesture sensing and data transmission
US9921660B2 (en) 2014-08-07 2018-03-20 Google Llc Radar-based gesture recognition
US10268321B2 (en) 2014-08-15 2019-04-23 Google Llc Interactive textiles within hard objects
US9933908B2 (en) 2014-08-15 2018-04-03 Google Llc Interactive textiles
US10409385B2 (en) 2014-08-22 2019-09-10 Google Llc Occluded gesture recognition
US9778749B2 (en) 2014-08-22 2017-10-03 Google Inc. Occluded gesture recognition
US9600080B2 (en) 2014-10-02 2017-03-21 Google Inc. Non-line-of-sight radar-based gesture recognition
WO2016094749A1 (en) * 2014-12-11 2016-06-16 Rdi, Llc Method of analyzing, displaying, organizing and responding to vital signals
US10064582B2 (en) 2015-01-19 2018-09-04 Google Llc Noninvasive determination of cardiac health and other functional states and trends for human physiological systems
WO2016142734A1 (en) * 2015-03-12 2016-09-15 Mis*Tic Telemedicine system using a multi sensor acquisition device
US10016162B1 (en) 2015-03-23 2018-07-10 Google Llc In-ear health monitoring
US9983747B2 (en) 2015-03-26 2018-05-29 Google Llc Two-layer interactive textiles
US9848780B1 (en) 2015-04-08 2017-12-26 Google Inc. Assessing cardiovascular function using an optical sensor
US10139916B2 (en) 2015-04-30 2018-11-27 Google Llc Wide-field radar-based gesture recognition
US10310620B2 (en) 2015-04-30 2019-06-04 Google Llc Type-agnostic RF signal representations
US10241581B2 (en) 2015-04-30 2019-03-26 Google Llc RF-based micro-motion tracking for gesture tracking and recognition
US10080528B2 (en) 2015-05-19 2018-09-25 Google Llc Optical central venous pressure measurement
US10088908B1 (en) 2015-05-27 2018-10-02 Google Llc Gesture detection and interactions
US10155274B2 (en) 2015-05-27 2018-12-18 Google Llc Attaching electronic components to interactive textiles
US9693592B2 (en) 2015-05-27 2017-07-04 Google Inc. Attaching electronic components to interactive textiles
US10203763B1 (en) 2015-05-27 2019-02-12 Google Inc. Gesture detection and interactions
US10376195B1 (en) 2015-06-04 2019-08-13 Google Llc Automated nursing assessment
US10169662B2 (en) 2015-06-15 2019-01-01 Google Llc Remote biometric monitoring system
WO2016205246A1 (en) * 2015-06-15 2016-12-22 Knit Health, Inc. Remote biometric monitoring system
US10459080B1 (en) 2015-10-06 2019-10-29 Google Llc Radar-based object detection for vehicles
US10401490B2 (en) 2015-10-06 2019-09-03 Google Llc Radar-enabled sensor fusion
US10300370B1 (en) 2015-10-06 2019-05-28 Google Llc Advanced gaming and virtual reality control using radar
US10310621B1 (en) 2015-10-06 2019-06-04 Google Llc Radar gesture sensing using existing data protocols
US10379621B2 (en) 2015-10-06 2019-08-13 Google Llc Gesture component with gesture library
US9837760B2 (en) 2015-11-04 2017-12-05 Google Inc. Connectors for connecting electronics embedded in garments to external devices
US10175781B2 (en) 2016-05-16 2019-01-08 Google Llc Interactive object with multiple electronics modules
WO2017198894A1 (en) * 2016-05-20 2017-11-23 Nokia Technologies Oy Method and apparatus for matching vital sign information to a concurrently recorded data set
US10492302B2 (en) 2016-11-15 2019-11-26 Google Llc Connecting an electronic component to an interactive textile
WO2019096783A1 (en) 2017-11-17 2019-05-23 Drägerwerk AG & Co. KGaA Method, computer program and device for classifying activities of a patient
DE102017010649A1 (en) 2017-11-17 2019-05-23 Drägerwerk AG & Co. KGaA Method, computer program and device for classifying activities of a patient

Similar Documents

Publication Publication Date Title
Ringach et al. Receptive field structure of neurons in monkey primary visual cortex revealed by stimulation with natural image sequences
US9072442B2 (en) System and method for displaying an image stream
JP5859979B2 (en) Health indicators based on multivariate residuals for human health monitoring
US8768438B2 (en) Determining cardiac arrhythmia from a video of a subject being monitored for cardiac function
US8374687B2 (en) Rapid serial visual presentation triage prioritization based on user state assessment
JP2921936B2 (en) Image monitoring apparatus
US7233684B2 (en) Imaging method and system using affective information
JP4532739B2 (en) Warning and drowsiness detection and tracking system
EP1350466B1 (en) Monitor
EP1445938A1 (en) Imaging method and system for health monitoring and personal security
CN102341828B (en) Processing images of at least one living being
US20100324437A1 (en) Device and method for assessing physiological parameters
US10095930B2 (en) System and method for home health care monitoring
US7720306B2 (en) Systems and methods for displaying changes in biological responses to therapy
Ouerhani et al. Empirical validation of the saliency-based model of visual attention
DE102013210713A1 (en) Video-based estimation of heart rate variability
McDuff et al. Improvements in remote cardiopulmonary measurement using a five band digital camera
KR20110105789A (en) Method and apparatus for monitoring an object
US9185353B2 (en) Removing environment factors from signals generated from video images captured for biomedical measurements
US20070173699A1 (en) Method and system for user sensitive pacing during rapid serial visual presentation
US9843743B2 (en) Infant monitoring systems and methods using thermal imaging
US6816603B2 (en) Method and apparatus for remote medical monitoring incorporating video processing and system of motor tasks
US7380938B2 (en) Apparatus to detect and measure saccade and pupilary changes
Wang et al. Exploiting spatial redundancy of image sensor for motion robust rPPG
US9913588B2 (en) Method and system for screening of atrial fibrillation

Legal Events

Date Code Title Description
AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG UZ VN YU ZA ZM ZW

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
WWW Wipo information: withdrawn in national office

Country of ref document: JP

NENP Non-entry into the national phase in:

Ref country code: JP