EP2507742A2 - Système et procédé économique et fiable de suivi du regard et d'identification de l'état de somnolence d'un conducteur - Google Patents

Système et procédé économique et fiable de suivi du regard et d'identification de l'état de somnolence d'un conducteur

Info

Publication number
EP2507742A2
EP2507742A2 EP10822865.1A EP10822865A EP2507742A2 EP 2507742 A2 EP2507742 A2 EP 2507742A2 EP 10822865 A EP10822865 A EP 10822865A EP 2507742 A2 EP2507742 A2 EP 2507742A2
Authority
EP
European Patent Office
Prior art keywords
eye
eyes
histogram
image
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP10822865.1A
Other languages
German (de)
English (en)
Inventor
K. S. Chidanand
Brojeshwar Bhowmick
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tata Consultancy Services Ltd
Original Assignee
Tata Consultancy Services Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tata Consultancy Services Ltd filed Critical Tata Consultancy Services Ltd
Publication of EP2507742A2 publication Critical patent/EP2507742A2/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms

Definitions

  • This invention relates to system and method for eye tracking and driver drowsiness identification. More particularly, the invention relates to a cost effective and. robust system and method for localizing and tracking drowsiness state of the eyes of a driver for avoiding accidents by using images captured by near infrared (IR) camera disposed on the vehicle
  • IR near infrared
  • US6283954 to Kingman Yee teaches improved devices, systems, and methods for sensing and tracking the position of an eye make use of the contrast between the sclera and iris to derive eye position.
  • This invention is particularly useful for tracking the position of the eye during laser eye surgery, such as photorefractive keratectomy (PRK), phototherapeutic keratectomy (PTK), laser in situ keratomileusis (LASIK), or the like.
  • PRK photorefractive keratectomy
  • PTK phototherapeutic keratectomy
  • LASIK laser in situ keratomileusis
  • US5345281 to Taboada et al discloses about devices for tracking the gaze of the human eye, and more particularly to an optical device for tracking eye movement by analysis of the reflection off the eye of an infrared (IR) beam.
  • US6927694 to Smith et al discloses tracking a person's head and facial features with a single on-board camera with a fully automatic system that can initialize automatically, and can reinitialize when needed and provides outputs in realtime.
  • the system as proposed in '694 uses RGB array indexing on the R, G, B components for the pixel that is marked as important, the system also works on different algorithms for daytime and nighttime conditions.
  • US5689241 to Clarke Sr et al discloses about the device that monitors via the infrared camera the thermal image changes in pixel color of open versus closed eyes of the driver via the temperature sensitive infrared portion of the digitized photographic image passed through a video charge coupling device.
  • the combination of non movement and a decrease in breath temperature, which is a physiological response to hypoventilation thus initiating drowsiness, will trigger the infrared camera to zoom onto the eye region of the driver.
  • US20080252745 to Tomokazu Nakamura teaches state-of-eye (including blinking) distinguishing means by calculating a feature value that represents a state of an eye, for an eye-area based on pixel data of pixels that constitute the eye-area.
  • a threshold value setting means calculates a first threshold value representing a feature value at a first transition point from an open state to a closed state and a second threshold value representing a feature value at a second transition point from the closed state to the open state, based on a feature value calculated for a targeted eye when the targeted eye is open.
  • US7130446 to Rui et al teaches that automatic detection and tracking of multiple individuals includes receiving a frame of video and/or audio content and identifying a candidate area for a new face region in the frame.
  • One or more hierarchical verification levels are used to verify whether a human face is in the candidate area, and an indication made that the candidate area includes a face if the one or more hierarchical verification levels verify that a human face is in the candidate area.
  • a plurality of audio and/or video cues are used to track each verified face in the video content from frame to frame.
  • IR near infrared
  • Yet another objective of the invention is to provide system and method which detect the state of the eyes using histogram equalization, morphological operations and texture based parameters using histogram and grey level co occurrence matrices.
  • the invention provides a cost effective and robust system and method for localizing and tracking drowsiness state of the eyes of a driver for avoiding accidents by using images captured by near infrared (IR) camera disposed on the vehicle.
  • IR near infrared
  • the present invention embodies a cost effective and robust method for localizing and tracking drowsiness state of the eyes of a driver for avoiding accidents by using images captured by near infrared (IR) camera disposed on the vehicle, the said method comprising the processor implemented steps of: Real-time tracking of the face and localizing eye bounding box within the face bounding box in the captured image by comparing the gray values with threshold using the segmentation process; tracking the eyes by computing the centroid of the eye, computing target model histogram and target candidate model histogram for one location to another location followed by comparing them to identify distance and subsequently calculating the displacement of the target centre by the weighted means, wherein the target model histogram and target candidate model histogram are computed based on the feature space which includes histogram equalized image range and Morphology transformed image; and detecting the drowsiness state of the eyes using histogram equalization, Morphological operations and texture based parameters using histogram and grey level co-oecurrence matrices.
  • IR
  • an alert means for warning the driver using detected drowsiness state of the eyes for avoiding collision, wherein the alert means can be audio and audio visual device including but not limited to an alarm, a voice based caution, an Indicator with display.
  • near IR camera is disposed inside of the vehicle facing towards from the driver.
  • Figure 1 is flowchart which illustrates a method for localizing and tracking drowsiness state of the eyes of a driver for accidents according to various embodiments of the invention.
  • Figure 2A illustrates the detected eye region for the closed eye in one exemplary head position of the driver in accordance with the invention.
  • Figure 2B illustrates the histogram equalization of the detected eye region for the closed eye in one exemplary head position of the driver in accordance with the invention.
  • Figure 2C illustrates morphological results of the histogram equalization of the detected eye region for the closed eye in one exemplary head position of the driver in accordance with the invention.
  • Figure 2D illustrates the detected eye region for open eye in one exemplary head position of the driver in accordance with the invention.
  • Figure 2E illustrates the histogram equalization of the detected eye region for open eye in one exemplary head position of the driver in accordance with the invention.
  • Figure 2F illustrates morphological results of the histogram equalization of the detected eye region for open eye in one exemplary head position of the driver in accordance with the invention.
  • Figure 3A illustrates the detected eye region for the closed eye in another exemplary head position of the driver in accordance with the invention.
  • Figure 3B illustrates the histogram equalization of the detected eye region for the closed eye in another exemplary head position of the driver in accordance with the invention.
  • Figure 3C illustrates morphological results of the histogram equalization of the detected eye region for the closed eye in another exemplary head position of the driver in accordance with the invention.
  • Figure 3D illustrates the detected eye region for open eye in another exemplary head position of the driver in accordance with the invention.
  • Figure 3E illustrates the histogram equalization of the detected eye region for open eye in another exemplary head position of the driver in accordance with the invention.
  • Figure 3F illustrates morphological results of the histogram equalization of the detected eye region for open eye in another exemplary head position of the driver in accordance with the invention.
  • Figure 4A & 6B shows the graphs which illustrates driver drowsiness identification statues according to the exemplary embodiments of the invention.
  • IR near infrared
  • a cost effective and robust system comprises of a near IR camera disposed on the vehicle facing towards the driver for capturing an image; and a processor is housed therein for analyzing the captured image in real-time for localizing and tracking drowsiness state of the eyes of the driver for avoiding accidents.
  • Figure 1 is flowchart which illustrates a method 100 for localizing and tracking drowsiness state of the eyes of a driver for preventing accidents according to various embodiments of the invention.
  • near IR camera can be disposed either outside or inside of the vehicle facing towards from the. driver.
  • the near IR camera is disposed inside of the vehicle facing towards from the driver.
  • the resolution of the near IR camera is 352 * 288.
  • the IR range of the near IR camera can be selected from a range of (0.7-1) to 5 Microns for detecting and tracking the pedestrians.
  • the temperature range of the near IR camera can be selected from 740 to 3,000-5,200 Kelvin for detecting and tracking the pedestrians.
  • the processor can be disposed either in the body of the near IR camera, outside or inside, top or on the dashboard of the vehicle. In one exemplary embodiment of the invention, the processor is disposed in the body of the near IR camera. In accordance with another aspect of the invention the processor can be selected from the group of Davinci DM6446 Processor, ADSP-BF533, 750 MHz Blackfin Processor.
  • the above said cost effective method comprises various processor implemented steps. Tracking small objects such as eyes in an entire image is difficult. Hence to localize the search of eyes, in the first step of the proposed method, the face is tracked first and then eyes are tracked, with in the face bounding box.
  • the processor executes the code that identifies 120 the shape of the eyes and facial landmarks such as eyebrow, nose tip and vertical face centre using the segmentation process.
  • Face _ height 2 * ⁇ nose _ tip _ position - eye _ brow _ position) - eye _ bro _ position
  • the processor executes the code that determines bounding box of eyes using segmentation process.
  • the processor executes the code that collects the features which is just the grey values greater than the threshold obtained from segmentation process.
  • the processor executes the code that collects feature which is histogram equalization in the bounding box of face and Morphology transformation in the bounding box of the face.
  • the main obstacle in tracking eyes is to get rid of pupil effect.
  • first morphological erosion is carried out with in the face bounding box b y the processor. From this eroded image, histogram equalization and Morphology transformation are done.
  • the Histogram equalization produces an output image that has a uniform histogram by spreading the levels of an input image over wide range of intensity scale.
  • the processor executes the code that applies Histogram equalization on the output image, the dark image becomes much darker and bright image much brighter.
  • the processor executes the code that extracts dark objects (i.e. eyes) from bright background (i.e. face bounding box) using Morphology transformation.
  • the processor tracks the eyes by computing the centroid of the eye, target model histogram and target candidate model histogram for one location to another location followed by compares them to identify distance and subsequently calculates the displacement of the target centre by the weighted means, wherein the target model histogram and target candidate model histogram are computed based on the feature space which includes histogram equalized image range and Morphology transformed image.
  • the processor executes the following steps: In the first step, the processor executes the code that considers the centroid of eye blob as centre mO, and then it calculates the target model histogram by considering the feature space. if ⁇ hist _ eq(i, j) ⁇ max_ range & &hist _ eq(i, j) > min_ range)
  • centre of the target is initialized at its previous location(yO) and the processor executes the code that calculates target candidate histogram by considering the same feature space. if (hist _ eq(i, j) ⁇ max_ range & &hist _ eq(i, j) > min_ range)
  • Update 32 bin histogram p on Morphological transformed image .
  • the processor executes the code that calculates the distance between target model and target candidate histogram, is the bhattacharya coefficient between p and q .
  • the processor executes the code that calculates the displacement of the target centre by the weighted mean.
  • the processor executes the code that
  • the processor maps the eye to eyebrow in the morphology transformed image to identify the eye pixel thereby avoiding the nose pixel which appears brighter in such image.
  • the processor executes the following steps: a) Eye to eyebrow mapping
  • centroid of eye position is stored in a FIFO (First-in First-out) or queue.
  • the processor executes the code that updates centroids using to the equation mentioned below.
  • cr previous _ c + (fifocr[3] - fifo.cr[0]) 14;
  • cc previous. cc + (fifo.cc[3] - fifo.cc[0 ⁇ )l 4 If there is a sudden jerk which occurs naturally in the car environment, the head position and eye position changes drastically. In order to track the eyes, the processor executes the codes that detects the change in head position in the current frame with respect to previous frame, herein the condition is
  • curr _ diff
  • track[Q].cc - fifo ⁇ .cc ⁇ 2 ⁇ pre V _ diff ⁇ fifol i] - f .cr[2] ⁇ + ⁇ ffoLcc[l] - ffol .cc[2] ⁇ velocity
  • the Kernel tracking algorithm fails to track the eyes, but it still tracks the face since face is bigger in size compared to eyes.
  • the boundary box of eyes is located.
  • the processor detects the drowsiness state of the eyes 150 using histogram equalization, Morphological operations and texture based parameters using histogram and grey level co-occurrence matrices.
  • the processor executes the codes that extendes the boundary box of the eyes upwards up to centroid of eyebrows. In this region, the processor executes the codes that applies histogram equalization process on the the bounding box of the eyes, wherein the histogram equalization is a method in image processing of contrast adjustment using the image's histogram.
  • Histogram equalization is the technique by which the dynamic range of the histogram of an image is increased. Histogram equalization assigns the intensity values of pixels in the input image such that the output image contains a uniform distribution of intensities. It improves contrast and the goal of histogram equalization is to obtain a uniform histogram. This technique can be used on a whole image or just on a part of an image.
  • Histogram equalization redistributes intensity distributions. If the histogram of any image has many peaks and valleys, it will still have peaks and valley after equalization, but peaks and valley will be shifted. Because of this, “spreading” is a better term than "flattening" to describe histogram equalization. In histogram equalization, each pixel is assigned a new intensity value based on its previous intensity level.
  • the processor eliminates the brighter pupil effect of the histogram equalized image by computing the line erosion of such image with a structuring element, wherein the width of the structuring element is equal to 1/3 of eye brow width and height of the structuring element is equal to one.
  • the processor executes the following steops:
  • L being the total number of grey levels in the image
  • n being the total number of pixels in the image
  • Pt being in fact the image's histogram, normalized to [0,1].
  • A is the grey level image and B is the structuring element, for sets A and B in z the erosion of A by B,
  • grey scale opening will be done with a structuring element height and structuring element width equal to 3.
  • opening is the dilation of the erosion of a set A by a structuring element B:
  • A is the grey level image and B is the structuring element, for sets A and B in Z 2 , the morphological opening of A by B,
  • a o B ( ⁇ ⁇ ⁇ ) ⁇ ⁇
  • the processor uses histogram based approach to texture analysis which is based on the intensity value concentrations on all or part of an image represented as a histogram to identify the state of the eye, wherein the value of Uniformity or Angular secondary moment (ASM) texture parameter occurs high for closed eye and low for open eye.
  • ASM Angular secondary moment
  • the processor executes the following steps:
  • the processor executes the codes that implements the texture Based
  • Texture is a property that represents the surface and structure of an Image. Texture can be defined as a regular repetition of an element or pattern on a surface. Image textures are complex visual patterns composed of entities or regions with sub-patterns with the characteristics of brightness, color, shape, size, etc. An image region has a constant texture if a set of its characteristics are constant, slowly changing or approximately periodic. Texture analysis is a major step in texture classification, image segmentation and image shape identification tasks. Image segmentation and shape identification are usually the preprocessing steps for target or object recognition in an image.
  • Statistical approaches compute different properties and are suitable if texture primitive sizes are comparable with the pixel sizes. These include Fourier transforms, convolution filters, cooccurrence matrix, spatial autocorrelation, fractals, etc.
  • Syntactic and hybrid (Combination of statistical and syntactic) methods are suitable for textures where primitives can be described using a larger variety of properties than just tonal properties; for example' shape description. Using these properties, the primitives can be identified, defined and assigned a label. For grey-level images, tone can be replaced with brightness.
  • Histogram based approach to texture analysis is based on the intensity value concentrations on all or part of an image represented as a histogram.
  • Common features include moments such as mean, variance, dispersion, mean square value or average energy, entropy, skewness and kurtosis.
  • the nth moment of Z about the mean is ⁇ (Z ( - m)"/KZ,) . ⁇
  • ASM Angular secondary moment
  • the value of uniformity lies between 0 to1. This parameter is high for closed eye and low for open eye.
  • the processor uses contrast of the grey level co-occurence matrix to identify the state of the eyes by detecting the band of N frames is having the same property then the eye is closed.
  • the processor executes the following steps:
  • a co-occurrence matrix also referred to as a co-occurrence distribution, is defined over an image to be the distribution of co-occurring values at a given offset.
  • a cooccurrence matrix C is defined over an n x m image I, parameterized by an offset (Ax.Ay), as:
  • the processor executes the codes that calculates the contrast by using the following equation.
  • Figure 2A illustrates the detected eye region for the closed eye in one exemplary head position of the driver in accordance with the invention.
  • Figure 2B illustrates the histogram equalization of the detected eye region for the closed eye in one exemplary head position of the driver in accordance with the invention.
  • Figure 2C illustrates morphological results of the histogram equalization of the detected eye region for the closed eye in one exemplary head position of the driver in accordance with the invention.
  • Figure 2D illustrates the detected eye region for open eye in one exemplary head position of the driver in accordance with the invention.
  • Figure 2E illustrates the histogram equalization of the detected eye region for open eye in one exemplary head position of the driver in accordance with the invention.
  • Figure 2F illustrates morphological results of the histogram equalization of the detected eye region for open eye in one exemplary head position of the driver in accordance with the invention.
  • Figure 3A illustrates the detected eye region for the closed eye in another exemplary head position of the driver in accordance with the invention.
  • Figure 3B illustrates the histogram equalization of the detected eye region for the closed eye in another exemplary head position of the driver in accordance with the invention.
  • Figure 3C illustrates morphological results of the histogram equalization of the detected eye region for the closed eye in another exemplary head position of the driver in accordance with the invention.
  • Figure 3D illustrates the detected eye region for open eye in another exemplary head position of the driver in accordance with the invention.
  • Figure 3E illustrates the histogram equalization of the detected eye region for open eye in another exemplary head position of the driver in accordance with the invention.
  • Figure 3F illustrates morphological results of the histogram equalization of the detected eye region for open eye in another exemplary head position of the driver in accordance with the invention.
  • Figure 4A & 4B shows the graphs which illustrates driver drowsiness identification statues according to the exemplary embodiments of the invention.
  • the provision for warning the driver using detected drowsiness state of the eyes for avoiding collision using an alert means wherein the alert means can be audio and audio visual devices, sounding an alarm, a voice based caution, an Indicator and display.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Ophthalmology & Optometry (AREA)
  • Multimedia (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • General Engineering & Computer Science (AREA)
  • Psychiatry (AREA)
  • Child & Adolescent Psychology (AREA)
  • Social Psychology (AREA)
  • Pathology (AREA)
  • Hospice & Palliative Care (AREA)
  • Educational Technology (AREA)
  • Developmental Disabilities (AREA)
  • Psychology (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Traffic Control Systems (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

L'invention concerne un procédé économique et fiable permettant de localiser et de suivre l'état de somnolence d'un conducteur au moyen d'images capturées par une caméra à infrarouge proche placée dans le véhicule, ce procédé comprenant les étapes mises en oeuvre par un processeur et consistant à : suivre en temps réel le visage et localiser le cadre entourant les yeux à l'intérieur du cadre entourant le visage dans l'image capturée en comparant les valeurs de gris à un seuil à l'aide du processus de segmentation; effectuer le suivi du regard en calculant un centroïde de l'oeil, un histogramme modèle cible et un histogramme modèle candidat cible d'un emplacement à un autre en les comparant afin d'identifier la distance, et calculer le déplacement du centre de la cible par des moyens pondérés, l'histogramme modèle cible et l'histogramme modèle candidat cible étant calculés sur la base de l'espace contenant des éléments; et détecter l'état de somnolence par égalisation des histogrammes, par des opérations morphologiques et à l'aide de paramètres basés sur la texture à l'aide de matrices de co-occurrence d'histogrammes et de niveaux de gris.
EP10822865.1A 2009-12-02 2010-12-02 Système et procédé économique et fiable de suivi du regard et d'identification de l'état de somnolence d'un conducteur Withdrawn EP2507742A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN2784MU2009 2009-12-02
PCT/IN2010/000781 WO2011067788A2 (fr) 2009-12-02 2010-12-02 Système et procédé économique et fiable de suivi du regard et d'identification de l'état de somnolence d'un conducteur

Publications (1)

Publication Number Publication Date
EP2507742A2 true EP2507742A2 (fr) 2012-10-10

Family

ID=44115374

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10822865.1A Withdrawn EP2507742A2 (fr) 2009-12-02 2010-12-02 Système et procédé économique et fiable de suivi du regard et d'identification de l'état de somnolence d'un conducteur

Country Status (5)

Country Link
US (1) US9483695B2 (fr)
EP (1) EP2507742A2 (fr)
JP (1) JP5680667B2 (fr)
CN (1) CN102696041B (fr)
WO (1) WO2011067788A2 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107977623A (zh) * 2017-11-30 2018-05-01 睿视智觉(深圳)算法技术有限公司 一种鲁棒性人眼状态判断方法

Families Citing this family (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9460601B2 (en) * 2009-09-20 2016-10-04 Tibet MIMAR Driver distraction and drowsiness warning and sleepiness reduction for accident avoidance
US10922567B2 (en) 2010-06-07 2021-02-16 Affectiva, Inc. Cognitive state based vehicle manipulation using near-infrared image processing
KR101046677B1 (ko) * 2011-03-15 2011-07-06 동국대학교 산학협력단 눈 위치 추적방법 및 이를 이용한 의료용 헤드램프
US9050035B2 (en) 2012-03-22 2015-06-09 The Curators Of The University Of Missouri Device to measure pupillary light reflex in infants and toddlers
US9314157B2 (en) 2012-03-22 2016-04-19 The Curators Of The University Of Missouri Device to measure pupillary light reflex in infants and toddlers
CN103411535B (zh) * 2013-08-07 2015-08-05 北京信息科技大学 一种针对回光反射标志的可变权重像点定位方法
CN103955695B (zh) * 2013-11-27 2017-07-07 苏州清研微视电子科技有限公司 计算机基于灰度共生矩阵能量变化智能识别视频中人眼状态的方法
JP6442942B2 (ja) * 2014-09-11 2018-12-26 株式会社デンソー ドライバ状態判定装置
EP3040726A1 (fr) 2014-12-29 2016-07-06 General Electric Company Procédé et système pour déterminer la vitesse d'un véhicule
US10984237B2 (en) * 2016-11-10 2021-04-20 Neurotrack Technologies, Inc. Method and system for correlating an image capturing device to a human user for analyzing gaze information associated with cognitive performance
EP3562394B1 (fr) * 2016-12-28 2023-07-26 Ficosa Adas, S.L.U. Extraction de signal respiratoire
US10290158B2 (en) * 2017-02-03 2019-05-14 Ford Global Technologies, Llc System and method for assessing the interior of an autonomous vehicle
US10121084B2 (en) * 2017-03-07 2018-11-06 Wipro Limited Method and a system for detecting drowsiness state of a vehicle user
US10509974B2 (en) 2017-04-21 2019-12-17 Ford Global Technologies, Llc Stain and trash detection systems and methods
US10304165B2 (en) 2017-05-12 2019-05-28 Ford Global Technologies, Llc Vehicle stain and trash detection systems and methods
US20190012552A1 (en) * 2017-07-06 2019-01-10 Yves Lambert Hidden driver monitoring
TWI647666B (zh) * 2017-08-28 2019-01-11 緯創資通股份有限公司 瞌睡偵測裝置及其瞌睡偵測方法
US10943092B2 (en) 2018-05-23 2021-03-09 ClairLabs Ltd. Monitoring system
CN109086740B (zh) * 2018-08-25 2019-08-23 上海首安工程技术有限公司 分级式渡口安检机构
CN109584303B (zh) * 2018-12-03 2023-04-14 电子科技大学 一种基于Lp范数和核范数的红外弱小目标检测方法
US11200438B2 (en) 2018-12-07 2021-12-14 Dus Operating Inc. Sequential training method for heterogeneous convolutional neural network
CN109886780B (zh) * 2019-01-31 2022-04-08 苏州经贸职业技术学院 基于眼球跟踪的商品目标检测方法及装置
US11068069B2 (en) * 2019-02-04 2021-07-20 Dus Operating Inc. Vehicle control with facial and gesture recognition using a convolutional neural network
CN110264670A (zh) * 2019-06-24 2019-09-20 广州鹰瞰信息科技有限公司 基于客运车辆司机疲劳驾驶状态分析装置
CN110400274B (zh) * 2019-07-19 2022-02-15 西安科技大学 一种车载红外行人检测用红外图像增强方法
CN114269223A (zh) * 2019-09-27 2022-04-01 爱尔康公司 对眼科诊断装置中的测量的患者引发性触发
CN112764524A (zh) * 2019-11-05 2021-05-07 沈阳智能机器人国家研究院有限公司 一种基于纹理特征的肌电信号手势动作识别方法
CN113807126A (zh) * 2020-06-12 2021-12-17 广州汽车集团股份有限公司 一种疲劳驾驶检测方法及其系统、计算机设备、存储介质
JP7048997B2 (ja) * 2020-06-26 2022-04-06 みこらった株式会社 自動運転車及び自動運転車用プログラム
US11763595B2 (en) * 2020-08-27 2023-09-19 Sensormatic Electronics, LLC Method and system for identifying, tracking, and collecting data on a person of interest
US11861916B2 (en) * 2021-10-05 2024-01-02 Yazaki Corporation Driver alertness monitoring system
CN114758403B (zh) * 2022-06-10 2022-09-13 武汉憬然智能技术有限公司 疲劳驾驶智能分析方法及装置

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5345281A (en) 1992-12-17 1994-09-06 John Taboada Eye tracking system and method
US5481622A (en) * 1994-03-01 1996-01-02 Rensselaer Polytechnic Institute Eye tracking apparatus and method employing grayscale threshold values
US5689241A (en) 1995-04-24 1997-11-18 Clarke, Sr.; James Russell Sleep detection and driver alert apparatus
JPH1086696A (ja) * 1996-09-13 1998-04-07 Toyota Motor Corp 顔画像における特徴部位の検出方法
FR2773521B1 (fr) * 1998-01-15 2000-03-31 Carlus Magnus Limited Procede et dispositif pour surveiller en continu l'etat de vigilance du conducteur d'un vehicule automobile, afin de detecter et prevenir une tendance eventuelle a l'endormissement de celui-ci
US5900819A (en) 1998-04-21 1999-05-04 Meritor Heavy Vehicle Systems, Llc Drowsy driver detection system
US6283954B1 (en) 1998-04-21 2001-09-04 Visx, Incorporated Linear array eye tracker
JP2003006654A (ja) * 2001-06-20 2003-01-10 Nippon Telegr & Teleph Corp <Ntt> 動画像における移動体の特徴量抽出方法と自動追跡方法及びそれらの装置、並びに、それらの方法の実行プログラムとこの実行プログラムを記録した記録媒体
US6927694B1 (en) 2001-08-20 2005-08-09 Research Foundation Of The University Of Central Florida Algorithm for monitoring head/eye motion for driver alertness with one camera
US7130446B2 (en) 2001-12-03 2006-10-31 Microsoft Corporation Automatic detection and tracking of multiple individuals using multiple cues
CN1275185C (zh) * 2002-06-30 2006-09-13 贺贵明 驾驶员面像识别方法
JP2004356683A (ja) * 2003-05-27 2004-12-16 Fuji Photo Film Co Ltd 画像管理システム
US7430315B2 (en) * 2004-02-13 2008-09-30 Honda Motor Co. Face recognition system
US7362885B2 (en) 2004-04-20 2008-04-22 Delphi Technologies, Inc. Object tracking and eye state identification method
JP4501003B2 (ja) * 2005-07-15 2010-07-14 国立大学法人静岡大学 顔姿勢検出システム
US8265392B2 (en) * 2006-02-07 2012-09-11 Qualcomm Incorporated Inter-mode region-of-interest video object segmentation
US7742621B2 (en) * 2006-06-13 2010-06-22 Delphi Technologies, Inc. Dynamic eye tracking system
JP4895874B2 (ja) * 2007-03-15 2012-03-14 アイシン精機株式会社 目状態判別装置、目状態判別方法及び目状態判別プログラム
JP4307496B2 (ja) * 2007-03-19 2009-08-05 株式会社豊田中央研究所 顔部位検出装置及びプログラム
JP4898532B2 (ja) 2007-04-13 2012-03-14 富士フイルム株式会社 画像処理装置および撮影システム並びに瞬き状態検出方法、瞬き状態検出プログラムおよびそのプログラムが記録された記録媒体
CN101375796B (zh) * 2008-09-18 2010-06-02 浙江工业大学 疲劳驾驶实时检测系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2011067788A2 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107977623A (zh) * 2017-11-30 2018-05-01 睿视智觉(深圳)算法技术有限公司 一种鲁棒性人眼状态判断方法

Also Published As

Publication number Publication date
JP5680667B2 (ja) 2015-03-04
WO2011067788A3 (fr) 2011-10-06
US9483695B2 (en) 2016-11-01
WO2011067788A2 (fr) 2011-06-09
JP2013513155A (ja) 2013-04-18
CN102696041B (zh) 2016-03-02
US20130010096A1 (en) 2013-01-10
CN102696041A (zh) 2012-09-26

Similar Documents

Publication Publication Date Title
US9483695B2 (en) Cost effective and robust system and method for eye tracking and driver drowsiness identification
Eriksson et al. Driver fatigue: a vision-based approach to automatic diagnosis
US7940962B2 (en) System and method of awareness detection
Wang et al. Driver fatigue detection: a survey
Alioua et al. Driver’s fatigue detection based on yawning extraction
Junaedi et al. Driver drowsiness detection based on face feature and PERCLOS
EP1589485B1 (fr) Procédé de poursuite d&#39;objet et d&#39;identification de l&#39;état d&#39;un oeuil
EP1732028A1 (fr) Système et procédé de détection d&#39;un oeil
EP2060993B1 (fr) Système et procédé de détection de sensibilisation
KR20190083155A (ko) 운전자 상태 검출 장치 및 그 방법
Tang et al. Real-time image-based driver fatigue detection and monitoring system for monitoring driver vigilance
Panicker et al. Open-eye detection using iris–sclera pattern analysis for driver drowsiness detection
Khan et al. Efficient Car Alarming System for Fatigue Detectionduring Driving
Veeraraghavan et al. Detecting driver fatigue through the use of advanced face monitoring techniques
Murugan et al. Driver hypovigilance detection for safe driving using infrared camera
Kumar Morphology based facial feature extraction and facial expression recognition for driver vigilance
Jiao et al. Real-time eye detection and tracking under various light conditions
Tarba et al. The driver's attention level
Murawski et al. The contactless active optical sensor for vehicle driver fatigue detection
P Mathai A New Proposal for Smartphone-Based Drowsiness Detection and Warning System for Automotive Drivers
Vinoth et al. A drowsiness detection using smart sensors during driving and smart message alert system to avoid accidents
Sharran et al. Drowsy Driver Detection System
Ejidokun et al. Development Of An Eye-Blink Detection System To Monitor Drowsiness Of Automobile Drivers
Zhang et al. A fast eye state computing algorithm
Fikriyah et al. Eye Fatigue Detection in Vehicle Drivers Based on Facial Landmarks Features

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20120531

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20140102

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20140715